Design Challenge
The Assessment Work Group began a multi-year Design Challenge effort to help inform a set of design principles for the direct assessment of student’s’ social and emotional skills. The AWG led the Design Challenge from 2017 to 2019 with the goal of establishing a set of design principles to help stimulate the development, continuous improvement, and adoption of direct assessments of social and emotional competence. These design principles were created, in part, by identifying, recognizing, and studying 18 award-winning efforts through three annual challenges.
The AWG released the results of this effort – a set of eight design principles – in a final brief. If you are interested in how to capture what social and emotional skills youth can actually demonstrate, we urge you to review the principles and let us know what you think by taking a quick survey.
The Design Principles
Eight design principles were identified from the three-year design challenge. The principles emerged from a study of existing direct assessments and an evaluation by a panel of expert judges with input from practitioners in the field. We acknowledge that there are a wide range of uses of direct assessments in educational settings and that some of these principles may be more valuable in certain contexts than in others. Nevertheless, strong direct assessments will be useful when they generally reflect most or all of these principles.
- Principle 1: Assessment is a Direct Measure of Competence
- Principle 2: Assessment is Educationally Appropriate
- Principle 3: Assessment is Developmentally Appropriate
- Principle 4: Assessment is Culturally Appropriate
- Principle 5: Assessment is Technically Appropriate
- Principle 6: Assessment is Practical to Implement
- Principle 7: Assessment Data Are Useful
- Principle 8: Assessment is a Good Fit Overall
- Learn more
Final Design Challenge (2019)
We identified six measures based on their successful implementation in educational settings: web-based assessments, rubric-based assessments created within educational settings, and rubric-based assessments created within educational programs. Learn more.
Web-based assessments that use virtual scenarios to assess performance
- The Minnesota Executive Function Scale (MEFS App), developed by Reflection Sciences, is a direct assessment designed to measure the following executive function skills in children as young as 24 months: working memory, inhibitory control, and cognitive flexibility.
- SELweb, developed by xSEL Labs, measures emotion recognition, self-control, social perspective-taking, and social problem solving in students in kindergarten through 6th grade4. SELweb is “played” using a web-based platform.
Rubric-based assessments created within school districts
- The 5 Scholarly Habits, developed by Two Rivers Public Charter School District, is an observational rubric that teachers fill out to measure students’ SEL competencies according to the following statements: I know myself; I am independent and resilient; I show compassion and embrace diversity; I can connect and collaborate; and I act with integrity.
- The Standards Based Report Card, developed by San Francisco Unified School District (SFUSD), displays the scores teachers give students in academic subjects and SEL.
Rubric-based assessments created within programs
- The Dream Life Skills Assessment Scale, developed by Dream a Dream, is an observational assessment of 4th-9th-graders that assesses the following five life skills: interacting with each other, solving problems and overcoming difficulties, taking initiative, managing conflict, and understanding instructions.
- The Deep Learning Progressions, developed by New Pedagogies for Deeper Learning (NPDL), is used by teachers to measure each student’s developmental progression in the six Deep Learning Competencies, or 6Cs: communication, critical thinking, collaboration, creativity, citizenship, and character.
Second Design Challenge (2018)
In the Spring of 2018, the SEL Assessment Work Group launched our Second Annual SE Competence Assessment Design Challenge. In our first year, we learned that more input from practitioners was needed to help us understand their most pressing needs for social-emotional (SE) competence assessments. This year, we aimed to address this by involving practitioner perspectives in the design of SE competence assessments.
Five proposals were selected out of 11 submissions we received. Assessment proposals were rated by practitioners and assessment experts on the purpose of the assessment, the measurement method, the extent to which the assessment addresses a practitioner need, research and frameworks that informed the design, data reporting, developmental and cultural appropriateness, usability, scalability, and technical merit. We accepted a range of submissions from both practitioners and assessment developers. Learn more.
First Place: Selected Response Assessment of Social Emotional Competence (SRASEC)
A selected-response assessment of elementary students’ social emotional competence, addressing knowledge via a game-like, computer-based administration platform. The final format will include about 35 items linked to about five vignettes, with content designed to elucidate elementary students’ thinking about issues related to SE competence (e.g. a child wants to join a group on the playground, an older child bullies a younger child on the bus, one child looks at another’s responses on a test).
Submitted by: Ryan Kettler PhD, Associate Professor, Rutgers, The State University of New Jersey; Kelly A. Feeney-Kettler, PhD, F K & K Consulting; Leah Dembitzer PsyD, Concordia College, New York
Second Place: Text-Based Decision Game
A text-based decision game for high school students where they walk through scenarios, earn points, and are connected to resources to help them along their pathway. Scenario details will depend on an individual student’s academic and behavioral data. For example, an incoming 9th grade student (on whom we will have little data) will start the game off thinking through the type of graduation pathway they want to pursue and what types of classes they need to take during the current and future years, whereas a returning 11th grade student will start the game off thinking more specifically on how they will prioritize submitting college applications considering their frequency of missing homework assignments in the past month.
Submitted by: Miguel Rivera-Rios, Data Manager, Democracy Prep Public Schools; Sharese Maine, Democracy Prep Public Schools; Alize-Jazel Smith, Democracy Prep Public Schools
Third Place: Virtual Environment for Social Information Processing (VESIP)
A theory-based, web-based assessment for third through seventh grade students that utilizes an interactive and immersive simulation format to assess children’s social information processing skills by measuring a child’s ability to effectively reason through five types of challenging social situations: (a) ambiguous, provocation, (b) bullying, (c) compromise, (d) peer entry into a group, and (e) friendship initiation. The overarching goal of VESIP is to fill a gap in available methods to assess social information processing, a key component of social-emotional learning competencies.
Submitted by: Nicole M. Russo-Ponsaran, Research Director, Rush NeuroBehavioral Center; Jeremiah Folsom-Kovarik, Soar Technology; Eric Tucker, Soar Technology; Jacob Crossman, Soar Technology
Fourth Place: The Skill Rubric Template (SRT)
A set of rubrics designed to allow for authentic assessment of high school students’ social and emotional skills as demonstrated during regular classroom or program activities. The SRT provides a central framework for defining and measuring skills of communication, collaboration, problem solving, innovation, grit, and self-management. Evaluators (teachers, coaches, mentors, etc.) construct a rubric for any given task by first identifying which skills and subskills students should exhibit in the performance of that task.
Submitted by: Stacie Furia, Senior Director of Evaluation, BUILD; Nicole Ramos, BUILD; Ryan Novack, BUILD; Diane Bezucha, BUILD
Fifth Place: Problem Solving Performance Assessment
This assessment is designed to determine high school student’s level of proficiency in problem solving. Students first generate solutions to an open-ended hypothetical scenario where enrollment is declining at the student’s school. Next, they examine their solutions for (1) Strengths (2) Weaknesses (3) Opportunities and (4) Threats (SWOT Analysis) and choose the best solution. Students are assessed on a 4-point rubric according to the degree to which they consistently apply thorough reason and logic in solving the problem.
Submitted by: Tara Laughlin, Curriculum Director, PAIRIN
Practitioner Needs Winners
In the first phase of the second annual Design Challenge, we issued a call to practitioners to share their most pressing SEL assessment needs. We received over 60 submissions and after a thorough review, selected the 10 that most clearly and convincingly articulated a practitioner need. Read about the winners in our blog.
First Design Challenge (2017)
These seven proposals were selected out of the 20 submissions we received. Applications of direct assessments of social-emotional competence were rated on innovation, clarity, usefulness, scalability, data reporting, developmental and cultural appropriateness, engagingness, and technical merit. We accepted a range of submissions, from assessments that were still in the early phases of development to those that have been tested and are being used in schools. Learn more.
First Place: Student Assessment Engagement
When students take an achievement test on a computer, metadata like the amount of time spent on each item are often collected. Research shows that students who often respond extremely fast–so quickly they could not have understood the item’s content–are likely disengaged from the test. Our measure quantifies how often students respond extremely quickly over the course of a test, which is strongly correlated with scores from measures of social-emotional learning constructs like self-regulation and self-management.
Submitted by:James Soland, Research Scientist, NWEA; Nate Jensen, Senior Research Scientist, NWEA; Tran D. Keys, Executive Director of Research and Evaluation, Santa Ana Unified School District; Sharon Z. Bi, Educational Research Analyst, Santa Ana Unified School District; Emily Wolk, Assistant Director of Research and Evaluation, Santa Ana Unified School District
Second Place: Social Detective
Panorama’s Social Detective is designed to measure and help students practice social perspective-taking, a malleable and central social competency that underlies a vast range of social-emotional functioning at school and in life. In this performance task, students are challenged to be a “social detective” whose job is to figure out other people’s values, interests, and perspectives. After watching short video interviews, students answer a series of questions to gauge how well they perceive and understand each person. Learn More Here
Submitted by: Panorama Education
Third Place (tie): PERC
The PERC is a computer-based tool that assesses students’ Persistence, Effort, Resilience and Challenge-seeking behavior. These are key behavioral expressions of a growth mindset of intelligence. Learn More Here
Submitted by: Tenelle Porter and Kali Trzesniewski, Department of Human Ecology, University of California, Davis; Lisa Blackwell and Sylvia Roberts, MindsetWorks
Third Place (tie): Zoo U Social Emotional Skills Assessment
Zoo U provides a game platform for performance-based formative assessment of social emotional skills in upper elementary grades. Learn More Here
Submitted by: Melissa E. DeRosier, PhD, 3C Institute and Centervention; James M. Thomas, PhD, 3C Institute and Centervention
Fourth Place: The Calendar Task
The calendar task (in development) will potentially provide a flexible and naturalistic platform to evaluate student self-management skills with the potential for multiple use cases, both summative and formative in nature.
Submitted by: Patricia Inglese, Research Project Manager, Educational Testing Service; Adam Bacall, Research Assistant, Educational Testing Service; Patrick Barnwell, Supervisor of Research, Educational Testing Service; Sam Rikoon, Associate Research Scientist, Educational Testing Service
Fifth Place: PLUS Executive Functioning Assessment
In order to employ direct assessments of executive function (EF) skills at scale, we developed a group-based assessment procedure that is time-efficient and cost-effective. We adapted four developmentally appropriate, widely used EF tasks for administration on tablet computers in a classroom setting. Our classroom protocol allows a minimally disruptive assessment of EF skills in all students at the same time. Learn More Here
Submitted by: Jelena Obradovic, Stanford University
Sixth Place: An Incentivized Method for Measuring Grit
This assessment measures grit, i.e. the propensity to set ambitious goals, persevere in the face of failures, and put effort to build skill. We use an incentivized methodology that involves rewarding successful outcomes.