Second Annual SE Competence Assessment Design Challenge
In the Spring of 2018, the SEL Assessment Work Group launched our Second Annual SE Competence Assessment Design Challenge. In our first year, we learned that more input from practitioners was needed to help us understand their most pressing needs for social-emotional (SE) competence assessments. This year, we aimed to address this by involving practitioner perspectives in the design of SE competence assessments.
Our second Design Challenge consisted of two stages:
1) A call for practitioners to describe their student SEL assessment needs, from which a panel of practitioner reviewers chose the ten most compelling and actionable statements of need.
2) A challenge to developers to submit assessment proposals that address the selected practitioner needs, from which a panel of practitioners and researchers picked the best-designed direct assessments.
Congratulations to our Assessment Winners!
We are thrilled to announce the winners of the second annual Design Challenge. Five proposals were selected out of 11 submissions we received.
Assessment proposals were rated by practitioners and assessment experts on the purpose of the assessment, the measurement method, the extent to which the assessment addresses a practitioner need, research and frameworks that informed the design, data reporting, developmental and cultural appropriateness, usability, scalability, and technical merit. We accepted a range of submissions from both practitioners and assessment developers.
View our recent webinar on the results of the Design Challenge:
Or view the slides here.
The webinar showcased three of the winning measures of the Design Challenge. The four awardees discussed their work, including what social-emotional competencies they assess and how.
- 1st place: Ryan Kettler, Rutgers, The State University of New Jersey
- 2nd place: Miguel Rivera-Rios, Democracy Prep Schools
- 3rd place: Ashley Karls, presenting on behalf of Nicole Russo, Rush Neurobehavioral Center
Selected Response Assessment of Social Emotional Competence (SRASEC)
A selected-response assessment of elementary students’ social emotional competence, addressing knowledge via a game-like, computer-based administration platform. The final format will include about 35 items linked to about five vignettes, with content designed to elucidate elementary students’ thinking about issues related to SE competence (e.g. a child wants to join a group on the playground, an older child bullies a younger child on the bus, one child looks at another’s responses on a test). Students will be asked to describe the feelings that characters in the vignette are experiencing, the reasoning for their actions, and how the characters are likely to behave next. Items will be dichotomously scored (i.e., correct or incorrect) and will use formats common to knowledge tests such as multiple choice, true/false, or matching
Ryan Kettler PhD, Associate Professor, Rutgers, The State University of New Jersey
Kelly A. Feeney-Kettler, PhD, F K & K Consulting
Leah Dembitzer PsyD, Concordia College, New York
Text-Based Decision Game
A text-based decision game for high school students where they walk through scenarios, earn points, and are connected to resources to help them along their pathway. Scenario details will depend on an individual student’s academic and behavioral data. For example, an incoming 9th grade student (on whom we will have little data) will start the game off thinking through the type of graduation pathway they want to pursue and what types of classes they need to take during the current and future years, whereas a returning 11th grade student will start the game off thinking more specifically on how they will prioritize submitting college applications considering their frequency of missing homework assignments in the past month. In its initial design, the game will be targeted at addressing the unique set of values at Democracy Prep Public Schools (Discipline, Respect, Enthusiasm, Accountability, Maturity, Bravery, Initiative, and Grit), but will later be able to include other school values.
Miguel Rivera-Rios, Data Manager, Democracy Prep Public Schools
Sharese Maine, Democracy Prep Public Schools
Alize-Jazel Smith, Democracy Prep Public Schools
Virtual Environment for Social Information Processing (VESIP)
A theory-based, web-based assessment for third through seventh grade students that utilizes an interactive and immersive simulation format to assess children’s social information processing skills by measuring a child’s ability to effectively reason through five types of challenging social situations: (a) ambiguous, provocation, (b) bullying, (c) compromise, (d) peer entry into a group, and (e) friendship initiation. The overarching goal of VESIP is to fill a gap in available methods to assess social information processing, a key component of social-emotional learning competencies.
Nicole M. Russo-Ponsaran, Research Director, Rush NeuroBehavioral Center
Jeremiah Folsom-Kovarik, Soar Technology
Eric Tucker, Soar Technology
Jacob Crossman, Soar Technology
The Skill Rubric Template (SRT)
A set of rubrics designed to allow for authentic assessment of high school students’ social and emotional skills as demonstrated during regular classroom or program activities. The SRT provides a central framework for defining and measuring skills of communication, collaboration, problem solving, innovation, grit, and self-management. Evaluators (teachers, coaches, mentors, etc.) construct a rubric for any given task by first identifying which skills and subskills students should exhibit in the performance of that task. The SRT ensures that anyone assessing a student’s skills is (a) using the same definition of the skill, (b) looking for the same indicators, and (c) assigning scores in a similar way so that assessment scores can be meaningfully linked across different contexts.
Stacie Furia, Senior Director of Evaluation, BUILD
Nicole Ramos, BUILD
Ryan Novack, BUILD
Diane Bezucha, BUILD
Problem Solving Performance Assessment
This assessment is designed to determine high school student’s level of proficiency in problem solving. Students first generate solutions to an open-ended hypothetical scenario where enrollment is declining at the student’s school. Next, they examine their solutions for (1) Strengths (2) Weaknesses (3) Opportunities and (4) Threats (SWOT Analysis) and choose the best solution. Students are assessed on a 4-point rubric according to the degree to which they consistently apply thorough reason and logic in solving the problem.
Tara Laughlin, Curriculum Director, PAIRIN
Practitioner Needs Winners
In the first phase of the second annual Design Challenge, we issued a call to practitioners to share their most pressing SEL assessment needs. We received over 60 submissions and after a thorough review, selected the 10 that most clearly and convincingly articulated a practitioner need. Read about the winners below, then read more at our blog.
Lucia Alfaro, Livingston Union School District
I would like to measure how and if parents and their children understand the same SEL strategies to manage emotions, recognize mental health crisis warning signs that necessitate asking for additional support, as well as how to practice empathy daily. Measuring tools could include pre/post tests before and after calibrated SEL parent education is presented and before and after student SEL education is provided schoolwide K-5. Data will be used to target additional education needs within parent and student populations so that SEL could continue in the home in coordination with school SEL education efforts.
Margaret Borelli, Meriden Public Schools
I would like to be able to determine a baseline, teach an evidenced based curriculum, and measure the effectiveness of the curriculum. In order to meet IEP goals and help students develop their problem solving skills I need to be able to measure current skills and teach a curriculum to develop skills. I work with over 100 students in small groups or individually. I would like to be able to collect concrete data regarding lagging skills and skill improvement following direct instruction and practice of skills. The data collected will drive programming, ensure fidelity of instruction, and help with the generalization of skills learned in small groups or individually.
Randie Chubin, Cheder Lubavitch Hebrew Day School
I would love a way to be able to know when to push a student to try again or to let the child stop working at that time. Many of the students I work with have the belief that they won’t be able to do the assignment put in front of them. My job is to break the assignments down into manageable pieces and explain it in terms the students will understand. The problem is, when they come to me, they already have a “stuck” attitude. If there was a way to measure beyond their words to know if I should use my time with the students to convince them that they can do it or if I should be more of a social worker and ask questions, such as what is the most challenging part of this…. I go with my gut and usually end up doing both, but a tool to measure this would be helpful for guidance, as well as documentation. How often? Each session. How many students? About 20 a day, from 1-6 at a time.
Carolyn Coli, FXW
Background Info: Started new after school social skills groups K-3. Want to create screener, pre and post assessment to measure self-regulation (body and emotion), growth mindset/confidence, and social perspective taking. I have 16 students total in four groupings. I want this assessment to inform my curriculum, show admin/parents data that supports my program and highlight the need for this support/resource.
Elizabeth Grant, Boston Green Academy
We would like to be able to prepare our 100+ high school juniors and seniors for college success by measuring their ability to 1) problem solve if new or unexpected situations arise in college; 2) initiate social interactions with professors, fellow students and staff to build support networks in a new environment; and 3) collaboration and teamwork to work with others to find ways to be successful. If we could measure these skills in high school then perhaps we can collaborate with families to help develop or enhance these skills and better prepare students to successfully navigate the transition from high school to college.
Theresa Lewis, Saint Mary’s University
I would like to have a variety of ways K-12 teachers could assess how their students are growing in their abilities to collaborate with their classmates in small groups. I would also like to have some way of assessing how the team approach fosters academic success for all members of the team. The teachers in my program are designing lessons to teach their students how to learn as a social process. They have picked certain aspects of social learning on which to focus their lessons. Their lessons are based on skills necessary to make a team function effectively like communication, self-motivation, responsibility, problem-solving, social awareness, and self-management. They have used self-assessment, reflection, and observation to assess the progress of their groups. They are looking for some more ideas to help them decide if their students are actually using the skills they are teaching them to enhance the learning of content in their small groups.
Jeffrey Lund, Hyde Foundation
At Hyde Foundation we have a small network ( 5 schools 2,300 students ) serving schools in urban low income communities of color ( in USA approximately 12 million students in 20,000 schools ). We have an integrated character and leadership development framework with practices and programs and processes. Our mantra is that individuals are born with a unique potential and character defines a destiny. We help schools, teachers, students, and parents forge a partnership to help transform student lives. We call it family-based character education. We want to better assess and correlate the impact of student character and leadership development on academic performance, graduation, college graduation and success in life. We would love to expand our data collection across more schools to enrich our findings. Our findings would be made available to participating schools.
Nicole Ramos, BUILD
The BUILD curriculum uses entrepreneurship and experiential learning to drive growth in key skills (Communication, Collaboration, Problem Solving, Innovation, Grit, Self-Management, and Growth Mindset) that matter for future success in high school, college, career and beyond. Because the skills are central to the theory of change for the curriculum, we seek to understand how students are growing in their skills, and even further, understand at what points during the curriculum they experience most growth. While that information would be helpful to understand whether our program “works” in the way we believe it to, what we really want to know is whether or how students are transferring the skills that we explicitly aim to teach into other contexts. We imagine this might be done by engaging students in ongoing self-reported behavioral assessment via SMS or an app to understand if/how they are thinking about and activating the skills outside of the classroom â€“ in other classes, in their personal lives, etc. This type of assessment would help us focus on not only the skill development, but also building the right strategies (metacognitive and otherwise) to help students use the skills to foster success more broadly.
Christine Rick, School District of Palm Beach County
As a school administrator I would like to measure the impact of SEL & student performance. Working with the Wallace Foundation & CASEL we are implementing Morning Meeting. Although I am excited to continue in the SEL journey with my students & staff I need assessment to measure the impact.
Laura Robinson, Kent Intermediate School District
I am wondering if there is a group you have unintentionally left out of the SEL discussions, and that is career and technical education students. In our district we are overlaying PBIS and career and employability (non-cognitive) skills demanded by the employers hiring our students (11th and 12th grade) across our campus (4 high schools). We need to be able to assess these skills to certificate our students going directly into careers (businesses are demanding this from the school district). We service (at an Intermediate School District level) approximately 3,000 students coming to us from all of our local districts.
View our recent webinar on the results of our 2nd Design Challenge
Key Design Principles for SEL Direct Assessments:Key lessons from the 2nd Design Challenge.
(And check out what we learned from the 1st Design Challenge here.)
Join our Collaborator Network Receive announcements about the Design Challenge.
The Winners of the 2017 Design Challenge
These seven proposals were selected out of the 20 submissions we received. Applications of direct assessments of social-emotional competence were rated on innovation, clarity, usefulness, scalability, data reporting, developmental and cultural appropriateness, engagingness, and technical merit. We accepted a range of submissions, from assessments that were still in the early phases of development to those that have been tested and are being used in schools.
You can view the slides or watch the recording of our recent webinar:
The webinar showcased four of the winning measures of the Design Challenge. The four awardees discussed their work, including what social-emotional competencies they assess and how.
- 1st place: Jim Soland, NWEA
- 2nd place: Sam Moulton, Panorama
- 3rd place (tie): Tenelle Porter, UC Davis, and Melissa DeRosier, 3C Institute and Centervention
First Design Challenge Media coverage
Student Assessment Engagement
When students take an achievement test on a computer, metadata like the amount of time spent on each item are often collected. Research shows that students who often respond extremely fast–so quickly they could not have understood the item’s content–are likely disengaged from the test. Our measure quantifies how often students respond extremely quickly over the course of a test, which is strongly correlated with scores from measures of social-emotional learning constructs like self-regulation and self-management.
James Soland, Research Scientist, NWEA
Nate Jensen, Senior Research Scientist, NWEA
Tran D. Keys, Executive Director of Research and Evaluation, Santa Ana Unified School District
Sharon Z. Bi, Educational Research Analyst, Santa Ana Unified School District
Emily Wolk, Assistant Director of Research and Evaluation, Santa Ana Unified School District
Panorama’s Social Detective is designed to measure and help students practice social perspective-taking, a malleable and central social competency that underlies a vast range of social-emotional functioning at school and in life. In this performance task, students are challenged to be a “social detective” whose job is to figure out other people’s values, interests, and perspectives. After watching short video interviews, students answer a series of questions to gauge how well they perceive and understand each person. Learn More Here
Third Place (tie)
The PERC is a computer-based tool that assesses students’ Persistence, Effort, Resilience and Challenge-seeking behavior. These are key behavioral expressions of a growth mindset of intelligence. Learn More Here
Tenelle Porter and Kali Trzesniewski, Department of Human Ecology, University of California, Davis
Lisa Blackwell and Sylvia Roberts, MindsetWorks
Third Place (tie)
Zoo U Social Emotional Skills Assessment
Zoo U provides a game platform for performance-based formative assessment of social emotional skills in upper elementary grades. Learn More Here
The Calendar Task
The calendar task (in development) will potentially provide a flexible and naturalistic platform to evaluate student self-management skills with the potential for multiple use cases, both summative and formative in nature.
Patricia Inglese, Research Project Manager, Educational Testing Service
Adam Bacall, Research Assistant, Educational Testing Service
Patrick Barnwell, Supervisor of Research, Educational Testing Service
Sam Rikoon, Associate Research Scientist, Educational Testing Service
PLUS Executive Functioning Assessment
In order to employ direct assessments of executive function (EF) skills at scale, we developed a group-based assessment procedure that is time-efficient and cost-effective. We adapted four developmentally appropriate, widely used EF tasks for administration on tablet computers in a classroom setting. Our classroom protocol allows a minimally disruptive assessment of EF skills in all students at the same time. Learn More Here
Jelena Obradovic, Stanford University
An Incentivized Method for Measuring Grit
This assessment measures grit, i.e. the propensity to set ambitious goals, persevere in the face of failures, and put effort to build skill. We use an incentivized methodology that involves rewarding successful outcomes.