Read the Blog

  • Skip to main content
  • Skip to footer

AWG HomepageAWG

  • About
  • Resources
  • Assessment Guide
Design Challenge Winner: Failing Forward: How a Flawed Assessment Inspired a Bold and Innovative New Approach

Design Challenge Winner: Failing Forward: How a Flawed Assessment Inspired a Bold and Innovative New Approach

September 12, 2018 Inspire

By: Nicole Ramos, M.Ed. (NERDi Consulting) and Stacie R. Furia, Ph.D. (NERDi Consulting)

 

BUILD is a non-profit organization that uses entrepreneurship and experiential education to propel students through high school to college and career success.  BUILD believes that its outstanding success in helping low-income youth of color enroll and persist in college is due to mentorship, entrepreneurship, and the development of Spark Skills.* (A set of 6 skills educators might find familiar – Communication, Collaboration, Problem Solving, Innovation, Grit and Self-Management.)

BUILD set out to create an assessment to help prove and improve impact on students’ Spark Skills growth. Like many of BUILD’s peers, the first attempt at an assessment included a pre-test/post-test, self-perception questionnaire with items largely adapted from existing validated instruments.  The result… a flop.  After two years of  tweaking the tool and implementation, we learned what Angela Duckworth and David Yeager were also learning at the same time: along with implementations difficulties, perception questionnaires have serious limitations.

Tapping in to our own entrepreneurial spirit, we learned a lot from this “failure” and used those lessons to help craft a new approach. Here are three lessons learned and how they helped us create an assessment that won 4th place in the Assessment Work Group’s 2018 Design Challenge.

Lesson 1: Assessment methodology should match pedagogy

BUILD’s curriculum and pedagogy is rooted in experiential learning.  It is no wonder that sitting down for 30+ minutes and answering a bevy of Likert-scale questions made the experience odd for students! In our new assessment design, we sought to align the assessment methodology with the curriculum.

Lesson 2: Reliability and validity are only as good as meaningful participation

We spent a lot of effort in our first version to ensure we had strong construct validity, face validity, (etc. etc…) and were obsessive about reliability as well. We failed to pay equal attention to getting buy-in from teachers and students. The lack of student engagement with the assessment eliminated any change for real reliability. In our next attempt, we sought input from practitioners and spent more time on in-person training.

Lesson 3: Don’t put all of the eggs in one basket

It was very painful to have spent two years of excellent and hard work on an assessment that ultimately did not meet our standards.  Moving forward, we designed and piloted several assessments, each on a much smaller scale. This increased our chances of finding the tools that best suited BUILD’s needs – or rather, demonstrated the importance of having a portfolio of assessments that gives a holistic picture of students’ Spark Skills attainment.

When one works with an entrepreneurial program, where failure = opportunity, there is only one way to deal with disappointment: learn from it.  We hope this mantra inspires the field, and this working group specifically, to continue to look critically at assessment work and try, try again.  Collectively, as we share what we learn – not only about what is working, but about what is NOT working – we can advance much more quickly to develop effective assessment and measurement tools.

Have you had any assessments that didn’t work out as planned, or didn’t produce useful data? What did or could you learn from these missteps? What are/were some lessons you can take away as you move forward in your assessment journey?

 

 

Disclaimer: The Assessment Work Group is committed to enabling a rich dialogue on key issues in the field and seeking out diverse perspectives. The views and opinions expressed in this blog are those of the authors and do not necessarily reflect the official policy or position of the Assessment Work Group, CASEL or any of the organizations involved with the work group.

Related topic

  • My Lessons Learned Working with a Diverse Set of School Leaders Deeply Committed to Taking a Data-Informed Approach to Whole Child Development
  • Design Challenge Winner Zoo U: A Game Platform for Performance-based Assessment of Children’s Social and Emotional SkillsDesign Challenge Winner Zoo U: A Game Platform for Performance-based Assessment of Children’s Social and Emotional Skills
  • SEL Field Notes | July 26SEL Field Notes | July 26
  • To Support Educators in Promoting SEL, Leverage Summer More EffectivelyTo Support Educators in Promoting SEL, Leverage Summer More Effectively
  • Nyack Public Schools Use Diversity, Equity, and Inclusion Survey to Move Our Equity Agenda ForwardNyack Public Schools Use Diversity, Equity, and Inclusion Survey to Move Our Equity Agenda Forward

Reader Interactions

Previous
Next

Comments

  1. Ryan says

    September 13, 2018 at 4:17 pm

    Excellent article! Very informative and speaks volumes to the real possibility that the PROCESS of evaluation can be meaningful to everybody that is involved. Also, thank you for being so open about your experience with failing, and then learning from it.

    Reply

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

INFORM

Read new reports on current SEL initiatives

INSPIRE

Stories about how people are using assessment in practice

ENGAGE

Provide input and share your experiences with us

CONNECT

See projects in the SEL community

Footer

© Copyright 2023 CASEL All Rights Reserved | Website Designed by ArtVersion