Social and Emotional Skills Stealth Assessment
InformBy: Kate E. Walton, John Whitmer, & Jeremy Burrus – ACT, Inc.
Can we measure social and emotional (SE) skills without using an assessment? Students’ SE skills are reflected every day in real-world behaviors, such as in daily interactions with online learning environments. Can these behaviors be analyzed in some way to provide scores and feedback on student SE skills? Here we discuss the results of an exploratory study which serves as a demonstration that this may indeed be possible.
First, we discuss some of the challenges associated with assessing SE skills. We then propose linking learning analytics research and SE skill assessment. Next, we describe a study we conducted using learning analytics methods to examine associations between SE skills, behaviors recorded in an online learning environment, and course grades. The results of this study point to an innovative approach for SE skill assessment. We conclude with recommendations for assessment of SE skills and personalized early SE skill interventions.
Challenges of Assessing SE Skills
There are methodological and logistical challenges associated with assessing SE skills, two of which are detailed below.
- The use of Likert items: Many SE skill assessments are limited to Likert-type self-report items, which may be subject to various biases. For example, people may agree to all statements regardless of item wording, they may simply not pay attention when responding, or they may answer in a way that makes them look better or more “socially desirable” (Wetzel, Böhnke, & Brown, 2016). These issues and others lead to skepticism in the validity of many SE skill assessments.
- Too much testing time: In terms of logistical challenges, an overwhelming majority (81%) of teachers feel as though their students spend too much time taking mandated tests (Rentner, Kober, & Frizzell, 2016), and many educators hear complaints from students regarding the amount of time they spend preparing for and taking tests.
However, there is a sea change occurring in education; social and emotional learning is becoming increasingly prominent in the classroom, and although stakeholders may recognize the importance of assessing and teaching SE skills, this clearly means more testing time for students and teachers.
Therein lies the rub.
Teachers cite limited time as a barrier to implementing social and emotional learning, and the limited time is linked to increased standardized testing (Humphries, Williams, & May, 2018). In short, assessing SE skills may get in the way of teaching SE skills. Perhaps there is an alternative. Is it possible to assess students’ SE skills without relying on standard assessment instruments or requiring additional time and testing? One answer may lie in a new cross-disciplinary area known as learning analytics.
Learning Analytics and SE Skills
Researchers in learning analytics have developed methods to analyze data collected “in the wild” through student interactions with educational technologies (e.g., student opened syllabus before first course session, student frequently participated in discussion forum, etc.) to gain insights into student learning and to ultimately improve learning outcomes (Siemens & Long, 2011). These methods have been used to create highly accurate predictive models of student performance. Less understood, however, are the student characteristics that underly these predictive models. That is, we can use learning analytics to accurately predict outcomes, but we do not know precisely what student characteristics explain the models. This problem has been called the “clicks to constructs” issue (Lang, Siemens, Wise, & Gašević, 2017), and researchers in learning analytics have hypothesized that it is necessary for predictive models to inform interventions to ultimately improve student success (Krumm et al., 2016). Could learning analytics methods be joined with SE skill assessment to provide insights into these skills? In other words, is it possible to combine the methods of learning analytics and SE skills assessment to not only answer the question of what students will do, but also why they do it?
Our Research Examining Links between SE Skills, Online Study Behaviors, and Grades
In a partnership with University of Maryland, Baltimore County and Blackboard, we carried out a study designed to answer these questions (Whitmer et al., 2019). At the start of the semester, college students (N = 905) in five courses making use of Blackboard, a learning management system (LMS), took the college version of ACT® Tessera®, an SE skills assessment. Their activities in the LMS throughout the course of the semester were tracked and synthesized. These activities included such things as completing assignments, reviewing exam materials, visiting the discussion board, and reviewing announcements, emails, and the course gradebook. At the conclusion of the course, we examined the associations between students’ SE skills, LMS activities, and course grades. We found that student behaviors and grades were related to their SE skills as measured by ACT Tessera.
Grit, for example, was associated with greater LMS activity and better course grades (see the figure below). Moreover, our analyses found that we could accurately predict not only course grades, but also SE skills from students’ LMS activity by the second week of the semester. Unlike predictions of course grades, which increased in accuracy as the course progressed over time, the predictions of SE skills remained fairly constant. We demonstrated that, with some degree of accuracy, we can estimate students’ SE skills by examining their online study behaviors without assessing the SE skills directly with a standard assessment, and we can do this early in the semester.

Implications for SE Skill Assessment and Interventions
We started this blog by asking the question, “Can we measure social and emotional skills without using an assessment?” The initial answer is a cautious “yes.” Although this is a single study of college students, it offers insights into how we can assess SE skills in innovative ways, specifically by taking advantage of the opportunities afforded by learning analytics methods applied to educational technologies. Technology usage is increasing throughout K-12 and higher education with many students engaging in full-time online or blended learning. Even students who exclusively engage in face-to-face learning frequently use educational technologies. The digital footprints left behind provide a wealth of data that can be used to provide insights into students’ SE skills.
What’s more, this analysis could be used to create personalized interventions using these same technologies. For example, if early in the year it is estimated that a student could benefit from better organizational skills, the LMS or other technologies could provide prompts to the student with reminders to complete homework, set aside time for studying, organize study materials, etc.
In this new decade, the focus on SE skill development will continue to increase. We ought to be thinking beyond traditional means of assessing and teaching these skills, and advanced learning analytics techniques offer a wealth of opportunity.
References
Humphries, M. L., Williams, B. V., & May, T. (2018). Early childhood teachers’ perspectives on social-emotional competence and learning in urban classrooms. Journal of Applied School Psychology, 34, 157-179.
Krumm, A. E., Beattie, R., Takahashi, S., D’Angelo, C., Feng, M., & Cheng, B. (2016). Practical Measurement and Productive Persistence: Strategies for Using Digital Learning System Data to Drive Improvement. Journal of Learning Analytics, 3, 116-138.
Lang, C., Siemens, G., Wise, A., & Gašević, D. (2017). The handbook of learning analytics. Society for Learning Analytics Research.
Rentner, D. S., Kober, N., & Frizzell, M. (2016). Listen to us: Teacher views and voices. Washington, DC: Center on Education Policy.
Siemens, G., & Long, P. (2011). LAK ’11: Proceedings of the 1st International Conference on Learning Analytics and Knowledge.
Wetzel, E., Böhnke, J. R., & Brown, A. (2016). Response biases. In F. T. L. Leong & D. Iliescu (Eds.), The ITC international handbook of testing and assessment (pp. 349-363). New York, NY: Oxford University Press.
Whitmer, J., San Pedro, S., Liu, R., Walton, K. E., Moore, J. L., & Andrade Lotero, A. (2019). The constructs behind the clicks (Report No. 2019-6). Iowa City, IA: ACT, Inc.
Disclaimer: The Assessment Work Group is committed to enabling a rich dialogue on key issues in the field and seeking out diverse perspectives. The views and opinions expressed in this blog are those of the authors and do not necessarily reflect the official policy or position of the Assessment Work Group, CASEL or any of the organizations involved with the work group.
I believe improved grades is a good indicator of SEL’s evidence. SEL is experimental and ongoing and it makes it hard to capture the magnitude of its benefits.
Love the idea of additional approaches to assessment given the drawbacks of data collection, trade-off to teaching time available and measurable/analytics. Now, how to extend this concept to EARLY SE? Whatever lesson is being taught, there is an observation component (are the kids demonstrating individual understanding of the lesson) as well as kid self reporting via art/drawing (because art/drawing is superior at early age to reading comprehension of test questions and lack of vocabulary). These two components provide a strong self assessment potential. Examples of this type of early SE with assessment feedback can be seen at the http://www.ORAEYC.org/Mestrovich Blog. Her lessons always end with kid feedback on how they felt before and after lesson showing they get it and are ready for the next lesson. See also http://www.SuperkidPower.org
Thinking of how to apply this in the elementary setting; students read books then take quizzes on Renaissance Accelerated Reader-I review data on which students have been taking quizzes/how many quizzes and what if any points the student earned on each quiz. Wondering if we could use this data to show which students have higher reading self efficacy and what we could do about the ones who don’t as shown by the number of times they accessed AR quizzes….just thinking out loud!
I recommend a community/relevant component such as Barrio.com with author Dr. Robert Renteria