Skip to Content


Section 4: Measuring for Learning

Goal: At all levels, our education system will leverage the power of technology to measure what matters and use assessment data to improve learning.

Measuring learning is a necessary part of every teacher’s work. Teachers need to check for student understanding, and parents, students, and leaders need to know how students are doing overall in order to help them successfully prepare for college and work. In addition to supporting learning across content areas, technology-enabled assessments can help reduce the time, resources, and disruption to learning required for the administration of paper assessments1Assessments delivered using technology also can provide a more complete and nuanced picture of student needs, interests, and abilities than can traditional assessments, allowing educators to personalize learning.

Through embedded assessments, educators can see evidence of students’ thinking during the learning process and provide near real-time feedback through learning dashboards so they can take action in the moment.2 Families can be more informed about what and how their children learned during the school day. In the long term, educators, schools, districts, states, and the nation can use the information to support continuous improvement and innovations in learning.

Technology-enabled tools also can support teacher evaluation and coaching. These tools capture video and other evidence of qualities of teaching such as teamwork and collaboration. They provide new avenues for self-reflection, peer reflection and feedback, and supervisor evaluation.

Educators and institutions should be mindful of whether they are measuring what is easy to measure or what is most valuable to measure. Traditional assessments in schools and post-secondary institutions today rely largely on multiple-choice questions and fill-in-the-bubble answers.3 Many assessments also happen after learning has occurred and with results delivered months later, usually after the course has ended. Assessments are more instructionally useful when they afford timely feedback.

Continued advances in technology will expand the use of ongoing, formative, and embedded assessments that are less disruptive and more useful for improving learning. These advances also ensure that all students have the best opportunity to demonstrate their knowledge and skills on statewide assessments that increasingly focus on real-world skills and complex demonstrations of understanding. Statewide assessment—coupled with meaningful accountability—is an essential part of ensuring students have equitable access to high-quality educational experiences. At the same time, it is crucial to focus time and effort on tests worth taking—those that reflect the kind of instructional experiences students need and that provide actionable insight.

As technology gives us the capability to improve on long-standing assessment approaches, our public education system has a responsibility to use the information we collect during assessment in ways that can have the greatest impact on learning. This means using assessments that ask students to demonstrate what they have learned in meaningful ways. And students and parents know there is more to a sound education than picking the right answer on a multiple-choice question or answering an extended-response question outside of the context of students’ daily lives. All learners deserve assessments that better reflect what they know and are able to do with that knowledge.


Approaches to Assessment

Various types of assessments are appropriate for different uses and at different times. Summative assessments measure student knowledge and skills at a specific point in time. Summative assessments often are administered in common to a group of students, whether an entire class, grade level at a school, or grade level across a district. These assessment results can help to determine whether students are meeting standards in a given subject and to evaluate the effectiveness of an instructional curriculum or model.4

Many PK–12 schools administer formal summative tests at the end of the year, which they may augment with interim tests earlier in the year. These assessments provide system-wide data on student achievement as well as data by sub-groups of learners.5 The data can provide valuable insights regarding the achievement and progress of all students, including efforts to promote equitable access to excellent educational opportunities and to narrow achievement gaps.

In contrast, formative assessments are frequent, instructionally embedded checks for understanding that provide quick, continual snapshots of student progress across time. Formative assessments provide information during the instructional process, before summative assessments are administered. Both teachers and students can use the results from formative assessments to determine what actions to take to help promote further learning. These assessments help identify students’ understanding, inform and improve the instructional practice of teachers, and help students track their own learning.6

Optimally, a comprehensive assessment system balances multiple assessment approaches to ensure that students, families, educators, and policymakers have timely and appropriate information to support individual learners and to make good decisions to strengthen educational systems overall.

Using Assessment Data to Support Learning

In almost all aspects of our daily lives, data help us personalize and adapt experiences to our individual needs. However, there is much work remaining to realize the full potential of using assessment data to improve learning. One recent study of teacher perceptions of the use of data revealed a range of frustrations with many current implementations. These frustrations include being overwhelmed with large amounts of data from disparate sources, incompatibility of data systems and tools that make data analysis unnecessarily time-consuming, inconsistency in the level of detail and quality of data, and delays in being able to access data in time to modify instruction.7

Education data systems do not always maximize the use of interoperability standards that would enable easy and secure sharing of information with educators, schools, districts, states, students, and their families. As a result, educators are missing out on significant opportunities to use data to improve and personalize learning. With improved educational data systems, leaders can leverage aggregate data to improve the quality and effectiveness of technology-enabled learning tools and resources.

For example, it is now possible to gather data during formative and summative assessments that can be used to create personalized digital learning experiences. In addition, teachers can use these data to inform interventions and decisions about how to engage individual students; personalize learning; and create more engaging, relevant, and accessible learning experiences for all learners.

Assessment data can be made available directly to students. When they have access to their data, students can play a larger role in choosing their own learning pathways.8 The data also can be made available to family members so students’ advocates can play a more active role in supporting their children’s education. Moreover, data can be used to support teachers’ efforts—individually or in teams, departments, or schools—to improve professional practice and learning.9 For personalized learning systems to reach their full potential, data systems and learning platforms should include seamless interoperability with a focus on data security and issues related to privacy.

In many cases, pre-service teaching candidates do not receive sufficient instruction on understanding and using data. At the same time, in-service teachers can benefit from ongoing professional development on the integration of technology to enhance their teaching. According to the Data Quality Campaign, as of February 2014, just 19 states included the demonstration of data literacy skills as a requirement for teacher licensure.10 Although data from technology-based assessments and data systems hold great potential, they are meaningful only when educators use them effectively. Teachers deserve ongoing support to strengthen their skills in how to use data to meet the needs of students better.

Addressing these challenges will take a three-pronged approach: (1) preparing and supporting educators in realizing the full potential of using assessment data, (2) encouraging the development of data assessment tools that are more intuitive and include visualizations that clearly indicate what the data mean for instruction, and (3) ensuring the security and privacy of student data within these systems.

For a more complete discussion of student data safety and privacy, see Section 5: Infrastructure.

How Technology Transforms Assessment

Technology can help us imagine and redefine assessment in a variety of ways. These tools can provide unobtrusive measurements for learners who are designing and building products, conducting experiments using mobile devices, and manipulating parameters in simulations. Problems can be situated in real-world environments, where students perform tasks, or include multi-stage scenarios that simulate authentic, progressive engagement with the subject matter. Teachers can access information on student progress and learning throughout the school day, which allows them to adapt instruction to personalize learning or intervene to address particular learning shortfalls. The unique attributes of technology-based assessments that enable these activities include the following.

Enable enhanced question types

Technology-based assessments allow for a variety of question types beyond the limited multiple-choice, true-or-false, or fill-in-the-blank options that have characterized traditional assessments. Examples of enhanced question types include the following:

  • Graphic response, which includes any item to which students respond by drawing, moving, arranging, or selecting graphic regions
  • Hot text, in which students select or rearrange sentences or phrases within a passage
  • Equation response, in which students respond by entering an equation
  • Performance-based assessments, in which students perform a series of complex tasks

Technology-enhanced questions allow students to demonstrate more complex thinking and share their understanding of material in a way that was previously difficult to assess using traditional means.

In particular, performance-based assessments are designed so that students must complete a series of complex skills that ask them to synthesize information from multiple sources, analyze that information, and justify their conclusions. For example, a performance task in English language arts might include reading passages from primary documents, analyzing the set of passages, and writing an essay in response to a prompt. In a mathematics class, a performance task might ask students to analyze a graph based on actual data and describe the linear relationship between the quantities. Because performance-based assessments allow students to construct an original response rather than selecting the right answer from a list, they can measure students’ cognitive thinking skills and their ability to apply their knowledge to solve realistic, meaningful problems.11

Using the technology offered in performance-based assessments, students can enter their responses in the online interface. For tasks that require hand scoring, scores can be merged with machine-scored items in the same system, thus providing complete test results. For example, the Partnership for Assessment of Readiness for College and Careers and the Smarter Balanced Assessment Consortium evaluate students’ ability to excel at classroom speaking and listening assignments in addition to more traditional machine-scored prompts.


PISA is a triennial international survey that aims to evaluate education systems worldwide by testing the skills and knowledge of 15-year-old students. For additional information, visit

Measure complex competencies

A recent convening of the National Research Council (NRC) underscored the importance of broadening the focus of assessment to include non-cognitive competencies and the importance of technology in measuring knowledge, skills, and abilities.12

As an example, the NRC highlighted the work of the international comparative assessment, Programme for International Student Assessment (PISA). PISA administers a novel technology-based assessment of student performance in creative problem solving designed to measure students’ capacity to respond to non-routine situations to achieve their potential as constructive and reflective citizens. The NRC also highlighted the SimScientists simulation-based curriculum unit and assessments, which are designed to use technology to measure middle school students’ understanding of ecosystems and scientific inquiry.

Similarly, the National Assessment of Educational Progress (NAEP) recently announced plans to expand its testing program to begin to include measures of students’ motivation, mindset, and perseverance in an effort to build the evidence base for more widespread use.

Technology Enables Assessment of Growth Mindset

With funding from the U.S. Department of Education’s Small Business Innovation Research program, Mindset Works developed SchoolKit, an app designed to strengthen academic and social and emotional success. Through animations, assessments, and classroom activities, students learn a growth mindset—the understanding that ability develops with effort. Pilot research in nine middle schools showed significant increases in student growth mindset, which related to increases in learning goals, positive beliefs about effort, and positive academic habits and behaviors (such as resilient responses to failure and better learning strategies).19

These changes also related to increases in students’ grade point averages. Since launching in 2012, SchoolKit has been used by tens of thousands of students around the country, including all middle schools in Washington, D.C. The app is based on Carol Dweck’s research on growth mindsets.20

Provide Real-Time Feedback

Technology-based formative assessments can offer real-time reporting of results, allowing stakeholders to understand students’ strengths and weaknesses, while guiding them to make valid, actionable interpretations of the assessment data. Such assessments can enable educators to see, evaluate, and respond to student work more quickly than can traditional assessments. Similarly, learners and their families can access this information almost in real time. Technology-based summative assessments also facilitate faster turnaround of results.

Some of today’s technology-based assessments also allow for a richer menu of approaches to feedback than do traditional or even first-generation online assessments. Certain formative assessment platforms allow educators to provide feedback to students via in-line comments (through video, audio, or text), engage in online chats, e-mail feedback directly to families and learners, and connect learners to additional resources for practicing specific skills or developing key understandings.

These technologies also can increase the efficiency of the process of giving feedback, allowing educators more time to focus on areas of greatest need. For example, for giving feedback on areas of frequent concern, educators can pre-populate a menu of responses to use as comments, allowing them to shift focus to areas of feedback unique to each student. Automated responses can be generated as well when assignments are late or incomplete. Although this is still nascent technology, in recent years, advances have occurred in automated scoring of essays that may make it a more powerful tool to generate timely feedback.

Increase Accessibility

Advances in technology grounded in UD and systems that align to UDL have made assessments more accessible and valid for a greater number of students, including those with diverse abilities and language capabilities. These advances have allowed a greater proportion of the population access to assessments.

Special features include the ability to increase font sizes and change color contrast, text-to-speech, bilingual dictionaries, glossaries, and more. These features can be embedded in assessments and made available to students, depending on what the assessment is measuring and identified learner needs. Seamless accessibility features embedded in technology-based assessments reduce the need to single out individual students for extra supports, providing an added benefit for students and educators alike.

Similarly, assistive technology, such as text-to-speech, alternate response systems, and refreshable braille, supports students with disabilities in accessing learning. These technologies continue to advance and can make it possible for students to interact with digital learning resources in ways that would be impossible with standard print-based assessments. When both assistive technologies and assessments effectively interoperate, students are better able to demonstrate what they know and how to apply this knowledge.

Adapt to Learner Ability and Knowledge

Computer adaptive testing has facilitated the ability of assessments to estimate accurately what students know and can do across the curriculum in a shorter testing session than would otherwise be necessary. Computer adaptive testing uses algorithms to adjust the difficulty of questions throughout an assessment on the basis of a student’s responses. For example, if the student answers a question correctly, a slightly more challenging item is presented next; if the student answers incorrectly, he or she receives another opportunity to demonstrate knowledge in a different manner.

Because adaptive tests target content and test items aligned with each student’s ability level, the adaptation leads to more precise scores for all students across the achievement continuum in a greatly reduced time period. Achieving the same level of precision in a traditional paper-and-pencil test would require students to answer many more questions, potentially impacting instructional time. Moving forward, these assessments can benefit from increased interoperability so that the data from these adaptive measures can be pulled into a centralized dashboard that allows a more integrated understanding of student performance.

Embedded With the Learning Process

Embedded assessments are woven directly into the fabric of learning activities students undertake. Such assessments may be technology driven or simply a part of effective instruction, and they may appear in digital learning tools and games. They are generally invisible to the instructional process because they are embedded in the regular classroom activities. Embedded assessments have the potential to be useful for diagnostic and support purposes in that they provide insights into why students are having difficulties in mastering concepts and provide insights into how to personalize feedback to address these challenges.13

Game-based assessment is designed to leverage parallels between video game design and next-generation learning and assessment.14, 15 Recent research has focused on promising ways that digital learning can support formative assessment practices16—including wraparound features such as annotation tools and dashboards—and ways that games can identify more nuanced conclusions about student learning outcomes.17

Incorporating Student Interests: Games and Assessment

GlassLab creates and supports high-impact games that make learning visible by creating games, conducting research, and building infrastructure that lowers entry costs for new developers. For example, GlassLab has conducted a number of studies investigating the efficacy of games as a tool for learning and unobtrusive assessment.

Students using GlassLab’s games regularly report that they persist in the face of challenging academic content in the games and that they feel ownership over their learning. SimCityEDU: Pollution Challenge!, one of GlassLab’s digital games, provides educators with the tools and content to engage students in real-world challenges faced by countries globally. The game focuses on the countries’ need to reduce dependence on cheaper, pollution-generating resources such as coal while at the same time growing their economies.

In SimCityEDU: Pollution Challenge!, students play the role of a city mayor faced with a growing pollution problem and a shrinking economy. While learning how economic and environmental issues influence one another, students are assessed on their ability to problem-solve and understand relationships in complex systems. The GlassLab assessment system gathers evidence for students’ problem-solving and systems-thinking skills unobtrusively in the course of students’ gameplay by logging student activities. To support teacher facilitation, and enrich teacher-student interactions, the game also includes lessons plans, teacher and student dashboards, and student data reporting.

Embedding Assessment: Understanding Middle School Students’ Knowledge of Physics Concepts

Valerie Shute, the Mack and Effie Campbell Tyner Endowed Professor in Education at Florida State University, is studying the impact of video games on learning, with a focus on building a greater understanding of the future of embedded assessment.

One study conducted by Shute and her colleagues of middle school students focused on the acquisition and embedded assessment of physics concepts by having students play the relatively simple video game, Newton’s Playground. Players guide a ball to a balloon across a set of increasingly challenging two-dimensional environments involving the placement and manipulation of ramps, pendulums, levers, and springboards. After taking a traditional pre-test and answering a background questionnaire to assess prior knowledge, students played the game during six class periods—about four hours in total—and concluded their participation by completing a traditional post-test.

Newton’s Playground generates detailed log files as students play, capturing data such as time spent on the level, number of restarts of the level, total number of objects used in a solution attempt, whether the solution ultimately worked, and the trajectory of the ball. Each of these data points provides information that the game uses to make inferences about how well each student is doing in the game and to gauge the student’s current understanding of the physics concepts being taught.

On the basis of analyses of the pre- and post-test data, game log files, and the background questionnaire, Shute and her colleagues demonstrated the following:

  • Students playing the game improved their conceptual physics understanding.
  • Students who were more engaged in playing the game learned more than those who were less engaged.
  • The assessments embedded in the video game could be used to substitute for the traditional assessments commonly used in today’s classrooms.

Shute’s work underscores the potential for embedded assessment to play an increasingly important role in helping students to gain and demonstrate mastery of important knowledge, skills, and abilities.21

Access for Ongoing Learning

Technology provides students with multiple pathways to create assessable work throughout the year. To demonstrate their understanding, students can create multimedia productions, construct websites to organize and analyze information, and design interactive presentations to serve as products for assessment. These pathways allow teachers to understand how students access and understand information across given categories. For students who need individual accommodations, advances in technology allow for dynamic and personalized presentation and assessment using alternative representations of the same concept or skill. For example, alternative text can be provided for images through the work of the Diagram Center to make graphics accessible to learners with print disabilities.

Moving forward, increasingly sophisticated technology-driven assessments will enable more powerful personalized learning, likely accelerating the shift from time-based learning to competency-based learning.

The Future of Technology-Based Assessment

Although the process is often challenging, in many places, transitioning to technology-based assessment is well under way. Such assessments will continue to improve across time in the following ways.

Continuous Improvement of Assessments

Traditional paper-and-pencil tests, and even some first-generation technology-based assessments, usually are reviewed and updated only on a designated schedule, often driven by printing and distribution cycles rather than when test items need to be updated. Online delivery of assessments allows for continuous improvement of test items.

Integrated Learning and Assessment Systems

Technology has the potential to move assessment from disjointed separate measures of student progress to an integrated system of assessments and personalized instruction to meet the needs of the learner. Technology can integrate more fully student classroom experiences, homework assignments, and formative and summative assessments, all of which are tied closely to academic standards. Online learning platforms can display effects of missing assignments, progress toward goals, and channels for communication with mentors and teachers.

We also should expect to see integrated systems that make the learning process more seamless for students and educators. As students progress along personalized learning pathways, they will be assessed when they are ready to demonstrate mastery over particular skills and content rather than when the calendar indicates there is a testing date. At the same time, we have a responsibility to ensure that all students are held to high standards and offered excellent educational experiences. Ensuring equity while also providing accelerated personalization is the one of the greatest challenges and opportunities moving forward for technology in assessment.

Using Data Effectively and Appropriately

To realize the vision of sharing data across student information systems, we need to address several challenges. On the technical front, formidable barriers to the development of multi-level assessment systems are created by having several student data systems running side-by-side, coupled with disparate data formats and the lack of interoperability across systems. Student and program data today are collected at various levels and in various amounts to address different needs in the educational system. State data systems generally provide macro solutions, institution-level performance management systems offer micro solutions, and student data generated by embedded assessments create nano solutions. Providing meaningful, actionable information that is collected across all of these systems will require agreement on the technical format for sharing data while attending to student privacy and security.

To assist with overcoming these challenges, the National Center for Education Statistics at the U.S. Department of Education has been leading the Common Education Data Standards (CEDS) Initiative, a national, collaborative effort to develop voluntary, common data standards. The CEDS Initiative’s objective is to help state and local education agencies and higher education organizations work together to identify a minimal set of key data elements common across organizations and come to agreement on definitions, business rules, and technical specifications to improve the comparability of and ability to share those elements. (Note: Version 5 was released in January 2015.)

For more information on protecting student data and privacy, see Section 5: Infrastructure.

Learning Dashboards That Enable Visualizations

Although systems that support real-time feedback can increase educator and learner understanding of progress toward learning goals, the feedback is even more valuable if it is available in one easily accessible place. To achieve this, we need to connect information about learning that happens across digital tools and platforms.

Learning dashboards integrate information from assessments, learning tools, educator observations, and other sources to provide compelling, comprehensive visual representations of student progress in real time. A learner’s attendance data, feedback from instructors, summative evaluation data, and other useful information all can be made available in formats specific to different stakeholders. Learning dashboards can present this data in easy-to-understand graphic interfaces.

These dashboards also can offer recommendations about resources to help students continue their learning progression as well as help identify students who may be at risk of going off track or even dropping out of school. Across larger education systems, these dashboards can help educators to track learner performance across time as well as monitor groups of students to identify shifts in equity, opportunity, and achievement gaps. Although teacher dashboards are becoming commonplace, student and family dashboards can offer promising opportunities to help students take control of their own learning.

Putting Learning on Display: Summit Public Schools’ Student Dashboards Personalize Learning

Each morning, students at Summit Public Schools connect to their Personalized Learning Plans by using their devices. Here, students find both their short-term and long-term project views, the materials they need to complete their projects, and just-in-time formative feedback to improve their individual learning, all in one location. Using a color-coded system, each project is linked explicitly with the associated content knowledge standards, and students can see the progress they have made toward those standards as well as areas in which they need more practice.

This automated feedback and work management system gives students easy access and greater control over their learning and frees educators to spend more time teaching and less time on administrative and organizational tasks. “It was really difficult to track where my students were on their progress towards meeting a learning objective and giving them timely feedback,” says Elizabeth Doggett, a teacher at Summit Public Schools. “Often I would take student work home over the weekend, but by the time I got through giving them all feedback, it would be too late for them to make meaningful changes.”22

With the Personalized Learning Plan system, students have the formative feedback they need in real time, and their educators, such as Doggett, are able to plan and execute differentiated instruction more efficiently and effectively so that all of her students can succeed. Students also are benefiting individually from the student-facing side of the Personalized Learning Plan. Educators have taken notice of how these plans promote student agency and motivation. “Students should be able to access what they need at the moment they need it, and we provide the resources so that they can do that,” says Jon Deane, the former chief information officer of Summit Public Schools.23

Doggett sums up the effect of implementing the Personalized Learning Plan, saying, “It makes the students’ lives so much easier. It makes me a better teacher, and it makes them more successful students.”24

Set of Shared Skill Standards

As we shift toward personalized learning, there is increased need for a shared set of common skill standards. The development of micro-credentials is one approach to address this need by creating a shared language and system for communicating success in developing these competencies.

Micro-credentials, often referred to as badges, focus on mastery of a singular competency and are more focused and granular than diplomas, degrees, or certificates. The earning and awarding of micro-credentials typically is supported by a technology-based system that enables students and evaluators to be located anywhere and for these activities to take place everywhere and all the time. Micro-credentials also allow for the portability of evidence of mastery. Information about the student’s work that earned a badge can be embedded in the metadata, as can the standards the work reflects and information about the awarder of the badge. As with other data systems, a key goal for the next generation of micro-credentialing platforms is interoperability with other educational information systems.18

Recognizing Digital Literacy Skills: Assigning Micro-Credentials

LearningTimes, in partnership with the New York Department of Education’s Office of Postsecondary Readiness, has developed DIG/IT, a digital learning course that introduces students in transfer schools (second-chance high schools) to digital literacy skills while they develop their plans for college, careers, and life after high school. DIG/IT is an open standards-based system designed specifically for badge-empowered social learning that uses challenge-based quests and badges to recognize competencies and positive behaviors in four areas: digital citizenship, college and career explorations, financial literacy and arts, culture and games. At the end of the course, students design a learning experience for a family member or another important person in their lives.

Upon completing a series of related quests, students earn badges acknowledging tangible new skills they have acquired. They also earn reward badges for contributions to the online and classroom community. As they gather enough rewards, they “level up” and continue to earn rewards for participating in the community and for helping others.

DIG/IT is currently in use in 36 New York City transfer schools. The initial pilot has had promising results, including positive teacher and student feedback and reportedly higher levels of student engagement in school. Student attendance in the DIG/IT- based course has been higher than in courses not using the approach. The DIG/IT program will be rolled out to approximately 50 transfer schools over the next two years, reaching more than 5,000 students.

Since DIG/IT’s development, LearningTimes has spun off Credly to focus on earning, managing, and analyzing digital credentials and badges in an open and portable way. Credly hosts more than 6,000 organizations and their respective micro-credential initiatives. BadgeOS, the open source environment for setting up progressive credentialing programs, has been installed more than 30,000 times by organizations around the world and supports millions of learners.

Educators also can benefit from earning micro-credentials because they can gain recognition for new discrete skills they learn throughout their careers. The nonprofit, Digital Promise, has developed an educator micro-credentialing system, noting that educator micro-credentials can identify, capture, recognize, and share the practices of our best educators. Proponents view micro-credentials as a promising emerging professional development strategy.


Revise practices, policies, and regulations to ensure privacy and information protection while enabling a model of assessment that includes ongoing gathering and sharing of data for continuous improvement of learning and teaching. This will require not only greater systems interoperability standards but also increased capacity on the part of educators and administrators to understand the types of systems they want to establish within schools and colleges. In addition, they will need to have an understanding of the standards of interoperability they should demand from vendors. A key component of this increased capacity should ensure educational leaders have a firm understanding of privacy and security concerns, how those concerns are addressed within the school or system, and clear communication of policies and procedures with all stakeholders. Achievement of this recommendation would benefit from the involvement and guidance of organizations, such as CoSN, ISTE, and the State Educational Technology Directors Association (SETDA), that have developed specialized expertise in these areas.

States, districts, and others should design, develop, and implement learning dashboards, response systems, and communication pathways that give students, educators, families, and other stakeholders timely and actionable feedback about student learning to improve achievement and instructional practices. The next generation of such tools should integrate across platforms and tools seamlessly, be designed with a mobile-first mindset, and be guided by UD and UDL principles to ensure accessibility by all stakeholders. Although current products and dashboards include basic functionality and features that improve on those of their predecessors, future iterations should be built on a premise of feedback and conversation, allowing learners and families to discuss learning outcomes and evidence and increasing agency and ownership across stakeholder groups.

Create and validate an integrated system for designing and implementing valid, reliable, and cost-effective assessments of complex aspects of 21st-century expertise and competencies across academic disciplines. Interoperable formative assessment formats offered by major testing consortia for use by educators throughout the year are an important first step. However, work remains to ensure more educators have access to high-quality formative assessment tools and to develop additional capacities to assess both cognitive and non-cognitive skills better. Moving forward, increasing educator capacity for the design and deployment of valid and reliable formative assessments will require the concerted efforts of current assessment developers, teacher preparation programs, school systems, and researchers. Furthermore, colleges and universities will benefit from system-wide reviews of assessment practices and from ensuring all faculty have deep understandings of key principles and practices surrounding the design and implementation of effective learning assessments.

Research and development should be conducted that explores how embedded assessment technologies such as simulations, collaboration environments, virtual worlds, games, and cognitive tutors can be used to engage and motivate learners while assessing complex skills. Although some of this research is in its early stages, the way forward will require close collaboration among organizations—such as GlassLab, Games for Change, and iCivics; colleges, universities, informal learning spaces, and schools; philanthropic organizations; and research institutions—that have a deep understanding of how game mechanics increase learner motivation. This collaboration can increase the likelihood of effective and engaging experiences being built to support learning.

  1. Gohl, E. M., Gohl, D., & Wolf, M. A. (2009). Assessments and technology: A powerful combination for improving teaching and learning. In L. M. Pinkus (Ed.), Meaningful measurement: The role of assessments in improving high school education in the twenty-first century (pp. 183–197). Washington, DC: Alliance for Excellent Education.
  2. Reeves, D. (2007). Ahead of the curve: The power of assessment to transform teaching and learning. Bloomington, IN: Solution Tree Press.
  3. U.S. Department of Education. (2010). Beyond the bubble tests: The next generation of assessments—Secretary Arne Duncan’s remarks to state leaders at Achieve’s American Diploma Project leadership team meeting. Retrieved from
  4. Chappuis, J., Chappuis, S., & Stiggins, R. (2009). Formative assessment and assessment for learning. In L. M. Pinkus (Ed.), Meaningful measurement: The role of assessments in improving high school education in the twenty-first century (pp. 55–77). Washington, DC: Alliance for Excellent Education.
  5. Chappuis, S., & Chappuis, J. (2008). The best value in formative assessment. Educational Leadership, 65(4), 14–19.
  6. Stiggins, R., & DuFour, R. (2009). Maximizing the power of formative assessments. Phi Delta Kappan, 90(9), 640–644.
  7. Bill & Melinda Gates Foundation. (2015). Teachers know best: Making data work for teachers and students. Retrieved from
  8. Darling-Hammond, L. (2010). Teacher education and the American future. Journal of Teacher Education, 61(1-2), 35–47.
  9. Data Quality Campaign. (2014). Data for action 2014. Retrieved from
  10. Data Quality Campaign. (2014). Teacher data literacy: It’s about time. Retrieved from
  11. Darling-Hammond, L., & Adamson, F. (2010). Beyond basic skills: The role of performance assessment in achieving 21st century standards of learning. Stanford, CA: Stanford Center for Opportunity Policy in Education. Retrieved from
  12. Pellegrino, J. W., & Hilton, M. L. (Eds.). (2012). Education for life and work: Developing transferable knowledge and skills in the 21st century. Washington, DC: National Research Council of the National Academies.
  13. Shute, V. J., Ventura, M., & Kim, Y. J. (2013). Assessment and learning of qualitative physics in Newton’s Playground. The Journal of Educational Research, 106(6), 423–430.
  14. Gee, J. P. (2003). What video games have to teach us about learning and literacy. Computers in Entertainment, 1(1), 20–20.
  15. Toppo, G. (2015). The game believes in you: How digital play can make our kids smarter. New York, NY: Palgrave Macmillan Trade.
  16. Fishman, B., Riconscente, M., Snider, R., Tsai, T., & Plass, J. (2014). Empowering educators: Supporting student progress in the classroom with digital games. Ann Arbor, MI: University of Michigan. Retrieved from
  17. Owen, V. E., Ramirez, D., Salmon, A., & Halverson, R. (2014, April). Capturing learner trajectories in educational games through ADAGE (Assessment Data Aggregator for Game Environments): A click-stream data framework for assessment of learning in play. Presentation given at the annual meeting of the American Educational Research Association, Philadelphia, PA.
  18. HASTAC. (2014). Open badges case study. Retrieved from
  19. Mindset Works. (2010). Brainology Transforming Students’ Motivation to Learn. Retrieved from
  20. Dweck, C., & Rule, M. (2013, September). Mindsets: Helping students to fulfill their potential. Presentation given at the Smith College Lecture Series, North Hampton, MA.
  21. Shute, V. J., Ventura, M., & Kim, Y. J. (2013). Assessment and learning of qualitative physics in Newton’s Playground. The Journal of Educational Research, 106(6), 423–430.
  22. Bill & Melinda Gates Foundation. (2015). Reaching the summit of data-driven instruction. Retrieved from
  23. Ibid.
  24. Ibid.h Hampton, MA.
  25. Shute, V. J., Ventura, M., & Kim, Y. J. (2013). Assessment and learning of qualitative physics in Newton’s Playground. The Journal of Educational Research, 106(6), 423–430.
  26. Bill & Melinda Gates Foundation. (2015). Reaching the summit of data-driven instruction. Retrieved from
  27. Ibid.
  28. Ibid.

Feedback / Share Your Thoughts