Introduction

Extracted Annotations (2020-10-14, 7:41:31 a.m.)

Assessment of Twenty‐First‐Century Skills (p. 22)

Such complex competencies are sometimes called > twenty‐first‐century skills> and include cognitive skills such as problem‐solving, systems thinking, and argumentation skills, intra personal skills such as self‐regulation, adaptability, and persistence, as well as interpersonal skills such as collaboration skills, leadership skills, and conflict reso­lution skills. (p. 23)

As a result, the kinds of intellectual and creative tasks that  effective citizens need to be able to solve nowadays with digital tools are often qualitatively different in important ways from the tasks of the past. As a result, consid­erations of smart assessment design, delivery, scoring, and reporting have become much more complex. (p. 23)

Since the tasks that we create for learners are lenses that allow us to learn particular things about them and tell evidence‐based stories about them, we are nowadays confronted with the reality that these stories have become more complex rather than less complex. (p. 23)

Methodologies for Innovative Assessment (p. 23)

In the most innovative and immersive digital environments the nature of the data that are being collected for assessment purposes has also become much more complex. (p. 23)

Integrating disciplinary traditions. (p. 24)

Integrating statistical modeling approaches (p. 24)

Put differently, it has helped many quantitative methodologists to re‐appreciate the fact that any model, no matter how elegantly it is specified or estimated, is, in the end, just a technological tool. Like any tool, it can be used very thoughtfully as a > healthy connective tissue> for evidence or rather inappropriately leading to serious evidentiary > injuries.> (p. 25)

Integrating assessment design and validity argumentation. (p. 25)

From a validity perspective, which is foundational for all educational assessment arguments, the constellation of design choices within an assessment life cycle has to be based on sound scientific reasoning and has to rhetorically cohere to provide added value to key stakeholders. This typically means that the information that is provided from such assessments should provide real insight into learning, performance, and various factors that affect these. (p. 25)

As such, smart assessment design considers the system into which the assessment is embedded just as much as the tool itself. (p. 25)

Instead, assessment information needs to be interpreted, actions need to be taken, experiences need to be shaped, and new information needs to be collected in an ever‐continuing cycle of learning, assessment, and development. (p. 25)

This perspective critically affects the societal reverberations that assessments can have when serving underrepresented or disadvantaged groups in order to improve the life experiences of all learners across the societal spectrum and lifespan. (p. 25)

However, ensuring that assessment arguments are thoughtful, differentiated, and responsible in light of societal missions of assessment is important, especially in inter­ disciplinary teams that are charged with various critical design decisions throughout the assessment lifecycle. It will help these teams be more motivated to keep track of controversial design decisions, limitations of assessment inferences, and critical assumptions. In short, it will help them to make sure they know what evidence they already have and what evidence still needs to be collected in order to support responsible interpretation and decision making. (p. 25)

Integrating professional development and practical workflows. (p. 25)

There is still a notable gap in the way universities train graduate students with Master's or PhD degrees in the practices of assessment design, deploy­ment, and use. (p. 25)

These mindsets require time to cultivate and established processes require time to change, which is again why frameworks like assessment engineering and evidence‐centered design can be so powerful for engendering best practices. (p. 26)