The future of assessment - five principles and five targets for 2025

Commentary - How Assessment is Changing in The Digital Age—Five Guiding Principles. (2020, August 20). [Blog]. Contact North.

Extracted Annotations (2020-08-29, 8:34:59 p.m.)

Assessment is crucial to the educational process. Done properly, it drives improvement, shapes learner behaviour and provides accountability to employers and others. (p. 6)

It can also be a source of dissatisfaction, frustration and anxiety. Does it assess the right things? Is it getting the best from learners? Does it take place at the right points in the learning journey? Is it susceptible to cheating? Does it involve a sustainable workload? (p. 6)

The key foundational measures of validity and reliability are only part of the story of assessment, especially when considered with these questions. What is the faculty workload required to develop valid and reliable measurements and items in relation to learning outcomes? If faculty workload is too high, can we trust publishers to do that work? How much should faculty out-source assessment and measurement in their work?

We argue that universities and colleges could use technology to transform assessment by making it more: Authentic: preparing the learner for what they are going to do next, meeting employer needs and testing knowledge and skills in a more realistic, contextualised and motivating way. Accessible: designed throughout to be usable by everyone to the greatest possible extent, including those who have a long-term disability, a short-term injury or a mental health challenge. Appropriately automated: easing teachers' marking and feedback workload, and providing quicker, more detailed and more actionable feedback for students. Continuous: rich in practice opportunities and reflecting the fact that students today need to be capable of lifelong learning, to adapt to changes in the world of work and across their lives rather than succeeding at one highstakes, high-stress exam. Secure: ensuring that the right student is taking the right assessment and that the work they are submitting is their own and abides by the rules. (p. 7)

Organisational culture and readiness are key for the kind of transformational programmes required to change assessment. (p. 7)

However, in the Jisc digital experience insights survey 2019, only 34% of HE teaching staff and 36% of FE teaching staff said they were offered regular opportunities to develop their digital skills and only 13% and 15% were given time and support to innovate2 (p. 7)

FE - Further education (note on p.7)

Some European universities are using technologies far more than their UK counterparts3 and are sending very strong messages about constructive alignment: if you digitise teaching and learning, you need to digitise assessment and vice versa. Digital technologies offer the possibility of making assessment more authentic and less burdensome for all involved. (p. 7)

3 Newcastle University is understood to be one of the institutions conducting the most digital exams of any UK HEI, with about 10% of exams (around 170 exams involving 15,000 students) being digital in 2018. In the Netherlands and Norway a number of institutions are close to exams being 100% digital. (p. 7)

Authentic assessment lets the learner express themselves in ways which feel natural to them and prepares them for what they are going to do next. Technology offers opportunities to test knowledge and skills in a more realistic and motivating way than pen and paper tests, which can appear irrelevant outside the academic world. More authentic assessment also encourages the learner to integrate knowledge and skills, and act on knowledge. It develops deeper, more integrative personal learning and knowing. (p. 9)

Principle one: Authentic assessment (p. 9)

Crucially, assessment design encompasses the process as well as the end-product, giving students experience of working in teams and in ways they are likely to meet in industry (p. 9)

How is technology being used? (p. 10)

Peer and self assessment PeerWise (https://peerwise.cs.auckland.ac.nz) is a free tool used by many universities to allow students to create questions for formative peer testing. Recent research5 shows that while writing questions helps student performance, it is even more enhanced when they also comment on posts by other students.. (p. 10)

5 Duret, D., Christley, R., Denny, P., and Senior, A. (2018) Collaborative learning with PeerWise. Research in Learning Technology, 26. https://doi.org/10.25304/rlt.v26.1979 (p. 10)

Principle two: Accessible assessment (p. 12)

Accessibility needs to be viewed in the very broadest sense and designed into assessment practice from the outset. In practice, many institutions are still at the stage of 'retrofitting' accessibility by taking a non-accessible assessment and providing adaptations to meet the needs of students with specific disabilities. However, any student with a sports injury or a short-term illness affecting their vision or hearing may have exactly the same needs as a learner with a longer-term disability (p. 12)

How is technology being used? (p. 12)

Principle three: Appropriately automated assessment (p. 14)

Designing, marking and feeding back all take up a large amount of time for teachers and can mean that they are not able to deliver results quickly or provide the detailed feedback that students want. The human element of feedback is valued by teachers and students alike but rising student numbers are adding to marking workloads, and students express more dissatisfaction with assessment and feedback (ucl.ac.uk/teaching-learning/casestudies/2018/apr/how-we-raised-our-nss-feedback-andassessment-scores-26-three-years) than with any other aspect of their learning experience, according to the National Student Survey (p. 14)

Technology can certainly help by automating some aspects of assessment such as marking and feedback. And it can do this in a way that improves the student experience and frees teacher time and expertise for the most valuable human activities, ensuring assessment is for learning and not simply of learning. (p. 14)

Case study: exploring natural language processing at Bolton College (p. 15)

How is technology being used? (p. 15)

A clear concern is the extent to which technology might reduce student/academic staff interaction, and to what extent human judgement could or should be replaced entirely - hence the need for 'appropriately' automated assessment. (p. 16)

Adaptive comparative judgment Comparative judgement works on the principle that the human brain finds it easier to compare two items and decide one is better or worse than the other than to make an objective assessment about the quality of either against a given rubric. (p. 16)

ACJ is being used at the universities of Edinburgh, Glasgow and Manchester and those who have used it appear very positive about the advantages while sounding a note of caution that judging criteria need to be very clearly formulated (p. 16)

Principle four: Continuous assessment (p. 17)

Our approach to, and expectations of, education will change as future workplace requirements change, so students need to develop skills of independent, self-directed learning. Too much emphasis on a limited number of high-stakes assessment points places stress on individuals and institutional processes, and reduces the time and effort that can be put into formative opportunities. (p. 17)

Overassessment can have a detrimental effect on student attainment as, with too many different assignments to complete, students cannot concentrate sufficient effort on each one. This is a particular problem if combined with 'assessment bunching', where submission deadlines fall closely together8. (p. 17)

How is technology being used? (p. 17)

Learning analytics The 'Quantified Student' (https://quantifiedstudent.nl/) project in the Netherlands is using learning analytics to make students more aware of their study behaviours. (p. 18)

Similarly, Jisc's Study Goal (jisc.ac.uk/ learning-analytics) is an app for students that enables them to view their data and to access other features that support their study. (p. 18)

Dutch universities, notably Amsterdam and Maastricht, have also been researching what data can best help predict a student's final grade. They concluded that the 'undoubted winner' for predictive ability seemed to be performance in formative tests in a digital environment - the act of doing the formative tests seemed to have more effect than different learning approaches and strategies in the run-up to the tests. (p. 18)

Principle five: Secure assessment (p. 19)

However, there is also a growing need to monitor how new technology is being used to break assessment rules. Cheating has never been easier or more prevalent. A recent study by Swansea University (swansea.ac.uk/press-office/latest-research/) analysed surveys dating back to 1978 in which students were asked if they had ever paid for someone else to complete their work. The findings - covering 54,514 participants - showed a 15.7% rise between 2014 and 2018 in the number of students who admitted cheating. (p. 19)

Academic integrity is at the heart of this issue. Better assessment design, resulting in fresh, situated, personalised assessment tasks which effectively design out reasons and opportunities to buy in essays, has a role to play in reducing the likelihood of cheating. Technology can play a part. (p. 19)

Contract cheating is a growing and concerning issue. From a technological perspective, plagiarism tools are of little use. (p. 21)

Developing academic integrity, good assessment design and an understanding of the pressures that students are under - pressures that may lead them to consider cheating - are the keys to more secure assessment. (p. 21)

Five targets for the next five years (p. 23)

How could we meet the targets? (p. 24)

Assessment needs to be treated as a large-scale transformation programme requiring consistent high-level management and organisational support alongside the freedom to innovate and permission to fail. Students must be part of the journey. (p. 24)

Authentic Understand assessment from the student perspective and build assessment that supports learning and works for today's digital society. (p. 25)

Accessible Establish a universal design approach to assessment, ensuring that all assessments are as widely usable as possible. (p. 25)

Automated Communicate, network and collaborate to identify technical solutions and to influence platform providers to better meet the needs of the sector. (p. 25)

Continuous Take a structured, organisation-wide approach to learning and assessment design that aligns organisational values, learning and teaching principles and the desired learning outcomes for any programme of study (p. 25)

Secure Develop a two-pronged strategy for dealing with the critical issue of assessment security by promoting a culture of academic integrity among students and also making better use of technological tools. (p. 25)