E-scape

E-scape was a project run by the Technology Education Research Unit (TERU) at Goldsmiths University of London, England that developed an approach to the authentic assessment of creativity and collaboration based on open-ended but structured activities. As such it is an alternative to traditional assessment methodologies.

Background

Project e-scape originated in a UK Qualifications and Curriculum Authority (QCA) project in 2003 and 2004. It was titled 'Assessing Design Innovation' that developed an approach to assessment in design and technology that encouraged creativity and teamwork, and was based on a 6-hour structured coursework activity.[1] The activity is broken down into a series of sub-activities that provide placeholders for students to record the development of their thinking in words and of the development of their prototypes in photographs. This approach has subsequently been adopted by Oxford, Cambridge and RSA Examinations in their General Certificate of Secondary Education for product design. Phase 1 of the e-scape project looked at how Information and communication technologies within subject teaching and learning could be used to encourage the assessment of creativity and teamwork. The UK Department for Education and Skills and QCA supported the phase 1 proof of concept.[2][3]

In e-scape phase 1 it was established that the use of digital peripheral tools could enable learners to create authentic, real-time, electronic portfolios of their performance.[4] The value of peripheral tools lay in their 'back-pocket' potential. Learners were not tied to desktops and workstations, but could roam the classroom / workshop. The peripheral digital tools enabled them to build an authentic story of their designing through a combination of drawings; photos; voice files and text. Their story emerged as the trace-left-behind by their purposeful activity in the task. The focus of phase 2 was to integrate these techniques into a complete system.

In e-scape phase 2 a prototype system was built that enabled teachers to run design & technology test activities in 11 schools across England. This resulted in 250 performance portfolios on a website that were then assessed using an Adaptive comparative judgement assessment methodology based on work by Thurstone and the Law of comparative judgment. Learners were enthusiastic about using the system in schools and the reliability of the subsequent assessments was significantly higher than is possible using conventional approaches.[5] However, there were two limitations with the phase 2 system:

Firstly, it operated only in design and technology, and this raised the question of its transferable value into other subjects.

Secondly, the phase 2 tests had been run as a research project – with the research team operating the system in schools. This was not a scalable model for national assessment. It was necessary for such a national system to be operable by teachers in their own classrooms and this became the focus of the third phase of the project.[6]

Phase 3 focused additionally on science[7] and geography,[8] with the work evolving through several steps:

  • creating subjects teams in geography and science
  • development and trialling of tasks (science, geography and d&t)
  • development of the technology to facilitate task evolution (authoring tool); to enable teachers to run activities in schools (EMS); and to manage the pairs judging (pairs adaptive comparative judgement engine)[9]
  • running test activities (geography, science and d&t) in schools across England and Wales
  • conducting the judging and analysing the outcome[10]

References

  1. Richard Kimbell, Jenny Bain, Soo Miller, Kay Stables, Tony Wheeler, Ruth Wright (2005). "Assessing Design Innovation - Final Report: A Research and Development Project" Archived 2011-07-16 at the Wayback Machine for the Department for Education and Skills (DfES) and the Qualifications and Curriculum Authority (QCA) (Paperback).
  2. QCA (2006). "New ways of assessing creativity in design and technology" (PDF). Archived from the original (PDF) on 2009-06-09. Retrieved 2009-07-02.
  3. Richard Kimbell, Tony Wheeler (2005), Assessing Design Innovation Archived 2011-07-16 at the Wayback Machine. Goldsmiths, University of London / Technology Education Research Unit, ISBN 9781904158639
  4. Miller, Soo and Kimbell, Richard and Wheeler, T. & Shepard, T., 2006. "E-scape Portfolio Assessment Project e-scape: Phase 1 report" Archived 2011-09-30 at the Wayback Machine. A research and development project for the Department for Education and Skills (DfES) and the QCA. Project Report. TERU.
  5. Richard Kimbell; Tony Wheeler; Soo Miller & Alistair Pollitt (2007). "e-scape portfolio assessment phase 2 report" (PDF). Goldsmiths, University of London / Technology Education Research Unit. Archived from the original (PDF) on 2011-07-16. Retrieved 2009-06-29.
  6. Martin Ripley (2007). "E-assessment – an update on research, policy and practice" (PDF). Futurelab.
  7. "E-SCAPE Project". Archived from the original on December 6, 2008. Retrieved August 16, 2016.
  8. "Geographical Association - E-scape". Archived from the original on 2010-11-29. Retrieved 2010-01-21.
  9. Alastair Pollitt; Victoria Crisp (September 2004). Could Comparative Judgements Of Script Quality Replace Traditional Marking And Improve The Validity Of Exam Questions? (PDF). British Educational Research Association Annual Conference. Manchester. Archived from the original (PDF) on September 30, 2011.
  10. "e-scape portfolio assessment phase 3 report" (PDF) (Report). Goldsmiths, University of London. March 2009. Archived from the original (PDF) on February 15, 2010.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.