School Assessment
Transcription
School Assessment
Friday 11:45 – 12:45 SCHOOL ASSESSMENT: IDENTIFYING DATA, ANALYSING RESULTS, IMPROVING LEARNING AND TEACHING A look at the complicated view of the assessment universe Sharing with you • Why we chose to focus on Assessment • The various assessment views • How we gather, analyse, interpret and use assessment data • Some thoughts about addiNonal assessment tools for an internaNonally-‐minded school? ContribuNons • MaQhew Savage (former Secondary Principal) • Gil Bierman (Secondary Principal) and Sarah Ford (IB PYP Leader) & their Assessment Team • GL Assessment Background • Danube InternaNonal School Vienna – IB World School (IB PYP, IB MYP and IB Diploma) – 21 Years Old – 510 students – 60+ naNonaliNes of students – 27 naNonaliNes of staff Background • Director since 2009 • My 18th year in InternaNonal EducaNon and 4th IB World School • BSc Hons in MathemaNcs, OperaNonal Research and Economics • PGCE in Secondary MathemaNcs • MSc in Management Sciences – DissertaNon -‐ Use of Data Envelopment Analysis in UK Primary Schools • MEd in EducaNonal Management – DissertaNon – Development of an EducaNonal Possibility Model for an internaNonally-‐minded school Why a presentaNon on Assessment? • Four years ago we collaboraNvely created a new vision • Three years ago (led by principals) we collaboraNvely created an aspiraNonal strategy for learning and teaching. – At the heart were • Assessment • Language • Physical and EmoNonal Learning Environments ! "#$%!&'()*+*,!(*-!.'(/0+*,!$1)(1',2! ! <5!0>.#9*020*!#9*!&?%@A?@!53!B@"CA%AD!)2#9!#9*!"C<!53!<@"?E%AD!)2#92.!#9*!@FG!#5!D,-/*!HI!+6-00,55:! ! ! 3-.6!789!&659:#:;! ! "#!$%&'(!)*!+,*-#*!-./!010#-2.!-.!*33*+#24*!3-.6!35,!6*-,.2.7(!8*+-10*!)*!8*62*4*!#9-#!#9*!3,12#0!53!623*65.7!6*-,.2.7! 7,5)!8*0#!2.!#9*!:50#!3*,#26*!0526;!<920!+62:-#*!+-.!8*!/242/*/!2.#5!648.#8:5&!-./!<=>$#35&!6*-,.2.7!*.42,5.:*.#0;! ! @:8,-+2.7!-./!350#*,2.7!#9*!-J,281#*0!53!#9*!%K!B*-,.*,!L,5M6*(!#9*!@NO<%OA"B!*.42,5.:*.#!)266P! 0-3*71-,/! *:5Q5.-6!&"R@<F(! )2#9!0*63S*0#**:! 2.+,*-0*/(!-./!,20TS #-T2.7!*.+51,-7*/! -./!-J,-+Q4*! /*:5.0#,-#*! :1#1-6!C@&L@?<(! )2#9!#9*!42*)0(! 0#,*.7#90!-./! .**/0!53!6*-,.*,0! 4-61*/!8>!-66! :-U2:20*!LC"%&@(! )2#9!-+92*4*:*.#! +*6*8,-#*/!-./! ,*)-,/*/(!-! V502Q4*!*UV*,2*.+*!! 35,!-66! +,*-#*!RGA!-./! @W?%<@N@A<(!)2#9! 6*-,.2.7!09-,*/(!! V1,V50*316!!-./! 5V*.!#5!*4*,>! 6*-,.*,! +*6*8,-#*! $%'@C&%<F(!)2#9! 2.#*,.-Q5.-620:! *:8,-+*/(!-./! 6*-,.2.7!:-/*! ,*6*4-.#!#5!-66! ! ?5.020#*.#6>!-/5VQ.7!8*0#!V,-+Q+*(!#9*!LEF&%?"B!*.42,5.:*.#!)266P! *U9282#!+,*-Q4*(! 2.35,:-Q4*!-./! 2.#*,-+Q4*!$%&LB"F(! )92+9!*.-86*0! "00*00:*.#!35,! B*-,.2.7! 8*!*X+2*.#6>! OCD"A%&@$(!7242.7! 3166!-++*00!#5! ,*6*4-.#!,*051,+*0(! 2.!-.!-J,-+Q4*! +6-00,55:!53!)92+9! 6*-,.*,0!-,*!V,51/! V,542/*!-!Y*U286*(! 6*-,.2.7S+*.#,*/! B"FOG<(!)92+9! *.+51,-7*0! 7,51V)5,T! ,*+57.20*!#9-#! &@"<%AD!LB"A&! 09516/!8*!/>.-:2+(! 6*-,.*,S+*.#,*/!-./! 0#,-#*72+! 0Q:16-#*!#9*! &@A&@&(!102.7! .-#1,-6!6279#(!V6-.#0(! :102+!-./!0+*.#!#5! *.9-.+*!6*-,.2.7! ! $.9?3.?96!789!&659:#:;! ! "#!$%&'(!)*!10*!#9*!35665)2.7!$.9?3.?96!35,!6*-,.2.7;!=*!8*62*4*!#920!0+-3356/*/!0#,1+#1,*!-++*6*,-#*0!6*-,.2.7;! ! @-+9!6*005.!5,!7,51V!53!6*005.0!)266!+,*-Q4*6>!35665)!-!ZSV-,#!0#,1+#1,*(!+5.020Q.7!53P!! 38::63.#:;!#9*! 6*-,.2.7(!#5!V,25,[31#1,*! 6*005.0!-./!#5!*-+9! 6*-,.*,!#9*:0*64*0\! 0*662.7!#9*!8*.*M#0!-./! -7,**2.7!#9*!B*-,.2.7! O8]*+Q4*(!01++*00! +,2#*,2-!-./!#9*!^K27! L2+#1,*^!53!+5.#*.#[ V,5+*00! !! "648:$.95.#:;! 53.#%5.#:;!#9*! 6*-,.2.7!8>!0Q:16-Q.7! 2.#*,*0#(!V,5:VQ.7! *._12,>(!*.7-72.7!+,2Q+-6! #92.T2.7(!-./!V,542/2.7!-! ,2+9(!2.#*,-+Q4*! *UV*,2*.+*! #9*!6*-,.2.7\!7242.7! 6*-,.*,0!5VV5,#1.2Q*0!#5! `095)!#9*>!T.5)a(! #9,5179!-+Q5.(! *UV*,2:*.#-Q5.!-./!2.! :16QV6*!:5/*0\!-./! *.01,2.7!*/1+-Q4*! 3**/8-+T!2.!5,!.*-,!#9*! -1#9*.Q+!*UV*,2*.+*! 38:$8&#"5.#:;! #9*!6*-,.2.7\!*.-86ing learners to evaluate and reflect collaboratively; 62.T2.7! 6*-,.2.7!#5!V,*42510!-./! 31#1,*!+5..*+Q5.0! Why a presentaNon on Assessment? • Two years ago, our Secondary Principal and PYP Leader collaboraNvely led and created a whole school assessment policy. • We are not “boldly going where no man has gone before” but we need to go there. • … as we are passionate about improving student learning. Why a presentaNon on Assessment? • John Hage in “Visible Thinking” states that we need to have high student expectaNons, develop assessment capable students and use the power of criNque and feedback. • Grant Wiggins “Backward Design – The need for Assessment to drive curriculum planning” • In short in its various forms Assessment is one of the most cost-‐effecNve and important drivers of improvement in student learning What is Assessment? • Assessment is the process of gathering, analysing, interpre8ng and using informa8on about students' progress and achievement to improve teaching and learning. Visible Learning Plus Types of Assessment? • work samples (wriNng, drawing, concept map, model); • tests (verbal, essay, mulNple-‐choice, matching); • interviews and conferences (taped, verbal, peer assessment, group discussion); • porlolios (diaries, sketches, journals, digital files, notes); • performance (problem-‐solving, roleplay, structured discussions, debates); • major work (exhibiNon, invenNon, invesNgaNve project, recital). Data Points • • • • • • School wide Programme wide Subject wide Grade wide Teacher specific Student specific • Task (knowledge) • Task (skills) • Task (understanding) • • • • AbiliNes Achievements Engagement Targets (Predic8ons) • External (paper/computer) generated • Teacher generated • Student (self/peer) generated Can Assessment Data help us? Assessment Analysis • • • • Boring Takes Nme Good quality assessment data can enable … Poor quality assessment data can muddy … • Vital if a school wants to improve student learning Assessment the BIG picture • Is the sun shining today? • The end of semester grade in MathemaNcs for Jane is an A Assessment the BIGGER picture • The sun is hot and looks to be complex. • A breakdown of Jane’s grade in MathemaNcs – various pieces of summaNve and formaNve assessment, which use a variety of assessment tools on a range of tasks and using different assessment rubrics Assessment THE BIG picture The assessment universe • Where is our sun amongst all of that • The overall level of achievement for all students in end of year internal assessments for all subjects. • The overall level of achievement for all students external assessments • All ongoing formaNve and summaNve school assessment data Assessment a BEAUTIFUL picture An assessment cluster • A beauNful formaNon when seen clearly • Students results across all their subjects during a year aqer you have stripped away any ‘insNtuNonal biases’ (A STEM school, a Bi-‐ lingual school, a Music school) • An individual students results aqer you have factored in addiNonal factors (ELD, dyslexia, divorce, major illness, ability) Assessment the INVISIBLE picture • hQp://staNc.bbc.co.uk/universe/img/ic/640/ quesNons_and_ideas/dark_maQer/ dark_maQer_large.jpg Dark Assessment MaQer • We know something is out there having a massive effect on the assessment universe but we can’t see it. • We know students have different talents, agtudes and abiliNes that indicate their current potenNal. • We need to look for them! Assessment – The Force! Dark Assessment Energy • Drives and accelerates expansion of universe • What can drive and accelerate the expansion of a student’s potenNal (brain plasNcity) – Developing Learning styles? – Enhancing their Learning ‘skills’ toolbox? – Engaging them in their learning? We have found all life forms in the galaxy are capable of superior development. -‐-‐ Kirk in 'The Gamesters of Triskelion' Assessment – Is there a Map? 12.5 to 14 billion light years from our sun Mayan Star Map Assessment Map • DaunNng or Mysterious 1. Ensure you have the overall Vision 2. Be AspiraNonal 3. Be CollaboraNve and Communicate every step of the way 4. Follow a process Assessment Circle What data should we gather? InformaAon needed How can data be used and shared? Methods CommunicaAon Use of data How can we analyse and interpret data? Tools PaHerns How can we gather data? Assessment Views • An individual student’s subject results from a distance • An individual student’s subject results close up • The school, the various phases, the range subjects, the assessments within the subjects • An individual student or group of students aqer ‘stripping out’ the background assessment noise • An individual students talents and abiliNes • An individual students learning styles, approaches to learning ‘skills’ toolbox, level of engagement Gathering the Data • Student self-‐assessment and peer-‐assessment results … • Teachers records assessment tasks and the feedback for both formaNve and summaNve is on ‘Managebac’ – Other gradebook systems are stand alone (mygradebook, gradekeeper) or in integrated school databases (SIMS, Serco, Blackbaud, etc) – Transparent Gathering -‐ Managebac Gathering -‐ Managebac Analysing Internal Assessment • Individual Student results (achievement and engagement) per subject and average Analysing Average assessment level & engagement result per Subject per grade Average assessment level & engagement result in the same grade 7 6 5 4 3 2 1 INC Av A 34 67 89 49 24 8 0 B C D E Av 3 5.0 110 108 70 20 2 3.9 Analysing • IB PYP internal reports – Exceeding, MeeNng, Developing, Beginning for each Unit of Inquiry UoI 1 UoI 2 UoI 3 UoI 4 UoI 5 Class D 5Es 17Ms 20Ds 12Bs 14Es 21Ms 15Ds 4Bs Class V 12Es 18Ms 17Ds 7Bs 14Es 18Ms 17Ds 5Bs UoI 6 • Benchmarked against the expectaNon of what they should do by the end of the grade vs expectaNon of what the should be able to do now IntrepreNng and Using • IntervenNons with students – Have they passed the mid-‐semester, semester in achievement and engagement – why did this not occur, can they be supported to do so. • OpportuniNes to celebrate – We have an honor roll on a stairwell • Average Grade of 5.5 • Most improved ELD student • IntervenNons with staff – A teacher-‐generated engagement/effort results are out of sync with every other teacher • Learn how this teacher got students really engaged in the subject? • Is the teacher applying the assessment levels too strictly? • Greater consistency across subjects and grades – StandardisaNon achieved as a product of data analysis • New staff alignment • IB Diploma Students – Write extended essays, lab reports, world literature essays, complete orals, answer mulNple choice, interpret data, write a Nmed essay in a final exam ….. – PredicNons by teachers • They produce a lot of data Gathering 028544028544 Analysing and InterpreNng • Did this student achieve as highly as possible? • Why did this student gain a 34 points, a 6 in History, a B in TOK but a D in extended essay? • Compare to a measure of the students abiliNes Gathering -‐ PredicNons Analysing and InterpreNng • Calculate the average difference between the predicted grade and actual grade for each class. • i.e. Under-‐predicNons: Biology 10%, German B 9%, Economics 7% – Why? • Does this raise a concern about a student being under-‐ predicted for a university applicaNon? Gathering Gathering Analysing and InterpreNng • AccounNng for different ability levels in different classes • Find the average grade per student • Student’s Grade in each subject – average Grade of the student • Further Discount IB average subject score (IB average subject score – (IB average subject scores * number of students in subject)/total number of subject candidates)). Subjects: Analysing & InterpreNng • Which subjects appear to have a beQer value-‐ added grade than the average grade of the class of student and the IB worldwide exam)? Analysing and InterpreNng Gathering Analysing and InterpreNng Why did English A SL students perform less effecNvely in Paper 1? InterpreNng • English A1 HL students were mainly naNve English speakers • English A1 SL students were mainly non-‐naNve English speakers – Look for strategies to further support this group Programme Wide: Analysing & InterpreNng • Find percentage of students with Bi-‐lingual Diploma • Compare with IB averages (available later that year) Using • • • • Celebrate!! Inform the board/governance Use to reinforce high expectaNons Use for markeNng purposes • InternaNonal Schools’ Assessment – ACER (Australian Council for EducaNonal Research) • Test Grades 3, 5, 7 & 9 in October • Measures Reading, MathemaNcal Literacy, WriNng (NarraNve and ExposiNon) • Results End of December – can distribute use in January • OECD uses the PISA to compare countries Other schools may use MAP Northwest EvaluaNon AssociaNon. WriNng Math Literacy Reading Analysing InterpreNng and Using MarkeNng Purposes, Reassure Parents InterpreNng Reading Like Schools DISV 232 221 362 350 439 448 508 485 Writing Task A Like Schools DISV 353 363 441 435 497 490 557 539 Writing Task B Like Schools DISV 378 405 452 474 502 487 562 543 Grade 7 to 9 Value Added Scores 2009/10 to Like Schools DISV 2011/12 78 103 Like Schools 82 DISV 103 Like Schools 69 DISV 61 Like Schools 59 DISV 65 Grade 5 to 7 Value Added Scores 2009/10 to Like Schools DISV 2011/12 73 86 Like Schools 105 DISV 183 Like Schools 69 DISV 120 Like Schools 52 DISV 69 2011/12 3 5 7 9 Mathematical Literacy Like Schools DISV 301 253 416 376 493 478 558 540 Using • Our ISA reading scores were not as strong as we would have liked. • Introduced a more structured reading support programme in the Elementary and now the Secondary School. • Adapted our new to English programme and increased the number and improved the skills level of English Language Development assistants for the School. Using • GL Assessment – BriNsh based Assess Grades 4 (in September) 5, 7 & 9 last week of school year. All new students to Grades 6-‐12 Measures Verbal, QuanNtaNve, Non-‐Verbal and SpaNal reasoning abiliNes Results are immediate as is digital Some predicNve ability re: BriNsh exams and this Sept 2013 even IB results Very liQle change over the 2 years of students taking the data even for ELD students (verbal increased by 4 points and quanNtaNve/non-‐verbal by 2 points) • The only major changes came from Learning Support students. • • • • • • • School also uses NGRT (New Group Reading Test) which will is an adapNve digital assessment Other schools may use MidSyS (Durham) but only tests 11-‐14 Gathering • NGRT – New Group Reading Test Age in Months 116 Standard Age Score 117 We measure in October and in June to check progress in reading from Grade 2 to Grade 10. The NGRT results will help us target students in need of further support. CATs 4 Non-‐ Verbal CATs 4 -‐ QuanNtaNve CATs 4 Verbal CATs 4 -‐ SpaNal Analysing Analysing & InterpreNng Using Analysing & InterpreNng Analysing & InterpreNng Using • • • • • • • • • Admissions Inform teachers The placement of students (8D vs 8V) The placement in MathemaNcs (Extended/Standard) Need for ELD lessons? Use of ELD assistants Need for Learning Support? Need for differenNaNon for G & T? Approach in various subjects (verbally strong, quanNtaNvely weak class in Science) Using Learning Scorecard IB Diploma Results CATS Assessments NGRT results ISA results Surveys on Service Engagement Surveys on AcNviNes ParNcipaNons Future University Placement Survey of Parent saNsfacNon Bill Gerritz, Head of School, InternaNonal School Bangkok Collect, Analyse and Interpret Data on Student Engagement • Grade Z: consistently has more students engaged in a variety of Community and Service acNviNes than other grades – Learn and share the factors that have caused this Collect Data on Agtudes Survey Learning Skills Gathering PlasNcity • Stretching their potenNal – Jo Ann Deak – stretching the elasNcity of ability (visual learners, auditory learners, kinestheNc learners) – Enhancing the students ‘skills’ toolbox • It’s not just wanNng students to be the best that they can be, it is supporNng students to strive to be beQer than their best For the future • Assessments of EmoNonal Intelligences – MSCEIT • Assessment on Physical AQributes Using Data • Every child is a wonderful mass of energy • Students “don’t care how much we know unNl the know how much we care” • Filter out the “assessment” background noise and find something to celebrate – Honor Roll, Most improved ELD students, Commitment to Community and Service, AthleNc excellence – Praise from teacher, peers, leaders. Smileys Where to go now … • According to James T Kirk, the best place to get help with assessment analysis is a planet called Vulcan. • "Mr. Spock, the women on your planet are logical. That's the only planet in the galaxy that can make that claim." -‐-‐ Kirk • Queensland Department of EducaNon – hQp://educaNon.qld.gov.au/staff/learning/diversity/ teaching/assessment.html • hQp://www.learning-‐styles-‐online.com • hQp://www.gp-‐training.net/training/educaNonal_theory/mulNnt/mulNnt.htm • www.ibo.org • ISA • GL Assessments hQp://www.atlasoqheuniverse.com/ • www.bbc.co.uk • Star Trek PDF [email protected]