Q: Darcy, tell us about your Bright Spots project.
We are focusing on providing research-level data analysis within a school so that teachers can evaluate their performance on a year on year basis, and to evaluate the impact of teaching initiatives. For example, you might want to try a new technique such as using mini whiteboards or introducing e-learning into your classroom, and this data analysis system allows you to measure if this has a significant effect on student outcomes. I’m working both with secondary schools (looking at NCEA outcomes and other measures like e-asTTle) and also with primary schools and intermediate schools (where we’re looking at curriculum levels and PAT testing).
Q: Why is this project important?
Currently teachers are not able to evaluate their initiatives with statistical validity. You can look at your NCEA results or percentages and the best case scenario is to compare percentages or averages of your current cohort with that of last year, or if you are really dedicated you can look at long term averages. But this always begs the question of how much improvement is good enough, or if marks go down is it a disaster?
Statistics already have standard procedures for evaluating these types of questions. That’s why this is important. There has been year after year, decade after decade of professional development initiatives introduced in education around the world and we keep on doing this. But if you can’t evaluate whether something works or not, should you carry on doing it or should you stop? If you don’t know, you can’t make any type of systematic progress and it just comes down to people’s preferences. Teachers need to be able to evaluate their teaching initiatives so that they can make progress.
Q: What progress have you made so far?
A huge amount! Last year myself and a team of colleagues spent two days with Bright Spots and SPARK to design the project. I’ve now developed an SPSS script, which is a computer programme used for statistical analysis. For secondary schools, this programme analyses NCEA credits and standards, comparing current cohort results with previous years. The analysis allows you to say with certainty whether changes in results year on year actually represent a significant improvement (for example, as a result of a new teaching approach), or whether slight variations in data are the result of chance.
My colleagues and I have looked at a lot of distributions, one for every single standard, every single class, and every single department at our school. Every department at Gisborne Boys’ High School now has evaluated their NCEA data using these methods, and HODs have made presentations about our findings and set goals for the year ahead. Feedback from management and teachers has been really positive. While it was pretty arduous, especially for those without any statistical background, everybody agrees it was one of the best pieces of professional development they’ve ever undertaken.
Q: Has the project been extended beyond Gisborne Boys’ High School?
Yes, I’ve also been working with other schools in the area. At the end of last year, the maths and science departments at Gisborne Girls’ High School were taking part, and now around half their departments are on board. I’m also working with another high school across all their departments, as well as with some local primary schools and an intermediate.
Q: What are some of the challenges?
Initially one of the other major challenges was the fact that extracting data manually from our student management system is an onerous task with much room for error – data entry can go wrong so easily! One of the fantastic things is that NZQA and MOE are now providing us with essential support. NZQA has designed and organised a new output providing all of a school’s NCEA results from 2012 to 2018. Meanwhile MOE has written an amazing script turning this whole-school data into subject based data. Without this, we would still be exporting records from our student management system and pasting the data together into incomplete and error-ridden records. This was a huge step forward for the project. Furthermore, thanks to the Bright Spots’ networks, we’ve started the same thing happen with NZCER. This has made analysis so much easier.
Q: What do you hope will be the long term impact of your work?
I’m really hoping that we will be able to roll out this type of analysis and way of working to schools across New Zealand. We have some amazing teachers in this country and some amazing research initiatives, but we have to get these things running directly in the schools. At the moment, evaluation and assessment can be incredibly difficult. We need to create an environment where it’s easy for teachers to evaluate their performance, the performance of their students, and their initiatives.
Q: How has being a Bright Spots Awards recipient helped you with this project?
I’ve been working on this for a long time by myself, but there hasn’t been enough time in the day to make progress in a large scale manner. I now have a role as a cross community teacher in our Community of Learning, which gives me ten hours a week of paid time to work on it. Bright Spots has combined perfectly with this by providing the funding to release teachers both from our school and other schools, and also by supporting me to upskill myself. The funds, the feedback, the advice and the training has been invaluable.
Applications for the 2019 Bright Spots Awards open April 29. The Bright Spots Awards support the development, evaluation and sharing of innovative practice in schools and ECE centres. For more information and to access the application form, go to www.theeducationhub.org.nz/brightspots