All right, be honest, how many of you have begun to dig into some educational research article, and before you make it all the way through the abstract, you’re feeling a little bit overwhelmed? Trust me, my hand is up too! As educators, we all know the value of research based strategies and skills, of using best practices, and of trying new ideas to meet the needs of our kids. Even with that knowledge, there are times that it is so hard to try to make it through the research.
Add to that another frustration that I sometimes come across when reading research… The fact that they don’t all seem to agree! I can read study A and it says to try one strategy, but study B gives me another option, and before I can make my mind up, someone shares study C with me, and it tells me something totally different. Talk about frustrating!
So imagine my happiness when I started to learn about the idea of a meta-analysis. For those of you that aren’t familiar with that phrase, a meta-analysis will combine data from multiple studies and develop conclusions that have a greater statistical power.
That brings me to the work of John Hattie, the author of Visible Learning. In his work, Hattie developed a system of ranking a variety of influences based on how great an effect they had on student learning. The study was most recently updated in 2015, and includes a list of 195 effects. When you look at the effect sizes, you will see a variety of items with a number score based on the effect size. In Hattie’s work, he identified the average effect size as a 0.40, and uses that as his guide for what works best in education. The things that fall above the 0.40 are considered successful, while things that fall below do not have as meaningful of an effect.
While not every strategy will work for every kid in every situation, the more you rely on the strategies that have higher effect sizes, the more likely you are to provide meaningful learning opportunities for your students.
So I guess all that’s left is to look at the rankings. This link will take you to an interactive visualization of Hattie’s rankings: Hattie Rankings Interactive
Some advice on how to read this visualization:
- The larger the effect size (in other words the higher the number or larger the bar) the more likely this strategy would be listed in a “What works best in education?” blog post.
- Several things that we often think of as “go to” reasons for student success and failure (home life and things that happen outside of school) fall pretty low on the list.
- Many things that are within our control have very large effect sizes.
Hopefully you take some time to analyze the items that appear on this list, think about what it means, and think about how integrating some of these strategies into your teaching (or maybe just your thinking) could have an impact on learning for your students. One of my big takeaways as I analyzed the data that appears here is that we should change the things that are within our control, and ignore the things that we can’t control. The data here, along with our own common sense of what’s best for kids, tell us that we do have control over the learning that happens in our classroom every day.
What are your thoughts on Hattie’s rankings? Are there any that you feel missed the mark and are too high or too low? What about things that aren’t on the list? Anything that you think needs to be there? Share with us in your thoughts below!
One thought on “Research, meta-analysis, and John Hattie”