www. teach it.so So... What is the evidence base? Main page Evidence Myths Solutions Feedback Active Learning Philosophies Evolve e-learning Thoughts
Use the word pedagogy in conversation? If so, click here
So... So... examine the evidence So... adopt evidence-based methods So... enhance learning
Effect Sizes What is learning? Prior Achievement School Leaders Student Management Pedagogy
Meta-studies: Education’s simile of medicine’s Cochrane reviews  There are thousands of medical research journals containing the results of hundreds of thousands of research trials that frequently yield contrasting results. How has medicine gone about finding some definitive answers from such a disparate research base? In 1993 the Cochrane Collaboration began synthesizing reviews of similar trials on medical interventions. There are now approximately 4,500 Cochrane Reviews (which are updated regularly in response to new linked research) that summarise the available research on individual treatments, allowing physicians and patients to make evidence-based decisions on the efficacy of a specific treatment. Education is replete with results from research, all of which seem to suggest that, “the new approach works”. A look back over educational change over the last decade will yield innumerable ideas in the shape of reforms, all of which were designed to improve education. How many were based on evidence rather than politics or personal experience? The evidence on the educational impact is available for hundreds of treatments (or innovations) in teaching, existing in the form of thousands of research studies on millions of students. How can you get a definitive view of what works best? John Hattie and Robert Marzano These two educational researchers have compiled comprehensive meta-analyses of research trials on aspects of education that influence achievement. John Hattie alone, in his work Visible Learning, synthesizes the findings of 50,000 studies on many millions of students. He clearly identifies key factors that have been evidenced to influence the achievement of learners, many of which are under the direct control of the classroom teacher teacher in front of them. Robert Marzano has also conducted a wide range of meta-analyses attempting to distinguish effective classroom practice from that which is less beneficial. The work of these two researchers provides clear guidelines on how to maximise student achievement.
Effect Sizes  Hattie and Marzono’s provide effect sizes- measures of the impact of educational initiatives on achievement. Effect sizes typically range in size from -0.2 to 1.2, with an average effect size of 0.4. It would also appear that nearly everything tried in classrooms works, with about 95% of factors leading to positive effect sizes:  
(Adapted from Petty, 2009)
  Hattie states that an effect size of d=0.2 may be judged to have a small effect, d=0.4 a medium effect and d=0.6 a large effect on outcomes. He defines d=0.4 to be the hinge point, an effect size at which an initiative can be said to be having a ‘greater than average influence’ on achievement. Hattie claims that the intellectual maturation of students leads to effect sizes of between d=0.0 to d=0.15- as revealed from studies with no or limited schooling. Average teacher effects range from 0.2 to 0.4. According to Hattie’s work, teachers should be aiming for achievement gains greater than d=0.4 to be considered above average, and to be considered excellent they should demonstrate achievement gains for their students of d=0.6 or higher.  
(Adapted from Petty, 2009)
Effect Sizes and Exam Grades  A rule of thumb from Petty is that: An effect size of d=0.5 represents a one grade increase in achievement at GCSE or A-level An effect size of d=1.0 represents a two grade increase in achievement at GCSE or A-level  
Challenging goals, success criteria, active learning, recognition of effort and rich feedback Evidence Effect Sizes