#Lower case r, upper case ED, 17

ResearchEd17 could be forgiven for being a bit self-conscious: in recent weeks it has been spoken of less favourably, had its grassroots raked over, its biases heat-mapped. Sure enough, when I arrived (like a marathon runner along Cheering Lane), it was clear that the cheering crowds had stayed away.

total gridlock ‘Total Gridlock’

There was barely a complimentary canvas bag in sight. Was TB defeatED? Was his love affair with geeks in the staffroom and policy wonkers in the anglosphere endED? Had his hash been tagged for the last time?

Nope.

It turns out that Chobham Academy, in the shadow of the Queen Elizabeth stadium, is larger than the average comp: they hosted the 2012 volleyball in their foyer – which is where we found the hordes and their canvas bags. Tom was in pink, Helene was handing out raffle tickets to win lunches, and all was right with the world. Someone (we blamed the Harris peeps) forgot the free pens, but we are high starters with 21st Century skills happy to photograb speakers’ slides and live-tweet our research. Our only concern was failing phone-battery power. #WorkingOutHowToMakeADyingPhoneWork.

I have engaged with ResearchED on many previous occasions, but this was the first time I got to engage in ResearchED. If you have never done it, you should, it’s miles better than sitting on the floor all day, and you get to rub shoulders with famous people and Nick Gibb. And you get to start the day in the speakers’ lounge, this year styled as the training room. I limbered up there with my fellow presenter James Mannion and teachers from the City of London School.

SpeakersRoom

Imagine if all the boxers from the undercard were put in the same dressing room: it’s just like that. Your competitors (those you suspect will draw a bigger crowd) are in there, as are real live people who misleadingly look nothing like their Gravatar. I’m represented to the online world by a three year old drawing by my daughter, who at the time loved me enough to ignore my greying hair. In short, there is little small talk in the speakers’ room; just people doing some research before it’s their turn to present.

You would get fit working in this school. It’s designed like one of those spiralling coin boxes that entertain you as you give your old euros to charity. I joined Lisa Pettifer for a lap on the second floor. ‘Is M213 this way, Lisa?’ ‘Just keep on walking, Mark, and it soon will be.’ Jonny Peacock and Christine Ohuruogu go to school here. Sensible presenters like Christian Bokhove wear t-shirts with penguins on them. He told us that spinach does not contain lodes of iron, that the myth apparently occurred during the Great Decimal Point shift, that that too is a myth traceable to the Readers’ Digest, which may or may not be available in the Netherlands. And the moral of his tale was: don’t pretend that you know stuff really well unless you really do; try a little nuance when discussing cognitive psychology on twitter. I will try my best, Christian, but it’s in my nature…

‘Mark Quinn et al’ were giving their talk on practitioner enquiry during session 3. Six of us at the front, going for the prize of Those Most Likely to Outnumber Their Audience. Tom helped us out by scheduling against us Sherrington, Christodoulou, Weston, Jones, Creaby, Davenport, and Hood and Fletcher-Wood… et al. Well, I don’t know how many flocked to the gurus this year but we were very happy with our little turnout. Everyone had a seat, they could join in on the chat, and I could pick out old colleagues Barbara Terziyski and Vivienne Porritt.

ResearchEd17 There are more people just out of shot.

We were making the case for the gnomes of the research garden, teachers carrying out the sort of micro-research that tests out the grand theories without ever being reported. Nick wanted to know if his year 8 had a growth mindset, and if they did did it show up in achievement and effort data. (They did, and it did not.) Joe, the head of RE, has an ontological interest in creativity: he wanted to know what his students thought about it and where they would like to see more of it. Richard wanted his year 11s to be more reflective about their work, had a hunch that peer feedback would help him get there, and found that it did. The great thing about ResearchED is that it showcases some of the disciplined enquiries that real teachers are conducting, but even if ResearchED did not exist these teachers would still be gnomically enquiring away.

Amanda Spielman finished my day. That’s great because she is passionate about workload, so much so that Ofsted will ask headteachers how they are reducing it. She is also passionate about research and will turn the inspectorate’s attention that way increasingly. I asked her if she would research the impact Ofsted have on workload, and act on the results. I can imagine headteachers replying to Sean Harford’s questionnaire by saying they tell their staff to ignore Ofsted. Ofsted could write a best practice review of all of those schools that ignore them. That would be great, because lots more schools would read Ofsted’s how-to guide to ignoring Ofsted. Spielman might pull her hair out at that unintended consequence. One to watch.

 

 

Advertisements

Research: Learning by Doing

Learning by Doing

I provoked some minor debate with my Research Home Guard post, suggesting that ‘research ayatollahs’ ought to be more relaxed when more practitioner researchers commandeer their favourite R-word. My point is basic: if I want my colleagues to engage with research, they need to have some experience of also being engaged in it.

What does that mean? Engaging with research is more than just clicking on twitter, or scouting around the EEF site – though I would do nothing to discourage either activity. To be properly engaged with implies that I can read the findings critically, that I can ask questions about validity, that I have an appreciation of methodology. It also means that I don’t let go of my own professional judgement: I may catch some shining new insight escaping out of an academic hole, but I also have long years of my own experience to call upon. In short, if I am to understand what I am reading, I need also to understand how the knowledge was put together. And that is where school-based, action/practitioner enquiry comes in. Done well (and – I owe – it can sometimes be done badly), it can be systematic and rigorous. All I know about evidence-collection, I have learned from doing it and from guiding others to do it. Action researchers learn by doing. In other words, they can read other stuff because they have had to write their own.

Research Social Network at Chace My Social Network analysis (courtesy, Chris Brown.)

This map arose out of a survey conducted among my staff, compiled by David Godfrey at UCL-IOE. It revealed that, although I was at the centre of much of what my colleagues perceived as the school’s research culture, I was not alone: our Lead Teacher team was vital to the wider dissemination of ‘what we know’. I wanted to build upon, and to further democratise, this distribution. So, as my homework for the Leading Evidence Informed Practice in Schools course (led by David and Karen Spence-Thomas at UCL-IOE), I proposed the creation of volunteer Research Co-ordinator posts at my school. With the green light from SLT, I advertised and successfully recruited two colleagues who had recently completed excellent school-based MAs with me. (See here.) They are @louleggo7 (our head of Psychology) and @DSaunders1106 (a PE teacher). With @BTerziyski, we have now completed our first meeting. They have taken on a lot, and I need to reconsider the ‘volunteer’ aspect of their job title. As their work proceeds I, and they, will report further. But, for now, here are our plans:

To increase staff engagement with research ·         DS and LL will start ‘research reading’ groups (name tbd), convening possibly on a monthly basis to share thoughts on a piece of recently published research. This could be something with a controversial edge, or an enquiry conducted by a colleague at Chace.

·         DS and LL to publish a termly ‘research digest’ (name tbd): either synthesised by them, or ‘found’ elsewhere. This will be placed on our ChacePD website, and flagged in the staff bulletin.

·         Either as well as, or instead of, the above DS and LL will compile a booklet of research which will underpin our Development Time focus. E.g. pieces on resilience, growth mindset, collaboration, independence.

·         LL and DS will contribute to short School Briefing slots on research into pedagogy, school leadership and the educational system.

To increase the Research Coordinators personal effectiveness. DS and LL will continue to pursue their own research interests (and through this contribute to School Briefing, as described above.)

Through twitter, blogs and publications they will enhance their skills in data collection, and their appreciation of how best to engage a school in and with research. Among the commentators and academics they will familiarise themselves with: Chris Brown, Louise Stoll, Gary Jones, Alex Quigley, Tom Sherrington.

To support internal research and review ·         RCs will conduct a review of the impacts of MDT. They will design their evidence-gathering methods at the outset and measure progress towards desired outcomes.

·         RCs are keen to work with one or two middle leaders as they write and review their improvement plans. Enquiry questions such as: What do I need to focus on? What might success in this area look like? What evidence can I gather against this? What is my current position? What does research – and my experience – tell me might work? Therefore, what will I do?

·         Assist with the construction of enquiry questions when SLT are conducting reviews and learning walks.

·         To respond to ‘commissions’ from SLT for research findings into areas, on an ad hoc basis.

·         Work with NQTs on one action enquiry across their induction year.

To impact on teaching and learning, and on school culture. LL and DS are keen to implement next steps from their MA enquiries. They will explore opportunities to work further with interested departments.
To create a repository of items of research interest. MQ, CLTs and RCs are increasingly sharing insights and thinkpieces from twitter, blogs and online publications, which we need to store more efficiently.

We will share using the #ChacePD hashtag, so our PD website can maintain items on its timeline.

MQ and BT will create space – and place interesting items – on the ChacePD site, under the Research Enquiry at Chace heading.

To support the RC team. Request that MQ and the RC team have a timetable free together, to meet to plan and review work.

MQ to enquire as to how now to remunerate RCs (via TLR or time), as their input could be substantial.

The Research Home Guard

The estimable Alex Quigley’s recent post Just Don’t Call in Research again raised the issue of what should, and should not, be called ‘research’ in schools. I’ve debated the issue myself here and here. Despite his post’s title, Alex seems happy to take Dylan Wiliam’s notion of disciplined inquiry and apply it to the hard work of school improvement. I’m just not sure why the research mavens are so jealous of the word ‘research’. If what we do in schools passes the test of asking interesting questions in a systematic way and making our answers somehow public, wouldn’t Lawrence Stenhouse himself be satisfied?

Full Disclosure: I do a little work for the UCL IOE Research and Development Network, where this blog is also featured. I have been a participant on their Leading Evidence Informed Practice in Schools course, led by Karen Spence-Thomas and David Godfrey. Their working definition of research is much more open:

By ‘research’ we include a broad range of activity that can be loosely defined as ‘systematic enquiry made public’. In other words, it must be consciously planned and involve some collection of data/evidence. Making research ‘public’ need not involve writing the research up formally in an article or report but it must have been shared in some way.This can include a wide range of practitioner or academic research or research and development activity.

They have been sharing some of the insights from their colleague Chris Brown’s Leading the use of research & evidence in schools, which includes the notion of information flow around an institution’s ‘social network’. This was my effort:

Research Social Network at Chace

I am the hairy dot in the middle. It was based partly on my own observations, but mainly on the findings of a staff survey which David is using in a number of schools. I have selected just a few graphs here as indicators of the sorts of things my colleagues think about the idea of research engagement.

The first set of questions were about our culture.

Research Culture at Chace

 

There is pretty clear agreement that the culture of our school supports engaging in and with research for professional development. They also agree that we use and do research for wider school development, and that the school’s leadership is committed to this. Very many could point to specific inquiries that they, or colleagues, had been engaged in. This included Masters level work (see here for recent examples), but also inquiries conducted as part of our NQT induction programme, action research conducted within some departments, and school improvement work done as part of National College programmes. Interestingly, very many decided that our professional development programme (what we call Development Time) also amounted to research engagement, as it often starts with sharing a piece of externally-produced research and urges colleagues to experiment with ideas in their own classrooms. Whereas few puritans would accept this as research, it is clear that our staff think of it as such; for many, it will be the principal way in which they interact critically with the research.

So my colleagues are clearly not averse to this business. What would help more of them get into it? The next question revealed that a quarter of them did not know of a named member of staff responsible for promoting research engagement. That person is me. Subsequent questions show that a majority do not know of funding to support research (there is, from the governors), or of training to develop research skills.

Slide1

Slide2

Slide3

This is obviously an area we can work on. I have taken 18 teachers through to completion of their MAs over the past few years, and whipped numerous NQTs through various forms of action enquiry, but clearly several others have not been invited to join the party. There is a capacity issue: if I alone am the ‘research guy’, too many opportunities will be missed.

So, where next? The social network analysis provides me with a potential way forward. Although I placed myself at its centre, I was not alone. The CLT (Lead Teacher) team leads the Development Time which very many saw as their exposure to research. We have a hardy band of volunteer Development Coordinators, who consciously develop strategies to share with their colleagues. My plan is to build upon this habit of discretionary input from colleagues, to mobilise some of the skills that my ‘MA teachers’ have acquired, and muster a small team of Research Coordinators.A Modest Proposal

And, daring the wrath of the research ayatollahs, we are going to use the word ‘research’ at every opportunity.

MAs Re-Mastered

Masters

It is that time of year again. A pasty-faced band of colleagues return from their Christmas holidays having churned out 20,000 words for their MA Education dissertations. I have commented here before on some of the Masters work I have supported in my school. This year I want to share their work a little further (if my blog and twitter followers amount to ‘further’.)

With their kind permission I have reproduced their abstracts here. If you want to read their full reports, or to follow up a query with them, you can either type your email address into the comment box at the bottom of this post, or DM your email via twitter. It would be well worth your while!

 

Exploring the indirect impact of regular early morning football training on academic success: Can sport be used to develop a more academic culture?

By Daniel Saunders

Abstract

Background: There are numerous studies which discuss the benefits being physically active can have on physical and mental wellbeing as well as cognitive development in young people. Equally, there is currently much debate regarding the potential academic benefits of sport although many studies have struggled to find a causative link. Over the past three academic years, students at our North London mixed comprehensive school have had the opportunity to attend early morning football training sessions. This study seeks to determine whether there are any academic and non-academic benefits developed as a result. Method: Students who attended over 50% of the sessions were eligible for the study and the rest of their cohorts were deemed to be control groups. Student progress data was measured over the academic year, form tutors were asked for their opinion of students’ development, students were given questionnaires to determine their enjoyment of school and their perceived impact of the study, parents were interviewed to determine their opinion of their son’s improvement and attendance, punctuality and homework completion was measured to determine if the study has impacted upon their academic culture for learning. Conclusion: Students who participate in early morning football training achieve comparable results when compared with their respective cohorts. Form tutors and parents have made numerous suggestions that lead the researcher to believe that students who participated in the study developed significantly more positive attitudes towards school. Punctuality of students involved in the study was significantly better on days where early morning sessions were available as opposed to non-training days. The data obtained suggests that homework completion may be beginning to improve with the research sample who have been attending early morning training sessions for a longer period of time suggesting there may be a dose-response relationship which is yet to be understood.

 

How does a Mastery Learning teaching method compare to a responsive teaching method actively developed over time? 

By Darren Glyde

Abstract

This paper analyses in detail the Mastery Learning teaching approach and compares it to a tried and tested technique developed by a colleague over time. The two methods were applied to two GCSE Art classes, each taught by one method. The data collected has shown that both methods of teaching are able to raise attainment within the four core skills being investigated. The findings demonstrate that over a short period of time a Mastery approach of delivering skills based learning is effective, however, in the long term the tried and tested method of teaching produced higher levels of attainment overall.

 

An action enquiry to assess the effectiveness of a deconstruction model of teaching when applied to exam style reading-questions in year 8 English lessons, with a focus on: How far it can aid understanding of the question and whether it provides structural support when writing an independent response to a question.

By Megan Clarke

Abstract

This study explored the effectiveness of a deconstruction model when applied to exam-style reading questions in year 8 English lessons. It focused on the extent to which it aided students’ understanding of a question and whether it provided a structural support when students wrote independent responses to the question.

Data analysis revealed that implementing a model which focused on: vocabulary; the deconstruction and reconstruction of a question; and student-led scaffolding, to be beneficial in affective terms and promoted self-efficacy in students as learners, including independence. The results showed the specific areas of the model were successful in strengthening the breadth of responses to questions when delivered in a classroom environment, although, the small scale of the research would necessitate further quantitative and qualitative data collection and analysis in order to generalise and validate the use of the model further.

 

An Investigation into the Effectiveness of Distributed Practice and Practice   Testing Revision Strategies on Students’ Learning.

By Louise Legg

Abstract

Students often struggle to complete revision in preparation for important and final exams  and teachers are sometimes uncertain of how best to support students at this time. Experts in the field of cognitive psychology propose two revision strategies as being the most effective in improving students’ learning, namely distributed practice and practice testing. This research tested the effectiveness of these strategies in improving students’ learning in A-Level Psychology lessons. An action research approach was employed and a variety of techniques used to test the effectiveness of these strategies over the course of several months. Findings suggest that practice testing and distributed practice can enhance students’ learning and lead to improved performance in assessments. It was concluded that embedding these strategies within the teaching of subjects across the school and from an early age,  could lead to an improvement in students’ ability to prepare for exams and in their overall academic achievement.

Action Research: More than a Hobby

For Tom Bennett (@tombennett71), ResearchEd2015 turned out to be all lasagne and toilet rolls. I’m sure this was not a comment on the catering. His was a unique perspective and, if he took the opportunity to sit back and survey the landscape, then he deserved the respite for this was a wondrous landscape he had created.

It had certain must-see features. For anyone who found their way to Finchley Road station via Twitter (and, really, was there any other way?) surely the likes of John Tomsett (@johntomsett), Tom Sherrington (@headguruteacher), David Didau (@LearningSpy) and Sam Freedman (@samfr) were the main draw. These are among those to whom Nick Gibb was referring when he spoke of how teachers have the potential to affect policy. They would have provided the meat in the conference lasagne for many a conference-goer. But, being a quiche-eater, my conference experience was a touch more marginal: no more adventurous and no less fulfilling, just not what everyone would have chosen. But being ResearchEd, it was impossible not to slip into talks delivered by what Tom B labelled somewhere ‘the illuminati.’

This was not my first. Not a veteran, not a neonate: I suppose I may be a ResearchEd toddler. I took my first steps at Dulwich College in 2013. Much has changed, it seems to me, between #reEd13 and #reEd15. Then, many of us needed to be told what the Education Endowment Foundation toolkit was; now it can be the butt of barbed references to its generous funding, and we all nod sagely. At South Hampstead High School, no one needed a primer on Lesson Study. We have come such a distance, and have done such a lot to dismantle the walls dividing our educational estates, that the promises of Connect-Ed to put us all in touch with each other seem quaintly out of time. But, for all this advance, the issue which dominated for me two years ago still lingers: what place does teacher action enquiry have in the research garden? Is it the well-groomed lawn, the pride and joy; or is it rather the embarrassing collection of gnomes, pointless but stubbornly present?

Becky Allen (@drbeckyallen) invited us to hack our own teacher-researcher career: like a domestic goddess in a business suit, we too could have it all. What was not available to her ‘in her day’, was to us on Twitterday. How good we were feeling… until the first slide.

“Almost all teachers should never do educational research.”

Like Dr Husbands in 2013, her point was that ‘piffling around’ with classroom enquiries did not a research study make. Do it for pleasure – or PD – but don’t pretend you are adding anything meaningful to the knowledge base. Yet, though most of us action researchers should leave well alone, ‘Education research needs practising teachers on the team.’ They need us to make up the numbers in their big projects. They. Us. Their. The hacking of her title seems to apply only to the few who can ‘create an identity’ for themselves online, build a team, get a summer job with someone reputable such as Education Datalab, then find a sugar daddy to fund us while we go part-time at school. No one quite asked, but why would we want to do all that, when the ‘pleasures’ of piffling around with action don’t-call-it-research are so enticing?

Because professional educational researchers are doing such a fine job. As the excellent Alex Quigley (@HuntingEnglish) reminded us, someone like John Bohannon call fool even reputable media outlets (not just the Daily Star) that eating chocolate helps you lose weight. And, at the end of the day Professor Robert Coe (@ProfCoe) drew attention to the flaws in the ‘Screen time drags down your grades’ story. Both gave valuable advice, not so much to teacher-researchers, but to teacher-consumers-of-research. Prof Robert Coe WhatWorks departed from Dylan Wiliam, when he said that a research-engaged teacher can be happy with basing just ‘some’ of her decisions upon the evidence. Like Quigley, he offered several tips for the teacher hoping to cut through the ‘BS’ (perhaps the same substance coyly referred to by our headmistress host at the end.) But Coe seemed to reach out to the action research fraternity, in answer to a question from James Mannion (@pedagog_machine): Yes, teachers should monitor and review the messages passed down to them by the experts, by themselves trialling the ideas and methods systematically in their own classrooms.

I know that much practitioner research can be dodgy. It may lack validity in terms of its wider applications; and those of us in school feel more queasy than those who are not about setting up a control group and deliberately denying them the fairy dust we are sprinkling on our treatment group. Like the joyously upbeat Nick Martin and Clare Hood at the Samuel Whitbread school, where every teacher is in a research lesson study triad, we cannot always reliably locate the factor which truly caused the progress. Our sample sizes may be tiny, and the claims we make for the size of our effects may be overblown. Action researchers are hobbyists, and professionals may feel a little embarrassed in our company.

They should get over themselves.