Schools must be constantly evolving if they are to meet the challenges facing our educational system. School development is a complex process, shaped by the various perspectives of the involved parties. These include educators (administrators, subject-specific teachers), but also parents, students and social workers, as well as researchers and other external individuals.

StEG Tandem, a subproject of the Study on the Development of All-Day Schools (StEG), is an example of a school development process. With the support of scientific experts, it was implemented at five schools in Germany.

In response to evidence that homework supervision is problematic at many German all-day schools and rarely offers individual support to students, my colleagues and I devised a plan, in collaboration with the participating schools, to enlist older students to help younger children with their homework. This is referred to as a cross-age peer mentoring approach, and was intended to improve homework support.

Specifically how the plan was to be carried out was left up to each school. Meetings were held to discuss the project’s implementation, with researchers present to provide advice. To determine the effects of cross-age peer mentoring, we gathered data on English and math achievement, the quality of the children’s relationships with their mentors, and the prevailing learning environment during homework supervision.

How do practitioners respond to scientific evidence?

Many academics complain that practitioners pay little attention to insights from research projects and scientific experiments. The StEG Tandem project examined communication between theoreticians and practitioners by looking at the process of school development. It was evident that the two sides do not agree on what is relevant.

Initial analyses of the project’s conceptual phase revealed competition between the insights of researchers and practitioners. During these early meetings, the researchers focused particular attention on the guidelines that had been drawn up for the schools, which contained information about the effectiveness of peer mentoring and the necessary conditions for its success. The educators viewed these findings through the lens of their practical experience, and generally reached the conclusion that “things don’t really work like that.”

“Their focus is not on finding measurable effects, but on implementing the project smoothly, within the context of existing school structures.”

Reasons why “things don’t work” include organizational factors and school culture; the school’s faculty might be opposed to innovation, for example.

The participating educators rarely responded to the information offered by the researchers, instead sharing their own ideas and interpreting scientific findings in the light of their own practical experiences. The researchers’ findings were therefore placed into a different context, which robbed them of their “scientific” character.

How schools responded to the project’s results

During the conceptual as well as the implementation phase, students were asked about their perceptions of homework supervision. We also tested their academic achievement. The aim was identify the effects of the project: Are students who do their homework under the supervision of older students more satisfied, relative to students from the previous year who were only under the supervision of teachers? Do they receive more subject-specific support? Do they perform better in English and mathematics?

It turns out that students who are mentored by older students are just as satisfied as those supervised by teachers. They also report that the older students are “good at explaining things.” At some schools, they performed somewhat better in English than the group from the previous year. This did not hold true for mathematics, however. Indeed, there was a slight decline in math performance at some schools. The students also observed that the rooms where they did their homework were noisier than the previous year.

“Researchers must acknowledge that practitioners, too, have insights to offer, and they should not dismiss that knowledge as inferior.”

My colleagues and I offered to provide a more detailed explanation of our results to help the schools continue the project on their own, but they showed very little interest. A presentation of the study’s results failed to elicit further questions or discussion. One school did not even respond to our offer to share our results.

Does this lack of interest suggest that the schools have no desire to continue the project? Surprisingly, this is not the case.

Different priorities

Academics and practitioners have very different priorities. Researchers want to identify positive effects, such as correlations with student achievement, and to publish their results. Practitioners (the schools) are mainly interested in continuing the project.

However, they want to do so only under certain conditions. Their focus is not on finding measurable effects, but on implementing the project smoothly, within the context of existing school structures. To them, the relevant questions are: Are there problems in the project’s implementation? Is additional effort required (even after the initial conceptual year)? Are teachers resistant? To answer these questions, we looked at the subjective views of the practitioners and their understanding of this project.

What do our results tell us about communication between researchers and practitioners?

Successful collaboration requires both sides to recognize that they are operating from different perspectives. Researchers must acknowledge that practitioners, too, have insights to offer, and they should not dismiss that knowledge as inferior. They also need to abandon the idea that they can “control” schools, based on the evidence, and force them to adopt methods that have been tested in “laboratory” settings.

We found that a certain accommodation was achieved in the course of the project – in some cases, one side succeeded briefly in adopting the perspective of the other, but not until several meetings had already taken place. Dialogue requires continuity and time. It is then possible to come to an understanding of the other perspective, and to apply scientific findings to the school context in dialogue with practitioners. This is evident from the fact that all five schools are continuing the project.

Footnotes

The StEG Tandem project is a component of the Study on the Development of All-Day Schools and funded by the German Federal Ministry of Education and Research.

Participating research associates: Brigitte Brisson, Julia Dohrmann, Katrin Heyl, Jana Herr, Markus N. Sauerwein and Desiree Theis.

Information about the latest publications can be found on the website of the StEG Tandem project.

References:

Beck, U. & Bonss, W. (eds.). (1989). Weder Sozialtechnologie noch Aufklärung? Analysen zur Verwendung sozialwissenschaftlichen Wissens (Suhrkamp-Taschenbuch Wissenschaft, Vol. 715, 1st edition). Frankfurt am Main: Suhrkamp.

Bonss, W. (2003). Jenseits von Verwendung und Transformation. Strukturprobleme der Verwissenschaftlichung in der Zweiten Modeme. In H.-W. Franz, J. Howald, H. Jacobsen & R. Kopp (eds.), Forschen, lernen, beraten. Der Wandel von Wissensproduktion und -transfer in den Sozialwissenschaften (pp. 37–53). Berlin: Ed. Sigma.

Caplan, N. (1979). The Two-Communities Theory and Knowledge Utilization. American Behavioral Scientist 22 (3), 459–470.

Jäger, M., & Prenzel, M. (2005). Erfolgreiche Bildungssysteme nutzen wissenschaftliche Erkenntnisse. Überlegungen zur Verwertung pädagogischen Wissens. In H. Heid &. Ch. Harteis, Ch. (eds.), Verwertbarkeit. Ein Qualitätskriterium (erziehungs-)wissenschaftlichen Wissens (pp. 163-182). Wiesbaden: VS Verlag

Idel, T.-S. & Pauling, S. (2018). Schulentwicklung und Adressierung. Kulturtheoretisch-praxeologische Perspektiven auf Schulentwicklungsarbeit. Die Deutsche Schule 110 (4), 312–325. doi:10.31244/dds.2018.04.03

Penuel, W. R., Allen, A.-R., Coburn, C. E. & Farrell, C. (2015). Conceptualizing Research–Practice Partnerships as Joint Work at Boundaries. Journal of Education for Students Placed at Risk (JESPAR) 20 (1-2), 182–197. doi:10.1080/10824669.2014.988334

 

 

Keep up to date with the BOLD newsletter