Throughout its 25-year history, EDC’s Center for Children and Technology (CCT) has worked to strike a balance between promoting the potential of new technologies to significantly improve public education and respecting the traditional knowledge and culture of public schools and classroom teachers. This attention to local relevance is not limited to CCT, however—it’s a vital part of EDC’s work in the education and health fields. EDC staff approach every research project as a genuine collaboration between staff researchers and school personnel. Rather than importing knowledge into a district, we conceive of our work as a process of simultaneous learning alongside our school-based partners. We believe we have something to contribute to school improvement efforts, but we also know that we have much to learn from the process of immersion and collaboration with school and district staff. So while this article discusses CCT’s specific research experience, it really illustrates the approach and attitude of the many centers across EDC.
Getting to know schools on their own terms has helped CCT, in particular, forge effective collaborations with a wide range of school districts. We understand that educational technology and school reform programs, whether national reform initiatives or small pilot studies, take place in local contexts driven by complex social, political, and economic demands. No matter how educationally sound a practice or model of technology use is, when it is introduced into a new school, that practice or model changes to fit the particular needs, strengths, and structures of an individual school, classroom, or teacher. Often, this intersection of intervention and local adaptation releases, in a process not unlike fusion, new teaching and learning opportunities. It also yields crucial insights into the nature of school improvement and reform.
Our evaluations look broadly and systematically at how technological interventions are integrated into, and altered by, the particularities of individual districts, schools, classrooms, and teachers and their widely varying circumstances and goals.
Often, this partnership-based research allows us to make important research findings and consequently encourages us to rethink our basic suppositions. For instance, we recently conducted a two-year independent study of the implementation of a Web-based test data reporting system, called the Grow Report®, in the New York City school system. The Grow Report provides an interface to the state and city testing results by organizing raw data into information that is aligned with New York State standards. For example, a sixth grade math teacher teaching during the 2003–2004 school year would have access to a customized report that is grouped according to three questions: (1) How did my students do?, (2) What do they need to learn?, and (3) What tools are on the Web? This report performs a summary and analysis of the data and identifies “class priorities.”
We began our research with a hypothesis based on the current literature on data-driven decision-making in school districts, which contends that (1) teachers lack a psychometric understanding of the data and that (2) appropriate data use has to start with a clear vision in the upper leadership about data-driven decision-making. The data we uncovered in this project shifted our thinking about what teachers know about data, and has become the intellectual basis of much of our current work in data-driven decision-making. We now realize that though both of these statements from the literature may be true, they do not adequately capture the reality of data in the classroom environment.
Our initial interviews with board and district personnel uncovered their very vague sense of the Grow Report and its role in the school system. Administrators couldn’t really describe or construct examples of how the Grow Report should work in the schools. Based on the literature, this suggested that the teachers would have even less of an understanding. “What we found was almost the reverse,” says senior researcher Daniel Light. “Teachers had a much more grounded and balanced understanding of how the test scores reported through Grow [Report] might or might not fit into their current teaching practices. They did not use the correct psychometric vocabulary, but they clearly used the same concepts of reliability and validity.”
Our research approach was critical to making this finding. Had we interviewed teachers in a more traditional fashion, asking them to list their undergraduate courses in testing and measurement or to define the psychometric terminology behind assessments, we might not have come to this realization. Teachers lacked knowledge of the terms but understood the practical aspects of data-driven decision-making. Instead of targeting what they didn’t know, however, we targeted what they did know and would use in their classroom practice. We brought a sample Grow Report to our teacher interviews and then asked teachers to talk about how they would use the report with their students. This research approach gave teachers the opportunity to unpack their understanding of data—not just standardized test data but also classroom assessment data—and describe how it informs their teaching practice. Though the teachers’ grasp of testing terminology was limited, their understanding of data was surprisingly sophisticated. By selecting methods that allowed teachers to express their understanding of data in a way that was meaningful to them, researchers were able to capture valuable information. Survey work or structured interviews where teachers are asked to explain what they know about test concepts might not have yielded the same data. As a result of this research, we now realize that the challenge is not to give teachers stronger psychometric knowledge but to create data tools that present reliable and valid information in a format that is instructionally relevant to teachers.
Conducting research in partnership with local schools or districts means staying attuned to their priorities. Our evaluations need to generate evidence that is relevant and useful to multiple stakeholders, including teachers, administrators, corporate partners, and policymakers, while maintaining high standards of rigor. Working with local educators as partners means co-constructing questions, methods, and goals in partnership with school personnel, defining the research so that it’s both practical and useful to them and contributes to a broader theoretical understanding in the field.
For example, although an initial objective of our long-term collaboration with the Union City (New Jersey) public school district was to investigate and understand how technology changes school culture and student-teacher relationships, we ultimately shifted our research-gathering to include a larger focus on standardized test data in order to accommodate the needs of the district. While moving the school culture toward a more integrated, student-centered approach and documenting this transformation was central to the district’s reform efforts and our research goals, the schools still had a bottom line to meet: to improve students’ test performance. We had to be responsive to the district imperatives while not losing sight of our own research agenda.
School-based partnerships also require a flexibility of methods. As part of our Union City work, we enlisted current and former students as researchers to expand the type and depth of research data we collected. In one instance, we asked recent high school graduates to interview current eleventh graders on their plans for the future, and we found that making them partners in our research had advantages. Students showed a greater willingness to share their fears and doubts about the future with fellow peers, whereas they were less forthcoming with adults. Students also felt more comfortable describing their teachers, including their own opinions and their sense of undercurrents, counter-tensions, and issues of racism, with former students than with adults. The participation of former students as researchers yielded a richer body of evidence that deepened our understanding of how the school system’s reforms had affected them.
Key Elements of Locally Relevant Research
- Collaborations with teachers and administrators are at the core of the process of defining research and innovation.
- The research grows out of felt needs and important challenges that districts and schools are facing.
- The collaboration sets goals that are practical, generalizable, helpful to the immediate community, and informative to the larger community.
- As researchers, we benefit from the process by being consistently exposed to and challenged by the complexities of real school situations
- Educators benefit by gaining experience in using reflective, critical lenses on their own experiences to learn about the strengths and weaknesses of their current practices and to identify paths toward successful change.
Originally published on June 1, 2006