May 27, 2014

A Renewed Focus on Data

EDC is helping Head Start programs around the country apply the data that they collect

As part of their efforts to help prepare at-risk children for school, the nation’s Head Start programs gather data. And it’s a lot of data—about the children, their health, their families, and the communities where they live.

But little of this data, which could show a program’s impact or guide programmatic decisions, is actually used by program directors and managers. It’s a missed opportunity, according to EDC’s Patricia Fahey, principal investigator for the Office of Head Start’s National Center on Program Management and Fiscal Operations (PMFO).

“We have found that programs frequently collect the data only because Head Start requires them to do it,” she says. “They don’t often look to the data as a source of information about the services they can offer.”

PMFO, which is based at EDC, is supporting the Office of Head Start’s efforts to educate program managers by developing two interactive, online courses that lay the groundwork for better integration of data and programmatic decision making. Stacy Dimino, director of PMFO, says that the courses have already led to a shift in how program staff think about the impact they’re having in their communities.

“Programs are starting to switch to a mind-set that builds data-based decision making into the fabric of their operations,” she says. “It’s all pretty new to them. They’ve never had to focus on the impact they were having, and the Office of Head Start is asking for that.”

“The expectation to use data has always been there,” adds Fahey. “Our task now is really to help people rise to these expectations.”

Changing perspectives

In 2013, PMFO staff introduced the courses to more than 3,500 Head Start staff. And more than 13,000 individuals have visited the course webpage since it launched last year.

According to the course website, the first course, titled Creating a Culture that Embraces Data, aims to “help everyone in the program—from bus drivers and cooks to managers and governing board members—become enthusiastic about using data and to base decisions on facts rather than hunches.” It was developed as an interactive learning module and uses realistic scenarios to reinforce key skills.

The second course, Digging into Data, explores the statistical techniques that directors and staff can use to extract meaning from their data sets. It also emphasizes the importance of examining a variety of data sources to better understand complicated issues.

The PFMO courses also seek to address one of the biggest reasons why Head Start program staff do not use the data they have: analyzing data can be intimidating, especially for staff who often do not have any formal experience in program management or statistical analysis.

“People sometimes get nervous when the data reveals a surprise—especially if the surprise points to a need for a change,” says Fahey. “So rather than using the data to find ways to improve staff capacity or service delivery, they may be tempted to ignore the data and just keep doing what they have been doing.”

Positive outcomes

Evidence of change can be seen in Southern Oregon Head Start (SOHS), a program serving 1,141 children from primarily rural areas in the state. SOHS Director Nancy Nordyke says they began applying data to their work in a more meaningful way within the past two years, and trainings from PMFO have helped them continue that transition.

“The trainings enhanced our ability to look at data in an efficient way, and to use it in a way that’s meaningful as well,” says Nordyke.

For example, the program created a weekly monitoring report that all SOHS centers submit. The report collects programmatic data, such as total enrollment, the number of open slots, and the number of children on the waiting list. It also collects child assessment data, as well as data on the health and well-being of children, such as family service work and information about routine doctor visits and dental exams. The results are sent to Nordyke, who then has information about all of her centers—and all of the children in her care—at her fingertips. The report is distributed to the entire agency so monitoring and follow-up are done promptly.

“Creating a uniform reporting system has enabled us to get data on who we need to follow up with,” she says. “We have developed it into a tool that is much more uniform across the board for us.”

This uniformity is important. Prior to the weekly monitoring report, each center collected its own data—making it difficult for Nordyke to see what was going on throughout her region, especially when the types of data that each center was collecting were not consistent.

Now, data has become part of a cycle that informs practice in all aspects of SOHS, from providing family services to making sure that children are meeting academic and social benchmarks. This new focus has had positive results, says Nordyke.

“Now, we’re really trying to focus on ways that families can be engaged and on activities for families that will help their children,” she says. “We intend to use data to monitor our progress and help us determine the effectiveness of our efforts.”