Over the 20-year-history of community technology centers (CTCs), impact has tended to be measured in one way: Is anybody here? CTCs were established to provide technology access—and by extension, new opportunities for learning and skill development—to people who didn’t have computers at home or at work. However, bringing technology to disadvantaged communities proved much easier than bringing people from the community into the center. From the beginning, the challenge has been creating a center that people would return to over and over again.
“I remember visiting this fabulous CTC in a Latino neighborhood in L.A.,” recalls Laura Breeden, director of the America Connects Consortium (ACC) based at EDC’s Center for Education, Employment, and Community. “It was obviously a very successful center, very busy and offering lots of valuable and rigorous programs. It was run by a nun who had been a school principal, and I asked her how she evaluated the impact of the center. She said, ‘I believe feet back is feedback.’”
Today, CTCs are under increasing pressure to document more than just foot traffic. They need to know who is coming, who is returning, and who is leaving for good; why people are coming (or not); and what they’re accomplishing while in the center. “CTC staff realize that in order to sustain their programs and attract new funding, they need to present real data showing the impact they’ve had on people in the community,” says Breeden.
The U.S. Department of Education, which funds ACC, has made research and evaluation the top priority for the consortium. Designing tools to evaluate impact and learning outcomes in CTCs is a unique challenge, given the diversity of centers and the clients they serve—from elementary school children to adult learners who are building literacy and workplace skills. In response, ACC has developed a set of tools tailored to the needs of each center; on a deeper level, ACC is working to promote the concept of evaluation as an essential management tool.
“Too many people view evaluation as something you do at the end of a program to see if it was a ‘good’ program,” observes Breeden. “Evaluation should be embedded in the program from the outset. It’s a planning tool and a program development tool. It helps you to establish goals and to track progress toward reaching your goals.
“For example, some directors run their CTC as if there is a fixed number of classes that they need to offer,” Breeden continues. “There is no one solution for all centers; you need to evaluate the whole roster of classes to see what your clients need. Are you offering Excel classes because you think you should, even though no one in your community is taking them?”
The evaluation tools developed by ACC grow out of a pioneering evaluation of the impact of CTCs, conducted by EDC in the late 1990s. “That study—and a handful of others—established the model for assessing the effects CTCs were having on people in terms of employment, education, and personal growth,” says Breeden. “Now ACC is helping centers conduct that sort of research on their own programs.” In fact, ACC’s evaluation tools include two that were developed by CTCs under ACC’s Field Innovation Grants.
ACC tools allow CTCs to monitor progress on a number of fronts:
- Self-Evaluation and Benchmarking Toolkit: The toolkit, developed by ACC, is a database specifically designed for use by CTCs. Centers can access the database via the ACC Web site and use it to track everything from their equipment inventory to traffic in the center to client evaluations of their programs. A key feature of the database is that it allows centers to track their progress over time and in relation to their goals.
- CTC Database Management System: Fast Forward Neighborhood Technology Center in Columbia, South Carolina, developed this database system through a Field Innovation Grant from ACC. This database allows CTCs to capture and manipulate more data more fully than the toolkit—including scheduling and course-tracking features and the ability to track evaluations by groups of clients.
- Online Course on Evaluation Design: Through another grant from ACC, Charles Plummer at the Rochester (New York) Museum & Science Center developed this self-paced course, which provides a “big picture” perspective on program design and evaluation. The course includes modules on rationale, goal setting, and data analysis, along with a simulation of a sample CTC dealing with evaluation issues.
Taken together, these tools provide CTCs with a variety of strategies for tracking their impact, which is essential, according to Breeden. “You have to use several different measures,” she comments. “For example, through interviews and questionnaires, you can get at things like impressions and attitudes. Of course, that kind of information is subjective, but it can yield valuable insight into some critical issues—such as whether the learners are gaining confidence in their abilities. In the end, you need a balance of qualitative and quantitative data.”
Originally published on September 1, 2003