What we do

PERG Learning  provides planning, evaluation and applied research services in support of quality educational programs in formal and informal settings (early childhood, PK-12 schools and higher education) and informal contexts (museums, science centers, out-of-school and community programs). We have worked with hundreds of organizations over the decades on projects that include several broad areas, primarily STEM, arts integration, and students with children in higher education. PERG’s projects range in scale, from small developmental projects to national and international partnerships, with funding from federal and state agencies and foundations. 

    • Evaluation
    • Applied research/ development
    • Technical assistance

Evaluation

Our projects are primarily evaluations of grant-funded innovation. When people think of the role of an evaluator, they often conjure up the image of a trench-coated investigator who keeps a distance from the action or who sums up the results of a project at the end. Our stance is both objective (as far as that is possible) and supportive. We are known for our collaborative, formative work that seeks to understand what is happening in relation to questions that project designers and leaders care about, and that supports the refinement of project design and implementation along the way toward strong outcomes.

Depending on the particular project, our role may be framed as evaluators, critical friends, design-based research partners or consultants. But across all projects and partners, we see the process of evaluation as a process of learning not just about what happens, but why and how things work– or don’t. This is what keeps our work so deeply interesting, and what, we hope, benefits our clients and partners in understanding and tuning their own work with learners toward equitable access, experiences and outcomes.

Our evaluations are usually, but not always, tied to grant-funded projects. We work collaboratively with clients from the proposal stage, and throughout the life of the project to collect and analyze data to provide feedback for improving the project’s design and implementation.  Ultimately, this leads to strong outcomes.   

Project proposal
Ideally, we start working with the project team as they develop their proposal. We will meet to discuss the proposal, program and its targeted outcomes, and develop an evaluation plan that is customized to the program goals, budget, and requirements of the funder. Being involved from the beginning allows us to get started right away when the project is funded. We are always available to discuss proposals, and do not bill clients unless the project is funded.

Evaluation Plan
The heart of the evaluation plan is the logic model – a map of what your program will accomplish and how. We will collaborate with the team to draft a logic model, and then use it as the basis for fleshing out the evaluation plan. It also helps guide the proposal to ensure its coherence. The evaluation plan specifies evaluation questions (such as “How do participants describe their identity as scientists?”) and evaluation activities that answer the questions (such as interviews or focus groups with participants at the program site). We use a matrix to link questions and data sources to ensure validity and reliability. The evaluation plan may be refined over time to fit evolving program needs. Most evaluation plans include an annual evaluation report, as well as formative feedback along the way.

Applied research 

Sometimes our role is to investigate a question or questions, often within a project, rather than to focus on the design and implementation of the project itself. Examples include a study of youths’ use of drawing and digital photography to learn about natural organisms, and a study of programs to support students with children in colleges across the country. The approach we use for research studies is parallel to that of evaluations in terms of design and methods.  

We also do research and development projects. Two recent examples include the development and testing of the Family Friendly Campus Toolkit, which grew out of our research on programs for students with children; and a tool for assessing youths’ attitudes toward data science.

Planning and technical assistance 

PERG offers its expertise to organizations and programs outside of formal evaluation and research projects. 

    • Customized evaluation planning and capacity building workshops for organizations that want to develop staff skills and organizational systems for internal evaluation
    • Logic mapping and program design
    • Design of instruments or mapping systems
    • Technical assistance for small and large needs (from conducting focus groups to reviewing surveys to leading professional development for staff to build internal evaluation capacity.