by Anjanette Delgado,
Guest Blogger for MIP
Journalists don't as a rule, have a specific impact in mind when we begin our journalism. We list project goals, but other than bringing awareness to an issue or event we do not identify what we’d like to see happen next.
This is a story about the one time we did.
Rockland County, just northwest of Manhattan, is one of New York’s fastest growing counties. Neighborhoods that used to feel suburban and bucolic now choke with high-density sprawl, multi-family homes rise in backyards next door, traffic is a problem and religious schools pop up on residential streets.
“A generation ago, there were few problems between Ramapo's small ultra-religious Jewish communities and the gentiles and other Jews who made up the bulk of the town's population. Things have changed. As the ultra-religious community has grown, Ramapo has become a flashpoint in a continuing conflict over what it means to live in the suburbs. … The conditions fueling that conflict are now threatening to spread beyond Ramapo's borders. Surrounding communities have taken notice, and they are adopting measures aimed at heading off the strife that has become the norm for their municipal neighbor. No place, it seems, wants to become ‘the next Ramapo.’”
People in Rockland County, one of two counties we cover at The Journal News and lohud.com, argue over this. Things get ugly. Reader comments with the stories we write turn hostile and border on anti-Semitism. No matter what the topic, the conversation inevitably devolves into “us” vs. “them.”
But here’s the real problem, and it’s not religion: Ramapo’s loose zoning, lax enforcement of fire and building codes and largely unchecked development puts everyone at risk.
We wanted to know: Do people understand what’s really happening in Ramapo? If we were to explain the issue — lay it out in longform story, year by year and decision by decision — could we all (or most of us) finally agree? Could we change minds? If we could achieve consensus, would we stop arguing religion and start attacking the real problems?
So, after years of having the same tired, angry, circular discussion, we set out — in one project — to change the conversation.
We researched the story over several months. We talked to longtime residents, members of the Jewish community, government officials and the county’s Fire and Emergency Services coordinator. We wrote and rewrote, shot video and mapped the region.
“Ramapo nears breaking point” is a longform piece of explanatory journalism that laid out all of the problems that began decades ago.
At the end of our story, we included a short survey to measure the impact of our
explanatory journalism — to find out if we successfully proved that the issues in Ramapo are a lack of zoning and safety code enforcement, not religion. (We considered asking for opinion at the beginning of the story to gauge change at the end, but felt that would inhibit readership and we could accomplish what we needed with one survey at the end.)
The survey was four questions, beginning with two easy Likert Scale queries:
Then we expected a steep drop-off in answers to the final two questions because the first takes time and the second raises privacy concerns:
We built the survey using PollDaddy because it prevents repeat responses and for the data it provides on the back end, including geolocation, a time/date stamp and referrer, and it assigns each respondent a unique PollDaddy ID. This helped us connect their answers from question 1 to question 2, and so on, and get a more complete picture of their response.
We embedded the survey at the bottom of the longform story. For two days, that was the only place you could find the survey; we didn’t embed it elsewhere or share it on social media. After two days we added it to our editorial, “Ramapo’s shoddy governance is by design.”
Surprisingly, nearly 900 people responded. Here’s what they said:
Do people understand what’s really happening in Ramapo? Yes. Most (67%) either agreed or strongly agreed with our explanation of the issues, with “strongly agree” ranking the highest (54%).
Could we change minds? Yes. Of those 67%, 14% said our reporting convinced them, while 35% said they agreed beforehand.
Finally, we believed we and the community could stop having the same conversation and instead move forward toward solutions.
Emails. Almost a third — 29% — gave us their email address for follow-up.
Sources and commentary. Even more surprising than the total number of respondents was the percentage of them — 38% — who also took the time to write about how Ramapo’s issues have impacted them. Hearing from homeowners, those who’ve moved away, educators, fire safety officials, people who have tried to move in but claim to have been denied housing gave us a list of future story ideas and sources.
A few wrote to thank us for our reporting, for exposing the issues and doing so in such an
unbiased way. A couple said we should have written this years ago. One called it “conversation-framing analysis” and others called for more discussion on the topic. One response shows how we raised awareness and encouraged civic involvement:
“I live in a neighboring town and am concerned about Ramapo’s issues becoming issues here, too. So I will be paying more attention and doing what I can to ensure that we do not fall victim to the same lack of zoning and code enforcement that has plagued Ramapo.”
What we learned
Anjanette Delgado is the digital director and head of audience for lohud.com and poughkeepsiejournal.com, part of the USA Today Network. Email: email@example.com, Twitter: @anjdelgado. Special thanks to Lindsay Green-Barber at the Center for Investigative Reporting, who helped with survey methodology.
Erica Watson-Currie, PhD: November 2017
At the Media Impact Project, our mission is to understand the effects of media on viewers. We also strive to apply acquired knowledge to projects that serve the social good, and to be a thought leader on research issues in our field. This means we assume two distinctly different roles: the “critical friend” of the evaluator and the impartiality of the researcher.
By evaluating the work of individual documentarians, journalists, and other media makers, we help find ways to improve engagement opportunities with audiences on a micro level. In studying the general effects of media on society, we contribute to the greater body of research on this topic that touches us all. These are important distinctions to make, as evaluations of specific programs or films provides us with information about the whats -- audience demographics and how audiences responded, what actions they took, and if their behaviors changed; research on that same show can help us to understand the whys of media impact -- what was it about the program that sparked the response, spurred the action, or shifted behaviors.
Myriad scholars have weighed in on the distinctions between evaluation and research. Michael Scriven's oft-referenced explanation of differences between the two disciplines posits that although both practices apply social science tools to conduct empirical investigations and analyze data, evaluators do so to assess value of what is being examined with an eye to whether predetermined standards are met. This, while researchers collect data to test hypotheses and reach conclusions based on "factual results." Into this fray we contribute our own nine distinctions as they relate to media evaluation projects and our overarching research program here at the USC Annenberg Norman Lear Center’s Media Impact Project.
Nine Key Differences Between Evaluation vs. Research
1. Value: Evaluation focuses on the effectiveness and/or value of a program, message campaign, or other communications; Research strives to be value-free or at least value-neutral in pursuit of increasing knowledge.
2. Role: Evaluators work with stakeholders to understand a program's objectives and goals, and develop agreements on the relevant (and obtainable) "Key Performance Indicators" which constitute evidence these are being achieved. Researchers develop an initial question and design their study (e.g., intervention, experiment) deciding:
what variables will be tested; on whom; under what conditions; over what period of time.
3. Application of Critical Thinking Skills: Evaluators engage as a "critical friend" to program leaders helping to understand analyses, determine effective changes, and refine data collection as the program evolves (a posteriori/ad hoc). Researchers engage in critical thinking at outset of a study to implement procedures which prevent them from biasing data collection or interpretation of findings (a priori).
4. Use and Timing of Operationalization: Evaluators work with project leaders to operationalize terms and agree upon methods at outset of project; however, these may shift, expand, or evolve along with the project. Thus, the process of evaluation is responsive and incorporative of what is learned along the way. Researchers operationalize variables and set methods and procedures at the outset which are to be followed until project is completed.
5. Role in Avoiding Potential Pitfalls: External evaluators may be better positioned to recognize barriers and threats to a program's success, as well as unintended effects. Thus, evaluators often play a role in mediating collaborations between program leaders and key stakeholders to increase effective participation and encourage development of effective procedures. Researchers strive to be mindful of the possibility confounding variables may affect their results in order to exclude or control for these at the outset, or statistically eliminate their effects in analysis.
6. Review of Academic Literature: In Evaluation, purposes for literature reviews vary depending upon stage of program:
7. Purposes and Procedures: Evaluations are conducted to discover strategies and tactics to improve a program. Thus, Evaluation Reports are provided at regular intervals as part of an ongoing process in order to encourage reflection and stimulate discussion among project leaders and key stakeholders to help discover and implement effective adjustments to materials and procedures (e.g., to enhance innovations) while the program is underway. Research seeks evidence to prove a program had the hypothesized effect and/or support a theory, with data analyzed and findings reported in full at the end of the study. Researchers would require approval from an Institutional Review Board to make substantial procedural changes to a research plan while in progress.
8. Dissemination of Findings: Evaluators report findings to program leaders and stakeholders to: provide a record of how the program developed and evolved over time
document programs' effects; help articulate best practices for institutionalizing a program and/or implementing it more widely. Researchers disseminate new knowledge for peer review as a contribution to an ongoing academic narrative.
9. Tables, Charts, and Graphs: Evaluators often get to use more engaging quantitative and qualitative data visualization techniques in reports to clients than permitted in academic journals. Research is reported in academic journals which often limit the permitted number of tables and figures, and most often display only black-and-white or grayscale images.
Thus, at MIP, our evaluations of entertainment projects, documentaries, films and other programs function as a vital component within our overall research mission to study the influence of news and entertainment on viewers. In my next blog, I will discuss how our evaluations of news and entertainment programs play an important role within our research.
Media Impact Project
A hub for collecting, developing and sharing approaches for measuring the impact of media.