Second in a series
Both the utility and limits of the new Jewish community demographic study come down to its methodology, how the study’s authors conducted the research to arrive at their results.
Any survey is an estimate. At its best, it is a snapshot of a moment in time that is, hopefully, reflective of a larger trend or theme, said Janet Krasner Aronson, one of the new study’s authors. This means it can help the Jewish community determine its future priorities and programs by understanding the makeup of its members.
With this new study, which was
released last week and conducted by the Cohen Center for Modern Jewish Studies at Brandeis University, the Jewish community of Greater Washington saw an overall 37 percent population gain since 2003. And of the nearly 300,000 Jews in the area, 41 percent are in Northern Virginia, 39 percent in suburban Maryland and 19 percent in the District. The study was done in cooperation with the Jewish Federation of Greater Washington and funded by The Morningstar Foundation established by Susie and Michael Gelman. (Susie and Michael Gelman are members of the ownership group of Mid-Atlantic Media, which publishes Washington Jewish Week.)
What this study can’t do, however, is provide direct comparisons to the previous 2003 study — except for those overall population numbers — because the way the researchers did their studies were different. According to Aronson, people could have been counted differently, and even if the categories appear the same — the number of people who are members of a synagogue, for instance — the questions could have been framed differently from study to study. The 2003 study, for instance, limited synagogue membership to “brick and mortar” synagogues, while the new one included all forms of worship communities, including Chabad and chavurot.
Ira Sheskin, author of the 2003 survey, said that changing methodologies between the two studies decreases the usefulness of the new one precisely because those comparisons can’t be made. But each study, while facing similar methodological challenges, did its work in drastically different environments.
Among the common questions researchers face: How do you reach Jews who aren’t already known to be involved in the community? And in the age of caller ID and local residents with cell phone numbers with non-Washington area codes, how do you find people and get them to answer the phone?
The 2003 methodology, said Aronson, no longer works in the 2017 environment and would have produced misleading results. In 2003, for example, it was still possible to get most respondents by calling landlines.
“We struggle with this because it is so important to communities,” Aronson said. But, she added, they made the decision not to use the same methodology as the previous survey because “using the same method 10 to 15 years later ignores how the world is.”
The newest community survey used a variety of methods for counting and analyzing results that vary from the 2003 study. The earlier study relied on random digit dialing, better known by the acronym RDD. RDD is a practice used by many polling companies, using local area and exchange codes — the first six numbers of a telephone number — and then a computer-generated trailing of four digits.
Response rates for this method have plummeted in the advent of cell phones and caller ID.
“Everyone is struggling with RDD” in the survey and polling world, said Aronson. “We’re hitting a moving target with the [survey] landscape. This is particularly a problem when doing an RDD study designed to find a small population in a targeted geographic area.”
Instead, for the overall population number of the latest study, the authors used hundreds of national RDD surveys and polls that asked about religion and location to estimate the overall Jewish population.
“They gather all of these other polls in the region to statistically determine the population of Jews in a given area,” said Gil Preuss, CEO of The Jewish Federation of Greater Washington, which executed the study. “It’s called meta-analysis. It’s bringing together all of these different studies and putting them into a single database to do some of this analysis.”
For the rest of the population analysis, the list of people to contact was compiled from numerous smaller lists maintained by hundreds of local Jewish organizations. Organizers added a directory of residents with Jewish-sounding first or last names to the list. The study’s authors drew both their primary and supplemental samples from this modified master list.
For the primary sample, which is meant to be representative of the community as a whole, the researchers chose 8,900 households to contact by mail, phone and email (although the survey could only be taken by phone or email). They successfully reached more than 3,700 households, but after screening out incomplete surveys and those deemed ineligible (like not being Jewish), the primary sample came to just less than 2,000 households.
Another 42,000 households were contacted by email to generate the supplemental sample. The researchers ended up with 4,460 useable responses. Because those who are already heavily involved would be more likely to respond to the survey, the researchers made adjustments to account for that bias, which is called weighting.
Weighting, which is widely used in surveys, gives certain respondents more importance based on the need to account for more people of a certain type; any survey like this would trend more toward responses from those already involved in the community. Because of this, Aronson said researchers gave more weight to responses from households whose members are less involved to ensure representation for that population.
The new survey also used some RDD to generate a small sample of 200 people to test the larger results for consistency, Aronson said. This RDD sample confirmed the results of the population estimates produced based on the national surveys. Because it was based on only 200 respondents, however, it was far less precise than the other survey results. As another check on their numbers, the group also solicited numbers from synagogues to get an estimate of current membership levels.
The concern with this mix of methods, said Sheskin, the author of the 2003 study, is all the people you could be leaving uncontacted. Without a large RDD sample, he said, it is easy to miss Jews who don’t have particularly Jewish names and who aren’t very involved in the community. It could result in inflated numbers, especially in the numbers of people who are involved.
“You can’t just believe things and take them on face value,” he said. “They have to pass the ‘smell test,’ as I call it.”
However, Aronson responded, there is no RDD method that can reach all of the local cell phone households with non-local numbers, and thus two populations who are of great importance to the D.C. community — young adults and newcomers — are systematically excluded from RDD.
There is no easy answer to the comparison issue, said Aronson, who said that methodologies needed to be updated with the times and that the authors of the recent study were able to get a more accurate look at the community with their approach.
“We’re balancing the desire for comparability with wanting to capture the full spectrum of evolving ways people are engaging in Jewish life,” she said.