- A list in Part 2, Where Blacks Bank, is pretty informative and gives the reader a good idea of how the two black-owned bank’s rates of non-white customers compare to their competition. However, I noticed that the list is using numbers from a market study that was conducted with telephone interviews. Wouldn’t this be considered a voluntary response sample, and therefore less accurate? Are these sorts of methodological errors less meaningful in a journalistic piece than, for example, academic research?
- While the way Dedman and his team handled the data was groundbreaking at the time, it seems that they frequently had to make do with limited information and logical assumptions to fill the gaps. For example, they were forced to match federal lending reports to the most recent census data, simply because that was as close as they could get. How much more accurate could a similar study be today, with modern information sharing?
This is a terrific question and we will discuss in class.
“Wouldn’t this be considered a voluntary response sample, and therefore less accurate? ”
Yes, basically, you have a great point. I would allow such a method for publication but definitely would flag it for its lack of analytical rigor. We need more questions such as these on the editing desks of websites and news stations.
Great second question too about dealing with the available data. This will still happen today. We may report on an older dataset, say the Census, but we tell our readers this is the most recent data we have. So the best practice here is to disclose to your readers and viewers the methods behind the madness.