I make a lot of fun of "studies" here and the reporting of them by journalists. Here's why:
Lots of cultural writing these days, in books and magazines and newspapers, relies on the so-called Chump Effect. The Effect is defined by its discoverer, me, as the eagerness of laymen and journalists to swallow whole the claims made by social scientists. Entire journalistic enterprises, whole books from cover to cover, would simply collapse into dust if even a smidgen of skepticism were summoned whenever we read that “scientists say” or “a new study finds” or “research shows” or “data suggest.” Most such claims of social science, we would soon find, fall into one of three categories: the trivial, the dubious, or the flatly untrue.
As a result of such unquestioning credulousness -- or maybe it's the cause of the credulousness -- the soft science of sociology, which tries to quantify things like attitudes and behavior causes that really can't be quantified -- is falsely treated as a hard science that relies on the precise measure of carefully accumulated data. I think most of the studies belong in that middle "dubious" category. At best, they merely confirm the researcher's biases. At worst, they try to bring about the researcher's desired result.
I minored in sociology, so I was exposed to a lot of this crap, and I do mean crap.