« What is that – a baby? | Main | Reconnecting over Spring Break »

March 14, 2005



I ignore the studies. But I also go one step further - I've stopped reading the parenting magazines.

See, no one at those mags has ever met *my* children. And we've not yet been interviewed by anyone conducting studies.

So any 'evidence' presented is hearsay, as far as I'm concerned - and worth about as much as any other 'hearsay' source.

Hardcore stance, no? (And I'm a former researcher and current information professional, no less!)

But I'm here to tell you that it makes life much much much easier this way...


I too ignore them. If it were really important, my pediatrician would have told me about it.


Ignore them...the statistics, that is. What do they do for us anyway? Make us feel bad about what we are or aren't doing.


What is scary to me is when the government and self serving extremist groups that lobby government use these biased studies (gotta follow the money!) to force us all to conform to their values via legislation, based on these studies. Very scary. I rarely believe most studies and when I hear words like "the sky is falling" and "if we don't force everyone to do this, all of our children will suffer" with regards to some statistic, I just yawn and go on with my day.


when my husband was in grad school he took a class on statisical methodolody for studies and one of the first things we now ask ourselves when we see a study is who is paying for it!!!

the results of some studies are clearly obvious...all ciagarette smoke is bad...excess amounts of television watching is harmful to kids. fruit sprayed with pesticides is not healthy. now tell me something i couldn't figure out on my own!!!

i think it's important to read new information, but, and i hate this cliche, to take it with a grain of salt.


I think you're right to look for the methodology behind the statistics. That's something that I learned when taking a class on statistics in college.

For example, if a study is based on a sample of people that is NOT randomly selected, the results are likely to be skewed. I've noticed that this is a common problem in studies by psychology grad students. They just solicit their friends and anyone else they can find to take their surveys. That really isn't scientifically valid. Ideally a study should be random, double-blind (neither the scientist nor the research subject knows which group a person is in until after the results are reached). Otherwise, unconscious and conscious bias skews the results.

If there are too few people in the group that is studied, that also makes the results much less reliable. The margin of error can become huge.

And you'll often hear that the risk of something such as a disease is "doubled" by certain actions, but you're not told what the risk was to begin with. If the risk of something to begin with is only one in a million, I don't really care if the risk is "doubled."

You're also right to examine more closely studies that support a particular political agenda. Smoking is a political issue for many. So when a study comes out that supports a particular political point of view, I examine it and the methodology behind it more closely. That doesn't mean I won't believe the result that was reached if it seems to hold up after examination; I just want a little more detail and how the study was done and how many subjects were in it.

One question I would have about the secondhand smoke study is whether the researchers "controlled" for household income, parental education levels, and every other factor that could possibly explain the results. In other words, it could be that the parents who are careful about not smoking at all or not smoking around their kids also are more likely to work harder with them on their homework -- it isn't necessarily the physical effects of second-hand smoke that hurts the kids' math and reading scores, but the fact that their parents are not quite as focused on the kids' daily needs. Since people can't be studied in a pure "double blind" way like lab rats (you can't randomly assign 1000 households to smoke and antoher 1000 not to, and make sure that starting income, education, etcs. are identical), there is a big question mark next to every study that tries to correlate one factor to another factor. Correlation is not causation. Stickly asphalt may tend to happen on the same days as hot weather (correlation), but sticky asphalt doesn't cause hot weather. Maybe it's not smoke that hurt the kids' reading & math skills -- maybe the parents' poorer reading & match skills caused them (a) to smoke and (b) to not be quite as adept at helping with their kids' reading & math homework.

I think the single most important thing to do when reading statistics is to ask oneself whether the source of the statistic has a reason to exaggerate -- even the reason you identified, that they want to sell their magazine or newspaper.

Some people, including many journalists, think that there is no harm in exaggerating certain things, such as the harm of secondhand smoke, because after all we know that smoking isn't good for you anyway. The problem is that when people exaggerate, there are always unintended negative side effects. For example, a person who realizes that they survived second hand smoke just fine might be more inclined to think that warnings about the dangers of sharing intravenous needles might be exaggerated too -- and then they end up with hepatitis or HIV.

Exaggarating bad news is like "crying wolf" too often --after a while nobody believes it anymore. That can end up hurting everybody when a real threat comes along.

Statistics are like a knife: properly used they are extremely valuable, even indispensible. Improperly used, they can injure people.

Even the devil can quote the bible to his own ends, they say. The same is true of statistics.


I, too, subscribe to Parents and read that article you were talking about. I was exposed to second-hand smoke when I was younger, but does that mean I was bad at math? No. I think at some point, common sense has to take the place of all these studies and statistics. I'm glad someone did a study to find out about the causes of SIDS, but I could care less about the studies that show exposure to music at X age causes X increase in SAT scores (for example). I will raise my daughter to the best of my ability, and our lives won't cater to the studies!

The comments to this entry are closed.

DotMoms Daily

    follow me on Twitter