Is Health Data Ever Harmful?

In yesterday’s post I described how searching the medical literature helped me avoid a dangerous surgery with no obvious benefit. The surgeon I consulted, who recommended the surgery, said that published evidence backed her up. I could not find that evidence, however. Others found evidence that contradicted her recommendation.

Among the comments on that post were similar stories: Searching/reading the medical literature had been helpful. Learning what had happened (in research studies) was better than relying on an expert (a doctor). Here is an example:

A little over two years ago, I was “depressed”. My psychiatrist wanted to prescribe an SNRI [serotonin-norepinephrine reuptake inhibitor]. I related, once again, my poor experience with an SSRI and asked for evidence that an SNRI would be any more effective. He said there was evidence that SSRIs [selective serotonin reuptake inhibitors] worked. I pointed out the 2004 meta-analysis that showed no meaningful difference between SSRIs and placebos. Then I asked whether there was any better evidence for SNRIs. Since he wasn’t able to provide any, I told him that since we know that extremely low Vitamin D blood levels, poor diet, no exercise, and no social life can cause depression (all things I had at the time), I’ll try fixing those things first and then resort to drugs if that fails. It did not fail and I quit seeing him.

None of the stories in the comments described the opposite outcome: Knowing the data made things worse.

Are there exceptions? Is it always helpful (or at least not harmful) to know what happened (i.e., know research outcomes)? Has anyone reading this had an experience where knowing health research data was harmful?

Suppose You Write the Times to Fix an Error (part 2)

The Roberts-Schwartz correspondence continued. I replied to Schwartz:

“Dining establishments”? [His previous email stated: “Four restaurants simply cannot represent the variety of dining establishments in New York City”] I thought the survey was about sushi restaurants. Places where raw fish is available.

Quite apart from that, I am sorry to see such a fundamental error perpetuated in a science section. If you don’t believe me that the teenagers’ survey was far better than you said, you might consult a friend of mine, Andrew Gelman, a professor of statistics at Columbia.

John Tukey — the most influential statistician of the last half of the 20th century — really did say that a well-chosen of sample of 3 was worthwhile when it came to learning about sexual behavior. Which varies even more widely than sushi restaurants. A sample of 4 is better than a sample of 3.

Schwartz replied:

The survey included 4 restaurants and 10 stores.

The girls would not disclose the names of any of the restaurants, and only gave me the name of one store whose samples were not mislabeled. Their restaurants and stores might have been chosen with exquisite care and scientific validity, but without proof of that I could not say it in the article.

I wrote:

I realize the NY Times has an “answer every letter” policy and I am a little sorry to subject you to it. Except that this was a huge goof and you caused your subjects damage by vastly undervaluing their work. Yes, I knew the survey included 4 restaurants and 10 stores. That was clear.

As a reader I had no need to know the names of the places; I realized the girls were trying to reach broad conclusions. They were right not to give you the names because to do so might have obscured the larger point. It was on your side that the big failing occurred, as far as I can tell. Did you ask the girls about their sampling method? That was crucial info. Apparently The Times doesn’t correct errors of omission but that was a major error in your article: That info (how they sampled) wasn’t included.

He replied:

I could have been more clear on the subject of sample size, but I did not commit an error. Neither do my editors. That is why they asked me to write a letter to you instead of writing up a correction.

I don’t feel I have been “subjected to” anything, or that this is some kind of punishment. This is an interesting collision between the precise standards of someone with deep grounding in social science and statistical proof and someone who tries to write intelligible stories about science for a daily newspaper and a general interest audience. But I am not sorry that you wrote to me, even a little sorry.

i wrote:

“I did not commit an error.” Huh? What am I missing? Your article had two big errors:

1. An error of commission. You stated the study should be not taken seriously because the sample size was too small. For most purposes, especially those of NY Times readers, the sample size was large enough.

2. An error of omission. You failed to describe the sampling protocol — how those 10 stores and 4 restaurants were chosen. This was crucial info for knowing to what population the results should be generalized.

If you could explain why these aren’t errors, that would be a learning experience.

Did you ask the girls how they sampled?

His full reply:

We’re not getting anywhere here.

Not so. After complaining he didn’t have “proof” that the teenagers used a good sampling method, he won’t say if he asked them about their sampling method. That’s revealing.

Something similar happened with a surgeon I was referred to, Dr. Eileen Consorti, in Berkeley. I have a tiny hernia that I cannot detect but one day my primary-care doctor did. He referred me to Dr. Consorti, a general surgeon. She said I should have surgery for it. Why? I asked. Because it could get worse, she said. Eventually I asked: Why do you think it’s better to have surgery than not? Surgery is dangerous. (Not to mention expensive and time-consuming.) She said there were clinical trials that showed this. Just use google, you’ll find them, she said. I tried to find them. I looked and looked but failed to find any relevant evidence. My mom, who does medical searching for a living, was unable to find any completed clinical trials. One was in progress (which implied the answer to my question wasn’t known). I spoke to Dr. Consorti again. I can’t find any studies, I said, nor can my mom. Okay, we’ll find some and copy them for you, she said, you can come by the office and pick them up. She sounded completely sure the studies existed. I waited. Nothing from Dr. Consorti’s office. After a few weeks, I phoned her office and left a message. No reply. I waited a month, phoned again, and left another message. No reply.

More. In spite of Dr. Consorti’s statement in the comments (see below) that “I will call you once I clear my desk and do my own literature search,” one year later (August 2009) I haven’t heard from her.