Filter bubbles and parachutes
What the heck is a “filter bubble,” and what’s it got to do with parachutes? Originally coined by Internet activist Eli Pariser, a filter bubble is “your own personal universe of information that’s been generated by algorithms that are trying to guess what you’re interested in.” And because we’re increasingly online visiting more and more websites nowadays, we seem to dwell in these bubbles. OK, you might wonder – what’s wrong with that? Many researchers now believe that they could be a threat to democracy (more on that later). WTF? How can that be? Whose fault is it? Blame it on those pesky algorithms that bring you what you read.
Choice of media
We tend to choose media that agree with our beliefs or reflect our interests. That’s obvious. But the algorithms used in the media are different because we don’t know how they decide what to show us. We’re unsure of how they are making their choices. And we have no idea who algorithms think we are. More importantly, we don’t know if they are showing us the entire picture or have left out some essential parts. (You’ll read more about that later when I get to “parachutes.”) And if you’re like me, you often end up seeing items you wouldn’t usually choose to see. That’s because the information they base their decisions on doesn’t tell the whole story of who we are.
Take Facebook, for example. It’s trying to look at our wide range of clicks and figure out what we like and don’t like. I click on a lot of things to see if I’m interested in them. Most often, I’m not, but the “click” still took place, and that’s what the algorithms are based on. But does my web history truly represent me? Not a chance! When I look back at my web history, I realize just how many things I clicked on that I never read. OK, back to the threat to democracy I mentioned earlier. “Some 45% of U.S. adults get at least some of their news from Facebook, with half of that amount using Facebook as their only news outlet,” according to a recent Pew Research poll. If social media algorithms choose what people read, that could mean that we never read anything we don’t agree with! And that, in turn, will result in confirmation bias and perhaps intolerance. If you’re interested in learning more about this, I can heartily recommend Eli Pariser’s book, The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think.
OK, so what’s this got to do with parachutes? A lot, it turns out because it questions how we read data. Thanks to an article in the Big Think by Scotty Hendricks, I read about a study that proves that parachutes are useless. A group of scientists at U.S. medical schools has discovered – wait for it – “that parachutes don’t lower the death rate of people jumping out of airplanes.” While these new results fly in the face of common sense and years of evidence, the scientists who conducted the study stand by their findings but note that the results should be “applied carefully due to minor caveats with the experimental structure.”
Let me recap. Yes, you read that correctly – “scientists found that jumping out of a plane with a parachute didn’t lower the death rate of test subjects compared to those who jumped without one.” The results of the study were published in the Christmas edition of the BMJ and “involved 23 test subjects who were randomly sorted into two groups.” Although participants in one group jumped from an airplane with a parachute, while the other group participants jumped from an airplane using an ordinary backpack (no parachute!), scientists found that the survival rates after they hit the ground were the same for both groups. This amazing finding “could” have serious implications for military and civilian parachutists.
But what about the “minor caveat” in the study’s design mentioned earlier? It turns out that to get people to agree to participate in this study, the “minor caveat” was that the airplane was on the ground standing still. The authors, however, don’t admit this until several paragraphs into the paper! So, what’s the point of this? It’s that we should always be careful of headlines, and we should be aware of the limitations of randomized trials. More importantly for the average reader, there is great danger in not reading past the opening paragraph of a study.
Read carefully and critically
The authors of the study said: “The parachute trial satirically highlights some of the limitations of randomized controlled trials. Nevertheless, we believe that such trials remain the gold standard for the evaluation of most new treatments. The parachute trial does suggest, however, that their accurate interpretation requires more than a cursory reading of the abstract. Rather, interpretation requires a complete and critical appraisal of the study. In addition, our study highlights that studies evaluating devices that are already entrenched in clinical practice face the particularly difficult task of ensuring that patients with the greatest expected benefit from treatment are included during enrollment.” Read things carefully and critically.
Don’t just skim the headlines
OK, the entire study was silly (and humorous), but we’ve all read a headline and then talked about it later as if we have read the entire article. Bottom line: make sure you read more than the abstract of a study and make sure you understand the context before you “jump on the bandwagon.” And this brings me back to the title of this post. If we live in a filter bubble, we’re more likely to skim headlines and abstracts without fully understanding them. And that, as I mentioned earlier, can lead to confirmation bias and perhaps even intolerance. Don’t we already have enough of that in the world today?