0

Our Very Human Need to Pop “Online Filter Bubbles”

A long time ago, when I was a nerd in grad school (totally not a nerd anymore), one of the concepts that I did the most research on is the idea of the “filter bubble.” That is to say, does the massive amount of content on the web – and the means in which we go about accessing that content – either reshape or concrete people’s pre-existing worldviews?

Two professors – Bruce Bimber of the University of California and Richard Davis of Brigham Young University – have been exploring the idea of the online filter bubble since the 2000 presidential election.

In their study “The Internet In Campaign 2000: How Political Web Sites Reinforce Partisan Engagement,” they put this quite elegantly – thanks to a quote from singer-songwriter (and household favorite) Paul Simon:

“People tend to select out for attention those stories and claims that confirm their existing beliefs and predispositions. And when confronted with news or other information that tends to conflict with their assumptions about public life, people are especially likely to disbelieve what they see or hear.

These political habits call to mind lyrics to Paul Simon s 1970 song entitled The Boxer: ‘a man hears what he wants to hear and disregards the rest.'”

A few years later, Bimber and Davis talked about it further in their book Campaigning Online: The Internet in U.S. Elections.  Narrowcasting – the niche version of broadcasting, in which a fragmented media environment leads to the ability to access content designed for a smaller, more specific audience – is “one of the defining features of the Internet.” When users have a massive and nearly infinite number of outlets and diverse opinions, they’re more able to seek out stories that interest them. But as a result, they consume content that reinforces their current worldview.

In a TED talk released this week, Moveon.org founder Eli Pariser talks about this very topic.  Looking at the idea of one’s personalized Google search results or a “relevant” Facebook feed (which are all based on the idea of what we’re clicking when these pages return results to us), Pariser says that we’re increasingly becoming in danger of “algorythmically editing the web.”

As a result, Pariser says, we’re moving “toward a world in which the Internet is showing us what it thinks what we want to see, but not necessarily what we need to see.” He calls this a “filter bubble,” which he says is “your own unique universe of information that you live in online” – but a universe where you don’t decide what gets in and you don’t see who gets edited out.

Watch the talk here:

The “online filter bubble” problem that Pariser describes demonstrates the disconnect between “our future aspirational selves and our more impulsive present selves.”

Human behavior doesn’t always sync up with what our dreams, goals, and visions are.  What we believe morally certainly helps us to endeavor to behave in certain ways, but when we’re clicking around the web, are these even decisions that we’re conscious of?  I wonder how much of my human aspirations are displayed as I consume the web.  It would be a snapshot, for sure – just like, say, looking through a garbage bag – but I’m willing to bet that it’s just the crumbs and wrappers of all of the Internet junk food that I eat on an ongoing basis.

As Pariser points out, this idea points to the need that we need to reevaluate the mythology of the Internet –  that it will be some great democracy enhancer that connects us with everything that’s happening.  This isn’t the case if Internet algorithms edit out content that challenges us or causes us to re-evaluate concepts that we believe in.

As humans, we long for more than just correlations about relevancy.  We need an experience that mixes the idea of snack food-like relevancy with the sustenance of our human aspirations.

1

What Happens When They Expect “Fake”?

There’s a lot of hoopla around a “scandal” that has broken out when it comes to transparency.

According to MobileCrunch, a leading mobile communication and technology site, a PR firm called Reverb Communications has “managed to find astounding success on Apple’s App Store for its clients.”  One of their tactics, especially, involves hiring “a team of interns to trawl iTunes and other community forums posing as real users, and has them write positive reviews for their clients.”

This development in itself is startling to some, but in reality, I’m not terribly surprised.  My younger brother, who just wrapped up his college degree in marketing, once had an internship within the mobile gaming industry, and once told me this practice is totally rampant within that community.  Completely commonplace.

For one, there’s a huge issue in subjecting interns to performing unethical communications.  These interns, too eager to please in a hostile job market, are being taught that this is a professional method in conducting online marketing.  Whereas these firms should be teaching basic, standard fundamentals like transparency – methods that ensure that the client whom they hired is protected and that their brand is safe – they’re instead teaching future marketing, communications and public relations professionals how to take shortcuts.  They’re ingraining these types of practices within our industry’s future.

But, I think there’s also a larger issue here.  When talking to my brother about these practices, he essentially told me that these kinds of practices should be expected by the consumer.  He didn’t mean it as a “this is actually an ethical practice” argument, but rather, that younger people (look at me, I’m not even 26, and I’m talking about “younger people”) completely expect these communications to be fake.

For one, it makes a communication professional’s job harder.  The burden of proof is on us to show that what we’re doing is, in fact, real.

For instance, one campaign I’m currently working on is called the Campaign for Quality Services.  It’s about adding the voice of food service workers to the debate around passing an improved Child Nutrition Act (end plug).  In the campaign, since my goal is about adding their voice, I’m striving to ensure that the voice is authentic and prevalent throughout.

In building the site, one of my first goals was to collect quotes and stories from workers – real, actual quotes from interviews and conversations that we shared.  But, what I’ve found is that simply adding the quote to a picture of the worker isn’t enough.  The audience simply doesn’t believe that the quote really comes from that worker whose picture is on my site.  Instead, I’ve found that I have to move to video on various pages.  The burden of proof is simply on me.

So, in essence, when a company isn’t transparent, when they lie about who they are and who they represent, it doesn’t just damage their company, and it doesn’t just damage their clients.  It hurts all of us within the field, who then have to take the next step in creating an environment where we’re believed.

The good news out of this, however, is that it’s situations like this that challenge us, and force us to think outside of the box.  It pushes us to innovate and to strive to create content that is more real and more authentic.  It forces is to really live by the best practices we preach, and to work to develop and discover new best practices.

In that way, perhaps there is some good in these developments after all?