MEDIA IMPACT PROJECT
  • ABOUT
    • MISSION
    • OUR TEAM
    • WHAT WE DO
    • FELLOWS & PARTNERS
  • PROJECTS
    • OVERVIEW
    • AFRICA NARRATIVE
    • ACTION CAMPAIGNS
    • CHARITABLE GIVING IN THE MEDIA
    • CLIMATE & SUSTAINABILITY RESEARCH PROJECTS
    • HEALTH EQUITY
    • FILM/TV DIPLOMACY
    • IDEOLOGY & ENTERTAINMENT
    • IMMIGRATION ON TV
    • JOURNALISM STUDIES >
      • VIRTUAL REALITY
    • POVERTY NARRATIVES
    • THE SOCIAL DILEMMA IMPACT STUDY
  • PUBLICATIONS
    • OVERVIEW
    • Are You What You Watch?
    • Africa in the Media
    • CASE STUDIES & TIPSHEETS
    • Charitable Giving in the Media
    • IMMIGRATION ON TV
    • METRICS GUIDES FOR JOURNALISTS
    • POVERTY IN POP CULTURE
    • VIRTUAL REALITY
  • NEWS & EVENTS
  • BLOG
  • CONTACT
  • ABOUT
    • MISSION
    • OUR TEAM
    • WHAT WE DO
    • FELLOWS & PARTNERS
  • PROJECTS
    • OVERVIEW
    • AFRICA NARRATIVE
    • ACTION CAMPAIGNS
    • CHARITABLE GIVING IN THE MEDIA
    • CLIMATE & SUSTAINABILITY RESEARCH PROJECTS
    • HEALTH EQUITY
    • FILM/TV DIPLOMACY
    • IDEOLOGY & ENTERTAINMENT
    • IMMIGRATION ON TV
    • JOURNALISM STUDIES >
      • VIRTUAL REALITY
    • POVERTY NARRATIVES
    • THE SOCIAL DILEMMA IMPACT STUDY
  • PUBLICATIONS
    • OVERVIEW
    • Are You What You Watch?
    • Africa in the Media
    • CASE STUDIES & TIPSHEETS
    • Charitable Giving in the Media
    • IMMIGRATION ON TV
    • METRICS GUIDES FOR JOURNALISTS
    • POVERTY IN POP CULTURE
    • VIRTUAL REALITY
  • NEWS & EVENTS
  • BLOG
  • CONTACT

Please Manipulate Me

7/14/2014

0 Comments

 
By Marty Kaplan, Director, Norman Lear Center

What do you call it when media try to manipulate your feelings without first asking for informed consent?

Tuesday.

Example: The average Facebook user sees only 20 percent of the 1,500 stories per day that could have shown up in their news feed. The posts you receive are determined by algorithms whose bottom line is Facebook's bottom line. The company is constantly adjusting all kinds of dials, quietly looking for the optimal mix to make us spend more of our time and money on Facebook. Of course the more we're on Facebook, the more information they have about us to fine-tune their formulas for picking ads to show us. That's their business model: We create and give Facebook, for free, the content they use and the data they mine to hold our attention, which Facebook in turn sells to advertisers.
Those are the terms of service that everyone, without reading, clicks "I Agree" to — and not just for Facebook. We make comparable mindless contracts all the time with Gmail, Yahoo, Twitter, Amazon, Siri, Yelp, Pandora and tons of other apps, retailers and advertiser-supported news and entertainment. If you're online, if you use a smartphone, you're an experimental subject in proprietary research studies of how best to target, engage and monetize you. They're always testing content, design, headlines, graphics, prices, promotions, profiling tools, you name it, and you've opted in whether you realize it or not.

Many of these experiments hinge on our feelings, because much of what makes us come, stay, buy, like, share, comment and come back is emotional, not rational. So it should surprise no one that Facebook wants to know what makes its users happier. But when they acknowledged last month that they had tested — on 700,000 people, for one week — whether increasing the fraction of upbeat posts in their news feeds made them feel more upbeat (it did), a firestorm broke out.

The charge: People are being treated like guinea pigs without their consent. Unaccountable corporations are secretly manipulating our emotions. This is the slippery slope to Brave New World.

So what else is new? Neil Postman first warned us about Amusing Ourselves to Deathin his book of that name in 1984, before the Web was spun. But that didn't stop entertainment, which is exquisitely attuned to the marketplace, from making its long march through our institutions. Today, politics is all about unaccountable corporations manipulating our emotions; they're constantly testing and targeting their paid messages to voters, none of whom are asked for informed consent. The news industry is all about the audience, and much of its content has long been driven by the primal power of danger, sex and novelty to trap our attention, but there's no clamor for shows and sites to warn us we're lab chimps.

John Kenneth Galbraith called advertising "the management of specific demand." Ads tell us stories, which are all variants of: If you buy this, you'll be happy. Their words and images were tested on audiences even before Don Draper was a boy, and now digital analytics gives marketers new attention management techniques to use on us. Today, every tweet, every YouTube or blog post aspires to be viral, and when that happens, no one complains that some cat or cute kid or Kardashian has used Orwellian mind-control to manipulate our mood.

I'll give the Facebook freakout this: University partners did the research using Facebook's data, and the academic vetting process could have gone the other way and nixed the project. But even if that had happened, Facebook could still have conducted this experiment, just as they and Google and plenty of other companies no doubt continue to adjust algorithms, run randomized trials of content and design (known as A/B tests) and discover the many economic, political and cultural micro-tribes we consumers belong to. Academic committees called Institutional Review Boards rule on what professors can do to research subjects, but informed consent in Silicon Valley is basically what someone can get away with, which is what's been true for commerce, politics and the content industries since at least the 1980s.

In fact, ever since people first gathered around the fire, storytellers have perfected their skills by studying the data in their audiences' eyes. Today, we may think that our media savvy and bullshit detectors protect us from being played like piccolos, but people have always believed that thinking could reliably prevent their emotions from running away with them, and they've always been wrong. Neuroscience now shows what happens: Our emotions are faster than our reason, which we then use to reverse engineer some rationalization for our actions.

Is there any way to protect people from the hidden persuaders, as Vance Packard called an earlier era's desire wizards? After all, the arts and technologies of manipulation are only going to get more powerful. Consumer protection is only going to grow weaker. Mass education's ability to turn out critical thinkers is hardly going to spike upward. The best plan Plato could come up with to protect future leaders from being enslaved by their appetites was to exile the most powerful manipulators of his time -- the poets, who whipped crowds into frenzies with their artifice and illusions.

But banishment is an authoritarian solution. More speech, not less, is the democratic answer to assaults on freedom and agency. Open-source research, with methods and tools freely available, can serve the public interest. (We're up to that at the Norman Lear Center's Media Impact Project.)

The place where countervailing speech really wants to get heard is in the media, whose industrial success, like Facebook's, depends on monetizing our attention. I've seen a lot of stories about Facebook fiddling with the happiness of our feeds. The irony is that I encountered all of them on media whose owners are just as determined to push my buttons as Mark Zuckerberg.

0 Comments



Leave a Reply.

    Media Impact Project

    A hub for collecting, developing and sharing approaches for measuring the impact of media.

    Archives

    September 2021
    March 2021
    September 2020
    January 2020
    October 2019
    September 2019
    June 2019
    February 2019
    January 2019
    September 2018
    August 2018
    July 2018
    June 2018
    April 2018
    January 2018
    December 2017
    November 2017
    October 2017
    August 2017
    June 2017
    December 2016
    August 2016
    May 2016
    April 2016
    March 2016
    February 2016
    July 2015
    June 2015
    March 2015
    November 2014
    October 2014
    July 2014
    June 2014
    May 2014

    Categories

    All
    Advertisers
    Consent
    Demographics
    Documentaries
    Entertainment Education
    Entertainment Preferences
    Facebook
    Film
    Impact Stories
    Internet
    Johanna Blakley
    Journalism
    Media Impact Project
    Metrics
    Mobile
    Participant Media
    Social Media
    TED
    TPI
    Trends

    RSS Feed

The Norman Lear Center's Media Impact Project researches how entertainment and news influence our thoughts, attitudes, beliefs, knowledge and actions. We work with researchers, the film and TV industry, nonprofits, and news organizations, and share our research with the public. We are part of the University of Southern California's Annenberg School for Communication and Journalism.