Product details

By continuing to use our site you consent to the use of cookies as described in our privacy policy unless you have disabled them.
You can change your cookie settings at any time but parts of our site will not function correctly without them.
Subject category: Entrepreneurship
Published by: Harvard Business Publishing
Originally published in: 2022
Version: 30 June 2023
Length: 26 pages
Data source: Published sources

Abstract

The case discusses the relatively low technology approach used by Russia to influence the US Presidential Election in 2016. Although political parties manipulating the media was not a new phenomenon, the Russians ran a broad, well-financed, and sophisticated social media campaign that started in 2014 and grew each year. Russia's IRA (Internet Research Agency) managed messages, and posted links and content across Twitter, YouTube, Facebook, and Instagram. Like any disciplined marketer, they tested content on a few sites and doubled down on messages that worked. Messages relied heavily on sharable memes tailored to the identity of each target group based on political affiliation, religion, ethnicity, and geography. The IRA initially focused on building trust and group identity by creating a sense of belonging. Over time, these morphed into messages that were external threats to the group identity with an aim to sway behavior. Russia's ability to meddle with the Presidential election was partly the result of systemic weaknesses in the US governance of social media platforms. The leaders of social media platforms admitted that state actors had gamed their platforms to influence politics. However, underlining the misinformation campaign were opaque, influential algorithms that determined what content was viewed by billions of internet users. In a quest to capture attention and maximize engagement, these had fractured social norms necessary for a healthy democracy - leaving populations vulnerable to online misinformation.
Other setting(s):
2016

About

Abstract

The case discusses the relatively low technology approach used by Russia to influence the US Presidential Election in 2016. Although political parties manipulating the media was not a new phenomenon, the Russians ran a broad, well-financed, and sophisticated social media campaign that started in 2014 and grew each year. Russia's IRA (Internet Research Agency) managed messages, and posted links and content across Twitter, YouTube, Facebook, and Instagram. Like any disciplined marketer, they tested content on a few sites and doubled down on messages that worked. Messages relied heavily on sharable memes tailored to the identity of each target group based on political affiliation, religion, ethnicity, and geography. The IRA initially focused on building trust and group identity by creating a sense of belonging. Over time, these morphed into messages that were external threats to the group identity with an aim to sway behavior. Russia's ability to meddle with the Presidential election was partly the result of systemic weaknesses in the US governance of social media platforms. The leaders of social media platforms admitted that state actors had gamed their platforms to influence politics. However, underlining the misinformation campaign were opaque, influential algorithms that determined what content was viewed by billions of internet users. In a quest to capture attention and maximize engagement, these had fractured social norms necessary for a healthy democracy - leaving populations vulnerable to online misinformation.

Settings

Other setting(s):
2016

Related