If you are a digital marketer and think Russian Troll farms only impact the world of politics, think again….

By now, you have probably heard about how trolls and bots were used to influence the 2016 elections in the United States.  And if you are a marketer with expertise in Social Media, you can easily understand how Social Media Channels – because of their network effects –  can easily support the rapid dissemination of any messages – positive and negative.

If you are not familiar with how they work in the context of politics, let’s start with simple definitions:

  • Bots: automated accounts that repost content usually focused on a specific hashtag, or a specific digital destination.  A specific message can be disseminated in seconds by thousands of them working together without human intervention
  • Trolls: accounts created by individuals usually with a fake identity, focused on writing content – typically on controversial topics – that are then posted organically or promoted via paid ads and supported by an army of bots for reposts. These individuals are probably sitting somewhere in Russia but the accounts are  created as a 30 year old housewife in Michigan, or an 18 year old gun lover for example.

Bots and trolls can be found anywhere in the world, but the most sophisticated operation is found in Russia as they have used it internally to promote Putin’s agenda while making it seem like it is individual people talking about their priorities on social media.  This video provides additional information about the Russian troll farms.

Let’s say however, that you are not interested in politics, as a marketer how does this impact you and your priorities?

US social-media ad spend is expected to reach $14 billion in 2018, up from just $6.1 billion in 2013.   If you are a CMO or a digital marketer, you know a significant part of your budget is spent on social media.  But what happens when the platform includes a significant number of trolls sitting somewhere in Russia but posing as individuals in the US? The result is that audience metrics are significantly impacted and your money may be spent reaching out to fake accounts.

  • 10% of Facebook’s 2.07 billion monthly users are now estimated to be duplicate accounts, up from 6% estimated previously. The social network’s number of fake accounts, or accounts not associated with a real account, increased from 1% to 2-3%. These figures mean that there are now roughly 207 million duplicate accounts and as many as 60 million fake accounts on the network.  They say they are working on ways to take this into account when campaigns are being created but is it enough?
  •  Twitter is estimated to have about 50M fake accounts.

As advertisers more focus on having this issue fixed should be demanded. After all, you need to make sure your money is not wasted on advertising to fake accounts.  Technically it is probably a difficult challenge, but the same way that email systems had to find ways to reduce the impact of spam and work on better spam filters, it is time for social media organizations to add focus on technologies to help them reduce this problem.  A couple of ways that come to mind to address this problem could be to increase the use of machine learning models to support identification of bot and troll accounts, and the use of technologies like blockchain for digital id so that people on social networks are actually who they say they are.

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s