Socially Responsible? Social Media Companies Gearing Up With Big Changes For Election Day.

This combination of images shows logos for companies from left, Twitter, YouTube and Facebook. Social media companies are failing to stop manipulated activity, according to a report Friday, Dec. 6, 2019 by NATO-affiliated researchers who said they were easily able to buy tens of thousands of likes, comments and views on Facebook, Twitter, YouTube and Instagram. Most of the phony accounts and the activity they engaged in remained online weeks later, even after researchers at the NATO Strategic Command Centre of Excellence flagged it up as fake. (AP Photos/File)

The trend across social media toward battling misinformation continues, and now it involves one of the bigger stories of the century: the 2020 presidential election.

And trying to do the right thing will be no small task.

Google-owned YouTube announced that it will display a panel to help search story headlines and videos on Nov. 3, Election Day, and it will use the message: “Results may not be final. See the latest on Google.”

Once customers click a link, they will be taken to a separate Google page with real-time results based on the real data from the Associated Press.

In another blog post, YouTube said that the company will promote “authoritative” sources, such as CNN and Fox News, while “limiting the spread of harmful election-related misinformation and borderline content.”

Among the recent moves by social media outlets, from a story Tuesday on CNBC:

Facebook decided to ban political ads after the election. Google said it would temporarily pause ads referencing the 2020 election starting on Election Day. “Given the likelihood of delayed election results this year, when polls close on November 3 we will pause ads referencing the 2020 election, the candidates, or its outcome,” Google announced.

YouTube announced a policy to ban conspiracy theories that could result in real-world violence, including those by far-right group QAnon, but it stopped short of an all-out ban.

Twitter, Facebook and TikTok have explicitly stated measures they will take if a candidate or party prematurely claims victory. YouTube said it would follow existing rules that prohibit “misleading claims about voting or content that encourages interference in the democratic process.”

On Monday, Twitter posted a message saying that it is seeking to ensure the safety and accuracy of mail-in voting, noting, in part, that “experts and fact-checkers have continued to assure American voters that voting by mail is a safe and secure option, especially in the middle of a pandemic.”

So at least it appears that the social media platforms are aware – and gearing up for – a wild first week of November.

Join the conversation!

We have no tolerance for comments containing violence, racism, profanity, vulgarity, doxing, or discourteous behavior. If a comment is spam, instead of replying to it please hover over that comment, click the ∨ icon, and mark it as spam. Thank you for partnering with us to maintain fruitful conversation.