
But while the criticism can be ignored, the CEO of Facebook’s parent company, Meta Platforms Inc. He needs to reassess his priorities on Facebook. He needs to turn his attention back to Facebook. Otherwise, they must run the risk of spreading misleading videos about election fraud that could once again disrupt the democratic process.
Zuckerberg can rethink his tasks, starting by doing what thousands of managers before him have done.
The metaverse project is just getting started. Facebook has nearly 3 billion active users, but Horizon Worlds, the VR platform underlying the Metaverse experience, has just 200,000 active users, according to internal documents uncovered by The Wall Street Journal. is.
Zuckerberg candidly states that it will take more than five years for Meta’s metaverse to be fully realized. So much so that his passionate project can afford to lose his interest for a few months, or at least at a critical time for democracy.
So far, he’s shown no signs of shifting his focus. According to The New York Times, Facebook’s core campaign team no longer reports to Zuckerberg the way it did in 2020, when Zuckerberg made the US election his top priority that year.
He also loosened the reins of key executives tasked with handling election misinformation. Head of Global Affairs Nick Clegg is currently splitting his time between the UK and Silicon Valley and is responsible for the company’s information security. The owner, Guy Rosen, has moved to Israel, a company spokesperson confirmed in an email.
Researchers who track disinformation on social media say there’s little evidence that Facebook is better at thwarting conspiracy theories today than it was in 2020. We have not improved data access for external researchers trying to quantify the spread of misleading posts. Anecdotally, they are still proliferating, she said, and Smith said she found her Facebook group recruiting election vote watchers for the purpose of intimidating voters on Election Day.
She also pointed to a video posted on his Facebook page by Florida Rep. Matt Gates, saying the 2020 election was stolen. Posted a month ago but the warning label for Fact her check is missing.
Smith also cited a recent Facebook post, shared hundreds of times, inviting people to an event discussing how the “Chinese communists” are running local elections in the United States. Certain politicians cited posters that said they “should go to jail for their role in stolen elections.” Posts by candidates tend to be particularly widespread, Smith said. .
Meta says the primary approach to handling content until the 2022 midterm elections will be to use warning labels. However, warning labels are not very effective. A study conducted by the Integrity Institute, a non-profit research organization, found that more than 70% of his misinformation posts on Facebook were made more than two days after the posts were sufficiently likely to go viral. , labeled as such. By a former employee of a large tech company. Research shows that disinformation gets him 90% of total engagement on social media within a day.
Ultimately, the question is how Facebook displays content that is most likely to keep people on the site. According to Jeff Allen, a former data scientist at Meta and co-founder of the Integrity Institute, a “quality-based ranking” that prioritizes consistently authoritative sources, similar to Google’s page rank system, is better. approach will be.
Facebook’s increasing focus on video exacerbates the problem. In September 2022, misinformation was shared much more frequently via video than through regular Facebook posts, Allen said, citing a recent study by his Integrity Institute. said. In general, false content gets more engagement than truthful content, so it tends to be favored by engagement-based systems, he added (1).
In 2020, Facebook rolled out a “glass-breaking” tactic to combat a surge in posts alleging the election was being stolen by then-president-elect Joe Biden.
Meta never has to resort to such drastic measures again. If Zuckerberg is serious about bringing people together and doing so responsibly, it needs to get out of the virtual reality bubble and rethink the ranking system that keeps people’s eyes glued to Facebook’s content. At least he can tell his employees and the public that he’s making electoral integrity a priority again.
Bloomberg Opinion Details:
If Musk owning Twitter is a security risk, what about Tesla?: Liam Denning
When Musk Gobbles Twitter, He’s a Threat to All of Us: Tim Calpan
Zuckerberg’s $1,499 headset is meta useless: Parmy Olson
(1) According to Allen’s research, the “false alarm amplification factor” of video content on Facebook in September 2022 was 4.2 for regular posts and 14 for videos.
This column does not necessarily reflect the opinions of the editorial board or Bloomberg LP and its owners.
Parmy Olson is a columnist for Bloomberg Opinion covering technology. He is a former reporter for The Wall He Street He Journal and Forbes, and author of “We Are Anonymous.”
More articles like this can be found at bloomberg.com/opinion.
Commentaires
Enregistrer un commentaire