Facebook Knows There's A Problem With the 'Like' And 'Share' Buttons

Following whistleblower revelations by Frances Haugen last month, Facebook is facing increased scrutiny about how it is impacting US society and the world at large. The internal memo shared by Haugen with the US Securities and Exchange Commission and The Wall Street Journal has led to the opening of an investigation by the US Congress on the company's operations and whether it is knowingly allowing hate and misinformation to thrive on the platform for financial gains.

Another revelation was about the company's so-called 'Elite Whitelist' that created a two-tiered system for enforcing policies, treating celebrities and influential personalities more leniently for gross violations of company policy. Internal research also reportedly showed that the company was aware of the drug and human trafficking problem on its platform but failed to take adequate measures to address those issues.

Related: Is it time to Delete Facebook?

To add to the growing saga surrounding Facebook, a new report from The New York Times now claims that an internal study commissioned by the company found the 'Like' button on the platform causes "stress and anxiety" among many users. This was especially true for teenagers as they constantly worry about how many 'Likes' their photos and other posts are receiving from their friends and peers. Following the report, the company conducted a limited test with select users to see if removing the button had any positive impact on those kids. As it turned out, removing the button reduced ad clicks and other interactions but didn't encourage teenagers to post more photos. So Facebook let the 'Like' button remain despite knowing how it is negatively impacting its users.

Facebook also reportedly studied the impact of the 'Share' button, which, the study found, amplified harmful content, including hate and misinformation. The study even concluded that "The mechanics of our platform are not neutral." Still, the company allowed the 'Share' button to remain because it is a core feature of the website. The report also quotes former high-level Facebook executives as saying that the company sacrificed societal responsibility in favor of growth and user engagement. According to Brian Boland, a Facebook VP who left last year, employees often have frank discussions about what is happening on the platform and what needs to be done to change things for the better. However, "getting change done can be much harder," he noted.

On its part, Facebook, one of the five trillion-dollar tech titans along with Apple, Amazon, Google and Microsoft, has denied any wrongdoing. According to spokesperson Andy Stone, the company does not put profit over people's well-being and has invested $13 billion to improve safety on the platform. He further claimed that the company had hired 40,000 people to reduce toxicity on the platform because its own commercial interests would be hurt if it didn't care about the well-being of its users. He also added that the company supports updated regulations, whereby government regulators should set standards and norms for social media platforms to follow.

Next: How To Stop Facebook Identifying You By Disabling Facial Recognition

Source: The New York Times

from ScreenRant - Feed

Post a Comment