The executives of 4 social media platforms appeared before a Senate Committee on Tuesday. The Senate Homeland Security Committee members criticized Meta for the storm of child-sex abuse material reaching across its platforms. They also alleged TikTok over the extremely high risks simulated from its Chinese ownership.
The Committee members also criticized each of the platforms over the multiple roles they performed to spread QAnon conspiracies. They were also alleged for spreading massive misinformation regarding elections and vaccines. Senator Rob Portman said social media platforms have offered groundbreaking connectivity, often in a more positive way.
He added some issues have raised serious concerns for our children, our national security, and our culture. Moreover, extremists, terrorists, drug dealers, criminals, and some dangerous entities have used their platforms to perform their objectives. The same committee separately heard from former VP of Twitter and Facebook, Brian Boland.
Social Platforms Need to Secure User Engagement
Boland said you don’t know today what is happening with most companies, so you have to trust them. However, he lost his trust in the companies for their operations. Chairman Gary Peters maintained a thread early on off the backs of the former executives. He presented the social platforms as lacking any real financial incentive to ensure user security.
Peters said the incentives of social platforms are to secure user engagement with the growth of platforms to generate revenue. He said in his opening remarks that each of the companies must try to resolve this challenging narrative. Meta chief product officer, Chris Cox, said he was one of the first 15 emphasizing coders at the company. He said it is essential for social platforms to help people feel safe.
Meta Stands Firmly Against the Wrong Use of its Platforms
Cox added that his company stands firmly against the wrong use of social media for spreading hate and violence. He added that Meta prevents hate speech, terrorism, and other dangerous content. Cox also mentioned the specific mechanism Meta to impose its policies. The company is hiring global content review teams and investing billions in modern technologies.
The chief product officer at YouTube, Neal Mohan, also appeared in front of the Committee. His testimony came a day after a report from disinformation researchers at Bot Sentinel. The researchers mentioned a pattern of unchecked hate speech, racism, misogyny, and harassment of specific people. Meanwhile, most of these specific harassment cases focused on famous and known women.
Pew Research Published a Study in 2017
However, the founder of Bot Sentinel, Christopher Bouzy, said YouTube is to blame. He said most folks will not perform such activities if YouTube stops rewarding them. Mohan said the massive number of creators, advertisers, and viewers don’t need to join harmful and controversial content. It is important that Pew Research published a study in 2017 criticizing social platforms.
The study found that indignant rhetoric and politically controversial posts were involved to elicit user engagement. It said the ranking algorithm of Facebook started to treat emoji reactions as 5-times more valuable than likes. The platform applied the theory that eliciting reactions to posts effectively managed user engagement. These reactions include angry faces, smiley faces, and more.