April 15, 2024
Lawmakers questioned social media CEOs — but don't expect anything to change


At a Senate Judiciary Committee hearing this week, tech CEOs were focused on protecting children on their platforms. Social media has experienced many changes, with UMG’s music being removed from TikTok, X (formerly known as Twitter)’s actions surrounding Taylor Swift’s AI deepfakes, and more.

However, rather than protecting children while maintaining privacy for users, we are likely to see more political theater where executives continue to apologize for the harm their platforms have caused, while lawmakers demand more action.

“I don’t think this time it will be much different in terms of consequences than what we’ve seen in the past,” warned Dr. Cliff Lampe, professor of information and associate dean for academic affairs in the School of Information. University of Michigan.

Children’s Online Protection Act

Lawmakers are actually pushing the Kids Online Safety Act, which would establish a “duty of care” for social media and other technology to provide more parental controls. Given that privacy advocates, including the Electronic Frontier Foundation and the American Civil Liberties Union, have already expressed concerns over the proposed legislation, no middle ground is likely to be found.

“Many of these social and digital platform CEOs have appeared at Senate hearings before, where lawmakers used it as an opportunity to push their agenda,” said Jason Mollica, a lecturer and professor in the School of Communication at American University. Is.” “Senators took the opportunity to question them again on January 31. But they basically agreed that online safety for children is uppermost in their minds. Despite both parties finding common ground, the Kids Online Safety Act ( The idea of ​​KOSA) is being criticized by those it is potentially intended to protect: young people.”

Then there’s the fact that existing law already gives too much power to the same companies that keep finding themselves in the hot seat with lawmakers.

“The key law to understand here is Section 230 of the 1996 Communications Decency Act, which protects platforms from being held responsible for the content they host,” Lampe explained. “This means that Congress can put moral and ‘platform’ pressure on social media companies, but not do anything meaningful with the law. They can do things like shame, but not really do much else to amend or change 230. There have been some halting efforts from both parties to do so. But it’s pretty well-established law at this point.”

business as usual once again

Even though Meta CEO Mark Zuckerberg apologized to parents who said their children suffered or even died because of content on Facebook’s platform, it’s likely business as usual. Business will continue.

“The main problem is the economic model under which these services are operating, where advertisers foot the bill and users pay for the product,” said Rob Enderle, a technology industry analyst at the Enderle Group.

“As long as that’s the case, the motivation to protect users will be subordinate to the need for revenue-generating clicks. The right thing to do would be to make it illegal to siphon off revenue from users of a service so that attention can be paid to those users,” Enderle said. It rests on the safety and satisfaction of users who are the revenue source, not on maximizing revenue by giving user safety a much lower priority than clicks.”

He further said that whenever there are wrong goals there are problems, and reducing revenue from users has proven to be the same.

Enderle said, “Products and services in which users provide revenue are more secure and I doubt that any regulation that falls short of aligning user care with revenue generation will be effective because compensation drives behavior.” Is.”

Another possible solution would be to hold advertisers responsible for losses they cause on platforms, which would make the industry focus more on user safety.

“So, no matter what, tying profits to users, punishing advertisers for user abuse, you can argue that services are effectively paid agents of advertisers,” Enderle said. “It makes me wonder whether this latter approach could also fix our problem with fake news services.”

What if the business model couldn’t change?

Since it appears the business model will not change, the question arises whether there is enough incentive from lawmakers to actually address this issue. Can KOSA even get enough support, or is KOSA just DOA?

“On paper, this job looks good,” Mollica suggested. “However, many, including human rights organizations and LBGTQ groups, have taken this aside. COSA aims to require social and digital platforms to take measures to prevent harm to children. However, the definition of ‘harm’ is not particularly clear. That’s why some people are saying it could censor conversations on gender identity and reproductive health issues, as well as political content.”

As it stands, the legislation has an uphill climb, despite bipartisan support and the backing of Snapchat’s parent company Snap. Even if KOSA potentially becomes law, its legality could be entangled in the courts for years to come.

Molika added, “Protecting young people online is obviously a good idea. Limiting content as a form of protection is not the way to go, however.” “There will be more statements about protecting children from legislators and social and digital CEOs. That’s likely to be all.”

Lawmakers have previously taken the stance that they do not want to regulate tech companies. There is no reason to believe it is any different now.

“It’s not clear that legislators have any strong incentive to actually change what the platforms are doing here,” Lampe said. “Replacing 230 would be a challenge, but there are other privacy and security laws they could pass in this area, but they haven’t. My concern is that this hearing was more demonstrative than anything else.”

Leave a Reply

Your email address will not be published. Required fields are marked *