The uproar started in early January 2021 for social media companies. The attack on the United States Capitol prompted Twitter, Facebook and YouTube to launch then President Donald Trump. Throughout the year, they have been challenged to stop the spread of baseless claims about the 2020 presidential election, as well as damaging misinformation about vaccines.
Facebook had to respond to a whistleblower’s revelations, just when it wanted to draw everyone’s attention to the “metaverse”. The eccentric Twitter CEO left abruptly, handing over the company, along with its ambitions to create a new take on social media, to a little-known deputy. The Trump administration’s attempt to ban TikTok on national security grounds failed, allowing the Chinese-owned app to consolidate its grip as a defining engine of youth culture.
It’s fair to say that social media apps were at the center of politics and society in 2021, and not always for the better. And yet, many have prospered financially, recording record profits.
So what will 2022 bring? Here are four areas to watch this year.
Lawmakers say they want to regulate Silicon Valley. Can they agree on what this means?
If members of Congress agree on one thing, it’s that the tech giants are too big and too powerful. (Apple became the first publicly traded company worth $ 3 trillion on Monday.)
But the agreement ends there. Democrats want laws that force tech companies to remove more harmful content. Republicans say platforms censor conservative views, despite evidence showing that right-wing content and numbers thrive on social media.
The best chance for bipartisanship may come from a Senate trade subcommittee headed by Sen. Richard Blumenthal, D-Conn., And Marsha Blackburn, R-Tenn. They say they want to work together, especially when it comes to protecting children and teens online. (The two decried what Instagram shows young users after their staff created fake accounts on the photo-sharing platform.)
Lawmakers have introduced a slew of bills targeting tech giants, ranging from social media platforms’ liability for health disinformation to requiring companies to open more data to researchers external to updating the two-decade-old Children’s Privacy Act. They also want to strengthen competition law and give more firepower to the Federal Trade Commission and the Department of Justice, which regulate major technologies.
The question is whether any of these bills will become law in 2022?
As Washington stalls, Europe rushes to counter Big Tech
European regulators have been more willing to take on the tech giants, perhaps because they have little interest in protecting US supremacy in the industry and because many Europeans are more comfortable with it. government intervention to protect citizens.
The European Union is writing tough new rules that would prevent big tech companies from prioritizing their own products and services, like Amazon pushing people to the items it sells over those from third-party vendors. They would also force companies to crack down on harmful content, such as child sexual abuse and terrorism, and give users more control over how their data is used to target ads.
The UK recently set new standards for how apps should be designed for children, including providing parental controls, disabling location tracking and limiting the data they collect. Companies like Instagram, TikTok, and YouTube are already making changes to comply. For these global companies, it is often easier to implement new rules universally than to try to apply a mishmash of different policies for users in different countries. Regulations adopted in Europe could affect users far beyond the continent.
Businesses will struggle to understand how their apps affect children’s mental health and safety
Reports that Instagram was building a version of its photo-sharing app for children under 13 have sparked criticism from parents and regulators and bipartisan outrage in Congress. The outcry was amplified by internal research leaked by Facebook whistleblower Frances Haugen, revealing Instagram knew its platform was toxic to some teenage girls.
Under pressure, Instagram halted work on the kids’ app in September, but Instagram director Adam Mosseri made it clear to Congress in December that the company still plans to continue the project. He says the kids are already online, so it would be best if they were using a version of Instagram with parental controls.
All social media platforms, from Instagram to TikTok and Snapchat, will be grappling with this in 2022. Children and teens are a critical demographic, essential for business growth.
The midterm elections are approaching. What will Facebook, TikTok and others do against disinformation?
Companies say they have learned a lot by dealing with adversaries ranging from Russian trolls and Chinese influence operations to elected officials spreading disinformation and companies selling espionage as a service.
But the challenges facing their platforms also continue to evolve. Using social media to sow discord, undermine authoritative information, and spread rumors and lies is now a tactic used by anti-vaccine campaigners, far-right extremists and climate change deniers. So, in 2022, you can expect elected officials and candidates to continue spreading disinformation online.
There is already pressure on social media companies to increase their resources before the campaign really begins. Some lawmakers are eager to pass laws that pressure companies to do more to stop the spread of harmful or bogus content, but these diktats could clash with the First Amendment rights of technology platforms.
Meanwhile, executives like Mark Zuckerberg, CEO of Meta (formerly Facebook), have made it clear that they don’t want to be the arbiter of what people can say online. And do people really want to empower these unelected business leaders? We will await the answer to this question in 2022.
Editor’s Note: Amazon, Apple and Google are among the financial backers of NPR. Meta pays NPR to license NPR content.