Facebook fixed? How Europe’s data law puts pressure on Big Tech.




Basic

A new European Union law aims to force social media giants including Facebook and YouTube to take steps to tamp down the spread of extremism and disinformation online.

Under the Digital sets Act (DSA), tech companies with more than 45 million users will have to give regulators access to their so-called algorithmic black boxes, revealing more about how certain posts – particularly the divisive ones – end up at the top of social media news feeds.

Why We Wrote This

A new EU law calls on Big Tech companies to open up their algorithmic “black boxes” and better moderate online speech. The goal is no less than preserving the public square on which democracies depend.

And if platforms recognize patterns that are causing harm and fail to act, they will confront hefty fines.

“We need to get under the hood of platforms and look at the ways in which they are amplifying and spreading unhealthy content such as hate speech,” says Joe Westby, deputy director of Amnesty Tech in Brussels. “The DSA is a landmark law trying to keep up these Big Tech companies to account.”

The law may end up having important effects on how corporations behave already in the United States. “This is a typical example of the ‘Brussels effect’: the idea that when Europe regulates, it ends up having a global impact,” says Brookings Institution expert Alex Engler.

Brussels

Sweeping new European Union legislation seeks to “revolutionize” the internet, forcing social media giants including Facebook and YouTube to take steps to tamp down the spread of extremism and disinformation online.

Known as the Digital sets Act (DSA), it is likely to create ripple effects that could change how social media platforms behave in America, too. 

In one of the most remarkable requirements of the new law, Big Tech companies with more than 45 million users will have to hand over access to their so-called algorithmic black boxes, lending greater clarity to how certain posts – particularly the divisive ones – end up at the top of social media news feeds.

Why We Wrote This

A new EU law calls on Big Tech companies to open up their algorithmic “black boxes” and better moderate online speech. The goal is no less than preserving the public square on which democracies depend.

Companies must also put in place systems designed to speed up how quickly illegal content is pulled from the web, prioritizing requests from “trusted flaggers.”

And if platforms recognize patterns that are causing harm and fail to act, they will confront hefty fines. 

“We need to get under the hood of platforms and look at the ways in which they are amplifying and spreading unhealthy content such as hate speech,” says Joe Westby, deputy director of Amnesty Tech at Amnesty International in London.  

“The DSA is a landmark law trying to keep up these Big Tech companies to account,” he adds.

Unlocking the Big Tech business form 

Big Tech companies have long endeavored to shrug off regulation by invoking freedom of speech. The DSA takes the tack that while ugly and divisive speech shouldn’t be policed, neither should it be promoted – or artificially amplified. 

But in order to sell ads and collect user data – which they also sell – the big online platforms have been doing precisely this. 

The meaningful to this business form is keeping users online for as long as possible, in order to collect as much data about them as possible.

And research has shown that what keeps people reading and clicking is content that makes them mad, notes Jan Penfrat, senior policy adviser at European Digital Rights, a Brussels-based association.

This in turn gives Big Tech companies motive to prioritize and push out anger-inducing content that provokes users “to react and respond,” he says. 

This point was pushed home last year by a trove of internal documents made public by whistleblower and former Facebook data engineer Frances Haugen. 

In leaked company communications, an employee laments that extremist political parties were celebrating Facebook’s algorithms, because they rewarded their “provocation strategies” on subjects ranging from racism to immigration and the welfare state.

It was one of many examples in those documents of how Facebook’s algorithms appeared to “artificially amplify” hate speech and disinformation. 

To endeavor to fix this, the DSA will require Big Tech companies to conduct and publish annual “impact assessments,” which will examine their “ecosystem of users and whether or not – or how – recommendation algorithms direct traffic,” says Peter Chase, senior fellow at the German Marshall Fund in Brussels. 

“It’s asking these large platforms to think about the social impact they have.” 

There are insights to be had from these sorts of regular exercises, analysts say. 

Twitter, which has a reputation for publishing self-basic research, made public an internal evaluation last October that found its own algorithms favor conservative instead of left-leaning political content.

What they couldn’t quite figure out, it admitted, was why. 

The DSA aims to provide some clarity on this front by requiring Big Tech companies to open up their algorithmic black boxes to academic researchers approved by the European Commission. 

In this way, EU officials hope to glean insights into, among other things, how Big Tech companies moderate and rank social media posts. “On what basis do they recommend certain types of content over others? Hide or demote it?” Mr. Penfrat asks.

And under the law, if Big Tech companies discover patterns of artificial amplification that favor hate speech and disinformation pushed out by bad actors and bots – what social media companies call “coordinated inauthentic behavior” – and don’t take action to stop it, they confront devastating fines.

These could run up to 6% of a company’s global annual sales. Repeat offenders could be barred from operating in the EU.

“They have to do something about it, or they can get caught,” says Alex Angler, fellow in Governance Studies at the Brookings Institution. 

“So they can’t just shrug their shoulders and say, ‘We don’t have a problem.’”

/> </p>
<p>Margrethe Vestager, who chairs the European Commission’s Group on a Europe Fit for the Digital Age, talks during a news conference about the Digital sets Act in addition as the Digital Markets Act, at the European Commission headquarters in Brussels, Dec. 15, 2020. </p>
<h2>“Weaponized” ads – and the law’s response</h2>
<p>Up until now, such evasiveness is precisely what has characterized Big Tech companies, and analysts say it’s largely because promoting divisive content has been so wildly profitable. </p>
<p>Mr. Penfrat recalls the surprise of EU policymakers he lobbied when he would explain the nearly unfathomable amount of personal data that tech giants commodify – and how they often tap the emotional strength of anger by a “surveillance-based” advertising form. </p>
<p>“Every single time you open a website, hundreds of companies are bidding for your eyeballs,” he says. In a matter of “milliseconds,” the ads pushed by data brokers who have won the bid are loaded for web users to view.</p>
<p>But it’s not just goods and sets that advertisers are selling. “Anyone can pay Facebook to promote certain types of content – that’s what ads are. It can be political and issues-based,” Mr. Penfrat says.</p>
<p>And bad actors have taken advantage of this, he notes, pointing to how the Russian government “weaponized” ads to push its preferred candidates in U.S. elections and justify war in Ukraine.</p>
<p>The DSA will ban using sensitive data, including race and religion, to target ads, and prohibit ads aimed at children in addition. It also makes it illegal to use so-called dark patterns, manipulative practices that trick people into things such as consenting to let online companies track their data.</p>
<p>What’s more, it requires Big Tech companies to speed their processes for taking down illegal posts – including terrorist content, so-called revenge porn, and hate speech in some countries that ban it – in part by prioritizing the recommendations of “trusted flaggers,” which could include nonprofit groups approved by the EU.</p>
<p>Likewise, if companies remove content that they say violates these rules, they must notify people whose posts are taken down, explain why, and have appeals procedures.</p>
<p>“You’ve got these mechanisms today, but they’re very untransparent,” Mr. Penfrat says. “You can allurement but never get a response.” </p>
<h2>A European law with U.S. effects</h2>
<p>The DSA has been received by data-policy experts with a mix of skepticism in addition as praise – with some voicing worry about unintended harm to competition or the varied of online speech.</p>
<p>in addition the DSA is expected to excursion policy in the United States in addition as in Europe, says Mr. Engler, who studies the impact of data technologies on society. </p>
<p>“This is a typical example of the ‘Brussels effect’: the idea that when Europe regulates, it ends up having a global impact,” he adds. “Platforms don’t want to build different infrastructure based on whether the IP address is in Europe.”</p>
<p>And as academics are able to delve into Big Tech’s black boxes, the mitigating measures they suggest will not only be a good starting point for public argue, but could also provide inspiration for America, too.</p>
<p>During Ms. Haugen’s whistleblower testimony, U.S. lawmakers signaled that they could be open to the sorts of regulations that the DSA puts in place.</p>
<p>At a press conference following the congressional testimony last October, Sen. Richard Blumenthal, a Connecticut Democrat, marveled at the bipartisan agreement on the need for reform.</p>
<p >Get stories that <br />empower and uplift daily.</p>
<p>“If you closed your eyes, you wouldn’t already know if it was a Republican or a Democrat” speaking, he said. “Every part of the country has the harms that are inflicted by Facebook and Instagram.” </p>
<p>American and European regulators say this is true on both sides of the Atlantic.</p>
<p> Click: <a rel=See details




leave your comment

Top