The Digital Services Act and the American Web Space

By: Julianna Simpson

What is the Digital Services Act?

 

         The European Union’s (“EU”) Digital Services Act (“DSA” or the “Act”), adopted by the European Parliament in July 2022, will apply to all platforms operating in the EU beginning in February 2024. Primarily seeking to protect consumer rights within the online space, the DSA will impose robust data privacy measures, implement illegal content reporting, and enhance protections for children on the Internet.

The multi-faceted Act addresses the often repeated concerns regarding online platforms. For instance, it seeks to combat the spread of misinformation, algorithmic targeting, sale of illegal products, and mental health harms to children and teens. The DSA makes efforts to prevent data-driven advertising, which often targets children or other users based upon their sexual orientation, ethnicity, or other protected characteristics. Additionally, the Act bans the use of coercive designs which persuade users to give up their personal data. Companies commonly employ these designs, called “dark patterns,” by requesting  to track users’ data and then minimizing the option to opt-out of such tracking by hiding the choice using selective font size or placement. Such measures are among the Act’s larger initiative to protect users’ privacy and improve transparency within the digital space.

 

Who does the Digital Services Act Affect?     

The DSA affects both consumers and providers of digital services in three distinct categories: businesses, platforms, and users. As defined within the Act, digital services range “from simple websites to Internet infrastructure services and online platforms.” This broad language primarily affects social media networks, online marketplaces, and companies’ websites. These platforms are often the source of concern among policymakers and consumers alike.

         The DSA seeks to make digital platforms gatekeepers of the online space, imposing the strictest obligations upon this group. Upon its initial enactment, the DSA applied only to designated “very large online platforms,” defined as those with over 45 million users in the EU. Companies designated as such are required to perform risk assessments and impose mitigation procedures to address such risks. For instance, minors accessing TikTok in the EU will no longer receive targeted ads within the application. The Act has been slowly phased in, and beginning on February 17, 2024, all platforms must comply with the DSA standards. While very large platforms face more robust regulation, smaller platforms will soon have to meet transparency requirements, vet third-party suppliers, and other similar requirements seeking to protect users.

         The Act similarly affects businesses who use the online space. Such businesses have special reporting privileges regarding illegal content, as well as obligations to likewise protect their users from illegal content. For instance, incitement of terrorism, sale of illegal goods, and illegal hate speech are among the illegal activities that the Act seeks to address through robust flagging protocol. The Act aims to both protect and empower consumers, primarily by protecting rights of privacy and expression in an online forum. For instance, a website’s Terms & Conditions must be explained in plain language, under the Act, to keep users informed. The Act seeks to grant users more knowledge about their rights and protections and provides additional reporting mechanisms.

 

Why Should I Care?

Although the DSA currently only technically applies within the EU, many proponents suggest that it may have broader-reaching implications. First, many of the “very large online platforms” are United States based, such as Apple, Amazon, Meta, and Google. Many of these companies are working together with the European Commission to come into compliance, and, ideally, improve their platforms. Indeed, the language within the Act implicates both platforms with an EU headquarters, and those with a “substantial connection” to the EU. As such, improvements may spread outside the EU and set a trend for the online space globally.

Some argue that the DSA fundamentally conflicts with recent United States laws which take a more hands-off approach to online regulations. For instance, Texas House Bill 20 of 2021 (“HB 20”) prohibits large online platforms from moderating online speech based on viewpoint, which directly clashes with some of the content-moderation policies implemented in the DSA.  Under the DSA, content, including illegal speech or misinformation, must be removed upon notice to the platform, but the United States regulation is seeking to do just the opposite.

Nonetheless, HB 20 has come under fire as unconstitutionally restricting First Amendment rights by prohibiting media platforms from censoring content and implementing strict disclosure requirements. The United States Court of Appeals for the Fifth Circuit rejected the constitutional challenge to HB 20 in the case NetChoice, LLC v. Paxton, and the appeal is up for oral argument at the Supreme Court in February. In contrast, a similar case has come out in the opposite direction when the United States Court of Appeals for the Eleventh Circuit determined that a Florida law, akin to Texas HB 20, likely could not survive First Amendment scrutiny because the content-moderation requirements imposed did not promote any legitimate state interest. The case is also set for argument on February 26th, and the outcomes of these cases will likely indicate the potential success of the DSA in the United States.

 Finally, the DSA raises questions about cross-jurisdictional regulation over the internet. The DSA could become a model for regulation elsewhere, or rather, something to avoid. Whether the DSA becomes an example for global regulation remains to be seen, but the DSA will certainly impact the way we operate online in a profound way.

Previous
Previous

Redress for Victims of Generative AI: Copyright Infringement and Right of Publicity Claims

Next
Next

The SEC’s New Rules on Private Equity: Investor Shield or Investment Stranglehold?