What for? This is due to an important technology law that was passed in the EU last year but which did not receive sufficient attention (IMO), particularly in the US. I have several bills called the Digital Services Act (DSA) and the Digital Markets Act (DMA) and this is what they say is a signal to watch.
The law was actually quite revolutionary and set a global benchmark for technology regulation of user-generated content. The DSA addresses digital security and transparency for technology companies, while the DMA addresses industry antitrust and competition. let me explain.
DSA hit a major milestone a few weeks ago. From 17 February 2023, all major technology platforms in Europe must self-declare their size, which serves to classify companies into different tiers. The largest company with over 45 million monthly active users in the EU (about 10% of EU population) is creatively called a "Very Large Online Platform" (or VLOP) or "Very Large Online Search Engine" (or VLOSE) and will also comply with the standards highest transparency and regulation. Smaller online platforms have far fewer obligations, which has become part of the policy to encourage competition and innovation with big technologies in mind.
“If you ask [small businesses] to hire 30,000 moderators, for example, you will kill small business,” Henri Verdier, France's digital ambassador, told me last year.
So what is DSA actually going to do? So far, at least 18 companies have applied for VLOPs and VLOSE, including big players like YouTube, TikTok, Instagram, Pinterest, Google, and Snapchat. (If you want a complete list, Martin Husavecs, a law professor at the London School of Economics, has written a nifty Google paper showing where all the big players are working and the accompanying explanation.)
DSA will require these companies to assess the risks on their platforms, e.g. B. Potential illegal content or voice fraud and develop plans to mitigate such risks through independent security verification audits. Small businesses (under 45 million users) will also need to comply with new standards for content moderation, including removing illegal content immediately upon being reported, notifying users of the removal, and strengthening enforcement of existing company policies.
Supporters of the legislation say the bill will help end the era of self-regulation by technology companies. "I don't want companies deciding what is and isn't, without separation of powers, without accountability, without communication, without calling out," Verdier said. - It's very dangerous.
However, the bill makes clear that platforms are not responsible for illegal user-generated content unless they become aware of the content and do not remove it.