Digital Services Act | Commission designates first set of Very Large Online Platforms and Search Engines

The Commission adopted, on Tuesday, 25 April 2023, the first designation decisions under the Digital Services Act (DSA), designating 17 Very Large Online Platforms (VLOPs) and 2 Very Large Online Search Engines (VLOSEs) that reach at least 45 million monthly active users.

Very Large Online Platforms: Alibaba AliExpress; Amazon store; Apple AppStore; Booking.com; Facebook; Google Play; Google Maps; Google Shopping; Instagram; LinkedIn; Pinterest; Snapchat; TikTok; Twitter; Wikipedia; YouTube; Zalando

Very Large Online Search Engines: Bing; Google Search

The platforms have been designated based on the user data that they had to publish by 17 February 2023. Following their designation, the companies will now have to comply, within four months, with the full set of new obligations under the DSA. These aim at empowering and protecting users online, including minors, by requiring the designated services to assess and mitigate their systemic risks and to provide robust content moderation tools. This includes:

  • More user empowerment:
    • Users will get clear information on why they are recommended certain information and will have the right to opt-out from recommendation systems based on profiling;
    • Users will be able to report illegal content easily and platforms have to process such reports diligently;
    • Advertisements cannot be displayed based on the sensitive data of the user (such as ethnic origin, political opinions or sexual orientation);
    • Platforms need to label all ads and inform users on who is promoting them;
    • Platforms need to provide an easily understandable, plain-language summary of their terms and conditions, in the languages of the Member States where they operate.
  • Strong protection of minors:
    • Platforms will have to redesign their systems to ensure a high level of privacy, security, and safety of minors;
    • Targeted advertising based on profiling towards children is no longer permitted;
    • Special risk assessments including for negative effects on mental health will have to be provided to the Commission 4 months after designation and made public at the latest a year later;
  • More diligent content moderation, less disinformation:
    • Platforms and search engines need to take measures to address risks linked to the dissemination of illegal content online and to negative effects on freedom of expression and information;
    • Platforms need to have clear terms and conditions and enforce them diligently and non-arbitrarily;
    • Platforms need to have a mechanism for users to flag illegal content and act upon notifications expeditiously;
    • Platforms need to analyse their specific risks, and put in place mitigation measures – for instance, to address the spread of disinformation and inauthentic use of their service.
  • More transparency and accountability:
    • Platforms need to ensure that their risk assessments and their compliance with all the DSA obligations are externally and independently audited;
    • They will have to give access to publicly available data to researchers; later on, a special mechanism for vetted researchers will be established;
    • They will need to publish repositories of all the ads served on their interface;
    • Platforms need to publish transparency reports on content moderation decisions and risk management.

    By 4 months after notification of the designated decisions, the designated platforms and search engines need to adapt their systems, resources, and processes for compliance, set up an independent system of compliance and carry out, and report to the Commission, their first annual risk assessment.

    Social Media:

    Leave a Comment

    Scroll to Top