EU Commission opens formal investigation into TikTok, focused on child protection

Content-Type:

News Based on facts, either observed and verified directly by the reporter, or reported and verified from knowledgeable sources.

[Shutterstock/Diego Thomazini]

The European Commission opened formal proceedings against TikTok under the Digital Services Act on Monday (19 February), due to possible breaches in several areas, including child protection.

The Digital Services Act (DSA), which entered into force on 17 February for all platforms operating in the EU, is a horizontal legislation regulating how online actors should deal with illegal and harmful content.

The current decision is a result of a preliminary investigation, based on a risk assessment report by TikTok sent to the Commission in September last year, and on the EU institution’s formal Requests for Information on illegal content, protection of minors, and data access.

As European Commissioner for Internal Market Thierry Breton also pointed out in a post on X, the current investigation will focus on the possible breach in transparency and protection of minors, and within that, on addictive design, screen time limits, rabbit hole effect, age verification, and default privacy settings.

“TikTok has pioneered features and settings to protect teens and keep under 13s off the platform, issues the whole industry is grappling with,” a TikTok spokesperson told Euractiv.

Meta and TikTok sue European Commission over Digital Services Act fee

Meta and TikTok are suing the European Commission over a supervisory fee that companies affected by the EU’s Digital Services Act will have to pay yearly.

“We’ll continue to work with experts and industry to keep young people on TikTok safe, and look forward to now having the opportunity to explain this work in detail to the Commission,” the spokesperson added.

The Commission noted that compliance with the DSA includes the assessment and mitigation of systemic risks, however, existing mitigation measures, like TikTok’s age verification tools, may not be deemed reasonable, proportionate, or fully effective in addressing these concerns.

The Commission also points out that the DSA requirements involve implementing suitable and balanced measures to guarantee minors’ privacy, safety, and security, including default privacy settings tailored for minors within the design and functioning of their recommender systems. Ensuring adherence to the regulation’s requirements also entails establishing a searchable and dependable repository for advertisements featured on TikTok.

TikTok’s efforts to enhance platform transparency are also under scrutiny. The inquiry focuses on potential deficiencies in granting researchers access to the platform’s publicly available data, which is also a requirement under the DSA.

MEPs call for new rules on digital platforms' addictive design

The European Parliament adopted with a broad majority the initiative to make digital platforms less addictive at its plenary session in Strasbourg on Tuesday (12 December).

Addictive design

Social media platforms have been criticised for being addictive by design to make people spend as much time on them as possible, with such features as the rabbit hole effect, now mentioned by Breton, which refers to algorithms showing users more of a specific type of content, the more they interact with it, for example by liking it, or even by spending more time looking at a type of content than another one.

On 12 December last year, the European Parliament adopted with a broad majority the initiative to make digital platforms less addictive at its plenary session in Strasbourg.

Child protection

According to the DSA’s rules, online platforms used by more than 45 million users per month entail a ‘systemic risk’ for society; hence, they must follow a specific regime of content moderation, including transparency and risk management obligations.

Last February, the EU executive announced the first batch of very large online platforms (VLOPs) and very large search engines (VLOSEs) in April, which was then updated at the end of December.

The list includes social media platforms too, such as Meta’s Instagram and Facebook, as well as TikTok and X.

In the December update of the list, when three pornography platforms, XVideos, Pornhub, and Stripchat were added, already both Breton and Executive Vice-President in charge of competition Margrethe Vestager emphasised the protection of minors as one of the priorities under the DSA.

DSA: Three pornography sites join EU digital rulebook's ‘systemic risk’ list

Three pornography websites will have to abide by strict EU rules after being included on the Digital Services Act’s (DSA) very large online platforms list, the EU Commission announced on Wednesday (20 December).

Now, according to a document seen by Reuters, Breton said, “The protection of minors is a top enforcement priority for the DSA.”

“As a platform that reaches millions of children and teenagers, TikTok must fully comply with the DSA and has a particular role to play in the protection of minors online,” the Commissioner added.

Next steps

The Commission will now carry out an in-depth investigation, by continuing to gather evidence, which could prove that TikTok committed infringements under the DSA.

By opening a formal investigation, the Commission can enforce, for example, interim measures and non-compliance decisions, or accept TikTok’s remedies if offered.

Such investigations’ timelines can depend on several factors, so there is no current deadline until the Commission has to wrap up the proceeding.

Meanwhile, on 7 February, Meta and TikTok also confirmed they are suing the European Commission over an annual supervisory fee that companies listed under the DSA must pay.

[Edited by Nathalie Weatherald]

Read more with Euractiv

Subscribe to our newsletters

Subscribe