Technology

EU DSA enforcers send Snapchat, TikTok and YouTube more questions on AI risks

Published

on

The European Union on Wednesday asked Snapchat, TikTok and YouTube for more details about their respective content advice algorithms, activities covered by the EU’s online governance rulebook, the Digital Services Act (DSA).

In press release The commission said it had sent requests for information to a few social media platforms, asking them for more details in regards to the design and functioning of their algorithms. The trio had until November 15 to supply the info they were on the lookout for.

The EU said their responses would inform further steps, comparable to potentially opening a proper investigation.

The bloc’s web governance framework includes tough penalties for violations (as much as 6% of world annual turnover). It applies a further layer of systemic risk mitigation principles to a few platforms on account of their designation as VLOPs (i.e. very large online platforms).

These regulations require larger platforms to discover and mitigate risks that will arise from their use of artificial intelligence as a content advice tool, with the law stating that they need to take motion to stop negative impacts in a variety of areas, including health users’ mental health and civil discourse. The EU also warned that algorithms designed to extend engagement may lead to the spread of harmful content. This appears to be the main target of the most recent RFIs.

“Questions also concern the measures used by platforms to mitigate the potential impact of their recommendation systems on the spread of illegal content, such as the promotion of illicit drugs and hate speech,” the EU added.

In the case of TikTok, the Commission is requesting more detailed information on the anti-manipulation measures implemented to stop malicious actors from using the platform to spread harmful content. The EU can be asking TikTok for more information on tips on how to mitigate risks related to elections, media pluralism and civil discourse – systemic risks it says might be amplified by advice systems.

These latest requests for proposals aren’t the primary that the Commission has sent to the three platforms. Earlier DSA questions included questions to the trio (and several other VLOPs) about electoral threats ahead of the European Parliament elections earlier this yr. He also previously questioned all three about child protection issues. Additionally, last yr the Commission issued a request for proposals to TikTok asking how TikTok would reply to threats related to content related to the war between Israel and Hamas.

However, the ByteDance platform is the one one in all three social media products under formal DSA investigation to date. In February, the bloc launched an investigation into TikTok’s DSA compliance, expressing concern over a variety of issues including the platform’s approach to fine-grained protection and its management of the chance of addictive design and harmful content. This investigation is ongoing.

TikTok spokesperson Paolo Ganino emailed TechCrunch an announcement confirming the motion: “This morning we received a request for information from the European Commission, which we will now consider. We will cooperate with the Commission throughout the RFI process.”

We also contacted Snap and TikTok for responses to the Commission’s latest requests for information.

DSA’s VLOP rules have been in place since late last summer, however the bloc has yet to finish any of several probes it has opened on larger platforms. However, in July, the Commission presented preliminary findings related to certain investigations into X, stating that it suspected that the social networking site violated the DSA’s dark pattern design principles; providing researchers with access to data; and transparency of promoting.

This article was originally published on : techcrunch.com

Leave a Reply

Your email address will not be published. Required fields are marked *

Trending

Exit mobile version