The European Union has declared its suspicion that Meta’s social networking platforms, Facebook and Instagram, have violated the regulations of the bloc pertaining to larger platforms with regards to election integrity.
The Commission has initiated formal infringement proceedings to examine Meta’s compliance with the Digital Services Act (DSA), which is a framework governing online governance and content moderation. Please be reminded that violations of the rules can result in penalties, which may include fines amounting to a maximum of 6% of the company’s worldwide annual revenue.
The European Union has multiple concerns regarding Meta’s activities in various areas. Firstly, the EU questions the effectiveness of Meta’s moderation of political ads, suspecting that it may be insufficient. Secondly, the EU finds Meta’s policies for moderating non-paid political content to be unclear and excessively limiting. In contrast, the Digital Services Act (DSA) requires platforms to have transparent and accountable policies. Lastly, the EU is interested in Meta’s policies regarding the facilitation of external monitoring of elections.
The EU’s investigation is focused on Meta’s procedures for users to report illegal content, which it believes are not sufficiently user-friendly, as well as its internal system for handling complaints regarding content moderation decisions, which it suspects to be ineffective.
“According to a Commission official briefing journalists on background, it seems that Meta has not implemented an effective system for moderating content when they receive payment for displaying advertising. This is one of the factors that led to the opening of multiple investigations,” the official stated. “This includes advertisements that a generative AI might produce, such as deep fakes.Advertisements have been used or seem to have been used by malicious individuals to interfere in foreign affairs.”
The EU is utilizing independent research, made possible by a DSA requirement for large platforms to publish a searchable ad archive. According to this research, Russian influence campaigns that target elections through paid advertisements have taken advantage of Meta’s ad platform. The Commission has found evidence of Meta’s inadequate ad moderation, which scammers have taken advantage of. This has resulted in a significant increase in financial scam ads on the platform.
The EU has stated that Meta appears to restrict the visibility of political content for users by default on organic (non-paid) platforms. However, Meta does not seem to offer adequate clarification regarding how it identifies political content or how moderation is carried out. The Commission has discovered evidence indicating that Meta is engaging in shadowbanning, which involves restricting the visibility and reach of specific accounts that engage in extensive political posting.
If verified, such actions would constitute a violation of the DSA, as the regulation imposes a legal duty on platforms to clearly convey the policies they enforce to their users.
The EU is deeply concerned about Meta’s recent decision to restrict access to CrowdTangle, a tool that researchers have relied on for real-time election monitoring.
An investigation has not yet been initiated, but a formal request for information (RFI) has been sent urgently to Meta regarding its decision to deprecate the research tool. The company has been given a deadline of five days to respond. During the briefing, Commission officials indicated that they may pursue further measures in this domain, such as initiating a formal investigation, depending on Meta’s response.
The tight timeframe for a reply effectively communicates a feeling of immediacy. Following the EU’s assumption of responsibility for overseeing the compliance of larger platforms with a specific set of transparency and risk mitigation rules under the DSA, the Commission identified election integrity as a key focus area for enforcing the regulation.
At today’s briefing, officials from the Commission raised concerns about the timing of Meta’s decision to deprecate CrowdTangle, particularly in light of the upcoming European elections in June. “Our primary concern, and the reason we view this as an urgent matter, is that Meta has chosen to discontinue this tool shortly before the European election. This tool has been instrumental in enabling journalists, civil society actors, and researchers to monitor election-related risks, such as those observed during the 2020 US elections.”
The Commission expresses concern that another tool, as stated by Meta, which is intended to replace CrowdTangle, lacks comparable or superior functionalities. The European Union is notably concerned about its reluctance to allow external parties to monitor election risks in real-time. Officials also expressed apprehensions regarding the sluggish onboarding process for Meta’s latest tool.
During the briefing, a senior Commission official expressed the need for Meta to provide information on their plans to address the absence of a real-time election monitoring tool. “We are also asking them for additional documents regarding their decision to deprecate Crowdtangle and their evaluation of the new tool’s capabilities.”
Meta was contacted to provide a comment regarding the actions of the Commission. A company representative stated that they have a clearly defined procedure for identifying and reducing risks on their platforms. We anticipate further collaboration with the European Commission and will furnish them with additional information regarding this project.
These investigations are the initial formal inquiries that Meta has encountered, although they have previously received requests for information (RFIs). Meta received numerous requests for information from the EU last year, covering various topics such as the Israel-Hamas war, election security, and child safety.
Due to the diverse range of information requests on Meta platforms, the company may be subjected to further investigations by the DSA as Commission enforcers process numerous submissions.