Google says it will provide more information about EU-targeted ads. It is also giving third-party researchers studying regional systemic content risks data. The actions are among several it’s announcing today to comply with the bloc’s Digital Services Act.
Friday (August 25) is the DSA compliance deadline for larger platforms with more than 45 million regional users, 19 of which the EU designated in April.
Tech giants like TikTok, Meta, and Snap have announced their responses to the bloc’s law ahead of the deadline.
Google added its 2 (Euro) cents, and like other platform giants, it’s portraying the move as an expansion rather than a change. The threat of major fines (up to 6% of global annual turnover) for violating the pan-EU regime is forcing all platforms to be more open.
Tech giants may leave the EU if they don’t like the new rules. Amazon and Zalando are suing over their VLOP status. However, the average C-suite would not approve of abandoning 450 million consumers. Even though Twitter/X’s erratic owner Elon Musk will be watched.
The VLOP social media platform has been moving away from DSA compliance since Musk took over Twitter (now X). The Commission warned for months that the platform must do a lot of work to avoid DSA violations.
The other tech giants on the VLOP/VLOSE list can at least be assured they haven’t set a Musk-sized EU compliance target for themselves. They should expect European Commission regulators to scrutinize their claimed compliance in detail. Maybe with a better chance of not being first enforced.
Google writes in a blog post titled “complying with the Digital Services Act” that it will expand the Ads Transparency Center, a global searchable repository of advertisers across all our platforms, to meet DSA provisions and provide more targeting information for EU ads. “These steps build on our many years of work to expand online ad transparency.”
The adtech giant adds, “Building on our prior efforts to help advance public understanding of our services, we will increase data access for researchers looking to understand how Google Search, YouTube, Google Maps, Google Play and Shopping work in practice, and conducting research related to understanding systemic content risks in the EU.”
Google says its DSA compliance strategy includes increasing transparency around its content moderation decisions, giving users multiple ways to contact it, and updating its reporting and appeals processes to include “specified types of information and context about our decisions”.
It also launched a new Transparency Center to provide product-by-product policy information, reporting and appeals tools, Transparency Reports, and policy development information.
Google is expanding its Transparency Reports to include content moderation across Google Search, Google Play, Google Maps, and Shopping as part of another DSA measure.
As required by the DSA, the tech giant will assess risks in illegal content dissemination, fundamental rights, public health, and civic discourse and report to EU regulators and independent auditors on its blog.
“We are committed to assessing risks related to our largest online platforms and our search engine in line with DSA requirements,” it writes, adding that it will publish a public summary of the assessments “at a later date” and report to the EU and independent auditors. It will be interesting to see how quickly those assessments become public and how detailed Google’s summaries are.
DSA will apply to more digital platforms and services, with a general compliance deadline early next year. The regulation places extra obligations and a tighter compliance timeline on VLOPs and VLOSEs.
Platforms must give users more choice over how AI and other recommender algorithms shape their content, proactively address AI-driven risks on their services, and give data to independent researchers to study the societal impacts of algorithmic content-shaping systems to increase transparency and accountability.
The EU opened a new AI research hub in Seville, Spain, last year to support Big Tech oversight. The bloc also wants the regulation to boost platform research and algorithmic auditing across Europe to make it a leader in AI research.
VLOPs/VLOSEs’ recommender systems that profile users (aka content “personalization”) are also regulated by the DSA. They must allow EU users to opt out of such tracking and receive non-personalized content feeds or search results based on the platform’s analysis of their activity to predict what might interest them most.
Google’s blog post doesn’t mention how it’s complying with this DSA provision, so we contacted the company. Google settings already allow users to disable “personalized” search results, so this is likely why. Update: YouTube recently disabled watch recommendations for watch history-off users.
The pan-EU regulation also prohibits tracking minors to microtarget them with ads and uses sensitive personal data for ad targeting.
Google does not mention the latter requirement, so we asked how it will comply. Update: Google says its longstanding policy prohibits advertisers from targeting ads to users based on sensitive interests like race, religion, and sexuality.
Its blog post highlights a two-year-old decision to block personalized advertising (“based on the age, gender, or interests”) to minors. “The DSA will require other providers to take similar approaches,” it says.
Google does not name any rivals, but Meta and Snap appear to be continuing to target minors with some of the parameters Google claims it does not use, such as age, location, and language settings.
It will be interesting to see if EU regulators pick up on discrepancies in how platforms define personalization/profiling in ad-targeting. Language settings, age, and location are “basic essential information” for Snap, but Google seems to disagree on age.