+
  • HOME»
  • Google And Meta Exposed For Secret Ad Campaign Targeting Teens

Google And Meta Exposed For Secret Ad Campaign Targeting Teens

Google and Meta allegedly ran a secret ad campaign targeting teenagers, violating advertising policies. The campaign was shut down following media exposure.

Google and Meta are facing scrutiny after reports of a secret ad campaign targeting teenagers. According to the Financial Times, the campaign aimed to reach 13- to 17-year-olds on YouTube with Instagram ads, violating Google’s advertising policies, which prohibit targeting children under 18.

Investigation and Response

Google reportedly conducted an investigation and shut down the project after being approached by the media outlet. In a statement to Quartz, Google described the campaign as “small in nature” and claimed to have “thoroughly reviewed the allegations” regarding policy circumvention. Google also stated that it plans to refresh its training to ensure sales representatives understand the rules better.

The ads were targeted at a group labeled as “unknown” in Google’s advertising system, indicating users whose age, gender, or other demographics are not identified. However, Google could identify these “unknown” users as teenagers by analyzing data from app downloads and online activity.

Orchestration of the Campaign

The secret campaign was reportedly carried out with the assistance of Spark Foundry, a U.S.-based advertising agency. The program ran in Canada and was tested in the U.S. in May, with plans for a global rollout to promote services like Facebook.

After canceling the program, Google stated to the Financial Times, “We prohibit ads being personalized to people under 18, period.”

Legislative Measures

This revelation comes as the U.S. Senate recently passed legislation aimed at holding tech giants accountable for harming minors. The Children and Teens’ Online Privacy Protection Act (COPA 2.0) prohibits targeted advertising to minors and bans data collection without consent. Another bill, the Kids Online Safety Act, requires tech companies to design platforms that prevent harm, including cyberbullying, sexual exploitation, and drug use.

These developments highlight the growing concerns over tech companies’ influence on minors and their adherence to privacy and safety regulations.

Advertisement