Meta/Facebook $5 Billion FTC Fine for Cambridge Analytica

Jul 2019 · $5B fine

By Karim El Labban · ZERO|TOLERANCE

Between 2013 and 2018, Facebook’s platform design allowed third-party

applications to harvest the personal data of not only their direct users

but also those users’ entire friend networks without meaningful consent.

Cambridge Analytica, a political consulting firm, exploited this architecture

through a personality quiz app to collect detailed psychological profiles of

approximately 87 million Facebook users, which were then used for targeted

political advertising. In July 2019, the FTC imposed a $5 billion penalty-the

largest in FTC history-for violations of a 2012 consent decree. The SEC

separately fined Facebook $100 million for misleading investors about data

misuse risks. CEO Mark Zuckerberg was subjected to unprecedented personal

compliance obligations.

## Key Facts

  • .**What:** Cambridge Analytica harvested data via Facebook's Graph API for political profiling.
  • .**Who:** 87 million Facebook users worldwide.
  • .**Data Exposed:** Psychological profiles, friend lists, demographics, and page likes.
  • .**Outcome:** $5B FTC fine, $100M SEC penalty, and CEO personal compliance mandate.

## What Was Exposed

  • .Detailed profile data for approximately 87 million Facebook users,

harvested without their direct consent

  • .Psychological and personality trait profiles derived from Facebook

activity, likes, and engagement patterns

  • .Friend lists, group memberships, and social graph data revealing

relationship networks

  • .Demographic information including age, gender, location, and

political affiliations

  • .Page likes and content engagement data used to construct

psychographic models

  • .For approximately 270,000 users who directly used the quiz app:

full profile data including posts, messages, and personal details

shared with the application

The Cambridge Analytica data harvest was not a traditional breach involving

unauthorized system access. Instead, it exploited the intentional architecture

of Facebook’s platform, which by design permitted applications to access the

data of their users’ friends through the Graph API v1.0.

When approximately 270,000 users installed the “thisisyourdigitallife”

quiz app created by researcher Aleksandr Kogan, the app legally-under

Facebook’s terms of service at the time-accessed the profile data of

those users’ entire friend networks, reaching 87 million people. The

harvested data was then transferred to Cambridge Analytica in violation

of Facebook’s platform policies, but Facebook’s enforcement of those

policies was virtually nonexistent.

## From Data Harvest to Voter Manipulation

Cambridge Analytica used the harvested Facebook data to build psychographic

models of American voters. By analyzing users’ Facebook likes, content

interactions, and demographic information, the firm developed algorithms

to predict personality traits using the “Big Five” personality model:

openness, conscientiousness, extroversion, agreeableness, and neuroticism.

These psychographic profiles were then used to craft targeted political

advertising designed to influence individual voters based on their predicted

psychological vulnerabilities. The firm worked with multiple political

campaigns, most notably the 2016 U.S. presidential campaign of Donald Trump

and the Leave.EU campaign during the UK Brexit referendum.

The use of psychological profiling for political manipulation raised

fundamental questions about the boundaries of data use in democratic

processes-questions that continue to shape privacy regulation globally.

## Facebook’s Knowledge and Inaction

Facebook learned about the data transfer to Cambridge Analytica in

December 2015, when The Guardian reported that Cambridge Analytica had

acquired Facebook user data through Kogan’s app. Facebook requested

that Cambridge Analytica delete the data and accepted the firm’s

certification that it had done so.

However, Facebook did not verify the deletion, did not notify the

87 million affected users, did not report the incident to the FTC,

and did not publicly disclose the data misuse. The full scope of the

Cambridge Analytica data harvesting was not publicly revealed until

March 2018, when The New York Times and The Guardian/Observer published

detailed investigations based on information from whistleblower

Christopher Wylie.

## Regulatory Analysis

**FTC Act Section 5 - Consent Decree Violation:** The FTC’s

enforcement action was grounded in Facebook’s violation of a 2012

consent decree. In 2012, the FTC had settled charges that Facebook

had deceived users about their ability to control the privacy of their

personal information.

The 2012 consent decree required Facebook to obtain affirmative express

consent from users before sharing their data beyond their privacy settings

and prohibited Facebook from making misrepresentations about the extent

to which users could control the privacy of their data.

The FTC found that Facebook violated the 2012 consent decree in

multiple ways:

  • .Facebook’s platform design allowed apps to access friends’ data

without the friends’ knowledge or consent, directly violating

the consent requirement

  • .Facebook’s privacy settings created a false impression of user

control while the platform architecture undermined that control

  • .Facebook failed to verify that app developers complied with platform

data use policies

  • .When it learned of Cambridge Analytica’s policy violations, it failed

to ensure actual deletion of the harvested data

**The $5 Billion Penalty:** The $5 billion fine was approximately

20 times larger than the previous record FTC penalty ($275 million

against Equifax). The FTC justified this unprecedented penalty based

on the severity of the consent decree violations, the scope of harm

affecting 87 million users, Facebook’s pattern of privacy violations,

and the company’s financial capacity. The penalty represented

approximately 9% of Facebook’s 2018 revenue.

The FTC vote was 3-2 along party lines, with the two dissenting

commissioners arguing that the penalty was insufficient and that the

settlement should have imposed stricter structural remedies including

the breakup of Facebook’s data collection practices.

**Personal CEO Liability:** For the first time in a major FTC

action against a technology company, the settlement imposed personal

compliance obligations on CEO Mark Zuckerberg. Under the order,

Zuckerberg is required to personally certify Facebook’s compliance

with the privacy program on a quarterly basis. False certifications

would expose Zuckerberg to personal civil and criminal penalties.

This provision was designed to prevent the compartmentalization of

privacy compliance away from senior leadership-a pattern the FTC

identified in its investigation.

**SEC Exchange Act Enforcement:** Separately from the FTC action,

the Securities and Exchange Commission charged Facebook with violating

the Securities Exchange Act by making misleading disclosures about the

risk of data misuse. Facebook’s public filings treated the risk of user

data being accessed by third parties as merely hypothetical even after

the company knew of the Cambridge Analytica data harvest.

The SEC imposed a $100 million penalty, finding that Facebook’s risk

factor disclosures were materially misleading because they presented

a known data misuse incident as a theoretical risk.

**Structural Remedies:** Beyond financial penalties, the FTC order

imposed sweeping structural requirements on Facebook’s operations:

  • .An independent privacy committee on its board of directors,

separate from the audit committee, with members independent

of Zuckerberg’s control

  • .Designated compliance officers who cannot be removed without the

privacy committee’s approval

  • .Privacy reviews of new or modified products, features, and services

before launch

  • .A comprehensive privacy program subject to biennial independent

third-party assessments for 20 years

## What Should Have Been Done

**Privacy by Design in Platform Architecture:** The root cause

of the Cambridge Analytica scandal was not a security failure but a

design choice. Facebook’s Graph API v1.0 was intentionally designed

to allow applications to access friends’ data, because broad data

access made the platform more attractive to developers.

Privacy by design principles would have required Facebook to evaluate

the privacy implications of this architectural decision and implement

controls ensuring that data access beyond the direct user required

explicit, informed consent from each affected individual.

**Consent Decree Compliance Infrastructure:** Facebook’s violation

of the 2012 consent decree suggests that compliance with the decree

was not embedded in the company’s product development processes.

Consent decree requirements should be translated into specific engineering

constraints enforced through technical controls, not merely communicated

as policy guidelines.

**Third-Party App Auditing:** Facebook’s failure to verify

Cambridge Analytica’s certification that it had deleted the harvested

data was symptomatic of a broader failure to audit third-party

applications. Platform companies that allow third-party access to user

data must implement ongoing technical auditing of how that data is used,

stored, and retained by app developers.

**Proactive Regulatory Engagement:** Facebook’s decision to

handle the Cambridge Analytica data misuse quietly-requesting deletion

and accepting a self-certification-rather than disclosing it to the

FTC and affected users, transformed a manageable compliance incident

into a historic enforcement action. Organizations operating under

consent decrees must adopt a posture of proactive disclosure to their

regulators.

The Facebook/Cambridge Analytica case established that platform companies

bear direct responsibility for how third parties use data accessed through

their APIs, that consent decree violations carry exponentially escalating

penalties, and that CEO personal liability is now a tool in the FTC’s

enforcement arsenal. At $5 billion, it remains the largest privacy penalty

in U.S. history and a permanent benchmark for the financial consequences

of treating user privacy as an externality rather than a design constraint.

RELATED ANALYSIS

USPTO GovDelivery Scam: How Fraudsters Weaponize Real .gov Emails to Steal From Trademark Filers
Apr 1, 2026 · 77K+ victims · 60+ domains · First-person investigation
Free Mobile Fined EUR 42M After 24.6 Million Customer Records Stolen
Jan 16, 2026 · EUR 42M fine
Illuminate Education: FTC Action Over 10.1 Million Student Records Breach
Dec 1, 2025 · $5.1M settlement
Capita Fined £14M After Black Basta Ransomware Exposes 6.6M Records
Oct 1, 2025 · £14M fine
SHEIN Fined €150M for Cookie Consent Violations
Jan 23, 2025 · €150M fine
MORE REGULATORY ENFORCEMENT →