What you need to know about the CCPA draft rules on AI and automated decision-making technology

In
November
2023,
the
California
Privacy
Protection
Agency
(CPPA)
released
a
set
of
draft
regulations
on
the
use
of

artificial
intelligence

(AI)
and
automated
decision-making
technology
(ADMT).

The
proposed
rules
are
still
in
development,
but
organizations
may
want
to
pay
close
attention
to
their
evolution.
Because
the
state
is
home
to
many
of
the
world’s
biggest
technology
companies,
any
AI
regulations
that
California
adopts
could
have
an
impact
far
beyond
its
borders. 

Furthermore,
a
California
appeals
court

recently
ruled

that
the
CPPA
can
immediately
enforce
rules
as
soon
as
they
are
finalized.
By
following
how
the
ADMT
rules
progress,
organizations
can
better
position
themselves
to
comply
as
soon
as
the
regulations
take
effect.

The
CPPA
is
still
accepting
public
comments
and
reviewing
the
rules,
so
the
regulations
are
liable
to
change
before
they
are
officially
adopted.
This
post
is
based
on
the
most
current
draft
as
of
9
April
2024.

Why
is
California
developing
new
rules
for
ADMT
and
AI?

The

California
Consumer
Privacy
Act
(CCPA
),
California’s
landmark

data
privacy

law,
did
not
originally
address
the
use
of
ADMT
directly.
That
changed
with
the
passage
of
the
California
Privacy
Rights
Act
(CPRA)
in
2020,
which
amended
the
CCPA
in
several
important
ways.

The
CPRA
created
the
CPPA,
a
regulatory
agency
that
implements
and
enforces
CCPA
rules.
The
CPRA
also
gave
the
CPPA
the
authority
to
issue
regulations
concerning
California
consumers’
rights
to
access
information
about,
and
opt
out
of,
automated
decisions.
The
CPPA
is
working
on
ADMT
rules
under
this
authority.

Who
must
comply
with
California’s
ADMT
and
AI
rules?

As
with
the
rest
of
the
CCPA,
the
draft
rules
would
apply
to
for-profit
organizations
that
do
business
in
California
and
meet
at
least
one
of
the
following
criteria:

  • The
    business
    has
    a
    total
    annual
    revenue
    of
    more
    than
    USD
    25
    million.
  • The
    business
    buys,
    sells,
    or
    shares
    the

    personal
    data

    of
    100,000+
    California
    residents.
  • The
    business
    makes
    at
    least
    half
    of
    its
    total
    annual
    revenue
    from
    selling
    the
    data
    of
    California
    residents.

Furthermore,
the
proposed
regulations
would
only
apply
to
certain
uses
of
AI
and
ADMT:
making
significant
decisions,
extensively
profiling
consumers,
and
training
ADMT
tools. 

How
does
the
CPPA
define
ADMT?

The

current
draft

(PDF,
827
KB)
defines
automated
decision-making
technology
as
any
software
or
program
that
processes
personal
data
and
uses
computation
to
execute
a
decision,
replace
human
decision-making,
or
substantially
facilitate
human
decision-making.
The
draft
specifically
notes
that
this
definition
includes
software
and
programs
“derived
from

machine
learning
,
statistics,
other
data-processing
techniques
or
artificial
intelligence.”

The
draft
rules
explicitly
name
some
tools
that
do
not
count
as
ADMT,
including
spam
filters,
spreadsheets,
and
firewalls.
However,
if
an
organization
attempts
to
use
these
exempt
tools
to
make
automated
decisions
in
a
way
that
circumvents
regulations,
the
rules
will
apply
to
that
use.

Covered
uses
of
ADMT

Making
significant
decisions

The
draft
rules
would
apply
to
any
use
of
ADMT
to
make
decisions
that
have
significant
effects
on
consumers.
Generally
speaking,
a
significant
decision
is
one
that
affects
a
person’s
rights
or
access
to
critical
goods,
services,
and
opportunities.

For
example,
the
draft
rules
would
cover
automated
decisions
that
impact
a
person’s
ability
to
get
a
job,
go
to
school,
receive
healthcare,
or
obtain
a
loan.

Extensive
profiling

Profiling
is
the
act
of
automatically
processing
someone’s
personal
information
to
evaluate,
analyze,
or
predict
their
traits
and
characteristics,
such
as
job
performance,
product
interests,
or
behavior. 

“Extensive
profiling”
refers
to
particular
kinds
of
profiling:

  • Systematically
    profiling
    consumers
    in
    the
    context
    of
    work
    or
    school,
    such
    as
    by
    using
    a
    keystroke
    logger
    to
    track
    employee
    performance.
  • Systematically
    profiling
    consumers
    in
    publicly
    accessible
    places,
    such
    as
    using
    facial
    recognition
    to
    analyze
    shoppers’
    emotions
    in
    a
    store.
  • Profiling
    consumers
    for
    behavioral
    advertising.
    Behavioral
    advertising
    is
    the
    act
    of
    using
    someone’s
    personal
    data
    to
    display
    targeted
    ads
    to
    them.

Training
ADMT

The
draft
rules
would
apply
to
businesses’
use
of
consumer
personal
data
to
train
certain
ADMT
tools.
Specifically,
the
rules
would
cover
training
an
ADMT
that
can
be
used
to
make
significant
decisions,
identify
people,

generate
deepfakes,

or
perform
physical
or
biological
identification
and
profiling.

Who
would
be
protected
under
the
AI
and
ADMT
rules?

As
a
California
law,
the
CCPA’s
consumer
protections
extend
only
to
consumers
who
reside
in
California.
The
same
holds
true
for
the
protections
that
the
draft
ADMT
rules
grant.

That
said,
these
rules
define
“consumer”
more
broadly
than
many
other
data
privacy
regulations.
In
addition
to
people
who
interact
with
a
business,
the
rules
cover
employees,
students,
independent
contractors,
and
school
and
job
applicants.

What
are
the
draft
CCPA
rules
on
AI
and
automated
decision-making
technology?

The
draft
CCPA
AI
regulations
have
three
key
requirements.
Organizations
that
use
covered
ADMT
must
issue
pre-use
notices
to
consumers,
offer
ways
to
opt
out
of
ADMT,
and
explain
how
the
business’s
use
of
ADMT
affects
the
consumer.

While
the
CPPA
has
revised
the
regulations
once
and
is
likely
to
do
so
again
before
the
rules
are
formally
adopted,
these
core
requirements
appear
in
each
draft
so
far.
The
fact
that
these
requirements
persist
suggests
they
will
remain
in
the
final
rules,
even
if
the
details
of
their
implementation
change.



Learn
how
IBM
Security®
Guardium®
Insights
helps
organizations
meet
their
cybersecurity
and
data
compliance
regulations.

Pre-use
notices

Before
using
ADMT
for
one
of
the
covered
purposes,
organizations
must
clearly
and
conspicuously
serve
consumers
a
pre-use
notice.
The
notice
must
detail
in
plain
language
how
the
company
uses
ADMT
and
explain
consumers’
rights
to
access
more
information
about
ADMT
and
opt
out
of
the
process.

The
company
cannot
fall
back
on
generic
language
to
describe
how
it
uses
ADMT,
like
“We
use
automated
tools
to
improve
our
services.”
Instead,
the
organization
must
describe
the
specific
use.

The
notice
must
direct
consumers
to
additional
information
about
how
the
ADMT
works,
including
the
tool’s
logic
and
how
the
business
uses
its
outputs.
This
information
does
not
have
to
be
in
the
body
of
the
notice.
The
organization
can
give
consumers
a
hyperlink
or
other
way
to
access
it.

If
the
business
allows
consumers
to
appeal
automated
decisions,
the
pre-use
notice
must
explain
the
appeals
process.

Opt-out
rights

Consumers
have
a
right
to
opt
out
of
most
covered
uses
of
ADMT.
Businesses
must
facilitate
this
right
by
giving
consumers
at
least
two
ways
to
submit
opt-out
requests. 

At
least
one
of
the
opt-out
methods
must
use
the
same
channel
through
which
the
business
primarily
interacts
with
consumers.
For
example,
a
digital
retailer
can
have
a
web
form
for
users
to
complete.

Opt-out
methods
must
be
simple
and
cannot
have
extraneous
steps,
like
requiring
users
to
create
accounts.

Upon
receiving
an
opt-out
request,
a
business
must
stop
processing
a
consumer’s
personal
information
using
that
automated
decision-making
technology
within
15
days.
The
business
can
no
longer
use
any
of
the
consumer’s
data
that
it
previously
processed.
The
business
must
also
notify
any
service
providers
or
third
parties
with
whom
it
shared
the
user’s
data.

Exemptions

Organizations
do
not
need
to
let
consumers
opt
out
of
ADMT
used
for

safety,
security,
and
fraud
prevention
.
The
draft
rules
specifically
mention
using
ADMT
to
detect
and
respond
to

data
security

incidents,
prevent
and
prosecute
fraudulent
and
illegal
acts,
and
ensure
the
physical
safety
of
a
natural
person.

Under

the
human
appeal
exception,

an
organization
need
not
enable
opt-outs
if
it
allows
people
to
appeal
automated
decisions
to
a
qualified
human
reviewer
with
the
authority
to
overturn
those
decisions. 

Organizations
can
also
forgo
opt-outs
for

certain
narrow
uses
of
ADMT
in
work
and
school
contexts
.
These
uses
include:

  • Evaluating
    a
    person’s
    performance
    to
    make
    admission,
    acceptance,
    and
    hiring
    decisions.
  • Allocating
    tasks
    and
    determining
    compensation
    at
    work.
  • Profiling
    used
    solely
    to
    assess
    a
    person’s
    performance
    as
    a
    student
    or
    employee.

However,
these
work
and
school
uses
are
only
exempt
from
opt-outs
if
they
meet
the
following
criteria: 

  • The
    ADMT
    in
    question
    must
    be
    necessary
    to
    achieve
    the
    business’s
    specific
    purpose
    and
    used
    only
    for
    that
    purpose. 
  • The
    business
    must
    formally
    evaluate
    the
    ADMT
    to
    ensure
    that
    it
    is
    accurate
    and
    does
    not
    discriminate.
  • The
    business
    must
    put
    safeguards
    in
    place
    to
    ensure
    that
    the
    ADMT
    remains
    accurate
    and
    unbiased. 

None
of
these
exemptions
apply
to
behavioral
advertising
or
training
ADMT.
Consumers
can
always
opt
out
of
these
uses.



Learn
how
IBM
data
security
solutions
protect
data
across
hybrid
clouds
and
help
simplify
compliance
requirements.

The
right
to
access
information
about
ADMT
use 

Consumers
have
a
right
to
access
information
about
how
a
business
uses
ADMT
on
them.
Organizations
must
give
consumers
an
easy
way
to
request
this
information. 

When
responding
to
access
requests,
organizations
must
provide
details
like
the
reason
for
using
ADMT,
the
output
of
the
ADMT
regarding
the
consumer,
and
a
description
of
how
the
business
used
the
output
to
make
a
decision.

Access
request
responses
should
also
include
information
on
how
the
consumer
can
exercise
their
CCPA
rights,
such
as
filing
complaints
or
requesting
the
deletion
of
their
data.

Notification
of
adverse
significant
decisions

If
a
business
uses
ADMT
to
make
a
significant
decision
that
negatively
affects
a
consumer—for
example,
by
leading
to
job
termination—the
business
must
send
a
special
notice
to
the
consumer
about
their
access
rights
regarding
this
decision.

The
notice
must
include:

  • An
    explanation
    that
    the
    business
    used
    ADMT
    to
    make
    an
    adverse
    decision.
  • Notification
    that
    the
    business
    cannot
    retaliate
    against
    the
    consumer
    for
    exercising
    their
    CCPA
    rights.
  • A
    description
    of
    how
    the
    consumer
    can
    access
    additional
    information
    about
    how
    ADMT
    was
    used.
  • Information
    on
    how
    to
    appeal
    the
    decision,
    if
    applicable. 

Risk
assessments
for
AI
and
ADMT

The
CPPA
is
developing
draft
regulations
on

risk
assessments

alongside
the
proposed
rules
on
AI
and
ADMT.
While
these
are
technically
two
separate
sets
of
rules,
the
risk
assessment
regulations
would
affect
how
organizations
use
AI
and
ADMT.

The
risk
assessment
rules
would
require
organizations
to
conduct
assessments
before
they
use
ADMT
to
make
significant
decisions
or
carry
out
extensive
profiling.
Organizations
would
also
need
to
conduct
risk
assessments
before
they
use
personal
information
to
train
certain
ADMT
or
AI
models.

Risk
assessments
must
identify
the
risks
that
the
ADMT
poses
to
consumers,
the
potential
benefits
to
the
organization
or
other
stakeholders,
and
safeguards
to
mitigate
or
remove
the
risk.
Organizations
must
refrain
from
using
AI
and
ADMT
where
the
risk
outweighs
the
benefits. 

How
do
the
draft
CCPA
regulations
relate
to
other
AI
laws?

California’s
draft
rules
on
ADMT
are
far
from
the
first
attempt
at
regulating
the
use
of
AI
and
automated
decisions.

The

European
Union’s
AI
Act

imposes
strict
requirements
on
the
development
and
use
of
AI
in
Europe. 

In
the
US,
the
Colorado
Privacy
Act
and
the
Virginia
Consumer
Data
Protection
Act
both
give
consumers
the
right
to
opt
out
of
having
their
personal
information
processed
to
make
significant
decisions.

At
the
national
level,
President
Biden
signed
an
executive
order
in
October
2023
directing
federal
agencies
and
departments
to
create
standards
for
developing,
using,
and
overseeing
AI
in
their
respective
jurisdictions. 

But
California’s
proposed
ADMT
regulations
attract
more
attention
than
other
state
laws
because
they
can
potentially
affect
how
companies
behave
beyond
the
state’s
borders.

Much
of
the
global
technology
industry
is
headquartered
in
California,
so
many
of
the
organizations
that
make
the
most
advanced
automated
decision-making
tools
will
have
to
comply
with
these
rules.
The
consumer
protections
extend
only
to
California
residents,
but
organizations
might
give
consumers
outside
of
California
the
same
options
for
simplicity’s
sake.

The
original
CCPA
is
often
considered
the
US
version
of
the

General
Data
Protection
Regulation
(GDPR)

because
it
raised
the
bar
for
data
privacy
practices
nationwide.
These
new
AI
and
ADMT
rules
might
produce
similar
results.

When
do
the
CCPA
AI
and
ADMT
regulations
take
effect?

The
rules
are
not
finalized
yet,
so
it’s
impossible
to
say
with
certainty.
That
said,
many
observers
estimate
that
the
rules
won’t
take
effect
until
mid-2025
at
the
earliest.

The
CPPA
is
expected
to
hold
another
board
meeting
in
July
2024
to
discuss
the
rules
further.
Many
believe
that
the
CPPA
Board
is
likely
to
begin
the
formal
rulemaking
process
at
this
meeting.
If
so,
the
agency
would
have
a
year
to
finalize
the
rules,
hence
the
estimated
effective
date
of
mid-2025.

How
will
the
rules
be
enforced?

As
with
other
parts
of
the
CCPA,
the
CPPA
will
be
empowered
to
investigate
violations
and
fine
organizations.
The
California
attorney
general
can
also
levy
civil
penalties
for
noncompliance.

Organizations
can
be
fined
USD
2,500
for
unintentional
violations
and
USD
7,500
for
intentional
ones.
These
amounts
are
per
violation,
and
each
affected
consumer
counts
as
one
violation.
Penalties
can
quickly
escalate
when
violations
involve
multiple
consumers,
as
they
often
do.

What
is
the
status
of
the
CCPA
AI
and
ADMT
regulations?

The
draft
rules
are
still
in
flux.
The
CPPA
continues
to
solicit
public
comments
and
hold
board
discussions,
and
the
rules
are
likely
to
change
further
before
they
are
adopted.

The
CPPA
has
already
made
significant
revisions
to
the
rules
based
on
prior
feedback.
For
example,
following
the
December
2023
board
meeting,
the
agency
added
new
exemptions
from
the
right
to
opt
out
and
placed
restrictions
on
physical
and
biological
profiling.

The
agency
also
adjusted
the
definition
of
ADMT
to
limit
the
number
of
tools
the
rules
would
apply
to.
While
the
original
draft
included
any
technology
that
facilitated
human
decision-making,
the
most
current
draft
applies
only
to
ADMT
that

substantially

facilitates
human
decision-making. 

Many
industry
groups
feel
the
updated
definition
better
reflects
the
practical
realities
of
ADMT
use,
while
privacy
advocates
worry
it
creates
exploitable
loopholes.

Even
the
CPPA
Board
itself
is
split
on
how
the
final
rules
should
look.
At
a
March
2024
meeting,
two
board
members

expressed
concerns

that
the
current
draft
exceeds
the
board’s
authority.  

Given
how
the
rules
have
evolved
so
far,
the
core
requirements
for
pre-use
notices,
opt-out
rights,
and
access
rights
have
a
strong
chance
to
remain
intact.
However,
organizations
may
have
lingering
questions
like:

  • What
    kinds
    of
    AI
    and
    automated
    decision-making
    technology
    will
    the
    final
    rules
    cover?
  • How
    will
    consumer
    protections
    be
    implemented
    on
    a
    practical
    level?
  • What
    kind
    of
    exemptions,
    if
    any,
    will
    organizations
    be
    granted?

Whatever
the
outcome,
these
rules
will
have
significant
implications
for
how
AI
and
automation
are
regulated
nationwide—and
how
consumers
are
protected
in
the
wake
of
this
booming
technology.

Explore
data
compliance
solutions


Disclaimer:
The
client
is
responsible
for
ensuring
compliance
with
all
applicable
laws
and
regulations.
IBM
does
not
provide
legal
advice
nor
represent
or
warrant
that
its
services
or
products
will
ensure
that
the
client
is
compliant
with
any
law
or
regulation.

Was
this
article
helpful?


Yes
No

Comments are closed.