
This week we bring a new type of AI lawsuit to light. While many of you may be aware of AI lawsuits centered who owns the data and was it properly used in the training of large language models (LLM) we are now grappling with whether the application of AI tools violates individuals' privacy rights, particularly, in the state of California.
California has two laws that cover the usage of customer data: the California Consumer Privacy Act (CCPA) and California Privacy Rights Act (CPRA).
California Consumer Privacy Act
The CCPA took effect on January 1, 2020, and applies to businesses that collect personal data of California residents and meet certain criteria (e.g., annual gross revenue over $25 million, buys/sells/shares personal information of 50,000 or more consumers or derives 50% or more of annual revenue from selling consumers' personal information).
It covers several Key Rights for Consumers:
Right to Know: Consumers can request details about the personal information a business collects about them and how it is used and shared.
Right to Delete: Consumers can request deletion of their personal information.
Right to Opt-Out: Consumers can opt out of the sale of their personal information.
Right to Non-Discrimination: Consumers have the right to not face discrimination for exercising their CCPA rights.
California Privacy Rights Act
The CPRA took effect on January 1, 2023, and expands Consumer Rights to also include:
Right to Correct: Consumers can request correction of inaccurate personal information.
Right to Limit Use and Disclosure of Sensitive Personal Information: Provides consumers control over sensitive personal information (e.g., social security numbers, precise geolocation).
Data Minimization and Storage Limitation: Businesses must only collect, use, and retain personal information necessary to fulfill the purposes disclosed to the consumer.
Contractual Requirements: Strengthens requirements for contracts between businesses and third parties processing personal information.
Risk Assessments and Audits: Requires businesses to perform regular risk assessments and audits concerning their privacy practices.
The Lawsuit
A recent class action filing in California against Patagonia argues that while customers who call Patagonia are told that the conversation “may be recorded for quality training purposes,” they are not informed that the information will be accessed by a third-party (Talkdesk), or how said third-party will use the data. This lack of disclosure would appear to violate the right to know provision of the CCPA and the data minimization and storage limitation provision of the CPRA.
The lawsuit also claims that there is economic value in the data being captured as it is used to further improve existing AI tools and also inform the development of new capabilities within those platforms.
The core issue is that the disclosures used today aren't comprehensive enough to cover all the ways the data is being used by the companies involved and that the data may be used by companies other than the one the consumer is directly interacting with.
In this case the customer was consenting to Patagonia using their data in limited ways (which still was not a comprehensive enough disclosure even if the data remained inside of Patagonia's four walls) but was not conferring any consent to the data being processed, used and stored by Talkdesk.
The issues raised in this lawsuit aren't specific to Talkdesk and would apply to any solution you are using with NLP, sentiment analysis, generative AI, CX analytics, AI based quality assurance and management, etc. There have been similar lawsuits brought against Navy Federal Credit Union and Home Depot for their use of other AI tools provided by other vendors during customer interactions.
Your typical disclosures of "this call may be recorded for training and quality purposes" likely are not detailed enough to satisfy California's privacy laws when it comes to AI usage.
Our Recommendations
We would recommend you take the following two steps out of preponderance of caution immediately:
Remove any and all testimonials or mentions of AI technology from your website or vendors websites immediately. Much of the discovery by law firms determining which companies to target companies under these California laws is being done by scrubbing websites for testimonials, case studies, logos, etc. In the lawsuit examined above the plaintiffs specifically mentioned a case study found on Talkdesk's website.
Review your disclosures with your legal and compliance teams and update as needed. Pay particular attention to the California Invasion of Privacy Act and GDPR (assuming you service any customers in the EU). For now, these are some of the most stringent laws in the world regarding privacy and are being used as weapons by law firms looking to file class actions lawsuits.