A lawsuit against retailer Patagonia, which the plaintiffs hope to eventually get certified as a class action, is raising a variety of privacy and data leakage issues from the company’s use of generative artificial intelligence (GenAI) in its customer service organization.
The problem is, CIOs are required to disclose all data being collected and how it is being used. However, with GenAI, CIOs today simply cannot know either of those things with any certainty. That makes corporate counsels very nervous.
The GenAI situation at issue involves three distinct players: the enterprise (in this litigation, that would be retailer Patagonia), a third party (here that would be Talkdesk, a cloud-based contact center vendor leveraging AI), and whatever AI products are being used by either company.
This lawsuit only directly involves Patagonia and Talkdesk, but how the AI programs leverage accessed data is very much in the picture.
What’s alleged
Talkdesk captures all customer service conversations from Patagonia customers and business partners, transcribes them and uses unspecified AI programs to “analyze callers’ words to determine what the caller is talking about and how the caller is feeling,” according to the California filing. “Neither Talkdesk nor Patagonia disclose to individuals that their conversations are being intercepted, listened to, recorded, and used by Talkdesk. They also do not obtain customer consent for Talkdesk to intercept, listen to, record, and use the contents of the call. This is illegal under California law.”
The lawsuit refers to one of the Talkdesk AI products, Copilot, which it describes as a GenAI “assistant that listens, guides, and assists contact-center agents during customer interactions. For example, Copilot will automatically suggest relevant responses to agents in chats, emails, calls, and texts based on the content of customers’ communications. Talkdesk saves all this information in the cloud and builds an interaction history, which enables companies to keep track of customers’ prior conversations— even if those conversations occurred in a different medium. All of this data is stored on Talkdesk’s servers.”
Patagonia gives a generic description of data sharing in its privacy policies, the lawsuit said, but was not specific about what Talkdesk was doing.
Headaches for CIOs
The potential headaches for enterprise CIOs involve where the data goes next. Mark Rasch, an attorney specializing in technology and privacy issues, offered a hypothetical scenario:
What if a consumer called a retailer and got into an animated argument about a possible product refund. It was resolved with that retailer, but the GenAI emotion analysis labeled the customer argumentative and a troublemaker.
Let’s further assume that the software is being used by hundreds of other retailers and other businesses. What if the software decides to make that “troublesome” customer experience longer than necessary hold times with all of those other businesses? Maybe it will instruct the call to be disconnected after those long hold times. What if that software recommends to various clients that all refunds from that customer be rejected?
“When we start using AI, we don’t know how we are being harmed. With AI, nobody can say for certain how the data is being used or how it will be used down the road,” Rasch said. “That means that it is likely impossible to show cause and effect.”
The root of the problem is the unpredictability of AI itself, especially GenAI, Rasch said. It’s not solely an issue of hallucinations, but the fact that AI learns and adjusts means that no one can know precisely how it will react, what data it will gather, what data it will infer and how it might end up leveraging that data.
“Whenever you are collecting data for whatever purposes, the company collecting the data needs to advise the customer not only what is being collected, but all of the intended uses,” Rasch said. “That is especially true where data is being mined for AI purposes.” But the uncertainty of AI may make both tasks all but impossible.
That forces CIOs to make clean distinctions, especially with privacy policies and other disclosures, between what their team is knowingly and deliberately collecting and how that is knowingly and deliberately being used, and what AI is gathering and how AI is later using it. CIOs need to strictly stick with what they know.
Rasch stressed that this is not merely a matter of the words being used, but how those words are being said, the phone number and IP address being used, the time of day, background sounds, and anything else AI can detect and analyze. Consumers are “voluntarily giving up a lot of stuff when they make a phone call,” Rasch said. “When most companies write a privacy policy, they don’t really consider how much of that information they are collecting.”
Legislation lags
Attorney Shavon Jones, who specializes in AI strategies with business development, said laws today have not yet seriously figured out AI, which puts CIOs in a precarious position.
“Where should the line be drawn for civil liability for AI? The fact is that I don’t think we will know where the line is for many years. AI litigation is an emerging subspecialty of law,” Jones said. “Whenever a field is emerging, there are only two ways to decide where the line is: legislatures and regulators can enact statutes and rules, and litigators can bring cases to trial in multiple jurisdictions and let those cases work their way through appeals until we get some high court decisions that inject certainty into the law. That will take many years of trial and error.”
Other attorneys agreed that this area of law has barely begun to evolve.
“The AI and customer data landscape is a legal minefield, and recent lawsuits highlight the urgent need for clear boundaries,” said Los Angeles attorney James E. Wright. “Enterprises pushing the envelope with AI need to recognize that customer data isn’t a free-for-all playground. The legal line must be drawn firmly at transparency and consent. Customers should know exactly how their data is being used and give explicit permission. Anything less is a breach of trust and, frankly, a ticking legal time bomb.
“The courts will continue to define this space, but companies need to get ahead of the curve. Playing fast and loose with AI and data will only lead to more lawsuits and eroded consumer trust. It’s time for enterprises to wise up and play by the rules.”
Read More from This Article: Patagonia lawsuit raises thorny GenAI data issues
Source: News