How your vendors are exploiting your data for their own gain
The dark side of AI-driven solutions and platforms
Artificial intelligence (AI) is transforming the way businesses operate and interact with their customers. More and more service providers are using AI-driven solutions and platforms to offer innovative, more efficient, and streamlined services to their clients. However, AI-driven solutions and platforms depend on data to learn and improve, and this data may include your company’s data or your customer’s data. Have you ever wondered how your vendors are using your data to train their AI tools?
AI models require large amounts of data and training to be effective and to perform various tasks, such as speech recognition, natural language processing, computer vision, or recommendation systems. However, data sharing for AI training also poses significant risks for data privacy, security, and ownership. If your vendors are using your data to train their AI tools, they may be accessing, processing, storing, or transferring sensitive company data or personal information without your knowledge or consent, or the consent of your customers. They may also be using your data for purposes that are not aligned with your expectations or interests, such as developing or enhancing their own products or services, or sharing your data with other users or third parties.
An evolving litigation landscape
A recent lawsuit filed against Navy Federal Credit Union (NFCU) and Verint Systems Inc.1 illustrates some of the potential legal implications of data sharing for AI training. In the complaint, consumers have accused NFCU and Verint of recording their calls without their knowledge and consent, and using these recordings to train Verint’s AI models. The plaintiffs claim that they were only informed that their calls “may be recorded for quality assurance purposes”, but not that Verint, a third-party vendor, would be making the recordings, or that Verint would use the recordings, including the information shared, specific words and phrases used, tone, pitch, pace of speech, etc., to enhance and improve or to develop Verint’s own service offerings and to train its own AI models. The plaintiff also alleges that NFCU knew that Verint uses the data it collects “…to advance its own business interests, because Verint’s contract states that it can do so.”
This lawsuit raises important questions about the transparency, accountability, and responsibility of data sharing and highlights the need to consider how your vendors are utilizing your company data.
Take action now
Data sharing for AI training can offer many benefits for both businesses and customers, such as improved services, enhanced customer experience, and increased efficiency. However, data sharing for AI training also poses significant challenges and risks for data privacy, security, and ownership. You should be aware of how your vendors are using your data to train their AI tools, and what measures they are taking to protect your data from unwanted use.
As highlighted by the complaint against NFCU and Verint Systems, we can anticipate more and more litigation on the improper use of data and more and more companies will find themselves facing lawsuits. Do not let your vendors use your company data in ways that you are unaware of and that you have not approved. Be proactive, now is the time to review and update your legacy contracts to control how your vendor is using your company data.
1Paulino v. Navy Federal Credit Union et al, Docket No. 3:24-cv-03298 (N.D. Cal. May 31, 2024)