- Everi Community Newsletter
- Posts
- The Security of Consumer Data
The Security of Consumer Data
Trust: The Currency of AI, Part II
In our last newsletter, we started a series called “Trust: The Currency of AI.” We shared how companies building and implementing AI consumer products will soon be held to regulatory standards of compliance. However, we encouraged these companies to start implementing processes to cultivate trust with consumers today. We defined “The Three Pillars Of Trust” as: (1) The Sanctity of Truth, (2) The Security of Consumer Data, and (3) The Ethics of Companies and Their Leaders. In Part I, we discussed The Sanctity of Truth, emphasizing that in order to build trust with customers, you have to consistently deliver on quality and accuracy.
Today’s newsletter continues the series with Part II: The Security of Consumer Data.
The Security of Consumer Data.
High quality data is core to building useful, accurate and trustworthy AI. The perspective of some executives is that access to data is running out, and whether hyperbole or not, one thing is true: data is the new oil. New licensing deals are being defined between major AI companies and media companies. For example, Open AI recently inked a licensing deal with Vox Media and The Atlantic to train ChatGPT. Outside of major deals, many consumer apps are creating new Terms of Service agreements that enable them to use consumer data to train their models by default. Adobe experienced notable backlash for the terms of AI training and licensing consumer data, largely sparking concerns about trust.
AI has, in many ways, turned consumers of the internet into data workers. From web scraping to tracking the intimate details of consumer data usage, behavior online is being captured to optimize application performance. In the age of AI, it’s also being captured for sale. Suddenly, the ability to engage online is hinged on contracting user images, audio, videos, messages, stories, shopping behavior and more to be sold, licensed, and sub-licensed. How do we maintain trust for the everyday consumer to share their data and information on the internet, without the threat of that data being used in ways they didn’t agree to or expect?
I just cancelled my Adobe licence after many years as a customer.
The new terms give Adobe "worldwide royalty-free licence to reproduce, display, distribute" or do whatever they want with any content I produce using their software.
This is beyond insane. No creator in their… x.com/i/web/status/1…
— Sasha Yanshin (@sashayanshin)
4:37 PM • Jun 7, 2024
There is a responsibility for companies to protect consumer data by:
(1) Allowing consumers to freely use the internet without becoming default data workers.
(2) Protecting and securing consumer data from fraud and exploitation.
(3) Maintaining consumers’ right to choose how, where and when their data is used for profit.
At Everi, we’re building a data platform that manages trust between consumers and businesses to accelerate AI while championing consumer rights. Given we’re still learning about AI, we want to equip businesses with best practices they need to build trust around protecting and securing consumer data.
Prioritizing Consumer Data Rights
For enterprise companies and scaling AI startups, it is imperative that you cultivate trust in your business by protecting and securing consumer data. Below are three best practices to help you start earning trust today:
Be a Security First Company. In 2023, there was a 20% increase in data breaches, producing 353 million victims of data leaks. In a world where consumers speak to chatbots and automated voice agents more than people, how can we ensure personal data is safe and secure? The most obvious best practice is investing in security training for your internal teams, especially team members who handle customer data. Sales and customer support teams are often overlooked in data privacy training, but they end up working with personal data often. Maintain compliance trainings with customized, company specific use cases. Security is typically measured on a spectrum of PII to NPI, but something as simple as an email leak can break trust given the introduction of AI spam.
Categories of Consumer Data (source)
Security also relates to what you do with consumer data once it is logged. In Q4 2023, there were 7.3 billion unwanted spam calls, globally. As it turns out, not all of these calls are from security breaches, many are from how data is sold as a revenue stream without consumers ever knowing it's being shared. Security goes hand in hand with transparency. If consumer data will be used or shared, tell them why and how, and make sure sharing that data provides value beyond the revenue stream for your company. Will sharing this data improve personalization? Will it help you verify a customer’s eligibility for features or rewards? Make the value proposition of how data is shared worthwhile.
Prioritize Consumer Consent. Companies have a responsibility to maintain consumers' right to choose. There is a difference between introducing products that consumers don’t yet know they need and ignoring the feedback and rights of consumers. Many consumers are complaining about AI, but it doesn’t mean AI has no place across industries. That said, how consumers participate in that economy should be a choice. As you transition your company into the age of AI, consider consumer consent first, before anything else. Publicize changes to Terms of Service, provide opt-in and opt out for how data is used. Let consumers choose. If you’re already doing this well, we’d love to hear from you and share your story!
Ethically Source Training Data. One of the largest concerns today is how companies are procuring training data. The definition of “publicly available” has been exploited in many ways, leading to major controversy around copyright law. A new social platform for artists, Cara, recently leapt in usage over the past month due to its commitment to protect creatives from data mining. To maintain trust with consumers, companies need to be transparent about their training data, ensuring it does not infringe on copyright. If going through third party sources, ask them about their licensing and procurement processes. Arm yourself with information such that you can be held accountable by your customers. Publicize your training data sources, if possible.
Securing and protecting consumer data positions companies as trustworthy and ethical. With the best practices we shared you’ll take tangible steps in the right direction to maintain and cultivate trust with your end customers, and drive engagement and brand loyalty that translates to revenue growth. In the next post, we’ll talk about the third pillar of trust: “The Ethics of Companies and Their Leaders.”
The key building block of truthful, accurate AI is clean, accurate data. At Everi AI, we help companies curate high quality data to train accurate and precise AI models. We partner with companies to build AI products that work for Everi one, Everi where.
If you have a story about how you are prioritizing consumer data rights, we’d love to hear from you and share your story. Email us at [email protected] today!
Reply