Welcome to AI Confidential

A newsletter from Opaque Systems

Greetings!

Collectively, we’re amassing data points faster than ever. By some estimates, we created 120 zettabytes of data in 2023—this is compared to the 9 zettabytes created just a decade earlier. The up-and-to-the-right trajectory of data creation has spawned the recent AI revolution, providing the fuel that propels the AI engine forward. Over the last year though, enterprises have seen this less as a vehicle they want to get into and more as a runaway train. Their nagging question: once our data enters an AI system, how does it remain secure? 

More than two-thirds of IT and security leaders believe the accelerated data growth across their infrastructure is outpacing their ability to secure data and manage risk. On top of that, common AI tools aren’t inspiring much faith: a study of ChatGPT use found that organizations with 10,000 users who enter 660 daily prompts can expect 158 monthly incidents of sensitive data exposure, with source code being the leading type of exposed data, followed by regulated data, intellectual property, and passwords and keys.

Things are evolving quickly, but security is a clear and missing layer for enterprises that want to harness the promise of AI. Our goal at Opaque Systems is to provide that layer, allowing companies to leverage their sensitive data with peace of mind. I’m thrilled to launch this newsletter to not only keep you up to date on what’s going on at the intersection of data security and AI, but also showcase the work we’re doing to help enterprises securely leverage the promise of AI.

AI Confidential will be in your inbox every two weeks, so be on the lookout. We’re glad you’re here.

—Aaron Fulkerson, CEO, Opaque Systems

Moving Beyond Data Anonymization

Data anonymization has been a go-to security measure for many enterprises over the years. But with the improvement of AI systems (and the advent of generative AI), data anonymization is not enough. Below is an excerpt from an article written by our co-founder Rishabh Poddar where he unpacks the impact of advancing AI on data security along with the importance of a solution like confidential computing.

Enterprise leaders should not have to delay opportunities to work with important clients and partners for fear of disclosing important data. Aware of the challenges that come with current data anonymization practices, technology leaders came together to launch the Confidential Computing Consortium to accelerate the development and adoption of new approaches to secure data in use, wherever it is stored and however it is used. 

Confidential computing enables companies to secure sensitive data in trusted execution environments (TEEs). However, they need to extend that security—with end-to-end encryption for data at rest, in transit, and in use. That’s because data that is in use is typically unencrypted, making it more vulnerable to exploitation. 

Image via Unsplash

With a purpose-built environment, companies can securely share and analyze data while maintaining complete confidentiality. They can augment current enterprise data pipelines with a confidential layer that protects sensitive and regulated data while enabling internal teams and partners to process encrypted data sets. Companies can also guarantee governance of their data with a hardware root of trust that provides verifiable privacy and control.

With this technology, teams and partners can:

  • Enable high-performance analytics and AI on encrypted data using familiar tools: With a trustworthy platform for secure data sharing, business teams can isolate sensitive data in TEEs and perform collaborative, scalable analytics and ML directly on encrypted data using familiar tools such as Apache Spark and notebooks.

  • Allow approved internal and external collaborative analytics, AI, and data sharing: Internal and external teams can share encrypted data or blended data sets with set policies, streamlining collaboration while keeping encrypted analytics results specific to each party.

Scale across enclaves, data sources, and multiple parties: A sensitive data-sharing tool can secure access across enclave clusters and the ability to automate cluster orchestration, monitoring, and management across workspaces without creating operational disruption. This simplifies data management responsibilities and duties.

Read the full article here

In the Lab

The latest happenings at Opaque Systems

Opaque CEO Aaron Fulkerson to speak at The Artificially Intelligent Enterprise virtual event

Many enterprises are on the precipice of transitioning AI systems from pilot projects to full-scale deployments. Our CEO Aaron Fulkerson will speak about this transition and how companies can ensure the security of sensitive data while moving from pilot to production. Join Fulkerson’s virtual presentation at the online event hosted by TechStrong on May 21. Register for free here. And watch him talk about how Opaque can solve the problem for enterprises stuck in pilot during this recent interview with theCube:

Confidential Computing Summit™: June 5-6

Join the confidential computing community for a two-day event in June. Not only will you hear from our co-founder and CEO, but there will also be speakers from the likes of Meta, NVIDIA, Microsoft, and more. Over the two days, speakers will take the stage to explore the latest in confidential data and AI, data privacy, and building trustworthy AI. Learn more and register here.

Join Opaque co-founder Raluca Ada Popa at GenAI Summit in San Francisco

Public and private sector leaders will convene at San Francisco’s Palace of Fine Arts for three days (May 29-31) of discussion and workshops around the promise of next-gen AI. Join our co-founder Raluca Ada Popa’s at the event here.

Product update: Opaque Gateway

Leverage your LLMs, privately. Opaque Gateway serves as a privacy layer around your LLM of choice. With Opaque Gateway, you can seamlessly sanitize LLM prompts to hide sensitive data from external parties and LLM providers. Opaque’s Confidential Computing technology ensures that no third party, not even Opaque Gateway, sees the underlying prompt. To join the waitlist and get early access to the product sign up here.

Rishabh Poddar Hits the Stage at SW2Con

Opaque co-founder and CTO Rishabh Poddar is set to speak tomorrow at AI developer conference SW2Con in Broomfield, CO. His talk will focus on securing generative AI in the enterprise. If you’re in the area, you can grab a ticket to the event here. If you’re not, we’ll be sure to share a recap following the talk.

Code for Thought

Worthwhile reads

🏛 Lawmakers push for data limits to mitigate security risks: The AI evolution means more personal data which could easily lead to more cyber attacks. At a Senate subcommittee hearing last week, U.S. lawmakers including Sen. John Hickenlooper, D-Colo., stressed the importance of passing stricter user privacy frameworks designed to limit the amount of data companies can collect, and give consumers more control over how those companies use their personal information.

⚙️ Innovation over security leaves businesses vulnerable: A new study finds that 70% of business executives care more about innovation than the security of their generative AI projects. In fact, less than a quarter reported that those projects are being secured at all, making them vulnerable to privacy risks.

⛓ Data breach regulations put pressure on cyber leaders: CISO isn't a coveted role these days. Recent government regulations around the disclosure of data breaches are holding CISOs liable for hacking incidents—sparking a need for cyber teams to conduct consistent security audits and develop effective response plans.

🔐 Certain third-party providers lag in protecting customer privacy: The Australian privacy commissioner warns that third-party data providers are failing to protect customer information in the wake of new data breaches. Recently, more than one million people had their personal information exposed after data owned by IT provider Outabox was published online. 

Reply

or to participate.