Secure Sensitive Data with Privacy-Preserving (Enhancing) Computation

Jorge Hernández | January 18, 2023

Privacy-Preserving (Enhancing) Computation (PPEC) helps ensure that no party gets access to the data from other parties. As a result, multiple parties can collaborate to monetize the data without having to reveal their data to each other. Since data remains encrypted during computation, the risk associated with leakage or theft of associated data is significantly reduced.

Jorge Hernandez, Senior Machine Learning Research Engineer, is one of the expert engineers who presented their unique perspective and thought leadership on the top 10 software engineering trends shaping the next generation of technology. Jorge Hernandez spoke to us about PPEC, a game-changing trend that will help organizations meet the pace and quality demanded of software solutions of today and tomorrow.   

 

Banner Blogs (1)

 

What is Privacy-preserving (enhancing) Computation and what is it used for?  

Data is the new uranium, powerful yes, but also dangerous to have and handle without care, as the "UN Handbook on Privacy-Preserving Computation Techniques  ¹ tells us: "... data is often sensitive, including details about individuals or organizations that can be used to identify them, localize their whereabouts, and draw conclusions about their behavior, health, and political and social agendas. In the wrong hands, such data can be used to cause social, economic, or physical harm.". 

Privacy-preserving (enhancing) Computation is a general name for a set of techniques and technologies that help to maintain privacy and ensure the safety of personal data, while still allowing the data to be used for other purposes. 

 

Why is Privacy-preserving (enhancing) Computation a trend that will shape business in 2023?  

Three main reasons: the increasing cost of data breaches, the increasing amount of regulation in the space, and the maturing of Privacy-preserving (enhancing) Computation technologies and techniques. 

Currently, per IBM's "Cost of a Data Breach 2022 Report²," the global average total cost of a data breach in the United States is $9.44M USD, and for some industries the cost can be even higher. For example, the average total cost of a data breach in the Healthcare sector is $10.10M USD. Furthermore, the costs climb every year, making mitigating technologies and techniques an attractive investment. 

Due to the amount of damage, governments around the world have seen it fit to regulate the use of data. Laws like the European Union’s General Data Protection Regulation (GDPR) carries hefty fines for non-compliance; turning data privacy and security into a critical obligation for all data processors. 

Lastly, recent advances in Privacy-preserving (enhancing) Computation, now make it viable to apply these technologies and techniques in a wider range of systems and scenarios, particularly in the fields of Artificial Intelligence and Machine Learning (AI/ML). 

 

Why did Encora select it as a rising trend in 2023?  

At Encora, we have a firm commitment to both privacy and security. Seeing the rising cost of improper use of data coupled with technological advances, we feel that the time has come for the widespread adoption of Privacy-preserving (enhancing) Computation as an integral part of all computing systems that handle sensitive personal data. 

 

What makes Privacy-preserving (enhancing) Computation different from traditional data privacy technologies/techniques?  

Unlike traditional data privacy technologies and techniques, which are centered around protecting data-in-transit and data-at-rest, Privacy-preserving (enhancing) Computation tackles the problems of protecting data-in-use and protecting data that must be publicly shared (which renders encryption inapplicable). 

 

Can you speak about your experience with Privacy-preserving (enhancing) Computation? 

Working in the AI/ML field there are a range of Privacy-preserving (enhancing) Computation technologies and techniques which are attractive to implement. Personally, I’ve explored the application of both Federated Learning and Synthetic Data. And for this last one, I have worked on a system that greatly benefited from its use since it worked in the low-shot/few-shot learning regime. 

 

In your experience, where do clients stand on the topic?   

In my experience (working in the AI/ML field), clients are naturally very concerned that all customer data is properly handled, so as to minimize the risk of harm in case of a security incident. This makes new technologies and techniques that help manage or reduce that risk incredibly attractive. Clients are willing to invest both time and money into the effort of implementing and integrating Privacy-preserving (enhancing) Computation into both current and future projects. 

 

How can Encora help clients evolve? 

At Encora, our teams have experience with the use of the full spectrum of Privacy-preserving (enhancing) Computation technologies and techniques and with integrating them into all of the phases of the Software Development Lifecycle. Given the nature of these technologies and techniques, this is a necessity when it comes to their proper use and deployment. 

 

How will Privacy-preserving (enhancing) Computation impact software & digital product engineering?

While it may, at first, seem that Privacy-preserving (enhancing) Computation just means that engineers and product managers now have a new set of items that they have to check off their list, in truth, the kinds of assurances around privacy and security that these technologies bring allow product and engineering teams to branch out and try new approaches to old problems; and can lead to the creation of products that were not business-viable before! 

 

Can you provide real-world examples of how organizations are benefiting from Privacy-preserving (enhancing) Computation today?

There are two main benefits: de-risking the use of data and enabling new kinds of analysis on data that was previously not available due to privacy concerns. 

For the first benefit, we see that the widespread adoption of Privacy-preserving (enhancing) Computation by the major cloud vendors has brought security benefits to all kinds of companies who have to handle consumer data, and thus risk breaching one of the many sets of regulations which govern their use (such as the GDPR). 

For the second benefit we have seen both new products in the blockchain (e.g.  ZeroCash, Difinity, QED-it and R3), and the ML space (e.g., Alexa, Siri, Waymo, Tesla and clinical research at Roche) appear or be significantly improved thanks to the adoption of Privacy-preserving (enhancing) Computation. As well as their use for novel data analysis in projects like the United Nations’ UN Global Platform. 

 

Who benefits from Privacy-preserving (enhancing) Computation the most?  

We all do! As consumers of digital and computing products, having our data handled in a more secure manner is a clear win. These technologies are not just more hurdles for teams to jump through, they actually open new doors in both product development and engineering since they allow new uses for data and give access to data that could previously could not be used without risk. 

 

How does Privacy-preserving (enhancing) Computation fit into larger IT and business initiatives?  

Privacy-preserving (enhancing) Computation will become an integral part of any and all systems which handle potentially sensitive user data. This makes the integration of such technologies and techniques a clear legal and business necessity. Particularly for those organizations working in AI/ML, Blockchain, Fintech, and Healthcare. 

Companies will need to make the best use of these technologies to meet both regulatory and business objectives; by fully integrating them as one of the necessary points of analysis during both technical and product planning and design. This will also require training members of the organization in the benefits, drawbacks, and correct use and implementation of said technologies and techniques.  

 

What does Privacy-preserving (enhancing) Computation mean for privacy and compliance?

Privacy-preserving (enhancing) Computation is a clear win in both areas, by helping keep data in well protected systems (or by helping perform compute locally without needing to share data), lowering the amounts and kinds of data that need to be shared and revealed, and allowing full encryption during many kinds of computation. These technologies and techniques make user data far safer, and make compliance with data protection laws much easier even for small organizations. 

 

We sincerely thank Jorge Hernandez, Senior Machine Learning Research Engineer. The focus of this piece, PPEC, is one of ten technology trends featured in Encora’s 2023 Technology Trends eBook. You can read the eBook in its entirety by visiting Encora’s 2023 Technology Trends. 

 

"Due to the amount of damage and the number of people affected, governments around the world have, in recent years, seen fit to regulate the use of data. Laws like the European Union’s General Data Protection Regulation (GDPR), which carry hefty fines for non-compliance have turned ensuring data privacy and security into a critical obligation for all data processors." 

Jorge Hernandez, Senior Machine Learning Research Engineer 

 

Banner Blogs (1)

 

References: 

[1] https://unstats.un.org/bigdata/task-teams/privacy/index.cshtml 
[2] https://www.ibm.com/reports/data-breach 

 

About Encora 

Encora is a digital engineering services company specializing in next-generation software and digital product development. Fast-Growing Tech organizations trust Encora to lead the full Product Development Lifecycle because of our expertise in translating our clients’ strategic innovation roadmap into differentiated capabilities and accelerated bottom-line impacts. 

Please let us know if you would ever like to have a conversation with a client partner and/or one of our Innovation Leaders about accelerating next-generation product engineering within your organization. 

 

Contact us

Insight Content

Categories

Related Reading

Share this Post