What started as an experiment less than two years ago is growing up and moving into its own home! TF Encrypted has seen tremendous growth thanks to partner contributions and with this move we want to further cement its community nature.
The Paillier homomorphic encryption scheme is not only interesting for allowing computation on encrypted data, it also provides an excellent illustration of modern security assumptions and a beautiful application of abstract algebra. In this first part of a series we cover the basics and the homomorphic operations it supports.
In this series of blog posts we go through how modern cryptography can be used to perform secure aggregation for private analytics and federated learning. As we will see, the right approach depends on the concrete scenario, and in this first part we start with the simpler case consisting of a network of stable parties.
We apply TF Encrypted to a typical deep learning example, providing a good starting point for anyone wishing to get into this rapidly growing field. As shown, using state-of-the-art secure computation techniques to serve predictions on encrypted data requires nothing more than a basic familiarity with deep learning and TensorFlow.
Using TensorFlow as a distributed computation framework for dataflow programs we give a full implementation of a secure computation protocol with networking, in turn enabling optimised machine learning on encrypted data.
We take a typical CNN deep learning model and go through a series of steps that enable both training and prediction to instead be done on encrypted data using the SPDZ protocol.
First part in our series on the SPDZ secure computation protocol.
We have previously seen that redundancy in secret sharing can be used to recover from lost shares. In this third part of the series we use Reed-Solomon decoding methods to see that it can also be used to detect when some shares have been manipulated.
Overview of work done at Snips on applying privacy-enhancing technologies as a start-up building privacy-aware machine learning systems for mobile devices. Mainly centered around secure aggregation for federated learning from user data but also some discussion around privacy from a broader perspective.
Efficient secret sharing requires fast polynomial evaluation and interpolation. In the second part of the series we go through how the well-known Fast Fourier Transform can be used for this.
First part in a series where we look at secret sharing schemes, including the lesser known packed variant of Shamir's scheme, and give full and efficient implementations. We start in this post by looking at the more typical textbook approaches.
We build a simple secure computation protocol from scratch and use it to train simple neural networks for basic boolean functions.
In this repost of a blog post from my early academic years ago, we toy with a small non-determinism programming language where we can simply ask the computer to “guess” solutions.