Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

UK researchers have developed a new PyTorch framework for preserving privacy in deep learning

Save for later
  • 3 min read
  • 13 Nov 2018

article-image

UK professors and researchers have developed the first general framework for safeguarding privacy in deep learning built over Pytorch. They have reported their findings in the paper “A generic framework for privacy preserving deep learning.”

Using constructs that preserve privacy


This paper introduces a transparent framework to preserve privacy while using deep learning in PyTorch. This framework puts a premium on data ownership and its securing processing. It introduces a value representation which is based on chains of commands and tensors. The resulting abstraction allows implementation of complex constructs that preserve privacy. Constructs like federated learning, secure multiparty computation, and differential privacy are used.

Boston Housing and Pima Indian Diabetes datasets are used in the paper to show early results. Except for differential privacy, other privacy features do not affect prediction accuracy. The current framework implementation introduces a significant overhead which is to be addressed in a later development stage.

Deep learning operations in untrusted environments


To perform operations in untrusted environments without disclosing data, Secure Multiparty Computation (SMPC) is used, which is a popular approach. In machine learning, SMPC can protect the model weights while allowing multiple worker nodes to participate in training with their own datasets. This is known as federated learning (FL). These securely trained models are still vulnerable to reverse engineering attacks. This vulnerability is addressed by differentially private (DP) methods.

The standardized PyTorch framework contains:

  • A chain structure in which performing transformations or sending tensors to other workers can be shown as a chain of operations.
  • Unlock access to the largest independent learning library in Tech for FREE!
    Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
    Renews at $19.99/month. Cancel anytime
  • For a virtual to real context of federated learning, a concept called Virtual Workers is introduced. These Virtual Workers reside in the same machine and do not communicate over the network.

Results and conclusion


A reasonably small overhead is observed when using Web Socket workers in place of Virtual Workers. This overhead is due to the low network latency when communication takes place between different local tabs. When using the Pima Indian Diabetes dataset, the same overhead in performance is observed. The design in this paper relies on chains of tensors exchanged between the local and remote workers. Decreasing training time is an issue to be addressed. Another concern is securing MPC to avoid malicious attempts targeted at corrupting the data or the model.

For more details, read the research paper.

PyTorch 1.0 preview release is production ready with torch.jit, c10d distributed library, C++ API

OpenAI launches Spinning Up, a learning resource for potential deep learning practitioners

NVIDIA leads the AI hardware race. But which of its GPUs should you use for deep learning?