Skip to content

This work combines differential privacy and multi-party computation protocol to achieve distributed machine learning.

Notifications You must be signed in to change notification settings

dacongy/distributedMachineLearning

 
 

Repository files navigation

distributedMachineLearning

This work combines differential privacy and multi-party computation protocol to achieve distributed machine learning. Based on the paper "Distributed Learning without Distress: Privacy-Preserving Empirical Risk Minimization" (link -- To be added) that has been accepted at NIPS 2018.

The code contains privacy preserving implementation of L2 Regularized Logistic Regression and Linear Regression models.

About

This work combines differential privacy and multi-party computation protocol to achieve distributed machine learning.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Languages

  • C 83.9%
  • Python 15.7%
  • Makefile 0.4%