Microsoft introduces WhiteNoise toolkit for differential privacy
Microsoft's WhiteNoise toolkit will be available to developers via GitHub.
Microsoft, earlier this month, hosted the Build 2020, via an online event during which the company announced a host of updates to products and services such as Windows 10, Microsoft 365 and Edge among others. In addition to this, the company also announced tools such as Project Cortex and Fluid Frame that are aimed at helping developers build better products more easily. The slew of updates that the company announced at its annual developers conference also included a toolkit called WhiteNoise.
The company's WhiteNoise toolkit provides differential privacy to the artificial intelligence (AI) based models that developers are working on.
To give you some clarity about the subject, differential privacy techniques allow developers to derive meaningful insights from “private data while providing statistical assurances that private information such as names or dates of birth can be protected.” Simply said, these techniques ensure that users' personal information remains safe even as developers use the data to develop predictive models or hackers try to gain access to it.
The company WhiteNoise toolkit achieves this by inserting a small amount of noise in the raw data that developers feed to their machine learning model. This makes it difficult for hackers and malicious actors to extract the original data from the AI model.
Now, the company has announced that its WhiteNoise toolkit, which was developed in collaboration with researchers at the Harvard Institute for Quantitative Social Science and School of Engineering, will be available to developers via GitHub. Developers can also access this toolkit via Azure Machine Learning.
Follow HT Tech for the latest tech news and reviews , also keep up with us on Twitter, Facebook, Google News, and Instagram. For our latest videos, subscribe to our YouTube channel.