Microsoft Research334 тыс
Следующее
Опубликовано 8 февраля 2022, 17:09
Speaker: Melissa Chase, Principal Researcher, Microsoft Research Redmond
Training modern machine learning models requires large amounts of data, and often that data may be private or confidential. The area of privacy-preserving machine learning looks at to what extent this private data may be exposed in the resulting model, and how this leakage can be reduced or prevented. This talk will first introduce the area of privacy-preserving machine learning, then give an overview of how we have been thinking about this problem at Microsoft Research. It will briefly summarize some of the work that we have been doing on different aspects of this problem, and then do a deeper discussion of one project that considers to what extent text models store recognizable information about users in the training data. Specifically, we will describe a new black box membership inference attack which works on models that include a word embedding layer, and takes advantage of the inherent structure in word embeddings.
Learn more about the 2021 Microsoft Research Summit: Aka.ms/researchsummit
Training modern machine learning models requires large amounts of data, and often that data may be private or confidential. The area of privacy-preserving machine learning looks at to what extent this private data may be exposed in the resulting model, and how this leakage can be reduced or prevented. This talk will first introduce the area of privacy-preserving machine learning, then give an overview of how we have been thinking about this problem at Microsoft Research. It will briefly summarize some of the work that we have been doing on different aspects of this problem, and then do a deeper discussion of one project that considers to what extent text models store recognizable information about users in the training data. Specifically, we will describe a new black box membership inference attack which works on models that include a word embedding layer, and takes advantage of the inherent structure in word embeddings.
Learn more about the 2021 Microsoft Research Summit: Aka.ms/researchsummit
Свежие видео