Privacy and Learning for Collaborative Science
Privacy concerns often preclude the sharing of "raw" data, resulting in complex legal arrangements prior to scientific collaboration. One example is in the field of neuroimaging, where researchers at different sites studying the same mental health condition would like to jointly learn from their locally collected brain scans. We propose an approach to this problem which uses distributed algorithms that guarantee differential privacy. Differential privacy has emerged as one of the de-facto standards for measuring privacy risk when performing computations on sensitive data and disseminating the results. Many machine learning algorithms can be made differentially private through the judicious introduction of randomization, usually through noise, within the computation. In this talk I will describe this setting, algorithms for differentially private decentralized learning, and potential applications for collaborative research in neuroimaging.
Author: Anand Sarwate
Joint work with: H. Imtiaz, J. Mohammadi, B. Baker, A. Abrol, R.F. Silva, E. Damaraju, V.D. Calhoun, and S.M. Plis
Biography: https://www.ece.rutgers.edu/~asarwate/bio.php