Methods for Reducing Disclosure Risks When Sharing Data

Methods for Reducing Disclosure Risks When Sharing Data

Statistical agencies, survey organizations, academic researchers, and business establishments often collect data that they seek to share with others. Typically, groups that share data are ethically or legally obligated to protect the confidentiality of data subjects’ identities and sensitive attributes. This compilation of methods includes a statement and online guidance on statistical disclosure protection methods, technological solutions, and computer science approaches to preserving privacy.

Data Privacy and Anonymization in R

Data Privacy and Anonymization in R

With social media and big data everywhere, data privacy has been a growing, public concern. Recognizing this issue, entities are promoting better privacy techniques; specifically differential privacy, a mathematical condition that quantifies privacy risk. In this course, you will learn to code basic data privacy methods and a differentially private algorithm based on various differentially private properties. With these tools in hand, you will learn how to generate a basic synthetic (fake) data set with the differential privacy guarantee for public data release.

Anonymisation Decision-making Framework

Anonymisation Decision-making Framework

We present in full here for the first time the Anonymisation Decision-Making Framework, which can be applied, perhaps with minor modifications to the detail, to just about any data where confidentiality is an issue but sharing is valuable.

Anonymisation Decision-making Framework

The UK Anonymisation Network (UKAN)

The UK Anonymisation Network (UKAN) was set up in 2012 as a means of establishing best practice in anonymisation. It offers practical advice and information to anyone who handles personal data and needs to share it.