Mechanisms exist to refrain from archiving Personal Data (PD) elements if those elements in a dataset will not be needed after the dataset is archived.
Mechanisms exist to refrain from archiving Personal Data (PD) elements if those elements in a dataset will not be needed after the dataset is archived.
Mechanisms exist to remove Personal Data (PD) elements from a dataset prior to its release if those elements in the dataset do not need to be part of the data release.
Mechanisms exist to remove, mask, encrypt, hash or replace direct identifiers in a dataset.
Mechanisms exist to manipulate numerical data, contingency tables and statistical findings so that no person or organization is identifiable in the results of the analysis.
Mechanisms exist to prevent disclosure of Personal Data (PD) by adding non-deterministic noise to the results of mathematical operations before the results are reported.
Mechanisms exist to perform de-identification of sensitive/regulated data, using validated algorithms and software to implement the algorithms.
Mechanisms exist to perform a motivated intruder test on the de-identified dataset to determine if the identified data remains or if the de-identified data can be re-identified.
Mechanisms exist to use aliases to name assets, which are mission-critical and/or contain highly-sensitive/regulated data, are unique and not readily associated with a product, project or type of data.
Mechanisms exist to identify and document the location of information and the specific system components on which the information resides.
Automated mechanisms exist to identify by data classification type to ensure adequate cybersecurity & data privacy controls are in place to protect organizational information and individual data privacy.