A Data Privacy Impact Assessment (DPIA) is a process allows the privacy team to analyze, identify and minimize the data protection risks of a project or plan. Within GitLab this would apply to new product features, changes to existing product features, and the use of tools included in our Tech Stack.
DPIAs do not require that all risk is eliminated. Rather a DPIA aids in minimizing any identified risks and helps determine if the level of risk is acceptable under the circumstances, taking into account the desired benefits.
Conducting a DPIA is a part of our accountability obligations under global data privacy laws and regulations; DPIAs demonstrate to both customers and Team Members a commitment to data protection by design and default. In circumstances where a DPIA is required, failure to do so can result in an regulatory inquiry and financial penalty.
A DPIA can cover a single processing activity or a group of similar processing activities, and it can be used throughout the development and implementation of a project to identify and fix problems early, saving time and monetary resources. The review of risks and any mitigation measures may be continuous, especially if anything changes to how or why a processing activity occurs.
A DPIA is intended to identify, analyze, and minimize "risks to the rights and freedoms of natural persons," which includes risk to privacy and data protection rights, but also the risk of physical, material or non-material damage. These types of risks may lead to discrimination, identity theft or fraud, financial loss, reputational damage and other significant economic or social disadvantages. Ultimately, a DPIA is required when the level of risk to the rights and freedoms of natural persons is deemed "high".
In determining whether a DPIA is legally required for a processing activity, GitLab considers the following high-risk criteria:
Does the processing use automation, including profiling, to make decisions that produce legal effects or could significantly affect an individual?
Does the processing involve sensitive data or data processed on a large scale?
Does the processing involve monitoring public areas on a large scale?
Does the processing match or combine data sets from separate processing operations?
Does the processing contemplate an innovative use or apply new technological or organizational solutions?
Does the processing in itself prevent data subjects from exercising a right or using a service?
If you have questions about DPIAs or the process for how a DPIA is launched, reach out to the Privacy Team in the #legal Slack channel.