Webinar Recap – US Quarterly Privacy Update: Consumer Privacy Law

As part of the TrustArc Privacy Insight Series, TrustArc Associate General Counsel – Privacy Intelligence K Royal, and TrustArc Privacy Legal Specialist Christina Fratschko presented the webinar “US Quarterly Privacy Update: Consumer Privacy Law” last week. This blog post will give a brief summary of that webinar; you can listen to the entire webinar and download the slides here.

In this quarterly session, the panelists provided:

An overview on updates to Consumer Privacy Law for each of the states, and mentioned which legislatures have killed their bills due to substantive issues or slating them for further study. Also discussed were commonalities between bills among states with regards to rights to access, correct and delete personal information, and right to opt-out of sale of personal information.

A review of three federal bills proposing consumer rights: 1) United States House of Representative Draft Law Discussion Bill – new safeguards around how companies can collect and use identifiable consumer data, 2) Consumer Online Privacy Rights Act (“COPRA”) – entities subject to the U.S. Federal Trade Commission jurisdiction must comply with individual rights, and 3) Consumer Data and Security Act – establishing a clear federal standard for data privacy protection, giving businesses a uniform standard rather than a patchwork of confusing state laws.

What employers and educational institutions need to know during this growing pandemic of the novel coronavirus around the world. The panelists recapped several guidances issued by regulatory authorities. The Office for Civil Rights, which enforces the Health Insurance Portability and Accountability Act (“HIPAA”)  published an advisory regarding Telehealth in which healthcare providers can communicate to patients and provide Telehealth services through communication technologies. The U.S. Department of Education issued guidance on how and when educational institutions may share student personal information if a student has COVID-19. In addition, the U.S. Equal Employment Opportunity Commission published some guidance on how employers can handle information of a COVID-19 case among their employees and protect their employees from COVID-19.

Watch this on-demand webinar to stay up-to-date on consumer privacy laws in the US. TrustArc also has a robust library of on-demand webinars available here

Join us for the next webinar in the Privacy Insight Series: “COVID-19 – What are the Potential Impacts on Data Privacy?” with TrustArc SVP, Privacy Intelligence and General Counsel, Hilary Wandall on 4/8 at 9am PT. Register for the webinar here.

The TrustArc Privacy Insight Series is a set of live webinars featuring renowned speakers, presenting cutting edge research, tips, and tools. Events are free and feature informative discussions, case studies and practical solutions to today’s tough privacy challenges.

Managing Employee Privacy in the Face of COVID-19

View all TrustArc COVID-19 resources here.

Suddenly, the world came to an almost complete standstill. What few expected to happen in these modern times of continuous global travel and interconnectedness, did happen after all. COVID-19, or the Coronavirus, has caused governments to close national borders, issue ‘shelter at home’ warnings, and cancel public and private group gatherings and events. Many companies have adopted policies and remote work practices requiring or allowing their employees to work from home in situations where their responsibilities can be managed off-premise. 

At TrustArc, we receive a lot of questions about the privacy implications of the COVID-19 pandemic. What are employers allowed to do to control the spread and mitigate the effects of the virus, and what additional data can they process about their employees? How do employers ensure good data protection and governance practices for employees working from home? In this blog, we address the most common challenges organizations currently face.

Health Data on the Work Floor 

Even in times of crisis (perhaps particularly in times of crisis), the law still applies. This is the case for labour laws, for medical legislation, and also for privacy and data protection laws. Safeguards cannot just be thrown out of the window. That said, in many jurisdictions, the law permits organizations to process additional data to assist public health efforts by keeping employees safe and healthy, provided that certain safeguards and requirements are met. 

Guidance from the Regulators

One frequently asked question by both governments and employers relates to the collection and use of medical data, like body temperature. Earlier this week, the Executive Committee of the Global Privacy Assembly (GPA), a worldwide consortium of privacy and data protection regulators, released a statement on this issue:

“We are confident that data protection requirements will not stop the critical sharing of information to support efforts to tackle this global pandemic. The universal data protection principles in all our laws will enable the use of data in the public interest and still provide the protections the public expects. Data protection authorities stand ready to help facilitate swift and safe data sharing to fight COVID-19.”

The GPA also published a special webpage where guidance from national regulators and other authorities on how to deal with COVID-19 related data issues is posted. This guidance is not limited to specific regions or regulators but rather covers GPA members worldwide. 

What Employers Should Know

Even though we recommend you review the specific guidance available for the country where your organization operates, there are a few general rules that can be deduced from the regulator guidance on COVID-19. 

  • A distinction needs to be made between data that governments can collect and use and data that private entities can collect and use and the permitted legal basis for each. Governments in general will have more room to maneuver when processing personal data in the public interest (e.g. to safeguard public health) or even to process personal data in the vital interest of an individual. Under the GDPR and various other laws, these are identified explicitly as grounds to process personal data. For private entities, collection and use of personal data in the public interest can also be possible, but there needs to be a clear, direct and demonstrable link with the public interest. 
  • When processing medical and other health data data, which includes noting if employees have been diagnosed as infected by or show symptoms of COVID-19, organizations should show restraint in only processing the minimum personal data necessary to carry out their obligations related to safety of the workforce, customers, and the public. In general, data protection and labour laws restrict the amount of detail on employee illnesses that can be registered by employers. When it is necessary and proportional (i.e. if there is no other option but to collect data on (suspicion of) COVID-19 infections in the workplace), as a best practice, data minimization and confidentiality must be respected. This means that as little information as possible should be collected and that this information should only be accessible to specific persons (not departments of groups) with a legitimate need to know it. For example, identifying victims of COVID-19 by name generally should not be allowed. Companies should also show restraint when processing data from visitors to its premises. There might be a good reason to measure the temperature of a visitor before allowing access, but that doesn’t mean the temperature reading or data related to whose temperature was read should be retained following the decision to provide access or not. In many jurisdictions, the processing of medical or other health data may require an organization to complete a privacy or data protection impact assessment and implement additional procedural safeguards and security controls.    
  • Whatever data is collected and used in the fight against COVID-19, organizations should be upfront and transparent about what data they process for which reasons. Under almost all data protection regulations around the world, the transparency requirement is a key principle. Information should be accessible, easy to understand and include the reasons why (additional) data needs to be processed.

Working from Home 

For many organizations, the Coronavirus crisis is the first time they will allow large groups of employees to work from home. In addition to impacting IT resources, it also requires organizations to consider a renewed approach to their data use and data protection practices. Even for organizations where employees are used to working from home, it is advisable to review and, where relevant, revise policies and procedures to ensure that personal data will remain secure at all times. This review should also include an assessment of the organizational, physical and technical risks involved in working from home and accessing systems and data remotely and the security measures that may be advisable, such as using secure Wifi networks and company-authorized VPNs. Though there may not be an alternative to working from home, conducting a privacy or data protection impact assessment of the working from home processing may help identify the risks to the rights and freedoms of your employees, customers and business partners. It also allows you to identify mitigation steps that your workers at home can implement, like the implementation of certain technical and organizational measures.

We have created two top-10 lists with recommendations for both employers and employees on what to take into consideration when employees are working from home. Download the following tips:

CCPA Update: March Regulation Proposed Revisions

The Department of Justice of California published yet another round of draft CCPA (California Consumer Privacy Act) regulations on March 7, 2020 with comments due March 27, 2020.

As stated in the notice, there were “around 100 comments received in response” to the previous draft regulations.

In the most recent version, the “redlined” version is color-coded to easily identify the original draft regulations, the first set of modifications, and this second set of modification. The redlined and clean versions are published online.

According to the rule-making process, if changes are made to the proposed regulations, the changes will be published for the public to submit comments. These comments would be reviewed and based on the comments, either revise or accept the published draft. Comments will also be responded to at the publication of the final regulations.   The Office of the Attorney General previously provided guidance that if changes are “substantial and sufficiently related,” the changes will be published with an abbreviated comments period of 15 days (this modification and the last one met these requirements). If changes are not made or are “nonsubstantial and sufficiently related,” no publication for comments will occur. Only “major changes” would require a full 45-day comment period.

Some of the key changes include:

  • Removal of § 999.302 which was added in the last version addressing that an IP address that is otherwise not associated with identifying information is not personal data. No sections were added or modified in the newest version to address IP addresses.
  • Addition of § 999.305(d) clarifying that “[a] business that does not collect personal information directly from a consumer does not need to provide a notice at collection to the consumer if it does not sell the consumer’s personal information.”
  • An addition was made that if a business that denies a consumer’s request to delete sells personal information and the consumer has not already made a request to opt-out, the business shall ask the consumer if they would like to opt out of the sale of their personal information and shall include either the contents of, or a link to, the notice of right to opt-out in accordance with section 999.306. (§ 999.313(d)(7)).
  • Clarification that the notice provided at the collection of employment-relation information does not need to contain a link to the business’s privacy policy.
  • Additional clarifications were added around information provided in response to consumers’ requests to know (§ 999.305(f)(2)), what to publish about selling minors’ data (§ 999.308(c)(9)), a description of biometric data that is to be provided where the biometric data itself cannot be provided in response to a request to know (§ 999.314(c)(4)), and descriptions of categories of sources and business purposes in the privacy policy (§ 999.308(c)(1)(e) and (f).

Where are we now?

The comment period ends on March 27, 2020. Per guidance and history, any changes made to this version will result in publication of a new round of proposed regulations.

Once we reach a version wherein there are no changes made, according to the “Information about the rulemaking process,” the Office of the Attorney General will prepare and submit the final rulemaking record to the Office of Administrative Law (“OAL”) for approval, including the summaries and responses to each public comment received. The OAL has 30 working days to determine if all of the procedural requirements are met and if so, the regulations will be filed with the Secretary of State. 

Will enforcement start July 1, 2020?

At this time, enforcement remains slated to start on July 1, 2020. TrustArc will keep you posted on updates. To speak with a privacy expert about the California Consumer Privacy Act and how to comply, schedule a consultation today.

Tips to Securing a Privacy Budget

Privacy is historically underfunded when it comes to company budgets, even as “data privacy” has become a popular topic. Some stakeholders view regulations, like the GDPR or CCPA, as a one-time, check-the-box project, and therefore fail to fund appropriately. However, those handling privacy management on a day-to-day basis know this is not the case when dealing with numerous complex privacy regulations. Privacy compliance is an ongoing adventure and cannot be approached like a task that will be crossed off the list once compliance has been reached. Developing a mature privacy program is crucial to ongoing risk management and compliance. So how do you do this when there aren’t the proper resources available? Luckily, there’s several ways through which you can get your stakeholders on board the privacy train: 

Presenting a Solid Case for Privacy

Be Persuasive. When presenting your case to the stakeholders, be ready to make a convincing argument as to why privacy resources are needed. Be prepared. Be firm. And be early – don’t wait until the last minute to figure your compliance plan when there’s an enforcement date quickly approaching. 

Align Visions. Harmonize your privacy vision with the company vision and mission statement. If your company prides itself on its transparency, show that being transparent with your privacy policies and principles syncs with that vision of transparency. 

Case Studies. Nothing gets the point across like cold hard facts. Pull together a list of examples that show the importance of investing in privacy, such a recent regulatory fines, data breaches, and any consumer backlash related to data handling. These tangible use cases will demonstrate the severe repercussions when privacy is not taken seriously.   

Privacy as a Differentiator. Show your stakeholders how privacy will be an innovator and how privacy will set the company apart from its competitors. At CES 2019, Apple took out a large billboard stating “What happens on your iPhone, stays on your iPhone.” This marketing move focused in on Apple’s commitment to user privacy, and used that commitment as a competitive edge.

Know What’s at Stake. Business leaders need to know how much they have to lose. Regulations, such as the GDPR and the CCPA, come with significant penalties for non-compliance. GDPR fines can total up to 20,000,000 EUR or 4% of total worldwide annual turnover of the preceding year (whichever is higher). Furthermore, stakeholders need to evaluate how potential loss of trust could negatively affect brand equity. 

Set Goals and Targets 

Program Maturity Level. Conduct assessments to understand your company’s maturity level. Explain to the stakeholders the maturity level of the current privacy program and discuss the resources needed and the values of achieving a higher maturity level.   

Compliance Metrics. As mentioned before, cold hard facts get the point across. Compile metrics on where the company is at in terms of number of privacy incidents, number of data access requests, number of number of hours dedicated to employee training, for example.  Or, conversely, point out that not knowing these key metrics suggests that your organization may be at risk if requested by a regulator, shareholders or prospective M&A partners.  Review and analyze past privacy incidents to create qualitative metrics. Set goals for the future and explain what is needed to meet these goals.

Let Technology Help 

Automate. Aim for consistency, repeatability and scalability by using technology to automate and operationalize your privacy processes. For risk assessments, use a tool to complete assessments and generate compliance reports, which saves time, increases accuracy, and improves record keeping. Move away from spreadsheets which are very difficult to update and keep current.

Simplification. Technology can simplify the complex world of privacy regulation and privacy management. Managing data privacy and compliance risk is nearly impossible without specialized technology to streamline the process. A data inventory and mapping solution makes it easy to standardize and operationalize the processes and creates a detailed, up to date inventory of data collected along with visual data flow maps of all business processes.

Visit our website to learn more about how TrustArc can simplify privacy management for the GDPR, CCPA and 500+ other global regulations with our comprehensive technology platform.

Leveraging GDPR ‘Legitimate Interests Processing’ for Data Science

 

Darren Abernethy, Senior Counsel TrustArc
Ravi Pather, VP Sales CryptoNumerics

The GDPR is not intended to be a compliance overhead for controllers and processors. It is intended to bring higher and consistent standards and processes for the secure treatment of personal data. It’s fundamentally intended to protect the privacy rights of individuals. This cannot be more true than in emerging data science, analytics, AI and ML environments where due to the nature of vast amounts of data sources there is higher risk of identifying the personal and sensitive information of an individual.

The GDPR requires that personal data be collected for “specified, explicit and legitimate purposes,” and also that a data controller must define a separate legal basis for each and every purpose for which, e.g., customer data is used. If a bank customer took out a bank loan, then the bank can only use the collected account data and transactional data for managing and processing that customer for the purpose of fulfilling its obligations for offering a bank loan. This is colloquially referred to as the “primary purpose” for which the data is collected.  If the bank now wanted to re-use this data for any other purpose incompatible with or beyond the scope of the primary purpose, then this is referred to as a “secondary purpose” and will require a separate legal basis for each and every such secondary purpose.

For the avoidance of any doubt, if the bank wanted to use that customer’s data for profiling in a data science environment, then under GDPR the bank is required to document a legal basis for each and every separate purpose for which it stores and processes this customer’s data. So, for example, a ‘cross sell and up sell’ is one purpose, while ‘customer segmentation’ is another and separate purpose. If relied upon as the lawful basis, consent must be freely given, specific, informed, and unambiguous, and an additional condition, such as explicit consent, is required when processing special categories of personal data, as described in GDPR Article 9.   Additionally, in this example, the Loan division of the bank cannot share data with its credit card or mortgage divisions without the informed consent of the customer. We should not get confused with a further and separate legal basis the bank has which is processing necessary for compliance with a legal obligation to which the controller is subject (AML, Fraud, Risk, KYC, etc.).

The challenge arises when selecting a legal basis for secondary purpose processing in a data science environment as this needs to be a separate and specific legal basis for each and every purpose. 

It quickly becomes an impractical exercise for the bank, let alone annoying to its customers, to attempt obtaining consent for each and every single purpose in a data science use case. Evidence shows anyway a very low level of positive consent using this approach. Consent management under GDPR is also tightening up. No more will blackmail clauses or general and ambiguous consent clauses be deemed acceptable.

GDPR offers controllers a more practical and flexible legal basis for exactly these scenarios and encourages controllers to raise their standards towards protecting the privacy of their customers especially in data science environments. Legitimate interests processing (LIP) is an often misunderstood legal basis under GDPR.  This is in part because reliance on LIP may entail the use of additional technical and organizational controls to mitigate the possible impact or the risk of a given data processing on an individual. Depending on the processing involved, the sensitivity of the data, and the intended purpose, traditional tactical data security solutions such as encryption and hashing methods may not go far enough to mitigate the risk to individuals for the LIP balancing test to come out in favour of the controller’s identified legitimate interest.

If approached correctly, GDPR LIP can provide a framework with defined technical and organisational controls to support controllers’ use of customer data in data science, analytics, AI and ML applications legally. Without it, controllers may be more exposed to possible non-compliance with GDPR and the risks of legal actions as we are seeing in many high profile privacy-related lawsuits.

Legitimate Interests Processing is the most flexible lawful basis for secondary purpose processing of customer data, especially in data science use cases. But you cannot assume it will always be the most appropriate. It is likely to be most appropriate where you use an individual’s data in ways they would reasonably expect and which have a minimal privacy impact, or where there is a compelling justification for the processing.

If you choose to rely on GDPR LIP, you are taking on extra responsibility not only for, where needed, implementing technical and organisational controls to support and defend LIP compliance, but also for demonstrating the ethical and proper use of your customer’s data while fully respecting and protecting their privacy rights and interests. This extra responsibility may include implementing enterprise class, fit for purpose systems and processes (not just paper-based processes). Automation based privacy solutions such as CryptoNumerics CN-Protect that offer a systems-based (Privacy by Design) risk assessment and scoring capability that detects the risk of re-identification, integrated privacy protection that still retains the analytical value of the data in data science while protecting the identity and privacy of the data subject are available today as examples of demonstrating technical and organisational controls to support LIP.  

Data controllers need to initially perform the GDPR three-part test to validate using LIP as a valid legal basis. You need to:

  •               identify a legitimate interest;
  •               show that the processing is necessary to achieve it; and
  •               balance it against the individual’s interests, rights and freedoms.

The legitimate interests can be your own interests (controllers) or the interests of third parties (processors). They can include commercial interests (marketing), individual interests (risk assessments) or broader societal benefits. The processing must be necessary. If you can reasonably achieve the same result in another less intrusive way, legitimate interests will not apply. You must balance your interests against the individual’s. If they would not reasonably expect the processing, or if it would cause unjustified harm, their interests are likely to override your legitimate interests.  Conducting such assessments for accountability purposes is happily now also easier than ever, such as with TrustArc’s Legitimate Interests Assessment (LIA) and Balancing Test that identifies the benefits and risks of data processing, which assigns numerical values to both sides of the scale and uses conditional logic and back-end calculations to generate a full report on the use of legitimate interests at the business process level.

What are the benefits of choosing legitimate interest processing?

Because this basis is particularly flexible, it may be applicable in a wide range of different situations such as data science applications. It can also give you more on-going control over your long-term processing than consent, where an individual could withdraw their consent at any time. Although remember that you still have to consider managing marketing opt outs independently of whatever legal basis you’re using to store and process customer data. 

It also promotes a risk-based approach to data compliance as you need to think about the impact of your processing on individuals, which can help you identify risks and take appropriate safeguards. This can also support your obligation to ensure “data protection by design,” performing risk assessments for re-identification and demonstrating privacy controls applied to balance out privacy with the demand for retaining analytical value of the data in data science environments. This in turn would contribute towards demonstrating your PIAs (Privacy Impact Assessments) which forms part of your DPIA (Data Protection Impact Assessment) requirements and obligations.

LIP as a legal basis, if implemented correctly and supported by the correct organisational and technical controls, also provides the platform to support data collaboration and data sharing.  However, you may need to demonstrate that the data has been sufficiently de-identified, including by showing that the risk assessments for re-identification are performed not just on direct identifiers but also on all indirect identifiers as well. 

Using LIP as a legal basis for processing may help you avoid bombarding people with unnecessary and unwelcome consent requests and can help avoid “consent fatigue.” It can also, if done properly, be an effective way of protecting the individual’s interests, especially when combined with clear privacy information and an upfront and continuing right to object to such processing. Lastly, using LIP not only gives you a legal framework to perform data science it also provides a platform that demonstrates the proper and ethical use of customer data, a topic and business objective of most boards of directors. 

About the Authors  

Darren Abernethy is Senior Counsel at TrustArc in San Francisco.  Darren provides product and legal advice for the company’s portfolio of consent, advertising, marketing and consumer-facing technology solutions, and concentrates on CCPA, GDPR, cross-border data transfers, digital ad tech and EMEA data protection matters. 

Ravi Pather of CryptoNumerics has been working for the last 15 years helping large enterprises address various data compliance such as GDPR, PIPEDA, HIPAA, PCI/DSS, Data Residency, Data Privacy and more recently CCPA compliance. I have a good working knowledge of assisting  large and global companies, implement Privacy Compliance controls as it particularly relates to more complex secondary purpose processing of customer data in a Data Lakes and Warehouse environments.

CCPA Week Series Issue 4 – Training, Metrics, Verification and Minors

Last week, the California AG’s office released proposed regulations implementing key provisions of the CCPA (”CCPA Regulations”). In the CCPA Regulations, the California Attorney General (AG) offered businesses clarifications–and, in some cases, new obligations–around consumers’ individual rights requests under the CCPA. In Issue 1, we provided an overview of the themes addressed by the regulations. In Issue 2, we gave a recap of the requirements related to Notices to Consumers. In Issue 3, we provided best practices for handling consumer requests . In today’s CCPA Week Issues, we address a range of topics such as training, metrics, verification and minors.

Training

Among the proposed provisions of the CCPA Regulations are expanded requirements for training those individuals responsible for handling consumer inquiries regarding the business’s privacy practices or compliance with the CCPA. The CCPA Regulations expand the training scope from specific sections of the CCPA to all of the requirements of the CCPA and the CCPA Regulations. Additionally, businesses that buy, receive for commercial purposes, sell, or share for commercial purposes the personal information of 4,000,000 or more consumers must establish a training policy to govern the implementation of these training requirements.

Record-Keeping

The CCPA Regulations introduce record-keeping requirements for all businesses and enhanced record-keeping requirements for business that obtain, use, and share larger volumes of data for commercial purposes. Specifically, all businesses must maintain a record of consumer requests under the CCPA, as well as how they responded, for a period of 24 months. These records may not be used for any other purpose. Further, any business that “buys, receives for the business’s commercial purposes, sells, or shares for commercial purposes, the personal information of 4,000,000 or more consumers” must maintain annual statistics of the number of requests from consumers (confirmation, deletion, opt out) and the median length of response time. These statistics must be made publicly available and accessible from the business’s privacy policy.   

Verification of Requests 

CCPA, similar to GDPR, requires businesses to take steps to verify the identity of the consumer making the request to ensure the consumer is indeed the consumer that the data pertains to and to protect against unauthorized access. Notably, the CCPA Regulations do not extend the verification requirement to requests to opt-out of the sale of personal information.

The CCPA Regulations establish general rules for verification, as well as verification rules for three specific scenarios: where consumers have a password-protected account, where consumers are not account-holders, and where a consumer uses an authorized agent to submit a request.

General Rules

Identity verification methods should be reasonable.Two primary approaches are provided: (1) match identity information with personal information already maintained by the business, or (2) use a third-party identity verification service. The proposed approach seeks to address concerns raised in recent months about the risks associated with responding to access requests by establishing six factors, such as the sensitivity of the data and the potential risk of harm to the consumer, for assessing the methodologies to use. Additionally, the rules establish purpose limitation and security controls to further mitigate the risks associated with verification and clarify that the requirements for providing access to or deleting personal information do not apply to de-identified information.  

Password-Protected Accounts

The approach for identity verification where the personal information is accessible via a password-protected account leverage established authentication approaches, such as user authentication to access the account, user re-authentication to delete personal information, and additional authentication checks where fraudulent or suspicious account activity is detected.

Non-Account Holders

The approach for non-account holders uses “degree of certainty” tests based on the risks associated with the request. For example, requests to know the categories of personal information require matching of two data points, whereas requests to know specific pieces of personal information require mapping of three data points. Requests to delete data require a degree of certainty that correlates with the sensitivity of the data that is requested to be deleted. The proposed rules also provide for how businesses should handle scenarios in which the requisite degree of certainty is unable to be achieved. In those scenarios, the business must provide an explanation to the requestor, however, where identity verification is not possible across the broader customer base, this must be disclosed in the business’s privacy policy and re-evaluated annually.

Using an Authorized Agent

The proposed rules address scenarios where one person may submit a request to know or a request delete on behalf of another person. Where the person making the request on behalf of a consumer has a power of attorney, those requirements will be honored. For other requests, an authorized agent may be appointed and the business may require evidence of written permission from the consumer granting the agent this status as well as verifying the consumer’s own identity.

Special Rules for Minors

There is ongoing recognition that information collected from minors requires special protections. CCPA is no exception. The United States put protections in place in 1998 by the passing of the Children’s Online Privacy Protection Act (COPPA) that requires verifiable parental consent prior to collecting personal information directly from children under age 13. The General Data Protection Regulation (GDPR) Article 8 requires parental consent prior to processing personal information collected from children. While COPPA focuses on collection and GDPR focuses on processing – CCPA requirements focus on the sale of children’s data and obtaining the appropriate opt-in consent from either the parent or the minor aged 13-16.

The CCPA Regulations provide specifications on how businesses should handle the two different groups of minors: those under 13 years of age and those 13-16 years of age.

Minors Under 13 Years of Age

In addition to meeting the applicable requirements under COPPA, the CCPA Regulations require additional steps beyond obtaining consent as required under COPPA to determine that the person authorizing the sale of the child’s personal information is the child’s parent or guardian. Measures for ensuring the person is the child’s parent or guardian include mechanisms similar to those allowed under COPPA for obtaining verifiable parental consent such as obtaining a signed consent form, using a credit card in conjunction with the transaction, communicating with trained personnel via a toll-free phone line or video conference, or verifying a government issued ID. General requirements relating to verification as described outlined in Article 4 of the CCPA Regulations apply. The business, upon receiving authorization from the parent, will inform the parent that they can opt-out of the sale of the child’s personal information at a later date and provide instructions for doing so. This is consistent with COPPA, giving the parent the right to withdraw consent to the collection and further use of the child’s information at any time.

Minors 13-16 Years of Age

If the business has actual knowledge it maintains personal information from children aged 13-16, then the business must establish, document, and implement a reasonable process for obtaining the minor’s opt-in consent for the sale of the minor’s personal information. The business must inform the minor that he or she may withdraw consent to the sale of their personal information at any time and include instructions for doing so.

Notices for Minors

The requirements related to the sale of personal information regarding minors under the CCPA Regulations also extend to the business’s Privacy Policy. Businesses subject to these requirements must include a description of how parents and minors can exercise their rights relating to the sale of the minor’s personal information in their privacy policies. Notably, for those businesses that only target minors under 16 years of age and do not sell the personal information of those minors without their opt-in consent, the CCPA Regulations clarify that these businesses do not need to provide Notice of the Right to Opt of the Sale of Personal Information.  

To learn more, watch the webinar “Update Your CCPA Plan with Practical Insights into the Proposed Regulations, 2019 Amendments to the Law, and More.” 

This update was provided by the TrustArc Privacy Intelligence News and Insights Service, part of the TrustArc Platform.

div>