It was a busy but fantastic week for TrustArc in the Belgian capital at the annual International Association of Privacy Professionals Europe Data Protection Congress.
TrustArc began the conference with the announcement of its acquisition of privacy industry heavyweight Nymity. The companies have joined forces to accelerate development of the next generation of technology-driven privacy solutions. The news was received with overwhelming excitement by conference goers and news media, and will usher in incredible new content and product synergies for current and future customers.
In addition to countless conversations with friends old and new from organizations of all sizes, industries and geographies, as well as with officials from the public sector and regulatory bodies, TrustArc and Nymity also participated in the conference’s educational and information sharing efforts.
TrustArc SVP, Privacy Intelligence and General Counsel, Hilary Wandall, shone on the “Little Big Stage” where she unveiled the results of an IAPP-TrustArc survey report entitled “Measuring Privacy Operations in 2019.” The survey gauged what global privacy professionals–from organizations ranging from less than 250 employees to more than 25,000 employees–have done to meet increasing data privacy compliance requirements to which their organizations are subject.
Alongside IAPP Research Director, Caitlin Fennessy, Hilary performed a deep dive for a standing room-only audience, going over the report’s revealing findings and trends with respect to whether companies are adopting a single global privacy strategy (versus more regional or local implementations); what types of privacy impact assessments they conduct; how many privacy laws they currently must comply with; whether the companies have made any privacy-related operational changes within the last 12 months, and much more.
Nymity EU Operations and Strategy Director Paul Breitbarth participated on three separate panels during the conference. Paul stepped on to the Little Big Stage on Wednesday morning and explained how Nymity turns compliance data into knowledge for any team in an organisation. On Wednesday evening, Paul joined his fellow panelist to discuss “Using your Register of Processing Activities to Demonstrate Compliance.” The panel provided and examined real-world examples of the challenges faced by global organisations and what they do to overcome them. As Data Protection Congress was coming to a close, Paul joined the panel on “Artificial Intelligence: From Principles to Practice” and spoke on how companies get from overarching ethical principles to a robust legal framework.
Darren Abernethy, TrustArc Senior Counsel, also led a session entitled “Winning with Privacy: Implementing Consent and DSARs to Comply AND Win Customers.” The panel first overviewed the basics of digital advertising; then addressed the various “cookie”- and ePrivacy Directive-related guidances released in the last year, including by EU privacy regulators from the U.K., France, Germany and Spain; then discussed the role of third-party cookies and consent under the California Consumer Privacy Act, as well as the importance of website cookie audits (including to help make determinations as to “service provider” vs. “third party” status for vendors); and offered practical tips on how to set up compliant and responsive DSAR/individual rights programs within organizations.
The panel spent time showing real-world examples of how cookie consent and individual rights implementations look on actual digital properties “in the wild,” providing the audience with collective insights from the panelists’ extensive experience with exactly these matters, including their use of scalable, automated technology solutions across privacy programs to account for local variations in legal requirements.
If you would like copies of slides from the above presentations, or would like to discuss how TrustArc’s Cookie Consent Manager or Individual Rights Manager may be leveraged to facilitate your company’s privacy compliance and data value maximization, we welcome you to contact TrustArc at any time for more information.
Darren Abernethy, Senior Counsel TrustArc
Ravi Pather, VP Sales CryptoNumerics
The GDPR is not intended to be a compliance overhead for controllers and processors. It is intended to bring higher and consistent standards and processes for the secure treatment of personal data. It’s fundamentally intended to protect the privacy rights of individuals. This cannot be more true than in emerging data science, analytics, AI and ML environments where due to the nature of vast amounts of data sources there is higher risk of identifying the personal and sensitive information of an individual.
The GDPR requires that personal data be collected for “specified, explicit and legitimate purposes,” and also that a data controller must define a separate legal basis for each and every purpose for which, e.g., customer data is used. If a bank customer took out a bank loan, then the bank can only use the collected account data and transactional data for managing and processing that customer for the purpose of fulfilling its obligations for offering a bank loan. This is colloquially referred to as the “primary purpose” for which the data is collected. If the bank now wanted to re-use this data for any other purpose incompatible with or beyond the scope of the primary purpose, then this is referred to as a “secondary purpose” and will require a separate legal basis for each and every such secondary purpose.
For the avoidance of any doubt, if the bank wanted to use that customer’s data for profiling in a data science environment, then under GDPR the bank is required to document a legal basis for each and every separate purpose for which it stores and processes this customer’s data. So, for example, a ‘cross sell and up sell’ is one purpose, while ‘customer segmentation’ is another and separate purpose. If relied upon as the lawful basis, consent must be freely given, specific, informed, and unambiguous, and an additional condition, such as explicit consent, is required when processing special categories of personal data, as described in GDPR Article 9. Additionally, in this example, the Loan division of the bank cannot share data with its credit card or mortgage divisions without the informed consent of the customer. We should not get confused with a further and separate legal basis the bank has which is processing necessary for compliance with a legal obligation to which the controller is subject (AML, Fraud, Risk, KYC, etc.).
The challenge arises when selecting a legal basis for secondary purpose processing in a data science environment as this needs to be a separate and specific legal basis for each and every purpose.
It quickly becomes an impractical exercise for the bank, let alone annoying to its customers, to attempt obtaining consent for each and every single purpose in a data science use case. Evidence shows anyway a very low level of positive consent using this approach. Consent management under GDPR is also tightening up. No more will blackmail clauses or general and ambiguous consent clauses be deemed acceptable.
GDPR offers controllers a more practical and flexible legal basis for exactly these scenarios and encourages controllers to raise their standards towards protecting the privacy of their customers especially in data science environments. Legitimate interests processing (LIP) is an often misunderstood legal basis under GDPR. This is in part because reliance on LIP may entail the use of additional technical and organizational controls to mitigate the possible impact or the risk of a given data processing on an individual. Depending on the processing involved, the sensitivity of the data, and the intended purpose, traditional tactical data security solutions such as encryption and hashing methods may not go far enough to mitigate the risk to individuals for the LIP balancing test to come out in favour of the controller’s identified legitimate interest.
If approached correctly, GDPR LIP can provide a framework with defined technical and organisational controls to support controllers’ use of customer data in data science, analytics, AI and ML applications legally. Without it, controllers may be more exposed to possible non-compliance with GDPR and the risks of legal actions as we are seeing in many high profile privacy-related lawsuits.
Legitimate Interests Processing is the most flexible lawful basis for secondary purpose processing of customer data, especially in data science use cases. But you cannot assume it will always be the most appropriate. It is likely to be most appropriate where you use an individual’s data in ways they would reasonably expect and which have a minimal privacy impact, or where there is a compelling justification for the processing.
If you choose to rely on GDPR LIP, you are taking on extra responsibility not only for, where needed, implementing technical and organisational controls to support and defend LIP compliance, but also for demonstrating the ethical and proper use of your customer’s data while fully respecting and protecting their privacy rights and interests. This extra responsibility may include implementing enterprise class, fit for purpose systems and processes (not just paper-based processes). Automation based privacy solutions such as CryptoNumerics CN-Protect that offer a systems-based (Privacy by Design) risk assessment and scoring capability that detects the risk of re-identification, integrated privacy protection that still retains the analytical value of the data in data science while protecting the identity and privacy of the data subject are available today as examples of demonstrating technical and organisational controls to support LIP.
Data controllers need to initially perform the GDPR three-part test to validate using LIP as a valid legal basis. You need to:
- identify a legitimate interest;
- show that the processing is necessary to achieve it; and
- balance it against the individual’s interests, rights and freedoms.
The legitimate interests can be your own interests (controllers) or the interests of third parties (processors). They can include commercial interests (marketing), individual interests (risk assessments) or broader societal benefits. The processing must be necessary. If you can reasonably achieve the same result in another less intrusive way, legitimate interests will not apply. You must balance your interests against the individual’s. If they would not reasonably expect the processing, or if it would cause unjustified harm, their interests are likely to override your legitimate interests. Conducting such assessments for accountability purposes is happily now also easier than ever, such as with TrustArc’s Legitimate Interests Assessment (LIA) and Balancing Test that identifies the benefits and risks of data processing, which assigns numerical values to both sides of the scale and uses conditional logic and back-end calculations to generate a full report on the use of legitimate interests at the business process level.
What are the benefits of choosing legitimate interest processing?
Because this basis is particularly flexible, it may be applicable in a wide range of different situations such as data science applications. It can also give you more on-going control over your long-term processing than consent, where an individual could withdraw their consent at any time. Although remember that you still have to consider managing marketing opt outs independently of whatever legal basis you’re using to store and process customer data.
It also promotes a risk-based approach to data compliance as you need to think about the impact of your processing on individuals, which can help you identify risks and take appropriate safeguards. This can also support your obligation to ensure “data protection by design,” performing risk assessments for re-identification and demonstrating privacy controls applied to balance out privacy with the demand for retaining analytical value of the data in data science environments. This in turn would contribute towards demonstrating your PIAs (Privacy Impact Assessments) which forms part of your DPIA (Data Protection Impact Assessment) requirements and obligations.
LIP as a legal basis, if implemented correctly and supported by the correct organisational and technical controls, also provides the platform to support data collaboration and data sharing. However, you may need to demonstrate that the data has been sufficiently de-identified, including by showing that the risk assessments for re-identification are performed not just on direct identifiers but also on all indirect identifiers as well.
Using LIP as a legal basis for processing may help you avoid bombarding people with unnecessary and unwelcome consent requests and can help avoid “consent fatigue.” It can also, if done properly, be an effective way of protecting the individual’s interests, especially when combined with clear privacy information and an upfront and continuing right to object to such processing. Lastly, using LIP not only gives you a legal framework to perform data science it also provides a platform that demonstrates the proper and ethical use of customer data, a topic and business objective of most boards of directors.
About the Authors
Darren Abernethy is Senior Counsel at TrustArc in San Francisco. Darren provides product and legal advice for the company’s portfolio of consent, advertising, marketing and consumer-facing technology solutions, and concentrates on CCPA, GDPR, cross-border data transfers, digital ad tech and EMEA data protection matters.
Ravi Pather of CryptoNumerics has been working for the last 15 years helping large enterprises address various data compliance such as GDPR, PIPEDA, HIPAA, PCI/DSS, Data Residency, Data Privacy and more recently CCPA compliance. I have a good working knowledge of assisting large and global companies, implement Privacy Compliance controls as it particularly relates to more complex secondary purpose processing of customer data in a Data Lakes and Warehouse environments.
TrustArc regularly attends and hosts events around the world and online – please visit us at one or more of the following events.
IAPP CCPA Comprehensive Live 2019
The California Consumer Privacy Act will come into effect on January 1, 2020. That gives you very little time to get a lot of work done to comply with this sweeping legislation expected to carry harsh enforcement and fines.
he IAPP CCPA Comprehensive Live 2019 November 7 in New York will provide practical, in-depth CCPA-specific training presented by IAPP experts that will help operationalize your commitment to CCPA compliance.
TrustArc will be sponsoring and exhibiting at this event. Stop by the TrustArc table to say hello!
Learn more here
Privacy Insight Series Webinar
How to Comply with CCPA as Part of a Global Privacy Strategy
November 13 @ 9am PT | 12pm ET | 4pm GMT
With the EU General Data Protection Regulation (GDPR), the California Consumer Privacy Act (CCPA), and other laws such as the Brazilian General Data Protection Law (LGPD), businesses must be prepared to comply with a variety of laws around the world.
Privacy is a complex, multi-level, concept which is now being regulated in more than 130 countries with more than 500 privacy laws. To be successful in complying with so many laws, businesses must develop a multi-jurisdictional approach to privacy laws that is consistent and predictable yet also not one-size-fits-all.
This webinar will help answer questions like:
- What are the additional privacy laws outside of the GDPR and CCPA law requirements you need to be aware of?
- How do you manage all data privacy to meet all applicable global requirements?
- How do you implement a multi-jurisdictional custom approach to address all applicable laws and regulations?
> Register here
IAPP Europe Data Protection Congress
In a year of enforcement action, fines and litigation, the Congress keeps your operation a step ahead.
Europe’s top event in data protection law and policy returns to Brussels, home of the IAPP’s European headquarters, 20-21 Nov. Privacy profession will gather for wide-ranging discussions of strategic developments in regional and international data protection, plus training classes and a deep-dive workshop day preceding the main conference dates.
TrustArc Senior Counsel Darren Abernethy will be speaking on “Winning with Privacy: Implementing Consent and DSARs to Comply AND Win Customers” on 20 November at 17:00.
TrustArc will be sponsoring, speaking and exhibiting at this event. Stop by the TrustArc booth to say hello!
> Learn more here
Perhaps the only thing higher than temperatures this summer in the European Union is the level of regulatory attention being paid to data-driven advertising and website cookie practices (including similar tracking technologies within mobile applications and other non-browser environments, collectively referred to here as “cookies”). This TrustArc blog post summarizes the major announcements and publications regulators have issued over the last few weeks, including what is expected to follow–and how TrustArc helps.
UK ICO Report on Ad Tech, RTB and Privacy. First, the United Kingdom’s Information Commissioner’s Office (ICO) released on June 20th an “Update Report Into Adtech and Real Time Bidding,” which concluded that advertising technology-related entities and those involved in real time bidding (RTB) should reassess their privacy notices, lawful processing bases, and personal data uses and sharing in light of the GDPR, as many have not to this point. The ICO is in the midst of evaluating practices within the advertising industry, in keeping with the view announced in its 2018-2021 Technology Strategy that web and cross-device tracking is one of its three “priority areas” for the current period. The report’s findings:
- pointed out deficiencies in publishers’ transparency practices, such as not specifically naming third party recipients of personal data collected on the basis of consent;
- adjudged that “special categories” of personal data included in targeted programmatic auction bid requests (e.g., inferred ethnic, health, sexual orientation or political audience segments associated with specific cookie or other unique identifiers bid on by advertisers) are regularly being processed unlawfully by ad tech companies due to failure to obtain explicit consent from data subjects;
- clarified that consent–rather than legitimate interests–is not only required for the placement or accessing of cookies or similar tracking technologies on an end user’s device (under the U.K.’s PECR rules implementing the EU’s “ePrivacy” Directive), but is also generally the appropriate lawful processing basis for the real-time bidding transactions that underpin the programmatic auctions between buyers and sellers of ad spaces for targeted advertising; and
- noted that “the ICO has published [pursuant to GDPR Article 35(4)] a list of processing operations likely to result in…high risk, for which [Data Protection Impact Assessments] are mandatory, [and] RTB matches a number of examples on this list,” resulting in the conclusion that RTB-involved “organizations are therefore legally required to perform DPIAs.”
The ICO’s report identified areas where it has concerns and expects to see changes, but it also articulated a recognition that the ad tech sector is “an extremely complex environment” that does not change overnight. With this in mind, the ICO indicated that it seeks to “take a measured and iterative approach, before undertaking a further industry review in six months’ time.”
>> Download TrustArc Cookie Consent Privacy Advisory now for free!
CNIL’s Change of Consent Interpretation and Timeline. Next, the French privacy regulator, the CNIL, announced on June 28th that in light of a rise in complaints and requests related to online marketing, it has devised an action plan for the next year making “targeted online advertising a priority topic for 2019.” Part of this plan will be the release this month of new guidelines that will rescind the CNIL’s 2013 interpretation that continued navigation of a website could be understood as an expression of an end user’s consent to the placement of website cookies or similar tracking technologies. The CNIL indicated that it will give stakeholders a transitional period of 12 months during which “scrolling down, browsing or swiping through a website or application will still be considered by the CNIL as acceptable.” Still, the CNIL will regularly investigate matters of transparency, withdrawal of consent, security obligations and more, including instances when cookies are impermissibly set before consent is collected for ePrivacy purposes. The CNIL’s calendar lists its tentative schedule for cookie-related matters as follows:
- May – June 2019: Update of the CNIL standards to align with the GDPR (i.e., update of the CNIL’s 2013 interpretation of consent for cookies);
- June – Sept 2019: Stakeholder working group to test the operational consistency of the guidelines;
- November 2019: Results of work
- End of 2019 – Early 2020: Publication of new guidelines for cookies
- June – July 2020: End of the grace period, entities must comply with the rules of the new guidelines.
Cookie Consent and Transparency. The ICO’s guidance confirms that if using cookies, the operator of an online service must inform users of what cookies will be set, explain what the cookies do, and obtain consent to storing cookies on a device before doing so. Moreover, if using any third party cookies, the operator must clearly and specifically name who the third parties are and explain what they will do with the information. Exempted from these requirements are cookies needed to transmit a communication over an electronic communications network, as well as cookies that are “strictly necessary” to provide a service or site requested by the user.
Lawful Processing Basis. Whereas PECR addresses the storing or accessing of information on users’ browsers and devices by requiring consent as a prerequisite to doing so, the GDPR (and its six possible lawful processing bases under Article 6) governs the processing of any personal data gained from cookies. In its guidance, the ICO recognizes that “it may be possible to rely on an alternative lawful basis for subsequent processing beyond the setting of any cookies,” but separately states that, “trying to apply another lawful basis such as legitimate interests when you already have GDPR-compliant consent would be an entirely unnecessary exercise, and would cause confusion for your users.”
The regulator also noted that any data processing involving analyzing or predicting preferences or behavior, or tracking and profiling for direct marketing and advertising purposes, will in most cases require consent as the lawful processing basis. Also confirmed is that “consent is necessary for first-party analytics cookies, even though they might not appear to be as intrusive as others that might track a user across multiple sites or devices,” although the ICO concedes that the setting of a first party analytics cookie “results in a low level of intrusiveness and low risk of harm to individuals,” and that “it is unlikely that priority for any formal action would be given” to such instances.
Cookie Audits and Banners. The ICO also emphasizes the utility of performing comprehensive “cookie audits” to detail what cookies are being used on a site and to discern which of them comprise “strictly necessary” first and third party cookies versus those which do not. The guidance likewise addresses forms of notice and means of consent, including prominently displayed cookie banners that provide clear information about cookies and user control options to allow or disallow those that are non-essential.
It further notes that the blanket use of “cookie walls,” which require users to agree or accept the setting of non-strictly necessary cookies before the user can access the rest of the site’s content, will generally amount to invalid consent because the user lacks a genuine choice other than to acquiesce in order to use the site. Lastly, the ICO declined to specify how often consent should be obtained from users, noting that this is dependent on a number of factors such as frequency of visitors or updates of content or functionality.
How TrustArc Helps
TrustArc offers the leading technology solutions in the cookie consent space with our Website Monitoring Manager and Cookie Consent Manager.
For product demonstrations or more information on how we can help your organization, contact TrustArc today!
As previously described on the TrustArc Blog (“ Privacy Shield Approaching Its 3 Year Anniversary”, the European Union (EU)-U.S. Privacy Shield Framework has received two successive annual approvals from the European Commission (EC) since its July 2016 adoption, and currently serves as an EU-to-U.S. personal data transfer mechanism for more than 4,700 U.S. organizations.
Separately, pre-approved standard contractual clauses (SCCs), the most recent version of which was issued in 2010, are also recognized by the EC as valid transfer mechanisms to non-European Economic Area “third countries.” On June 13th, the European Commissioner for Justice and Consumers confirmed in a speech that SCCs are in the process of being updated for the post-GDPR world: “We are already working to modernise standard contractual clauses. This will make it easier for companies to share data when they contract processing services, within the EU or abroad.”
This update to SCCs is occurring concurrently with a legal action challenging the validity of SCCs as a transfer mechanism to the United States, in a case brought against Facebook Ireland by Austrian privacy advocate Maximillian Schrems. The case, dubbed Schrems II?—following the 2015 decision of the European Court of Justice (ECJ) that resulted in the invalidation of the EU-U.S. Safe Harbor Agreement on grounds that it did not provide EU citizens with protections “essentially equivalent” to that of the EU due to U.S. intelligence agencies’ surveillance practices, and thus that any EU-to-U.S. personal data transfers made on that basis were not legal–proceeds to oral arguments before the ECJ on July 9th. In this case, the Irish High Court has referred eleven questions to the ECJ relating to whether entering into SCCs, by itself, provides an adequate level of data protection for EU personal data transferred to the U.S. The Irish Supreme Court recently dismissed Facebook’s appeal of the Irish High Court’s decision to refer these items to the ECJ.
Meanwhile, the EU-U.S. Privacy Shield Framework is similarly undergoing a legal challenge on grounds that the United States does not adequately protect EU citizens’ personal data by virtue of U.S. intelligence agencies’ activities. The case, brought by three French non-governmental organizations, seeks to revoke Privacy Shield as a valid EU-to-U.S. personal data transfer mechanism as occurred with Safe Harbor in Schrems I. On July 1-2, the NGOs will argue before the General Court of the EU that Privacy Shield is not “essentially equivalent” to EU data protection law, even if it is more protective than Safe Harbor was. The losing party in this matter could then appeal to the ECJ for a final determination.
Decisions in both matters are expected within a year or less. It is unclear what effect, if any, the entry into force of new European Commission-approved SCCs would have on the ripeness of the case if introduced prior of the ECJ’s Schrems II ruling. Moreover, in the event the ECJ were to eventually invalidate both SCCs and Privacy Shield–the latter of which was specifically drafted by EU and U.S. officials to withstand judicial scrutiny—it is uncertain what course of action most organizations–small and medium-sized enterprises in particular—would undertake to effectuate their data transfers. With binding corporate rules (BCRs) and reliance on derogations such as explicit consent for cross-border data transfers being expensive, time-consuming or disfavored options for many businesses, it remains to be seen what effect on digital commerce such legal actions would have in practice (including with respect to data transfers to the U.K., in the event of an eventual “Brexit”). TrustArc will continue to follow developments closely and will provide regular updates.
This update was provided by the TrustArc Privacy Intelligence News and Insights Service, part of the TrustArc Platform. To learn how you can get full access to the daily newsfeed, contact us today!
May 25, 2019 marked the one year anniversary of the EU General Data Protection Regulation enforcement deadline. In the last twelve months, companies across the globe have been working diligently to achieve and maintain compliance under the regulation. The GDPR significantly increased the requirements on how businesses address consumer individual rights. Companies have been tasked with putting processes and systems in place in order to receive, escalate, and accommodate consumer requests. Failure to comply with the GDPR can result in fines, loss of reputation, and expenses associated with responding to any compliance investigations. During the IAPP Global Privacy Summit in DC earlier this month, Ireland Data Protection Commissioner Helen Dixon shared that over 6000 complaints have been launched since May 25, 2018, and eighteen large scale investigations are underway and will be reviewed by the European Data Protection Board this summer.
TrustArc has announced new findings from an online study conducted by Ipsos MORI, a global research and consulting firm, on behalf of TrustArc. The survey polled individuals aged 16-75 in the UK about a number of issues surrounding the GDPR one year since it went into effect on 25 May 2018.
A summary of the key findings follows.
- Trusting Companies With Personal Data Is Increasing
36% of respondents trust companies and organisations with their personal data more since the GDPR privacy regulation came into effect one year ago (rising to 44% among 16 to 24-year-olds and 41% among 25-34 year olds). Only one-third or less of those aged 35-44, 45-54, and 55-75 report being more trusting. Women, at 38%, expressed more trust in companies and organisations than men at 33%.
- Understanding GDPR Compliance Is Challenging
25% of respondents are confident they can tell if a company or organisation is GDPR compliant versus 33% who are not confident. There were some differences based on geographic location with respondents in Northern Ireland indicating they are confident at 33%, which was significantly higher than those in Wales at 17%.
- Privacy Certifications Can Influence Behavior
57% of respondents would be more likely to use websites that have a certification mark or seal to demonstrate GDPR compliance versus 9% who are not. There were notable differences based on geographic location; respondents in Scotland were significantly more likely to be confident (67%) than those anywhere else in England.
56% are more likely to do business with companies and organisations that have a certification mark or seal to demonstrate GDPR compliance, versus 8% who disagree. Agreement is higher among households with larger incomes (65%).
- Young Adults Are Most Positive About How Well GDPR Enforcement Has Worked
34% of respondents agree that the regulatory enforcement of the GDPR privacy regulation has worked well versus 14% who disagree this is the case. There were significant differences based on age with younger respondents more likely than older respondents to agree this regulation has worked well (46% of 16 to 24-year-olds agree compared with 28% of those aged 55-75).
- Respondents Are Exercising GDPR Privacy Rights
47% of respondents have exercised their GDPR privacy rights by sending one or more of eight requests to a website, company or organisation. When asked which of these eight rights respondents had exercised in the past 12 months, the results were:
- Opting out of/Unsubscribe to email marketing = 35%
- Opting-out of /not consenting to install cookies = 23%
- Restrict use of my personal data = 13%
- Erase my personal data = 10%
- Correct my personal data = 6%
- To request access to your personal data: 5%
- To request to transfer your personal data: 3%
- To make a privacy complaint to a regulator: 3%
There were significant differences based on respondent gender with 52% of females and 42% of males exercising their rights in this respect. 43% of respondents claimed not to have exercised these privacy rights in the past 12 months. Exercising these rights increases progressively with age: 32% of 16-24-year-olds, 33% of 25-34 year-olds, 45% of 35-44 year-olds, 46% of 45-54 year-olds, and 52% of 55-75 year-olds.
For more information, download the research infographic at GDPR Consumer Research Infographic and a summary of the survey questions and responses at GDPR Consumer Research Survey.
On behalf of TrustArc, Ipsos MORI interviewed 2,230 adults aged 16-75 across the United Kingdom between 15-17 May 2019. Interviews were carried out online on Ipsos MORI’s i:Omnibus. A quota sample of respondents were interviewed with quotas set by age, gender, and geographic region of residence. The final survey data were weighted to the known population of this audience at the analysis data.
TrustArc can help with all phases of GDPR compliance – from building a plan to implementing processes and controls to demonstrating and managing ongoing compliance. Learn more here.