In April 2017 we convened a workshop of stakeholders from across academia, business and regulation to consider developments and directions related to "consentful" use of personal data. This report summarises and sysnthesises our discussions from those two days as a means of encouraging conversation and collaboration, and acting as a roadmap for the development of a digital economy that puts individuals in control.
Five Pillars Five 'pillars' are crucial to the development of a consentful digital economy; developments are required in all five, and they reinforce (or weaken) one another.
- Law & Regulation: Regulation is a major driver for industrial change. Not surprisingly, the EU’s General Data Protection Regulation dominated discussion in this area. Future law and regulation should be informed by a deeper scientific understanding of consent, and behaviour, as some of the assumptions that have been made in the past have led to unsuccessful attempts to deploy "consent"; such as the maligned 'cookie law'.
- Citizens: Consent should put citizens in control of what their personal data is, or isn’t, used for. The attitudes, knowledge and beliefs of citizens will affect how information and choices are delivered, and even whether there is any demand for more consentful products and services. Citizens' attention is often captured by failures of security or consent, but it does not necessarily follow that attention translates into increased literacy and capability that would aid citizens in making decisions about their personal data.
- Business: Personal data is a multi-billion euro business. Consent has to make business sense in order to be effective, and businesses will need to dedicate effort to implementing more consentful interactions with their customers, perhaps even shifting to new business models entirely. Businesses are currently focussed on complying with the GDPR, often using the advice and guidelines produced by data protection regulators, but there is considerable opportunity to use "consent" - meaningful transparency and control - as a means of improving consumer trust and enabling greater use of personal data for commercial and social benefit.
- Consent Science: Prior events in consent; such as the much maligned “cookie law” have shown that consent is non-trivial to design and implement. The science of how people think, reason, and process information needs to be combined with a principled understanding of what consent means, in order to provide methods and evidence for judging what is working and what is not; and challenging the considerable received wisdom that pervades the consent space. Better evaluation tools, and more unified theory, could translate into more effective engineering practices, unlocking innovation in a space that is currently dominated by lawyers rather than designers.
- Technology: The widespread adoption of consent depends on the availability of tools and technology for collecting, querying and changing consent. Particularly in novel new scenarios like the Internet of Things, an infrastructure for consent is required to inform citizens and enact their preferences in the digital environment around them. Deeper scientific understanding is required to move beyond current consent patterns, which usually assume that a screen is present and citizens are actively engaged in using a device.
The concept of consent is well-established in many areas of law and ethics. Intuitively, consent provides people with the opportunity to exercise control over important aspects of their life or personhood, for instance when undergoing medical procedures. Consent is now recognised as an important element of privacy and data protection, notably in the European Union’s General Data Protection Regulation (GDPR).
However, efforts to build consent points, meaningful consent interactions between data subjects and data controllers, have been unsuccessful. The Meaningful Consent in the Digital Economy (MCDE) Project was established in 2014 to tackle the issue of consent head-on. In April 2017 we convened our third workshop, bringing policy makers, data controllers, researchers, consumer advocates, and the emerging consent technology industry together to ask, simply, how do we translate our notions of consent - from within MCDE and beyond - into real changes in how the digital economy operates. This report distills our discussions into something that, we hope, will facilitate collaboration between different stakeholders, and catalyse the development of a more ‘consentful’ digital economy in the United Kingdom, Europe, and beyond.
Consent, as a regulatory concept in data protection, has some benefits. It is the only mechanism that allows consumer preference to be directly exercised by data subjects, and so in theory exposes data handling practices to market forces. It side steps the need for regulators to make decisions about specific data uses, and the need to introduce many pieces of sector-specific legislation, by deferring decisions about what is allowed, and what is not, to citizens themselves.
Of course, consent also has some drawbacks. In practice, it has been hard to implement in a way that genuinely gives people an understanding of, and choice about, how their personal data is used. Consent is more frequently a source of annoyance to service users than a source of empowerment.
We cannot - nor should we - give up on the idea of consent, though. For two reasons: First, consent is part of the law, and will be for the foreseeable future. The fact that consent can be used to justify the processing of people’s personal data gives all of us, as citizens, a stake in making the best of it. Second, consent is fundamentally about individuals having an understanding of the world around them, and a choice about which aspects of it they engage in. Broadly and meaningfully construed, consent is the embodiment of liberal values like personal autonomy and empowerment. In reading this document, we urge a focus on consent in broad terms; as the embodiment of genuine understanding and choice for citizens of a digital world, and not just how it is construed as a legal concept or as a lawful basis for data processing.
Fundamentally, when we talk of a consentful digital economy, we mean a digital world in which citizens understand how their personal data is, or will be, used; and where they get to choose what happens to it.
Based on the outputs from our workshop, we identified five ‘pillars’ of a consentful economy. These pillars represent elements that are required in order for effective and meaningful consent to become widespread within the digital economy. Developments in one pillar will be more effective in conjunction with developments in the others, and efforts in one pillar may be hampered or entirely ineffective without adequate developments in the other pillars.
The pillars are:
- Law & Regulation: Regulation is a major driver for industrial change. Not unsurprisingly, the EU’s General Data Protection Regulation dominated discussion in this area.
- Citizens: Consent should put citizens in control of what their personal data is, or isn’t, used for. The attitudes, knowledge and beliefs of citizens will affect how information and choices are delivered, and even whether there is any demand for more consentful products and services.
- Business: Personal data is a multi-billion euro business. Consent has to make business sense in order to be effective, and businesses will need to dedicate effort to implementing more consentful interactions with their customers, perhaps even shifting to new business models entirely.
- Consent Science: Prior events in consent; such as the much maligned “cookie law” have shown that consent is non-trivial to design and implement. The science of how people think, reason, and process information needs to be combined with a principled understanding of what consent means, in order to provide methods and evidence for judging what is working and what is not; and challenging the considerable received wisdom that pervades the consent space.
- Technology: The widespread adoption of consent depends on the availability of tools and technology for collecting, querying and changing consent.
Each pillar is explained in greater detail in the respective section.
Law & Regulation
The biggest factor in regulation is currently the EU’s General Data Protection Regulation (GDPR). The requirement to be able to demonstrate consent, and for data subjects to be able to revoke their consent, is a major factor driving the development of consent management systems in industry.
However, the translation by national Data Protection Authorities (DPAs) of the GDPR into guidelines that industry - particularly SMEs that do not have dedicated legal resources to draw upon - is also an important aspect of regulation. The ICO is currently drafting its own advice in this area.
Brexit is something of a wildcard in this area; although most people expect the UK to adopt the GDPR largely unchanged via the Great Repeal Act, there is the possibility that UK and EU law will start to diverge; despite the importance of the UK maintaining “adequacy” for inbound data transfers from the EU. Weakening consent requirements in the UK could hamper efforts to create a more consentful domestic digital economy, but many UK companies will still need to comply with the rules in the EU and so will benefit from consent technology. The ICO may have more freedom to support and encourage innovation in consent interactions if it is not bound by the precedent set by other EU DPAs.
The ePrivacy regulation, currently being drafted by the commission, will also have a major impact on the use of consent within the telecommunications industry, particularly as it is expected to bring “over-the-top” service providers (like WhatsApp or Snapchat) into scope for the first time.
The concept of voluntariness remains an unresolved issue in regulation; some participants interpreted the ICO’s draft guidance as being too restrictive over the benefits that may be offered to data subjects as a result of giving consent. Determining what is technically “necessary” in undertaking the activity that consent has been given for is not straightforward, and a focus on technical necessity ignores what could be equally pressing economic or business reasons for requesting consent to process data. Voluntariness could benefit from additional research about citizens’ expectations and attitudes to help establish what is and isn’t
There is little established case law relevant to consent in a data protection context, although consent is a legal and ethical concept that is well established in, for instance, medical case law. There is the potential for a “coalition of the willing”, which could include lawyers and consumer advocates, to bring test cases with the goal of establishing case law relating to consent. What these precedents might be remains an open question. There is a balance to be struck between providing clarity and direction through legal rulings, and intervening to establish precedent before aspects of consent in practice are sufficiently well understood.
Current and proposed regulations and guidance do not directly recognise consent metrics, nor the possibility of semi-automated consent, and may be drafted in such a way as to preclude the latter. Although a consent metric might relatively easily be incorporated into guidance and enforcement, the ability for data processors to rely on consent that’s provided via automated means (such as personal software agents) requires further legal clarification. [See the Consent Science, and Technology sections for more information on these developments.]
Some participants mentioned other regulations, such as EIDAS and PSD2; although these are of major importance to some industries that will also be involved in consent-based data processing, we identified no major consent-specific implications from these directives.
Personal data processing is already a major commercial activity that underpins many of the digital services that citizens rely on, improves public services, and supports jobs. It is also an area that has considerable growth potential. Building a digital economy where consent is more deeply embedded will require businesses to change how they approach personal data processing.
There are economic, business-model and regulatory opportunities to encourage organisations to put greater emphasis on consent - on citizens’ wishes - in their approach to personal data processing.
Regulation was identified as a major driver for organizationational change, and GDPR has provided the opportunity (and necessity) for organisations to re-evaluate their use of personal data. Larger fines, or more frequent enforcement, could provide an economic case for adopting consent-based processing where legally required; but it remains to be seen whether this becomes a major concern in practice.
One area where regulation may help to shift organisational thinking is the requirement for data controllers to appoint a data protection officer (DPO). Although not a board-level appointment, a DPO provides an internal advocate for data protection principles, including (where relevant) consent.
Consent may also provide a business advantage. Two broad opportunities were discussed:
- First, consent may be the only legal basis available for processing some data and realising the value that can be derived from it. In that sense, consent may make new income streams, or even new business models, viable. Conversely, tightening of the rules around consent (for instance via the ePrivacy regulation) could disrupt some existing business models, for instance targeted-advertising based on third-party tracking.
- Second, consent may be a competitive advantage in terms of consumer trust and the associated willingness to engage in business and to share valuable personal data. Some of this benefit may be a result of increased trust, and less resistance to new products and services. Other benefit may be derived as brand value, more akin to the ethical position indicated by a Fair Trade mark.
What about government data users?
Like business, governments make frequent use of personal data to deliver services and make decisions. The statutory basis for that service provision, and the power that governments have over individuals mean that - as the law currently stands - there is less scope for consent as a legal basis for data processing. One cannot meaningfully withhold consent from the government for processing related to taxation, for instance.
However, governments do engage in some data-handling where citizen participation is voluntary, and where seeking consent might be justified and beneficial. One UK example would be the use of NHS medical records for research purposes, where concerns over consent and the uses that data could be put to eventually led to the care.data scheme being scrapped.
A shift towards smart cities also suggests new opportunities for the valid use of consent by government. A smart city, operated or commissioned by local government, has the ability to collect a great deal of personal data about citizens in the course of their daily lives with the promise of delivering better transportation, facilities and public services. However, such data collection raises questions about what limits should be placed on government. In balancing public-interest data collection with the rights of the individual, consent may be a useful mechanism to employ. In this sense, just like private business, government may have recourse to consent as a legal and ethical basis for processing data about citizens in order to undertake activities that would otherwise be impossible on the grounds of individuals’ right to privacy; provided that other factors relevant to voluntariness, such as the universal provision of public services irrespective of consent to personal data processing, are respected.
Some workshop participants suggest that the current lack of consent to personal data use is a result of the structure of business-customer relationships, where businesses stockpile information about customers, and potential customers, in order to deliver advertising and inform them about products and services that are available. Businesses sit in the middle of a network of customers, managing the relationship through their CRM systems.
An alternative model, vendor relationship management (VRM), place citizens in charge; allowing them to enact their own agency in the marketplace. This includes the ability to control and permit the use of personal data, and to assert intentions in ways that can be understood and respected.
Citizens’ are a pillar of the consentful economy because, ultimately, their beliefs and attitudes influence their own behaviour, the businesses and products that they voluntarily engage with, and the consent that they are willing to give. By citizens we mean individuals, people who are acting in their private capacity, rather than on behalf of a business. Our notion of a citizen is therefore very similar to those that might be considered consumers in a business context, or data subjects in the world of data protection.
Consent - meaningfully implemented - is predicted to increase citizens’ trust and feelings of empowerment, leading to better B2C relationships and greater willingness to share data and engage with data-driven products and services. In that sense, consent has a role in enabling and expanding data processing. Conversely, consent also provides a mechanism by which consumers can make choices that reflect their preferences and attitudes and so exert greater market pressure to constrain data-processing practices.
It appears, prima facie, that a more consentful digital world puts more onus on citizens themselves; to engage with the consent decisions that they are asked to make, and to spend some effort on translating their own preferences and beliefs into an eventual choice. The successful deployment of consent therefore requires (at least) two things from citizens: First, the motivation to engage with the consent choices that they are asked to make. Second, the capability, to engage with those choices and make decisions.
Motivation refers to the desire to engage with consent, and the feeling that doing so provides some value. Motivation is perhaps most linked to attitudes towards privacy, and beliefs about how personal data is, or could be, used and misused.
Capability refers to having the knowledge and skills that are necessary to make a reasoned consent decision having decided to do so.
Citizens’ experiences will depend hugely on how government, business, regulation and technology develop. We identified some key areas that might catalyse change, though:
The UK rollout of smart utility meters was identified as a major public infrastructure project that has the potential to draw consumer attention to data use (or misuse) and to act as a catalyst to increase motivation. In addition to supporting easier billing by energy companies, smart meters allow greater insight into household energy use thanks to a greater reporting frequency (measurements can be collected daily or hourly, rather than monthly or quarterly as is common with conventional meters). People may be surprised to find that information about household patterns or occupancy can be inferred from this data, and there is the potential for misuse either by energy companies, or by third parties.
Consumer Services and Devices
There have been several examples of consumer devices that received criticism (and associated reputational damage) as a result of inattention to user consent. For instance, Samsung was criticised for not informing consumers that their Smart TVs could be recording conversations and sending them to a third-party company for processing.
Toys are an example of consumer devices that combine several privacy risk factors: vulnerable users (children), intimate context (the home), and experimentation with “connected” devices. In 2017, the German government recommended the destruction of the “My Friend Kayla” doll, over concerns that its combination of microphone and internet connection rendered it an illegal covert surveillance device. Around the same time, the maker of “CloudPets” stuffed toys exposed two million voice recordings collected by its “creepy IoT teddy bear”). Although these concerns could be framed as security problems, they highlight the potential for reputational damage to brands that arises from breaking the implicit trust that consumers put in technology providers; the parents who purchased CloudPets has presumably thought that appropriate security measures were in place, and it is missing transparency that that transforms Kayla from an innocent connected toy into an illegal covert surveillance device; precluding her companions from exercising meaningful choice or control by hiding invasive functionality.
Serendipitous failures that garner people’s attention are not the only thing that can - or should - shift attitudes, knowledge and beliefs. Education, and advocacy are other factors of this pillar that have considerable promise in making consent a more viable mechanism for regulating the use of personal data.
We construe education broadly, as any activity that deliberately helps people to acquire the knowledge and skills that are required to make meaningful consent decisions. There are challenges to using formal education as the route for delivering these skills, though. Formal education won’t reach most of the current population, educating those below the age of digital consent may take many years to have any impact on their actual decisions, and very rapid change means it’s unclear that education would remain relevant.
Advocacy & Lobbying
Consumer-advocacy organisations also have an important role to play. Part of this role could be in providing information and education, but consumer rights organisations also have the potential to influence the ways in which consent is sought, to help set expectations about things like voluntariness, and to lobby for regulatory enforcement against particular practices. Groups like Which? already provide some information about digital consumer rights.
Not all advocacy takes the form of consumer rights, though. Groups like the Electronic Frontier Foundation and the UK’s No2ID have taken positions on consent-related issues such as privacy and digital rights. In many cases, such as No2ID, groups coalesce around a specific issue; and there could be scope to build on those efforts through greater collaboration - either across groups, or to reach wider audiences of less-engaged consumers.
Alternative advocacy models were also discussed. For instance, a unionised model could allow data subjects to grant or withhold consent en-masse as a means of obtaining more favourable data-handling practices from data controllers. In this model, decisions about what an individual is happy to consent to might be partly or wholly delegated to a trusted third-party who can advocate and negotiate on their behalf; perhaps balloting for a “data strike” in response to unfavourable changes in personal data use.
The degree to which personal data processing is underpinned by technology demands a technology response to the new challenge of consent. Existing technology needs to be changed, and new technology needs to be created, to help weave citizens’ consent (where appropriate) into the technology that is processing their personal data.
Driven largely by the GDPR, there is an emerging set of consent infrastructure components. We define consent infrastructure as technology that helps to transmit, store, query and modify consent signals or records.
The two most relevant regulatory requirements from GDPR are that consent be revocable (suggesting a need for systems that assist with withdrawing consent, as well as giving it) and demonstrable (suggesting more detailed records of how and when consent was obtained).
There are currently examples of both end-to-end systems, that undertake a number of related tasks (such as Consentua), and efforts to standardise specific elements of the infrastructure (such as Consent Receipts).
We have identified several major functions for a consent infrastructure to support:
- Collection: Infrastructure must support the collection of consent signals from data subjects, as the result of “consent interactions”. These interactions must be construed broadly, as consent may be collected through a range of modalities, including paper forms, telephone conversations, on-screen interactions and in future “screenless” Internet-of-Things interactions.
- Storage: Consent signals must be stored in some way, in enough detail to demonstrate regulatory compliance and facilitate decision-making. Specifications such as Consent Receipts and COEL have begun to specify and standardise these records.
- Querying: In order to allow data- processing to respond to consent in real-time, it is necessary for the infrastructure to support frequent querying of consent records. In this way, data processing systems (such as bulk mailing or data mining) can check that consent exists prior to carrying out processing on a subject’s data. A higher-level analytics capability to help data controllers to identify trends in consent may help organisations to be more responsive to the privacy (or other) preferences that are signalled via data subjects’ consent decisions.
- Updating: A data subject’s consent must be revocable, and so consent records must be mutable. A mature consent infrastructure would allow citizens to bring together consent records from across services into applications such as dashboards; helping to simplify the task of managing the ways their personal data is used.
- Identification: The ability to identify - and in many cases authenticate - the data subject is a requirement in order to allow mutation and revocation of consent. Consent infrastructure therefore has some dependency on identity infrastructure, although in many cases this will be lightweight, and some applications may require the ability to authenticate that a change is being made by the data subject but retain only a pseudonymous or non-canonical identifier.
Agents & Automation
There is research interest in the role of Artificial Intelligence (construed broadly) in helping to manage the burden of consent on data subjects. The idea of a “personal data assistant” was developed by a number of workshop groups and is perceived as a way to lower the cognitive load and “annoyance factor” that has resulted from previous attempts to introduce more emphasis on consent (for instance the ePrivacy Directive’s “cookie law”).
In conjunction with a consent infrastructure, agent technology shows considerable promise in managing consent at the sort of scale that would be required to operationalise it in smart cities or other IoT applications. In this regard, agents can help both with scale (frequent consent requests, often while the data subject’s attention is on other tasks) and the often very limited interaction affordances of pervasive and ubiquitous computing devices.
There are existing examples of consent-related technology within specific domains. In medical research, electronic systems for explaining trials and recording consent are available commercially (for instance SecureConsent), and dynamic consent systems have been proposed to help facilitate consent for reuse of medical datasets.
In the domain of web tracking (typically related to online advertising), the “Do Not Track” initiative has produced a technical standard for the communication of consent (and non-consent) to tracking between web browsers and web servers. To date, the impact of DNT has been hindered by a lack of regulatory support, and poor implementation in web browsers. Although reference is made to consent in DNT (and the associated TPS), mechanisms for obtaining consent are specifically excluded from the standard (section 6.3.3), and development of consent infrastructure could therefore complement efforts in this domain.
The Consent Science pillar covers scientific research into consent. We use the term “consent science” to avoid over- or under-generalising. By definition, “Consent Science” should include any scientific enquiry aimed at defining, measuring or implementing consent; and in this sense there may be a blurred distinction between consent science and the consent engineering aspects of the Technology pillar.
A science of consent will likely have at least four components: Theory, Methods, Mechanisms and Analysis.
Theory: Theory refers to a body of knowledge. This knowledge covers both a theory of what consent is and what we know about how it can be effectively sought and obtained. Theories of what consent is range from discursive accounts from (for instance) medical ethics, to principle-based descriptions such as Friedman’s 6-factor model, to more empirical theories such as the more quantitative “surprise management” that we have advocated in the MCDE project.
Methods: In support of evaluating consent interactions, repeatable methods and methodologies will be developed; both for the purposes of conducting generalisable scientific research and, eventually, for use by industry and regulators who want to develop or validate consent interactions in particular contexts. Developing methods includes the development of reusable research instruments; for instance the notional “consentoscope” that measures the consentfulness of a particular user interface.
Mechanisms: Application of consent theory and methods will allow novel consent mechanisms to be developed and compared with more conventional mechanisms. The idea of personal ‘digital housekeepers’ - semi-autonomous agents that act on our behalf to signal consent - is one area that workshop participants were positive about, and which the MCDE project has conducted research into. Designing and validating interaction patterns and modalities will support industry - via the Technology pillar - in delivering empirically-proven consent mechanisms at scale.
Analysis: Science also has a role to play in evaluating the state and development of the consentful economy. Understanding which regulations and guidelines are leading to consentful engagement between business and citizens, and which are not, should be a core goal of research; and so too should be the translation of new discoveries about the nature and engineering of consent into evidence-based policy.
Where we’ll end up is uncertain and hard to predict, but based on the ideas and developments mentioned in this report we’ve sketched some possible consent futures.
The “do nothing” scenario corresponds to the possible future that could transpire without any major developments in the areas that we’ve outlined. In this scenario, Consent remains an annoyance for most data subjects, while doing little or nothing to improve their understanding and ability to make decisions. Consent does not help to improve customer trust in how businesses use their data because it is perceived as meaningless or offers no genuine choice. When consent interactions are encountered data subjects skip through them out of habit.
Regulation: Subjective and speculative guidelines about implementing consent are provided by DPAs, and there is little innovation or experimentation with alternative ways to present decisions and receive consent.
Citizens: Citizens are unsure of how their data is used, and don’t feel that they can control it once they’ve passed it on. Privacy concerns manifest as a general aversion to sharing data and suspicion about what organisations are doing.
Business: Businesses see consent as a compliance cost, by and large the business response is to find narratives that justify whatever it was they were doing beforehand, or to spin weak signals from data subjects (like clicking a button, or ticking a box, or “continuing to use the website”) as being consent. The relationship between business and consumer is one of trying to minimise the attention that consumers give to their personal data.
Consent Science: Scientific investigation of consent doesn’t really exist.
Tech: “Bare minimum” technology looks similar to today’s cookie notices. Cut-and-paste notices satisfy regulatory concerns but are not integrated with underlying data processing technology. Consent is assumed to have been given during registration, or represented by a lone “1” in the binary “Consent” field of a CRM system.
Major developments that cover all of the pillars have the potential to lead to a personal data economy that has markedly different characteristics to the one that we see today and that of the “do nothing” scenario. In this scenario, data subjects can allow or disallow data processing in near real-time, with confidence that their expressed preferences will be respected. Many consent decisions are automated at the point of consent, but can be inspected later and are framed in terms that are meaningful to the data subject. Greater overall benefit is extracted from data because data subjects are more willing to engage with services and grant access to their data, and data-users can more easily obtain a legal basis for processing data.
Regulation: Evaluation of consent interactions is based on empirical evidence, and encouraged through guidance, best-practice and enforcement. Detailed consent records allow compliance to be demonstrated, and allow organisations to respond to changing recommendations and developments in best-practice.
Citizens: Citizens have an awareness of how their personal data is used, and feel that the limits they set are respected; broadly they are not suspicious of how brands they voluntarily engage with are using their personal data. Citizens reason about possible impacts of giving consent, and sometimes withhold it, without suffering disproportionate hardship as a result.
Business: Customers comfortably share data with businesses in return for genuine value. Businesses can respond to what customers are signalling via consent decisions in order to shape offerings in line with consumer preference, and are economically motivated to alter their products and services to reflect citizens’ wants and needs with regard to personal data.
Consent Science: Established methods for assessing consent interactions are available and packaged in a format that makes them usable by industry (to develop better interactions) and by regulators (to judge compliance and offer guidance). The assessment of consent becomes more objective, and methods can be refined to become more robust over time.
Tech: Consent signals are recorded in a standardised format for auditing purposes, and records can be changed by the data subject at any point using centralised dashboards. In many cases, consent decisions are deferred to semi-autonomous agents that minimise distraction and annoyance. Data processing is tied, in real time, to consent records so that subjects’ preferences are respected and actions (such as emails) can be traced back to the specific consent decision that enabled them. Interactions are built and iterated upon using established methodologies and metrics, and validated reusable libraries & interaction patterns are available for data controllers to use.
The ideas and developments in the “do everything” scenario range from those that are close to commercial availability, like consent management platforms, to those that are still at the research or even conception stage - like AI support. We’ve tried to place the various developments mentioned in the pillars onto a set of readiness levels (Appendix 1 & 2). This intermediate scenario includes developments that have gone past the ‘research’ readiness stage.
Regulation: Regulation is based on GDPR and <whatever ends up in ePrivacy>. Guidelines from DPAs are still fairly speculative leaving many undertain edge cases. Safest for organisations to just copy an “approved” interaction pattern.
Citizens: Trust is built between some brands and citizens, and meaningful consent interactions are part of doing so, but general concerns over industry practice persist. Citizens are surprised sometimes by how data has been used, and there is moderate resistance to initiatives like smart meters.
Business: Businesses have some dedicated privacy resources, and some see them as a competitive/brand advantage as well as a compliance issue. Business models are not radically shifted.
Consent Science: Some basic models of consent, and ways of evaluating it, are available; but not translated into tools and materials that support wider adoption outside of research-intensive organisations.
Tech: Consent management tools are available and adopted, allowing individual organisations to store, audit and respond to consent from data subjects. Wider inter- and extra-organisational infrastructure is undeveloped, so personal consent dashboards and AI support are not widely deployed.
Appendix 1: Readiness Levels
We used a set of "readiness levels" to assess the different developments that were mentioned by participants during our workshop. These levels are conceptually similar to the Technology Readiness Levels but reformulated to allow comparison of a braoder range of developments, including law and regulation, which are not technologies. The result is a rough-and-ready, but hopefully broadly useful, means of categorising developments based on their progress from conception to real-world deployment.
|1||Concept: The idea has been proposed||Suggestion of a piece of technology, “X would be cool”||“Maybe we need new rules on Y”|
|2||Specification: We know what the main aims, objectives and success criteria for the idea are||Functional specification for a piece of technology.||The new law should accomplish P, Q, R|
|3||Research: We are identifying and evaluating ways of implementing the idea||Experimental proof of concept||Examination of existing law etc. eg REFIT exercise, early consultation|
|4||Implementation: Work has begun on turning the idea into a practical reality||Early Industrial Implementation||Consultation on proposed legislation|
|5||Adoption: An implementation of the idea is in use||Industrial implementations are available||Law agreed|
|6||Maturity||Widespread use||In force|
Appendix 2: Horizon Scanning
This table summarises things between us and the Horizon. We have split those things - from now on referred to as developments - by pillar, and assigned a readiness level (See Appendix 1) to help gauge how developed - or close to practical impact - they are.
|GDPR Consent Guidance||Law||Guidance from DPAs about practical compliance with GDPR’s consent requirements||4||ICO are reviewing consultation responses|
|UK-specific DP legislation||Law||"UK may pursue legislation that differs to some degree from EU"||1||"Little appetite currently but floated as an idea. ICO will not be bound by decisions of other DPAs via the consistency mechanism so there is some potential for divergence in interpretation.|
|ePrivacy Reg||Law||New EU ePrivacy regulation||4||"Has been proposed and a draft issued (/leaked). Not finalised looks like there could be major changes yet.|
|Coalition of the willing||Law||"An alliance between various stakeholders to proactively establish guidance case law lobbying"||1||"Lobbying in particular is quite disjointed. Mostly focussed on privacy digital rights. Consumer advocacy differs globally.|
|Consent Measurement for Regulators||Law||"The use of objective measures criteria to support regulation. Conceptually a “ground truth” for consent."||3||"Some research not fully validated and not yet translated into a form suitable for general use.|
|DPO Appointments||Biz||-||5||DPOs are accepted|
|CPO appointments||Biz||-||5||Many organisations have CPOs; less widespread than other C-positions though.|
|New business models||Biz||-||4||"Interest in more inherently consent-centric biz models eg HAT but still limited in real-world adoption. Established consentless models eg behavioural advertising still widely deployed and not yet disrupted by new models.|
|Brand Value||Biz||-||3||The brand value of consent-driven trust is still being tested by early-adopters.|
|VRM; consumer-centric relationships||Biz||-||3||VRM still being researched and prototyped.|
|HOW TO DO CITIZENS? DEVELOPMENTS ARE ORGANIC/REACTION TO OTHER EVENTS||Map readiness to citizen attitudes somehow; based on what? How widespread they are?||-||-||-|
|Consent Management Platforms||Tech||"Infrastructure for collecting storing and querying consent records."||4||"Some early commercial examples; but not yet widely adopted and still immature. Domain-specific examples eg DNT do not implement a full suite of functionality (e.g storage audit).|
|Consent vocabularies||Tech||-||3||Limited. No standards yet but interest from eg Kantara Consent WG.|
|Consent Agents||Tech||AI support for consent decision making. “Digital butler”.||3||"Area of active research but not ready for widespread adoption. Depends on infrastructure for communicating consent requests and responses.|
|DNT||Tech||-||5||"DNT is a standard and has some tech implementations. Adoption by service providers still limited - declining?|
|Consent Metric||Science||-||3||Research in progress|
|Consentfulness Toolkit||Science||Materials to help regulator/practitioner adoption of consentfulness metric||2||-|
|Validated consent design patterns||Tech / Science||"A set of validated design patterns with information about the socia business and consumer contexts in which they are suitable."||2||"We know we need these but most patterns still under-validated. Lack of objective validation measures/methods hinders this. Not currently enough research into contextual factors that could affect validity of the collected consent. Early examples of cataloguing by e.g. Projects By If|