The Council of Europe Guidelines on AI: Strengthening Rights over Automated Decisions

Kaveh Cope-Lahooti

To celebrate Data Protection Day, on the 28 January 2019, the Council of Europe released guidelines on data protection measures in relation to artificial intelligence. The guidelines contain recommendations that serve to codify much of the emerging best practice around ensuring artificial intelligence systems are made compatible with human and data subject rights, building upon existing regulation of the sector provided in the GDPR and Convention 108.

Convention 108 (the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data), which is an international law framework applicable to the (predominantly European) Council of Europe members, was last updated in 2018.  Building on the Convention, the guidelines further specify certain core elements that should be included when data is processed in AI systems, mainly focused about ensuring data accuracy, non-bias and a ‘human rights by design’ approach. It takes the latter to mean that all products and services should be “designed in a manner that ensures the right of individuals not to be subject to a decision significantly affecting them…without having their views taken into consideration”.

In practice, this will require organisations to conduct a wider risk assessment of their impacts in advance, and build-in governance methods to conduct an generate and consider relevant stakeholder input. One means of ensuring this is to involve, at an early stage in the design process, representatives from the design/development teams, HR, Data Protection and Risk departments and potentially executive and Boards members, in addition to seeking the advice of NGOs and other industry bodies already regulating data ethics. Organisations can rely on both external and internal data ethics committees, both to give their opinion on the potential social or ethical impact of AI systems, but also to be involved as a tool for ongoing monitoring of the deployment of AI systems.

Most notably, the Guidelines also highlight the right for individuals to obtain information on the reasoning underlying AI data processing operations applied to them. Indeed, this refers to Convention 108’s most recent iteration, which outlines that, in the context of automated decision-making systems, data subjects should be entitled to know the “reasoning underlying the processing of data, including the consequences of such a reasoning, which led to any resulting conclusions”. This goes further than provisions of the GDPR that provide for data subjects to receive information on “meaningful information about the logic involved” in such decisions, as well as the “significance and the envisaged consequences of such processing”. 

Rather than simply covering explaining system functionality’ or which processes are performed (i.e. whether there is any profiling, ranking or data matching than occurs), and perhaps what the functions of features (e.g. categories of data) are involved in the design of an algorithm, the Convention 108 right is a more expansive right, extending to understanding why an actual outcome was reached, including after a decision was made. This would require a company to assess and track the way algorithms are trained, and perhaps even re-run decisions with modified or different criteria, in order to be able to diagnose what “led to any resulting conclusions”.

Not only do the Guidelines refer to this right to information or explanation, but they also allude to the fact that AI systems should allow “meaningful control” by individuals over data processing and its related effects on them and on society. The thinking behind this is that, wher provided with the information to do so, data subjects will be able to exercise their other corollary rights under Convention 108 or the GDPR, including the right to either not be affected by solely automated decisions or to challenge their reasoning. As such, organisations should put into place mechanisms for challenging and reviewing automated decisions to ensure a fair and equitable outcome. These will have to be integrate either elements of human decision-making or at least, human intervention, over such decisions, on top of their obligations to notify data subjects of their rights. This will serve to provide sufficient assurance of objectivity of process under the GDPR, and will also help streamline any requests for challenging or querying decisions.

The Convention 108 is another ‘soft law’ mechanism that organisations should view as a sign of further elaboration on how organisations can practically comply with data protection and human rights norms in their deployment of AI systems, although not yet as a further regulatory step following the GDPR. From this perspective, the Guidelines, with the amended Convention, appear to serve to clarify much of the existing practice on designing transparency into AI processes, which can only serve to make organisations more objective and accountable.

Children’s Data: Consultation on Code of Conduct opens in Ireland

Kaveh Cope-Lahooti

The Irish Data Protection Commission (DPC) has launched a consultation on the processing of children’s personal data with a view to introducing guidance and a Code of Conduct for organisations. Although it will be some time before any steps are taken, it is important that businesses are aware of the key issues involved and to consider developing and investing in their own technology solutions to meet legal requirements under the Data Protection Act and General Data Protection Regulation (GDPR).

Background

On the 19th December 2018, the DPC opened a consultation and invited public comment on the processing of children’s personal data.

The GDPR’s regulation of children’s data is open-ended. There is no direct list of information that must be provided to children (or their parents) before children’s data is processed. The different EU member states may set the age at which children themselves may consent to the processing of their data for use online which means for children under that age parent’s consent would be required. This varies between 13 and 16 in member states, with Ireland taking 16 as the threshold – a marker of its commitment to protecting children.

The DPC intends to publish a Code of Conduct on processing children’s data. This has a basis in the Irish Data Protection Act, which encourages the formation of codes of conduct in specific sectoral areas, such as the protection of children’s rights, transparency of information and the manner in which parents’ consent is to be obtained. The Code will enable the DPC to carry out mandatory monitoring of compliance with the Codes of Conduct by the controllers or processors which undertake to apply it.

Key issues raised in the consultation and possible solutions are discussed in the following paragraphs.

Age of Consent

The GDPR’s specific regulation of children’s data is largely based on a similar standard and grounds as the US’ Children’s Online Privacy Protection Rule (COPPA), which, broadly speaking, applies to children’s information (for those under the age of 13) collected, used and disclosed by websites, mobile apps, games and advertising networks (among others) online. The GDPR’s requirement is slightly narrower, however, applying to information processed by ‘information society services’ offered to a child – which must ‘normally’ be paid or intended to be paid services. It also covers where these services are only ‘offered’ – i.e. at an early stage, such as where an account creation is initiated. In these circumstances, the online service provider must make “reasonable efforts” to verify that consent is given by the holder of parental responsibility “taking into consideration available technology”.

As discussed, the fledgling nature of the regulation of children’s data means there are no prescribed methods of collecting parents’ consent – and this is largely what the consultation asks for input on. Many of the attempts to introduce a means for collecting consent have been based on recommendations and practice under COPPA, including those recommended by the US Federal Trade Commission (FTC). In particular, this is the case for the proof of parent’s consent, where there is much discussion of what mechanisms organisations must put into place to collect the relevant consent. Clearly, there must be some information needed to verify this, i.e.:

  • Some form of age selection process by the user, where possible, built in to the website, which must be before a registration or a payment is made.
  • Where applicable, the identification of the parent may be required to be confirmed. This could be achieved by charging a nominal fee to a registered credit card in the parent’s name.
  • Parents will also need to confirm GDPR-compliant consent via an affirmative action such as signing a consent form, replying to an email or calling a number to confirm, which should be evidenced and auditable by the organisation, if possible.

In particular, it should be remembered that there are various other methods to prove or collect the required information, but each have their advantages or disadvantages. For example, it has been discussed that collecting a photograph of the parent’s ID (such as via a passport copy or other ID) would violate data minimisation requirements, even if deleted immediately. Arguably, on top of the potential to violate data minimisation requirements, the collection of this ID for verification purposes would be a practical and administrative burden for organisations to comply with far greater than charging a nominal fee to a credit card would. In particular, the verification of the ID would require sophisticated software or human intervention.

Most notably, it should be remembered that organisations should also put into place a method for the parent’s consent to be withdrawn, or the personal data deleted, in accordance with the rights under the GDPR. This would either involve saving the parent’s details – such as their email address – along with the fact that they had previously been verified as a parent – to process the request in the future.

Notification and Transparency

As the DPC notes, transparency is particularly important in the context of children’s data – such as through notices needing to be directed at children. An age-appropriate notice would have a broader application and, under the GDPR, is required regardless of whether the requirement for parent’s consent for paid online services applies or not. In particular, the DPC asks whether two separate sets of transparency information should be provided that are each tailored according to the relevant audience i.e. one for the parent and one for the child (i.e. aged under 16). Other issues that have been discussed around this include the fact that a child aged 15 will have a different capacity for understanding than a child under 10 years old.

A solution to this would be for the organisation to consider what the primary age demographic they could expect their website to be accessed by or targeted at and draft an appropriate notice. An age-selection method on a website’s homepage (to generate an appropriate notice) may be another method, although collecting such information so early may violate data minimisation requirements and may be cumbersome to many viewers, as well as for website owners to implement.

In relation to notifying parents, as explained previously, parents’ consent must meet the GDPR’s requirements, including that consent to be ‘informed’. One solution, such as the FTC recommends in relation to COPPA, could involve providing parents with information on why they are being contacted, which information was collected from the child and that the parents’ consent is required, alongside a link to the organisation’s Privacy Notice.

Children’s Rights

The consultation also touches on the issue of children’s rights, including the new rights under the GDPR, and how these are to be exercised. For example, in Canada, the Office of the Privacy Commission has proposed that, where information about children is posted online (by themselves or others), the right to remove it should be ‘as close to absolute as possible’, a sentiment echoed in Article 17(1)(f) of the GDPR. Ireland has taken a similar approach, for example, whereby under the Irish Data Protection Act, there is a stronger ‘right to be forgotten’. This applies to children whose personal data has been processed for information society services, without the need to prove that the processing was no longer necessary or unlawful, or the legitimate interest is unjustified, etc. As such, where the right to erasure does not apply absolutely (i.e. where consent is not relied on) organisations should be prepared to make such an objective assessment, considering (for example) whether the child or the guardian would have been aware of the use of the child’s personal data, and whether it was used in particularly invasive instances of processing.

Profiling

Additionally, whilst there has been some discussion among European supervisory authorities that children’s data will be particularly protected under data protection legislation, Ireland has gone further than this to protect children’s rights. In particular, the Data Protection Act 2018 has made it an offence to process children’s data (children, for this specific section, meaning those aged under 18, not under 13) for direct marketing, profiling or micro-targeting, regardless of consent. This has very wide implications – where profiling could simply be carried out by marketing or retail companies to tailor products and services to their child customers. On the broadest reading, would also exclude using factors from marketing that are likely to specifically target children, such as an online user’s interest in toys or browsing, for example.[

The consultation considers the incidence of where profiling involves specifically targeting children, particularly as guidance from supervisory authorities in several jurisdictions has held that this sort of automated profiling that specifically targets vulnerable groups should be prohibited. The DPC invites comments on how this can be balanced with an organisation’s legitimate interests. In practice, many organisations are already attempting to err on the side of caution by excluding factors related to children from the profiling.

Next Steps

The consultation touches on several other issues – such as how online service providers should ensure they comply with different ages of digital consent in different EU states – for which there are various possible legal, policy or technological solutions. The consultation is open for submissions until 1 March 2019, although there is a long way to go after this before businesses have any certainty over their procedures. After publishing the consultation submissions, the DPC will publish guidance and work with industry towards a Code of Conduct for organisations on measures and methods to be taken to the comply with provisions in the Data Protection Act and GDPR relating to children’s data. However, these will invariably be open to interpretation, meaning there is scope for business to develop and invest in their own technology solutions to meet these legal demands.


Safeguards for a No-Deal Brexit: Irish DPC releases guidance on data transfers to the UK

Kaveh Cope-Lahooti

Background

The future of trade between the United Kingdom and the EU, including the Republic of Ireland, post-Brexit has been at the centre of the negotiations between the British government and European institutions. The UK is Ireland’s largest trading partner, after the USA[1], and many businesses rely upon the flow of goods, people and information between the two neighbouring countries. With a no-deal Brexit now a real possibility (albeit one both the UK and EU insist they are doing the utmost to prevent) the closer that the 30 March 2019 deadline approaches, the Irish Data Protection Commission has released guidance on the procedures Irish businesses should take if this eventualises (available here).

The difficulties stem from the fact that the UK will be considered, from the EU’s point of view, as a ‘third country’ with regard to data transfers  (we have elaborated on the meaning of this for the UK here). Suffice to say – in the absence of an EU adequacy decision (similar to what exists for Switzerland or even the Privacy Shield for the USA) which permits data transfers to be transferred freely to such countries without any strenuous procedures, specific measures will have to be followed by businesses. The European Commission has outlined that it has no plans to consider an adequacy decision for the UK, somewhat obscurely, considering that the UK has effectively implemented the GDPR in full as per the Data Protection Act 2018.

Safeguards

As such, essentially, the measures that will be required are the same used to transfer personal data to countries that are not either in the EU or considered adequate – such as the case for the majority of Asian, South American or African countries where Irish businesses may already have outsourcing operations (such as using IT developers in India, for example). These measures will include:

  • Binding Corporate Rules (BCRs)– these are applicable for large organisations that have operations across Ireland and the UK (and potentially more jurisdictions). However, as these will need approval from the Data Protection Commission and possibly other supervisory authorities, organisations are unlikely to put them in place before the Brexit deadline;
  • Standard Contractual Clauses (SCCs) – these are agreements that can be signed as an Annex to any existing contracts you have with UK-based providers, who will be considered ‘Data Importers’ from Ireland. Although slightly outdated, they are fairly easy to implement where organisations have a limited number of UK-based providers that they can easily amend contracts with. Such measures can also be signed with an organisations’ subsidiaries abroad.

It should be noted that the Data Protection Commission has elaborated on many of these measures here.

These measures will be required to be applied to all Irish businesses’ cross-border data processing operations. In particular, cross-border data processing operations will generally involve where information is outsourced to other companies, or where companies share or jointly perform business operations that involve personal data being used or transferred to the UK from Ireland. For example, the Irish Data Protection Commission gives the example of where an Irish company currently outsources its payroll to a UK processor or even uses a cloud provider based in the UK.

However, with regard to larger services providers in the UK that are used by Irish businesses (for example, software providers or cloud providers), where it may be difficult to agree on BCRs or SSCs before 30 March 2019 (due to the difference in bargaining power), Irish firms could consider using EU or even US-based providers. In addition, where there are any transfers to an organisation’s own offices in the UK, including Northern Ireland, it might be worth considering if these operations can be carried out in Ireland instead, or an EU-based office.

Other possibilities

As a last-case scenario, to validate the data transfers to the UK, businesses could also rely on more ad-hoc measures, such as:

  • Seeking consent from individuals directly for data transfers to the UK. This is unlikely to be useful in the payroll example above, as consent cannot generally be used in employment situations.
  • Relying on exceptions, such as where transfers are necessary for the performance of a contract with the data subject or in the interest of the data subject – such as if you have a global mobility scheme for employees or need to process payments abroad for customers. However, as ‘necessary’ is to be construed narrowly, this will not extend to outsourcing processing.

Another issue that the Data Protection Commission considers is the situation where data are transferred from Ireland to the UK, but then sent onward from the UK to other third countries. In these situations, from an Irish perspective, Irish businesses will focus on making these transfer legal from an Irish legislative perspective.  Therefore, as above, this means that if such data transfers are onwards to the EEA, they will not need any additional measures, but where they are sent to third countries, they will need similar measures as above.

Next Steps for businesses

As a result, Irish firms should therefore be considering evaluating their UK-based suppliers or processors, and whether measures are in place. Although the best advice for businesses at the current time (particularly with regard to data protection) will be to prepare for the worst, it should be noted that these measures may not, in fact, be needed, if the following occurs:

  • As part of the UK Government’s agreement with the EU (which still needs approval by the UK Parliament), the UK and EU agree that from 30 March 2019 until 31 December 2020, a ‘Transition Period’ will take place during which EU law, including EU data protection law, will continue to apply to and in the UK. This will mean that special measures are not required vis-à-vis the UK;
  • The UK takes a position to not leave the EU by the proposed deadline; in such case the EU law will still apply in full;
  • If a no-deal Brexit still occurs, the European Commission nevertheless makes an adequacy decision with regard to the UK.

Footnotes

[1] Ireland exports, imports and trade balance By Country 2016. (2018). World Integrated Trade Solutions. Retrieved 22 December 2018 from: https://wits.worldbank.org/CountryProfile/en/Country/IRL/Year/LTST/TradeFlow/EXPIMP/Partner/by-country