Category Archives: Data Protection

Data Protection Act 2018

Earlier this week the House of Lords and the House of Commons completed their game of ping pong with the Data Protection Bill and it completed its journey through the Parliamentary procedure; a journey which began when the Bill was introduced to the House of Lords by the Department for Culture, Media and Sport (DCMS) in September 2017.  Almost eight months later, and after quite a bit of amendment, the Bill has now received Royal Assent to become the Data Protection Act 2018.

It is expected that the various pieces of secondary legislation which are required to bring the Act into force and make transitional provisions will be signed by a Minister in the DCMS later today or tomorrow to ensure that the Act comes into force on Friday.

The new Data Protection Act 2018 does a number of things: (1) it deals with those areas within the GDPR, such as exemptions, which have been left to Member States to deal with individually; (2) applies the GDPR (with appropriate medications) to areas which are not within the competence of the European Union; and (3) gives effect to the Law Enforcement Directive (which should have been in place by the 6th May 2018, but better late than never).

Data Protection law has become much more complex than was the case under the Data Protection Act 1998; it requires individuals to look in many more places to get a proper handle upon what the law requires (and that’s before we start to get decisions from the European and domestic courts).

There has been an indication by some campaign groups that there might be an early challenge to the immigration exemption within the Bill which will have an impact upon the information that data subjects can obtain from the Home Office under the subject access provisions within the GDPR.  It will certainly be interesting to see whether such a challenge is in fact made and what the outcome of it is – and of course, we will cover any decision on that point should one be made by a court.

Alistair Sloan

If you require further information in relation any data protection or privacy law concern then please do contact Alistair Sloan on 0141 229 0880 or by E-mail. You can also follow our dedicated information law account on twitter for news and updates concerning data protection, privacy and freedom of information.

Data Protection Impact Assessments under the GDPR

Accountability is an important aspect of the General Data Protection Regulation (GDPR).  The accountability principle in Article 5(2) of the GDPR obliges data controllers to be able to demonstrate that they are complying with the data protection principles in Article 5(1) of the GDPR.  Some of the technical requirements placed upon data controllers within the GDPR can be traced back, at least in part, to the accountability principle.  One of the requirements of the GDPR which will assist data controllers to demonstrate compliance with the data protection principles is the requirement to complete, in certain circumstances, a Data Protection Impact Assessment (DPIA).

For a number of years supervisory authorities around the EU, including the UK Information Commissioner, have encouraged organisations to conduct a Privacy Impact Assessment (PIA) as part of their promotion of good data protection practice. DPIAs are simply PIAs by another name. The requirements for DPIAs are set out in Articles 35 and 36 of the GDPR.

When do I need to perform a DPIA?

Article 35(1) of the GDPR requires data controllers to conduct an assessment of the impact of envisaged processing operations on the protection of personal data, where a type of processing, in particular using new technologies; and taking into account the nature, scope, context and purposes of the processing, is likely to result in a high risk to the rights and freedoms of natural persons.  The DPIA must be conducted prior to undertaking the processing envisaged.

Article 35(3) sets out specific circumstances where a DPIA should be carried out by a data controller and those are:-

(a) a systematic and extensive evaluation of personal aspects relating to natural persons which is based on automated processing, including profiling, and on which decisions are based that produce legal effects concerning the natural person or similarly significantly affect the natural person;

(b) processing on a large scale of special categories of data referred to in Article 9(1), or of personal data relating to criminal convictions and offences referred to in Article 10; or

(c) a systematic monitoring of a publicly accessible area on a large scale.

When deciding whether a DPIA is required it will be important for data controllers to check the ICO’s website. The Information Commissioner is required to publish a list of the kind of processing operations which are subject to the requirement for a DPIA.  If the processing operations envisaged by a controller appear on this list, then they will be required to carry out a DPIA.  Article 35(5) also empowers the Information Commissioner (but does not require her) to establish and publish a list of the kind of processing operations for which no data protection impact assessment is required.

What does a DPIA require?

Article 35(7) sets out what the DPIA must include as a minimum; these are:-

  • a systematic description of the envisaged processing operations and the purposes of the processing, including, where applicable, the legitimate interest pursued by the controller
  • an assessment of the necessity and proportionality of the processing operations in relation to the purposes
  • an assessment of the risks to the rights and freedoms of data subjects (the risk to the rights and freedoms of natural persons, of varying likelihood and severity, may result from personal data processing which could lead to physical, material or non-material damage – see recital 75 of the GDPR for more detail)
  • the measures envisaged to address the risks, including safeguards, security measures and mechanisms to ensure the protection of personal data; and to demonstrate compliance with the GDPR taking into account the rights and legitimate interests of data subjects and other persons concerned

It should be noted that requirements set out in Article 35(7) for the content of a DPIA are a minimum; there may be situations when a DPIA requires to go beyond what is set above.

The role of the Data Protection Officer

If you have appointed a Data Protection Officer, then Article 35(2) of the GDPR requires that you seek advice from them when carrying out a DPIA. It should be remembered that a DPIA might change a number of time during the process; you should therefore keep your DPO involved throughout and be seeking advice from them regularly at appropriate junctures. Seeking advice from the DPO is not simply a box ticking exercise and should therefore not be treated as such. If you treat it as a simple box-ticking you could find yourself not complying properly with the requirements of the GDPR and could potentially be missing out on valuable advice. Remember that controllers are not obliged to follow the advice of their DPO, but if they elect to act contrary to that advice then they should document this and could be required to defend that decision.

The Role of The Information Commissioner

I have already indicated that the Information Commissioner has a role in the DPIA process, but her role is more extensive than has already been covered above. There are circumstances, set out in Article 36 of the GDPR, in which controllers will be required to consult with the ICO.  This applies where, in the absence of any mitigating measures by the controller, the DPIA indicates that the processing would result in a high risk to the rights and freedoms of data subjects.

Within a period of 8 weeks following receipt of the request for consultation (but this may be extended by a further 6 weeks in appropriate cases) the Information Commissioner is required to provide written advice to the controller where she is of the opinion that the intended processing would infringe this Regulation.  It’s therefore important that you consult the Commissioner well in advance of undertakng the envisaged processing to ensure that you have enough time to receive any written advice from the Commissioner and to consider and apply it.

The Information Commissioner’s role does not end there; she is not simply limited to giving written advice to the controller.  She can also become much more involved by conducting a data protection audit; issuing a formal warning that the intended processing is likely to infringe the provisions of the GDPR and even limit or prohibit (temporarily or indefinitely) a data controller from undertaking the proposed processing.


A failure by a controller to comply with its obligations to conduct a DPIA where one is required can attract an administrative fine of up to €10,000,000 or 2% of global turnover (whichever is greater); as can a failure to consult with the Information Commissioner where consultation is required under Article 36 of the GDPR.  Failure to comply with an order limiting or prohibiting the processing (whether temporary or indefinite) can attract an administrative fine of up to €20,000,000 or 4% of global turnover (whichever is greater).

Can I undertake a DPIA when one is not required by the GDPR?

Yes you can; a properly completed DPIA will be of assistance to you in demonstrating that you are complying the the data protection principles. A DIPA on its own will usually be insufficient to completely comply with the Artcile 5(2) obligations (even where it is required by Article 35), but a properly completed DPIA is certainly something that you can produce to the Information Commissioner to help evidence that you are taking your data protection obligations seriously.

Alistair Sloan

If you require advice or assistance with Data Protection Impact Assessments or any other data protection matter then contact Alistair Sloan on 0141 229 0880 or by E-mail. Alistair can also assist with other aspects of information law.

Data Protection/Privacy Enforcement: April 2018

In April the Information Commissioner’s Office published a number of enforcement measures taken against public and private organisations under both the Data Protection Act 1998 (“DPA”) and the Privacy and Electronic Communications (EC Directive) Regulations 2003 (“PECR”).  The key points to draw from the enforcement action this month should be familiar to anyone who has been reading this series of blog posts since it began in September.

Key Points

  • It is important to keep track of personal data, especially when it is sensitive personal data; if it is to be sent out of the organisation ensure that it is properly secured and that a record of it being sent and received is kept.
  • Before sending out information to your customers it is important to consider whether the information you are sending is properly business information (or information you’re required to give by law), or whether it is actually promotional or marketing material. If it’s promotional or marketing material ensure that you only send it to the E-mail addresses of people who have consented to receive promotional or marketing material from you.
  • Make sure that before you conduct a marketing campaign by telephone that you do not include numbers listed with the TPS unless you have the consent of the subscriber to contact them by phone for the purposes of direct marketing.
  • When disclosing information to someone, whether under FOI laws or not, ensure that you do not accidently disclose personal or sensitive personal data of third parties where you do not have legal grounds to do so. Be especially careful with pivot tables, a number of public authorities shave found themselves in regulatory hot water of the use of pivot tables. The ICO produced a helpful blog post in 2013 on the issue of pivot tables.
  • If you are an employee it is important that you remember that you should only be accessing personal data where you have a proper business need to do so and should only be disclosing personal data where you need to do so in order to properly perform your role. You can be held personally liable and find yourself being prosecuted in the criminal courts.

Enforcement action published by the ICO in April 2018

Humberside Police
The Information Commissioner, exercising her powers under section 55A of the DPA, served a Monetary Penalty Notice in the amount of £130,000 [pdf] for breaches of the DPA.  The force conducted an interview of a person alleging that they had been the victim of rape, on behalf of Cleveland Police. The interview was filmed and three copies of it existed: the master and two copies. The discs were unencrypted. They were to be sent to Cleveland Police, but were never received by Cleveland police. Humberside Police were unable to locate the discs or to confirm whether they had ever been posted to Cleveland Police.  The Commissioner found that Humberside Police had failed to comply with the seventh data protection principle and also paragraph 9 of Schedule 1 to the DPA.

Royal Mail Group Limited
The Information Commissioner served a Monetary Penalty Notice on Royal Mail Group Limited for contravening Regulation 22 of PECR.  The Monetary Penalty Notice was in the amount of £12,000 [pdf]. Royal Mail Group is the designated Universal Postal Service Provider in the UK and as such, it has certain statutory responsibilities to disseminate certain information. Royal Mail Group Limited sent E-mails to all of its customers, including those who had opted not to receive electronic marketing, to notify them of a change in price for second class parcels purchased online.  The price change was described as being a “promotional” one. The Commissioner found that this amounted to direct marketing rather than information that Royal Mail was obliged to provide under the Postal Services Act 2011 and was therefore in contravention of Regulation 22 of PECR.

The Royal Borough of Kensington and Chelsea
The Information Commissioner served a monetary penalty notice on the Royal Borough of Kensington and Chelsea in the amount of £130,000 [pdf] for breaches of the DPA. The breach arose out of a request for information made to the council pursuant to the Freedom of Information Act 2000. The Council answered the request for information by providing a pivot table to the requesters. The council did not properly redact the underlying information which was then accessible to the requesters without too much difficulty; the underlying information included personal data.

The Energy Saving Centre Limited
The Information Commissioner has served the Energy Saving Centre Limited with a Monetary Penalty Notice in the amount of £250,000 [pdf] and also with an Enforcement Notice [pdf] for contraventions of PECR.  The Commissioner had found that the Energy Saving Centre Limited had made tens of thousands of marketing calls to numbers which were listed with the Telephone Preference Service and where the individual subscribers to those numbers had not given consent to the Energy Saving Centre Limited to be contacted by phone for marketing purposes.  The Enforcement Notice requires the company to stop making unlawful calls – failure to comply with an Enforcement Notice is a criminal offence.

Approved Green Energy Solutions
The Information Commissioner has served a Monetary Penalty Notice [pdf] on an individual who traded as a sole trader under the name Approved Green Energy Solutions.  The amount of the penalty was £150,000. Approved Green Energy Solutions used a public telecommunications service to make in excess of 330,000 unsolicited telephone calls for the purpose of direct marketing where the line subscriber had listed their number with the Telephone Preference Service (“TPS”). The Commissioner and the TPS received 107 complaints directly from individuals affected.

A former receptionist/general assistant at Milton Keynes University Hospital NHS Foundation Trust has bene prosecuted by the Information Commissioner after she inappropriately accessed the records of 12 patients when not required to do so in the course of her employment. The defendant entered a plea of guilty to offences of unlawfully accessing personal data and unlawfully disclosing personal data in breach of section 55 of the DPA. The Defendant was fined a total of £300 and ordered to pay a £30 victim surcharge.

Alistair Sloan

If you require advice and assistance in connection with any of the data protection/privacy issues above, or any other Information Law matter, please do contact Alistair Sloan on 0141 229 0880 or by sending him an E-mail directly.  You can also follow our dedicated information law twitter account.

The Information Commissioner’s power to compel information

The Information Commissioner is presently undertaking an investigation into the possible unlawful use of personal data, in particular, data analytics, by political parties and political campaigning organisations.  The most high profile activity that the Commissioner has undertaken in respect of that investigation has to be the obtaining and execution of a warrant to search the offices of Cambridge Analytica.  As part of that investigation it has been reported that a number of persons and organisations involved in politics have been served with Information Notices by the Information Commissioner, including the United Kingdom Independence Party (UKIP), Leave.EU and Arron Banks.

An Information Notice is a formal investigative tool which the Information Commissioner can use in order to gather information.  Her power to issue such notices, in respect of the processing of personal data, is to be found in section 43 of the Data Protection Act 1998.  There are two circumstances in which the Commissioner can issue an Information Notice:  (1) when conducting an assessment pursuant to section 42 of the Data Protection Act 1998; and (2) where the Commissioner reasonably requires any information for the purpose of determining whether the data controller has complied or is complying with the data protection principles.  Broadly speaking this means that the Commissioner can issue an Information Notice either when her office is conducting an investigation at the request of a data subject or an investigation undertaken by her office which has been instigated by the Commissioner herself.

An Information Notice is simply a document which requires the data controller concerned to provide the Commissioner with information specified within the notice relating to the section 42 request or the controller’s compliance with the data protection principles.  However, its simplicity obscures its formality.  The issuing of an Information Notice is a formal step, and is a serious one for the recipient of the notice.  There is an automatic right of appeal against the notice or any part of the notice to the First-Tier Tribunal (Information Rights).  The right of appeal exists precisely because of its formality and the consequences for not complying with the notice.  It has been reported that UKIP has appealed the Information Notice served on it to the Tribunal.

An Information Notice is more than a polite request for information; it is a formal demand for information which is baked up by the threat of sanctions.  It is a criminal offence to fail to comply with an information notice which can result, if convicted, in a fine.  Furthermore, it is a criminal offence  to (i) make a statement in response to an information notice which is known to be false; or (ii) recklessly make a false statement in response to an information notice.

When serving an Information Notice, the Commissioner can specify or describe the information required by her or can be broader and instead specify or describe categories of information that she requires from the data controller.  There are some restrictions though on the information that the Commissioner can require a data controller to provide her with.  A data controller is not required to furnish the Commissioner with (a) “any communication between a professional legal adviser and his client in connection with the giving of legal advice to the client with respect to the person’s obligations, liabilities or rights under [the Data Protection Act 1998]”, or (b) “any communication between a professional legal adviser and his client, or between such an adviser or his client and any other person, made in connection with or in contemplation of proceedings under or arising out of [the Data Protection Act 1998] (including proceedings before the Tribunal) and for the purposes of such proceedings.”

A data controller can also refuse to provide information which would reveal evidence of the commission of any offence.  However, there are some exceptions to this general exception; if the offence is an offence under the Data Protection Act 1998 or offences under certain statutory provisions concerning the giving of false evidence, then the data controller may still be required to provide the Commissioner with that information.

The serving of an Information Notice on a data controller is a significant step by the Commissioner and it is one that data controllers should not take lightly.  The consequences for failing to comply with the notice or for deliberately or recklessly misleading the Commissioner through the provision of false information can see the data controller facing criminal charges.  The Notice can be challenged through the First-Tier Tribunal (Information Rights) which could see part or all of the notice reduced/quashed.  The Data Protection Bill contains provisions in relation to Information Notices which are for the most part identical to the powers found within the Data Protection Act 1998 and so the Commissioner will continue to possess this potentially powerful took once the GDPR becomes a reality next month (subject, of course, to the Data Protection Bill completing is passage through parliament and receiving Royal Assent in time).

Alistair Sloan

If you are facing an investigation by the Information Commissioner in respect of alleged failures to comply with privacy and data protection law, or if you require advice on any other information law matter you can contact Alistair Sloan on 0141 229 0880.  Alternatively you can contact him directly by E-mail.  We also have a dedicated information law twitter account which you can follow.

NT1 and NT2: Forgetting past misdemeanors

The so-called ‘right to be forgotten’ (hereafter “RTBF”) is an often trumpeted aspect of the GDPR; it is an important right, but one that is rather more restricted in nature than is understood.  The RTBF is not a new right within he GDPR, but has foundation within current data protection law and practice.  On 13 March 2014, the Grand Chamber of the Court of Justice of the European Union gave its judgment in Google Spain SL, Google Inc. v Agencia Española de Protección de Datos, Mario Costeja González (“Google Spain”), which it has popularly been said created a ‘right to be forgotten’.  The court did not, in fact, grant a right to be forgotten; instead, the court required search engines, such as Google, to consider requests from individuals to have links to webpages concerning them de-listed from Google search results in certain circumstances.

Fast forward to 13th April 2018, a little over 4 years since the decision in Google Spain, Mr Justice Warby handed down his judgment in NT1 & NT 2 v Google LLC [2018] EWHC 799 (QB); cases which both concerned the RTBF.  NT1 and NT2 are both businessmen who were convicted of criminal offences.  In respect of NT1, he was involved in a controversial property business in the late 1980s and the early 1990s (while in his thirties).  In the late 1990s, while he was in his 40s, NT1 was prosecuted and convicted, after trial, of a criminal conspiracy connected with those business activities.  He was sentenced to a period of imprisonment and his conviction has since become “spent”.  In addition to the matters for which he was convicted, he was also accused of, but never prosecuted for, a separate conspiracy connected with the same business.  Some of the businesses former staff were eventually convicted in relation to that separate conspiracy.  There was media reporting of these and related matters at that time.  Links to that reporting are made available by Google in its search results.   On 28 June 2014, not long after the CJEU’s decision in Google Spain, NT1 made a de-listing request to Google in respect of six links.  Google agreed to block one link, but not the other 5.  Google stood by its position when NT 1 asked for them to reconsider their decision.  In January 2015, a second de-listing request was made by NT1, this time through his solicitors. Google replied to that de-listing enquiry in April 2015, refusing it.

NT2’s case is quite separate from that of NT1; the two claims were tried separately, but were heard one after the other and involved the same judge and the same representation.  NT2’s case has some similarity in terms of its facts and it raises similar issues of principle to that of NT1.  While in his 40s and sometime in the early 21st century, NT2 was involved in a controversial business which experienced public opposition in relation to its environmental practices.  NT2 pleaded guilty to two charges of conspiracy in connection with that business.  This was “rather more than ten years ago” [para 7].  NT2 received a short prison sentence and spent six weeks in custody before being released; his conviction also became spent.  On 14 April 2015, NT2 made a de-listing request to Google in respect of 8 links.  Google declined to de-list any of the links.

Ultimately, NT2 was successful in obtaining orders requiring Google to de-list while NT1 was unsuccessful.

Journalism, literature and art exemption

Google had, in its defence to these claims, sought to place reliance upon the exemption in section 32 of the Data Protection Act 1998, which relates to “journalism, literature and art”.  Warby J deals with this aspect of Google’s defence to the claims by the claimants in paragraphs 95-102 of the judgment.  Warby J ultimately rejected Google’s reliance upon section 32 holding that the exemption did not apply in the first place; but even if it did, Google would have failed to meet the part of the test which is contained in section 32(1)(b).  Warby J accepted that the EU law concept of journalism was a broad and elastic one which went beyond simply the activities of media undertakings and incorporates other activities which have as their aim the disclosure to the public of information, opinions and ideas. However, Warby J concluded that “the concept [of journalism] is not so elastic that it can be stretched to embrace every activity that has to do with conveying information or opinions. To label all such activity as “journalism” would be to elide the concept of journalism with that of communication.”

In Google Spain the CJEU was sceptical as to whether the exemption in Article 9 of the Directive (which is implemented through section 32 of the Data Protection Act 1998) would apply to an internet search engine such as Google.  Warby J noted that this observation by the CJEU was not integral to its decision in Google Spain; however, concluded that “it is true”.  Internet Search Engines do not, in the view of Wraby J, process personal data “only” for the purposes of journalism, literature or art.

In considering section 32 of the Data Protection Act 1998 Warby J concluded that there is a subjective and an objective element to each of section 32(1)(b) and (c).  In relation to section 32(1)(b) Warby J concluded that the data controller had to have a subjective belief that the publication of the personal data in question would be in the public interest and this belief must be objectively reasonable.  In respect of section 32(1)(c), Warby J considered that the data controller must prove that it had a subjective belief that compliance with the data protection principle(s) engaged would be incompatible with the special purpose and that belief must be one which is objectively reasonable.

Warby J explained in his judgment that if he was wrong in his conclusion that section 32 was not even engaged in this case, that he would have still rejected Google’s reliance upon it concluding that Google would have failed when it came to considering the test in section 32(1)(b).  There was no evidence, Warby J concluded, that “anyone at Google ever gave consideration to the public interest in continued publication of the URLs complained of, at any time before NT1 complained” [para 102]

Schedule 3 of the Data Protection Act 1998

Clearly a great deal of the personal data at issue in these claims, being personal data relating to criminal convictions, is sensitive personal data (see section 2 of the Data Protection Act 1998).  In order for processing of sensitive personal data to be in compliance with the first data protection principle, which requires personal data to be processed fairly and lawfully, the data controller must be able to rely upon one of the conditions in Schedule 3 to the Data Protection Act 1998 (in addition to one of the Schedule 2 conditions).  This is an area where Google had a great deal of difficulty.

Warby J rejected most of the Schedule 3 grounds that Google sought reliance upon (see paras 107-109).  However, in paragraph 110 of his decision, Warby J, decides that condition 5 in Schedule 3 was satisfied: “that “the information contained in the personal data has been made public as a result of steps deliberately taken by the data subject.” In reaching this conclusion, Warby J relies upon the decision of Stephens J in Townsend v Google Inc [2017] NIQB 81.  In Townsend, Stephens J concluded that as a consequence of the principle of open justice, when an offender commits an offence, even in private, he deliberately makes that information public (see para 65 of Townsend).  In NT1 and NT2, Counsel for the Claimants, Hugh Tomlinson QC, takes issue with the conclusions of Stephen J and Counsel’s arguments are set out briefly by Warby J towards the end of paragraph 110.  Warby J concludes that, in his view, that the reasoning of Mr. Tomlinson was not sound.

I must confess that I have a great deal of difficulty with the reasoning of Warby J and Stephens J on this point.  I struggle to see how the commission of an offence by an individual amounts to them taking positive steps to make the information public.  The conclusions of Warby J and Stephens J do not seem to me to fit with the statutory language in the Data Protection Act 1998 nor the language of the Directive which it implements.  Warby J considered that the language in Article 8.2(e) of the Data Protection Directive is “obscure”.  It seems to me that the language of the Directive is the complete antitheses of “obscure” and that section 32 does not adequately implement the requirements of the Directive in this regard.  The only UK jurisdiction yet to grapple with this issue is Scotland.  Neither the Northern Irish nor the English and Welsh court decisions are from appellate level courts.  For the time being we have two first instance courts in two jurisdictions reaching the same conclusion; that will undoubtedly be considered somewhat persuasive by other first instance judges.

The balancing exercise

The court in Google Spain required a balancing exercise to take place between the rights within the European Convention on Human Rights to a private and family life (Article 8) and freedom of expression (Article 10).  Following Google Spain the ‘Article 29 Working Party’ (soon to become the European Data Protection Board) issued guidance on the Google Spain decision.  These guidelines provide helpful assistance, but do not prescribe the factors which are to be taken into consideration; it is acceptable to go beyond the factors in the guidance [para 135].

In respect of NT1, Warby J attached some weight to the conduct of the Claimant post-conviction; in particular, NT1 had caused to be published about him on the internet (by a reputation management company known in the judgment by the fictitious name of ‘cleanup’) misleading statements about his character and integrity:  NT1 had been convicted of a substantial offence of dishonesty and had received a substantial prison sentence for that.  This can be contrasted with NT2 who had not been convicted of an offence of dishonesty, had entered a plea of guilty and had shown remorse.

The contrast is an interesting one because while each case will inevitably turn on its own facts, it shows the kind of issues that the court is likely to take into consideration when balancing the competing Article 8 and 10 rights.

Interaction between the Rehabilitation of Offenders Act and the Data Protection Act 1998

The Rehabilitation of Offenders Act 1974 (“ROA”) differs in Scotland from what is in force in England and Wales; of course, these claims deal with the ROA as it applies in England and Wales.  The differences in the substance of the Act do not, however, affect the principles which are in play when looking at the interaction between the ROA and data protection law.

The ROA creates a, somewhat limited, right to rehabilitation and Warby J concluded that this right to rehabilitation is an aspect of privacy law.  Warby J concluded that “[t]he rights and interests protected include the right to reputation, and the right to respect for family life and private life, including unhindered social interaction with others.” Furthermore, Warby J concluded that “[u]pholding the right [to rehabilitation] also tends to support a public or societal interest in the rehabilitation of offenders.”  Importantly though, the right to rehabilitation is a qualified right.  As with most cases involving rights, the rights of the offender to rehabilitation do come into conflict with the rights of others, in particular their rights to information and freedom of expression.

As a starting point, a person who is party to legal proceedings held in public (such as the accused in a criminal trial) does not have a reasonable expectation of privacy.  However, there may well come a point in time when they can have such an expectation.  The ROA works to prevent the disclosure of certain criminal offences for which a person has been convicted after a specified period of rehabilitation.  It does not, Warby J concluded, mean that in 1974 Parliament legislated for a right to privacy or confidentiality from the point at which the offence became “spent”.

The rehabilitated offender’s right to a family and private life in respect of a spent conviction will normally be a weighty factor against further use of disclosure of that information; however, it is not a conclusive factor.  The “balancing exercise will involve an assessment of the nature and extent of any actual or prospective harm. If the use or disclosure causes, or is likely to cause, serious or substantial interference with private or family life that will tend to add weight to the case for applying the general rule.” [para 166]

Paragraph 166 of Warby J’s judgment is well-worth reading in full for anyone who is involved in balancing exercises of this nature.

At the end of the day, de-indexing (or de-listing) from internet search results does not cause the information to disappear completely.  The effect that it has is to make the information more difficult to find.  It will still be possible for a person, with sufficient determination, to discover and access the information.  In the modern day world we are used to being able to put search terms into Google (and other search engines) and have millions, if not billions, of results returned to us in a fraction of a second.  The search engines have developed algorithms which help to bring the content that is seemingly most relevant to the top of those results with the seemingly least relevant placed at the end of the long list of results.  Information is much more readily available than it was in 1974; some might argue that cases such as NT1 and NT2 simply return the position back to something which more closely resembles 1974.

It is quite probable that we will begin to see cases like NT1 and NT2 arise more frequently.  The qualified right to erasure within the GDPR has attracted a lot of attention and individuals are certainly more aware of ‘the right to be forgotten’.  The GDPR arguably doesn’t take us forward from what was determined in Google Spain, but simply gives it a statutory basis as opposed to one that is derived mostly from case law.  The qualified right to erasure within the GDPR is, as noted above, often overstated and this will inevitably, in the event that people seek to enforce it more frequently, lead to disputes between controllers and data subjects.

Alistair Sloan

Should you require advice or assistance about UK Data Protection and Privacy law then contact Alistair Sloan on 0141 229 0880.  You can also contact him by E-mail.  You can also follow our dedicated Twitter account covering all Information Law matters:  @UKInfoLaw

Data Protection/Privacy Enforcement: March 2018

Probably the most high profile piece of enforcement action taken by the Information Commissioner’s Office in March was its application for, and execution of, a warrant to enter and inspect the offices occupied by Cambridge Analytica as part of the Commissioner’s wider investigation into the use of personal data in politics.  It would seem that data protection warrants get more people excited about data protection than would ordinarily be the case. The Cambridge Analytica warrant was not the only warrant that the Commissioner obtained and executed in March; the Commissioner’s website also published details of a warrant that it executed in Clydebank (Glasgow).  This warrant was directed towards alleged breaches of the Privacy and Electronic Communications (EC Directive) Regulations 2003 which deal with, insofar as this blog is concerned with, the rules concerning direct marketing to individuals by electronic means.

Key Points

  • Care needs to be taken when looking at sharing personal data on a controller-to-controller basis with other companies, including separate companies within the same group of companies. Data controllers need to ensure that they identify what their lawful basis for processing is, provide adequate fair processing information to data subjects in relation to such sharing of personal data and ensure that any changes to their policy in respect of data-sharing do not result in that sharing being for a purpose that is incompatible with those stated at the time of collection.
  • If you, as an individual (whether or not you are yourself a data controller), unlawfully disclose personal data to third parties then you could be liable for prosecution.

Enforcement Action published by the ICO during March 2018

WhatsApp Inc.
An undertaking was given by WhatsApp Inc. In it, WhatsApp undertook not to do a number of things; including not transferring personal data concerning users within the EU to another Facebook-controlled company on a controller-to-controller basis until the General Data Protection Regulation becomes applicable on 25th May 2018.  The undertaking was given after WhatsApp introduced new terms and conditions and a new privacy policy which affected how it processed personal data held by it; in particular, how it would now share personal data with other Facebook-controlled companies.

A former housing worker was convicted at St. Albans Crown Court after he shared a confidential report identifying a potential vulnerable victim. The defendant was convicted of three charges of unlawfully obtaining disclosing personal data contrary to section 55 of the Data Protection Act 1998.  He was fined £200 for each charge and was ordered to pay £3,500 in costs.

Alistair Sloan

Should you require advice or assistance about UK Data Protection and Privacy law then contact Alistair Sloan on 0141 229 0880.  You can also contact him by E-mail.  You can also follow our dedicated Twitter account covering all Information Law matters@UKInfoLaw

The Law Enforcement Directive: Data Subjects’ Rights (Part 1)

Earlier this month I wrote a blog post providing an introduction to the Law Enforcement Directive (“LED”); in that post I indicated that I would look separately at the rights of data subjects under the LED.  I had anticipated that I would do this earlier on in the month, but then came Cambridge Analytica and the Information Commissioner’s power to obtain a search warrant.  This is part 1 of my look at the rights of data subjects under the LED and will focus on the rights in Artciles 13-16 of the LED.

Part 3 of the Data Protection Bill will implement the provisions of the LED in the UK.  Clauses 43 to 54 of the Bill (as the Bill presently stands) make provisions in respect of the rights of data subjects under Part 3.   The rights within the Data Protection Bill are derived from the LED itself, which is very much based upon the rights contained within the General Data Protection Regulation.  Chapter III of the LED sets out the rights which Member States must make available to data subjects where personal data is being processed for the law enforcement purposes.

Information to be made available, or given, to the data subject
Article 13 of the LED makes certain provisions in relation to the information that controllers, who are processing personal data for the law enforcement purposes, should normally make available to data subjects.  The provisions of Article 13 are contained within clause 44 of the Data Protection Bill (although, I make reference to the LED Articles it should be kpet in mind that the LED is a Directive rather than a Regulation and therefore does not have direct effect.  It will be the domestic provisions upon which data subjects will rely upon in their dealings with the competent authorities, Information Commissioner and domestic courts rather than the LED’s Articles).

Controllers who are processing personal data for the law enforcement purposes are to make the following information available:

  • The identity and contact details of the controller;
  • The contact details of the data protection officer (where there is one);
  • The purposes for which the controller processes personal data;
  • The existence of the data subject’s rights to (i) subject access; (ii) rectification;  (iii) erasure of personal data or the restriction of its use; and (iv) to make a complaint to the Information Commissioner;
  • information about the period for which the personal data will be stored or, where that is not possible, about the criteria used to determine that period;
  • where applicable, information about the categories of recipients of the personal data (including recipients in third countries or international organisations)
  • where necessary, further information to enable the exercise of the data subject’s rights under Part 3, in particular where the personal data are collected without the knowledge of the data subject

Controllers can restrict the level of information that is provided to the data subject in order to: (a) avoid obstructing official or legal inquiries, investigations or procedures; (b) avoid prejudicing the prevention, detection, investigation or prosecution of criminal offences or the execution of criminal penalties; (c) protect public security (d) protect national security; or (e) protect the rights and freedoms of others.

This right to information will not be unfamiliar to anyone who is familiar with the provisions of the GDPR; however, it’s not surprising that the right is limited to a degree to take account of the nature of the personal data that falls to be dealt with under the LED and Part 3 of the Data Protection Bill.

Subject Access
The right of subject access remains a fundamental aspect of data protection law emanating from the European Union.  I have previously looked at the right of subject access within the General Data Protection Regulation on this blog.  The right of such fundamental importance that it appears within LED; Articles 14 and 15 of the LED covers the right of subject access and this aspect of the LED is to be given effect to by clause 45 of the Data Protection Bill (as it currently stands)

If you are familiar with the right of subject access under the current Data Protection Act 1998 and/or the General Data Protection Regulation, then nothing much will surprise you vwithin Articles 14 and 15 and clause 45.  The right of subject access within the LED and Part 3 of the Data Protection Bill provides the data subject the same rights as they have under the GDPR.  It must be complied within one month and no fee can generally be charged for dealing with a Subject Access Request (SAR).

The controller can restrict the data subject’s right to subject access and these provisions are presently found within clause 45(4) of the Data Protection Bill.  The controller can restrict the data subject’s right to the extent and for so long as it is a necessary and proportionate measure to: (a) avoid obstructing an official or legal inquiry, investigation or procedure; (b) avoid prejudicing the prevention, detection, investigation or prosecution of criminal offences or the execution of criminal penalties;(c) protect public security; (d) protect national security; or (e) protect the rights and freedoms of others.  In determining whether the restriction is a necessary and proportionate measure the controller must have regard to the fundamental rights and legitimate interests of the data subject.

Where a data subject’s right to subject access under Part 3 of the Data Protection Bill is to be restricted, the Bill (in its current form) requires the data subject to be given information relating to the restriction except to the extent that to provide such information it would undermine the purpose of the restriction.  For example, if an individual who was being investigated by the Police for fraud made a Subject Access Request the police would be entitled to restrict the data subject’s rights insofar as it related to that investigation and that police would be able to do so without telling them that they have restricted their subject access rights.

The next part will look at the right to restriction of processing; the right to erasure and the data subject’s rights in relation to automated processing in the context of the LED and Part 3 of the Data Protection Bill.  Remember, the LED is due to be implemented by 6th May 2018, which is almost 3 weeks before the date upon which the GDPR becomes applicable.

Alistair Sloan

If you require any advice and assistance with matters relating to the Law Enforcement Directive or any other Privacy/Data Protection legal matter then contact Alistair Sloan on 0141 229 0880 or send him an E-mail.  You can follow Inksters’ dedicated Information Law Twitter account:  @UKInfoLaw

The Information Commissioner’s Powers of Entry and Inspection

Yesterday I wrote a blog post looking at data subject’s rights and lessons for controllers arising out of the Cambridge Analytica and Facebook privacy matter.  In that blog post I mentioned briefly about the Information Commissioner’s powers of entry and search after the Commissioner announced that she was seeking a warrant to enter and search Cambridge Analytica’s premises.   In this blog post I will look at the Commissioner’s powers of entry and search in a bit more detail.

As noted yesterday, the Commissioner’s powers of entry and search are contained in Schedule 9 to the Data Protection Act 1998.  Schedule 9 sets out the circumstances in which a judge can grant a warrant to the Information Commissioner.  The judge considering the application must be satisfied, based on statements made on oath, that the there are reasonable grounds of suspecting that (a) a data controller has contravened or is contravening any of the data protection principles, or (b) that an offence under the Data Protection Act has been or is being committed, and that evidence of the contravention or of the commission of the offence is to be found on any premises specified in the information supplied by the Commissioner.

The Commissioner is generally required, by the terms of Schedule 9 to the Data Protection Act 1998, to jump through some hoops before the judge considering the warrant application can grant the warrant to the Commissioner.  Paragraph 2 of Schedule 9 requires that the judge considering the application be satisfied of a number of other things:

  1. that the Commissioner has given seven days’ notice in writing to the occupier of the premises in question demanding access to the premises, and
  2. that either (i) access was demanded at a reasonable hour and was unreasonably refused, or (ii) although entry to the premises was granted, the occupier unreasonably refused to comply with a request by the Commissioner or any of the Commissioner’s officers or staff to permit the Commissioner or the officer or member of staff to do any of the things she would be entitled to do if she had a warrant (see below); and
  3. that the occupier, has, after the refusal, been notified by the Commissioner of the application for the warrant and has had an opportunity of being heard by the judge on the question whether or not it should be issued.

Where the judge is satisfied that the case is one of urgency or that compliance with those provisions would defeat the object of the entry, the judge does not need to be satisfied of the three things listed above.  In this case, given that the Commissioner announced her intention to apply for a warrant on national television, it is likely that a judge will require to be satisfied of the three conditions listed above.

Who considers an application by the Commissioner for a warrant depends upon the jurisdiction in which the warrant is being applied for.  In England and Wales a District Judge (Magistrates’ Court) or a Circuit Judge has the power to grant the warrant; in Scotland it is the Sheriff and in Northern Ireland it is a Country Court Judge.

A warrant granted under Schedule 9 of the Data Protection Act 1998 gives the Commissioner the power to do a number of things; these things can be found in paragraph 1(3) of the Schedule and are:

  1. to enter the premises
  2. to search the premises
  3. to inspect, examine, operate and test any equipment found on the premises which is used or intended to be used for the processing of personal data;
  4. to inspect and seize any relevant documents or other material found on the premises;
  5. to require any person on the premises to provide an explanation of any document or other material found on the premises;
  6. to require any person on the premises to provide such other information as may reasonably be required for the purpose of determining whether the data controller has contravened, or is contravening, the data protection principles.

The warrant must be executed at a reasonable hour, unless it appears to the person executing it that there are grounds for suspecting that the object of the warrant would be defeated if it were so executed, and within 7 days of the date of issue.  It allows the Commissioner, her officers and staff to use reasonable force to execute the warrant.

There are lots of other, really boring and technical requirements, which I won’t go into; the last thing I will mention is the terms of paragraph 12 of Schedule 9 which makes it an offence to: (i) intentionally obstruct a person in the execution of a warrant issued under Schedule 9; (ii) fail, without reasonable excuse, to give any person executing such a warrant such assistance as he may reasonably require for the execution of the warrant; (iii) makes a statement in response to a requirement  to provide information (see 5 and 6 in the list of powers the warrant gives the Commissioner) which that person knows to be false in a material respect; and (iv) recklessly makes a statement in response to such a requirement which is false in a material respect.

The Commissioner does get warrants from time to time; for example, earlier this month the ICO executed search warrants in relation to two properties in Greater Manchester as part of an investigation into companies suspected of sending text messages in contravention of the Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR).  The provisions of Schedule 9 to the Data Protection Act 1998 apply to PECR by virtue of Regulation 31 of PECR.

Alistair Sloan

If you are a data controller or an individual who is looking for advice and assistance with any aspect of data protection or privacy law, then you can contact Alistair Sloan on 0345 450 0123 or 0141 229 0880.  Alternatively, you can send him an E-mail.

Data Protection, Facebook and Cambridge Analytica

We know that the Information Commissioner is investigating the circumstances surrounding the obtaining of personal data of a considerable number of individuals by Cambridge Analytica.  Cambridge Analytica is a data analytics company that is in the midst of what can only be described as a data protection and privacy scandal.

There are a number of significant allegations being made against Cambridge Analytica about how it obtains and processes personal data.  The Information Commissioner has also revealed that Cambridge Analytica is not cooperating with her investigation to the extent that she is going to apply for a warrant to enter and search their premises.  This means that, in all probability, the Commissioner has already sought access and it has been refused.  Schedule 9 to Data Protection Act 1998 sets out the Information Commissioner’s powers of entry and inspection; it permits the Commissioner to obtain a warrant from the court where the court is satisfied that a data controller has contravened or is contravening any of the data protection principles, or that an offence under this Act has been or is being committed, and that evidence of the contravention or of the commission of the offence is to be found on any premises specified.

This story is moving at quite a pace and is constantly changing with new revelations coming to light; it’s also the subject of an investigation by the Information Commissioner and there is the possibility that the company might face prosecution for offences under Section 55 of the Data Protection Act 1998 depending upon what the Commissioner finds during the course of her investigation.  I am therefore going to try and keep this blog post broad and theoretical rather than trample upon the toes of a live regulatory investigation.

A data controller has a duty to comply with the data protection principles in relation to all of the personal data for which they are the controller, subject to certain specified exemptions set out in statute.  The First data protection principle requires that personal data be “processed fairly and lawfully”; this requires the data controller to meet one or more of the conditions set out in Schedule 2 to the Data Protection Act 1998 (and, in respect of sensitive personal data, a condition in Schedule 3 also requires to be satisfied).

What can individuals do if they are concerned about whether Cambridge Analytica has any personal data concerning them and what they’ve been doing with it?  Data Subjects have a number of rights under the Data Protection Act 1998 and the cornerstone of those rights is the right of subject access.  This is currently given effect to in section 7 of the Data Protection Act 1998 and is not simply about getting copies of the personal data being processed by a data controller:  it consists of a whole suite or rights, of which getting a copy of the personal data is only one aspect.  Under the current law, data controllers are entitled to charge a fee up to a prescribed maximum for dealing with such requests; a request of this nature would attract a fee of £10, but many individuals might well think that this is a price worth paying to know if and how they have been affected by this issue.  Data Controllers have up to 40 days in which to comply with a subject access request.  Some key changes to the right of subject access will come into effect on 25th May 2018, but for now the law contained within the Data Protection Act 1998 is still applicable.

Once you have the response to your subject access request your rights do not end there; once you’ve established what a data controller is processing about you, what they’re doing with it and where they got it from there are a number of other steps that you might be able to take, such as requiring them to cease processing your personal data, complaining to the Information Commissioner or making a claim for compensation.

For data controllers, what is currently unfolding should be seen as an important lesson.  Data can be a useful tool to a business; whether it is being used for targeted marketing campaigns or to work out what consumers want from products and services in your market.  However, there are laws governing data protection and privacy and at the heart of those laws are the principles of fairness and transparency.  Controllers need to be careful as to how they obtain personal data, where they obtain it from, what they do with it and be certain that they have a lawful basis for processing that personal data in the ways that they want to do so; that may be because you have the consent of the data subject, because you have a legitimate interest in the processing or some other lawful ground for processing.  Don’t forget the Privacy and Electronic Communications (EC Directive) Regulations 2003 when conducting direct marketing by electronic means.

Simply because a person has made their personal data available, for example through social media, does not mean that is free to be used by whomever and for whatever they want.  The principles of the Data Protection Act 1998 still apply and the reputational damage that can be suffered may well vastly outweigh any regulatory action taken by the Information Commissioner or by data subjects themselves.

Alistair Sloan

If you are a data controller or an individual who is looking for advice and assistance with any aspect of data protection or privacy law, then you can contact Alistair Sloan on 0345 450 0123 or 0141 229 08800.  Alternatively, you can send him an E-mail.