The Information Commissioner’s power to compel information

The Information Commissioner is presently undertaking an investigation into the possible unlawful use of personal data, in particular, data analytics, by political parties and political campaigning organisations.  The most high profile activity that the Commissioner has undertaken in respect of that investigation has to be the obtaining and execution of a warrant to search the offices of Cambridge Analytica.  As part of that investigation it has been reported that a number of persons and organisations involved in politics have been served with Information Notices by the Information Commissioner, including the United Kingdom Independence Party (UKIP), Leave.EU and Arron Banks.

An Information Notice is a formal investigative tool which the Information Commissioner can use in order to gather information.  Her power to issue such notices, in respect of the processing of personal data, is to be found in section 43 of the Data Protection Act 1998.  There are two circumstances in which the Commissioner can issue an Information Notice:  (1) when conducting an assessment pursuant to section 42 of the Data Protection Act 1998; and (2) where the Commissioner reasonably requires any information for the purpose of determining whether the data controller has complied or is complying with the data protection principles.  Broadly speaking this means that the Commissioner can issue an Information Notice either when her office is conducting an investigation at the request of a data subject or an investigation undertaken by her office which has been instigated by the Commissioner herself.

An Information Notice is simply a document which requires the data controller concerned to provide the Commissioner with information specified within the notice relating to the section 42 request or the controller’s compliance with the data protection principles.  However, its simplicity obscures its formality.  The issuing of an Information Notice is a formal step, and is a serious one for the recipient of the notice.  There is an automatic right of appeal against the notice or any part of the notice to the First-Tier Tribunal (Information Rights).  The right of appeal exists precisely because of its formality and the consequences for not complying with the notice.  It has been reported that UKIP has appealed the Information Notice served on it to the Tribunal.

An Information Notice is more than a polite request for information; it is a formal demand for information which is baked up by the threat of sanctions.  It is a criminal offence to fail to comply with an information notice which can result, if convicted, in a fine.  Furthermore, it is a criminal offence  to (i) make a statement in response to an information notice which is known to be false; or (ii) recklessly make a false statement in response to an information notice.

When serving an Information Notice, the Commissioner can specify or describe the information required by her or can be broader and instead specify or describe categories of information that she requires from the data controller.  There are some restrictions though on the information that the Commissioner can require a data controller to provide her with.  A data controller is not required to furnish the Commissioner with (a) “any communication between a professional legal adviser and his client in connection with the giving of legal advice to the client with respect to the person’s obligations, liabilities or rights under [the Data Protection Act 1998]”, or (b) “any communication between a professional legal adviser and his client, or between such an adviser or his client and any other person, made in connection with or in contemplation of proceedings under or arising out of [the Data Protection Act 1998] (including proceedings before the Tribunal) and for the purposes of such proceedings.”

A data controller can also refuse to provide information which would reveal evidence of the commission of any offence.  However, there are some exceptions to this general exception; if the offence is an offence under the Data Protection Act 1998 or offences under certain statutory provisions concerning the giving of false evidence, then the data controller may still be required to provide the Commissioner with that information.

The serving of an Information Notice on a data controller is a significant step by the Commissioner and it is one that data controllers should not take lightly.  The consequences for failing to comply with the notice or for deliberately or recklessly misleading the Commissioner through the provision of false information can see the data controller facing criminal charges.  The Notice can be challenged through the First-Tier Tribunal (Information Rights) which could see part or all of the notice reduced/quashed.  The Data Protection Bill contains provisions in relation to Information Notices which are for the most part identical to the powers found within the Data Protection Act 1998 and so the Commissioner will continue to possess this potentially powerful took once the GDPR becomes a reality next month (subject, of course, to the Data Protection Bill completing is passage through parliament and receiving Royal Assent in time).

Alistair Sloan

If you are facing an investigation by the Information Commissioner in respect of alleged failures to comply with privacy and data protection law, or if you require advice on any other information law matter you can contact Alistair Sloan on 0141 229 0880.  Alternatively you can contact him directly by E-mail.  We also have a dedicated information law twitter account which you can follow.

NT1 and NT2: Forgetting past misdemeanors

The so-called ‘right to be forgotten’ (hereafter “RTBF”) is an often trumpeted aspect of the GDPR; it is an important right, but one that is rather more restricted in nature than is understood.  The RTBF is not a new right within he GDPR, but has foundation within current data protection law and practice.  On 13 March 2014, the Grand Chamber of the Court of Justice of the European Union gave its judgment in Google Spain SL, Google Inc. v Agencia Española de Protección de Datos, Mario Costeja González (“Google Spain”), which it has popularly been said created a ‘right to be forgotten’.  The court did not, in fact, grant a right to be forgotten; instead, the court required search engines, such as Google, to consider requests from individuals to have links to webpages concerning them de-listed from Google search results in certain circumstances.

Fast forward to 13th April 2018, a little over 4 years since the decision in Google Spain, Mr Justice Warby handed down his judgment in NT1 & NT 2 v Google LLC [2018] EWHC 799 (QB); cases which both concerned the RTBF.  NT1 and NT2 are both businessmen who were convicted of criminal offences.  In respect of NT1, he was involved in a controversial property business in the late 1980s and the early 1990s (while in his thirties).  In the late 1990s, while he was in his 40s, NT1 was prosecuted and convicted, after trial, of a criminal conspiracy connected with those business activities.  He was sentenced to a period of imprisonment and his conviction has since become “spent”.  In addition to the matters for which he was convicted, he was also accused of, but never prosecuted for, a separate conspiracy connected with the same business.  Some of the businesses former staff were eventually convicted in relation to that separate conspiracy.  There was media reporting of these and related matters at that time.  Links to that reporting are made available by Google in its search results.   On 28 June 2014, not long after the CJEU’s decision in Google Spain, NT1 made a de-listing request to Google in respect of six links.  Google agreed to block one link, but not the other 5.  Google stood by its position when NT 1 asked for them to reconsider their decision.  In January 2015, a second de-listing request was made by NT1, this time through his solicitors. Google replied to that de-listing enquiry in April 2015, refusing it.

NT2’s case is quite separate from that of NT1; the two claims were tried separately, but were heard one after the other and involved the same judge and the same representation.  NT2’s case has some similarity in terms of its facts and it raises similar issues of principle to that of NT1.  While in his 40s and sometime in the early 21st century, NT2 was involved in a controversial business which experienced public opposition in relation to its environmental practices.  NT2 pleaded guilty to two charges of conspiracy in connection with that business.  This was “rather more than ten years ago” [para 7].  NT2 received a short prison sentence and spent six weeks in custody before being released; his conviction also became spent.  On 14 April 2015, NT2 made a de-listing request to Google in respect of 8 links.  Google declined to de-list any of the links.

Ultimately, NT2 was successful in obtaining orders requiring Google to de-list while NT1 was unsuccessful.

Journalism, literature and art exemption

Google had, in its defence to these claims, sought to place reliance upon the exemption in section 32 of the Data Protection Act 1998, which relates to “journalism, literature and art”.  Warby J deals with this aspect of Google’s defence to the claims by the claimants in paragraphs 95-102 of the judgment.  Warby J ultimately rejected Google’s reliance upon section 32 holding that the exemption did not apply in the first place; but even if it did, Google would have failed to meet the part of the test which is contained in section 32(1)(b).  Warby J accepted that the EU law concept of journalism was a broad and elastic one which went beyond simply the activities of media undertakings and incorporates other activities which have as their aim the disclosure to the public of information, opinions and ideas. However, Warby J concluded that “the concept [of journalism] is not so elastic that it can be stretched to embrace every activity that has to do with conveying information or opinions. To label all such activity as “journalism” would be to elide the concept of journalism with that of communication.”

In Google Spain the CJEU was sceptical as to whether the exemption in Article 9 of the Directive (which is implemented through section 32 of the Data Protection Act 1998) would apply to an internet search engine such as Google.  Warby J noted that this observation by the CJEU was not integral to its decision in Google Spain; however, concluded that “it is true”.  Internet Search Engines do not, in the view of Wraby J, process personal data “only” for the purposes of journalism, literature or art.

In considering section 32 of the Data Protection Act 1998 Warby J concluded that there is a subjective and an objective element to each of section 32(1)(b) and (c).  In relation to section 32(1)(b) Warby J concluded that the data controller had to have a subjective belief that the publication of the personal data in question would be in the public interest and this belief must be objectively reasonable.  In respect of section 32(1)(c), Warby J considered that the data controller must prove that it had a subjective belief that compliance with the data protection principle(s) engaged would be incompatible with the special purpose and that belief must be one which is objectively reasonable.

Warby J explained in his judgment that if he was wrong in his conclusion that section 32 was not even engaged in this case, that he would have still rejected Google’s reliance upon it concluding that Google would have failed when it came to considering the test in section 32(1)(b).  There was no evidence, Warby J concluded, that “anyone at Google ever gave consideration to the public interest in continued publication of the URLs complained of, at any time before NT1 complained” [para 102]

Schedule 3 of the Data Protection Act 1998

Clearly a great deal of the personal data at issue in these claims, being personal data relating to criminal convictions, is sensitive personal data (see section 2 of the Data Protection Act 1998).  In order for processing of sensitive personal data to be in compliance with the first data protection principle, which requires personal data to be processed fairly and lawfully, the data controller must be able to rely upon one of the conditions in Schedule 3 to the Data Protection Act 1998 (in addition to one of the Schedule 2 conditions).  This is an area where Google had a great deal of difficulty.

Warby J rejected most of the Schedule 3 grounds that Google sought reliance upon (see paras 107-109).  However, in paragraph 110 of his decision, Warby J, decides that condition 5 in Schedule 3 was satisfied: “that “the information contained in the personal data has been made public as a result of steps deliberately taken by the data subject.” In reaching this conclusion, Warby J relies upon the decision of Stephens J in Townsend v Google Inc [2017] NIQB 81.  In Townsend, Stephens J concluded that as a consequence of the principle of open justice, when an offender commits an offence, even in private, he deliberately makes that information public (see para 65 of Townsend).  In NT1 and NT2, Counsel for the Claimants, Hugh Tomlinson QC, takes issue with the conclusions of Stephen J and Counsel’s arguments are set out briefly by Warby J towards the end of paragraph 110.  Warby J concludes that, in his view, that the reasoning of Mr. Tomlinson was not sound.

I must confess that I have a great deal of difficulty with the reasoning of Warby J and Stephens J on this point.  I struggle to see how the commission of an offence by an individual amounts to them taking positive steps to make the information public.  The conclusions of Warby J and Stephens J do not seem to me to fit with the statutory language in the Data Protection Act 1998 nor the language of the Directive which it implements.  Warby J considered that the language in Article 8.2(e) of the Data Protection Directive is “obscure”.  It seems to me that the language of the Directive is the complete antitheses of “obscure” and that section 32 does not adequately implement the requirements of the Directive in this regard.  The only UK jurisdiction yet to grapple with this issue is Scotland.  Neither the Northern Irish nor the English and Welsh court decisions are from appellate level courts.  For the time being we have two first instance courts in two jurisdictions reaching the same conclusion; that will undoubtedly be considered somewhat persuasive by other first instance judges.

The balancing exercise

The court in Google Spain required a balancing exercise to take place between the rights within the European Convention on Human Rights to a private and family life (Article 8) and freedom of expression (Article 10).  Following Google Spain the ‘Article 29 Working Party’ (soon to become the European Data Protection Board) issued guidance on the Google Spain decision.  These guidelines provide helpful assistance, but do not prescribe the factors which are to be taken into consideration; it is acceptable to go beyond the factors in the guidance [para 135].

In respect of NT1, Warby J attached some weight to the conduct of the Claimant post-conviction; in particular, NT1 had caused to be published about him on the internet (by a reputation management company known in the judgment by the fictitious name of ‘cleanup’) misleading statements about his character and integrity:  NT1 had been convicted of a substantial offence of dishonesty and had received a substantial prison sentence for that.  This can be contrasted with NT2 who had not been convicted of an offence of dishonesty, had entered a plea of guilty and had shown remorse.

The contrast is an interesting one because while each case will inevitably turn on its own facts, it shows the kind of issues that the court is likely to take into consideration when balancing the competing Article 8 and 10 rights.

Interaction between the Rehabilitation of Offenders Act and the Data Protection Act 1998

The Rehabilitation of Offenders Act 1974 (“ROA”) differs in Scotland from what is in force in England and Wales; of course, these claims deal with the ROA as it applies in England and Wales.  The differences in the substance of the Act do not, however, affect the principles which are in play when looking at the interaction between the ROA and data protection law.

The ROA creates a, somewhat limited, right to rehabilitation and Warby J concluded that this right to rehabilitation is an aspect of privacy law.  Warby J concluded that “[t]he rights and interests protected include the right to reputation, and the right to respect for family life and private life, including unhindered social interaction with others.” Furthermore, Warby J concluded that “[u]pholding the right [to rehabilitation] also tends to support a public or societal interest in the rehabilitation of offenders.”  Importantly though, the right to rehabilitation is a qualified right.  As with most cases involving rights, the rights of the offender to rehabilitation do come into conflict with the rights of others, in particular their rights to information and freedom of expression.

As a starting point, a person who is party to legal proceedings held in public (such as the accused in a criminal trial) does not have a reasonable expectation of privacy.  However, there may well come a point in time when they can have such an expectation.  The ROA works to prevent the disclosure of certain criminal offences for which a person has been convicted after a specified period of rehabilitation.  It does not, Warby J concluded, mean that in 1974 Parliament legislated for a right to privacy or confidentiality from the point at which the offence became “spent”.

The rehabilitated offender’s right to a family and private life in respect of a spent conviction will normally be a weighty factor against further use of disclosure of that information; however, it is not a conclusive factor.  The “balancing exercise will involve an assessment of the nature and extent of any actual or prospective harm. If the use or disclosure causes, or is likely to cause, serious or substantial interference with private or family life that will tend to add weight to the case for applying the general rule.” [para 166]

Paragraph 166 of Warby J’s judgment is well-worth reading in full for anyone who is involved in balancing exercises of this nature.

At the end of the day, de-indexing (or de-listing) from internet search results does not cause the information to disappear completely.  The effect that it has is to make the information more difficult to find.  It will still be possible for a person, with sufficient determination, to discover and access the information.  In the modern day world we are used to being able to put search terms into Google (and other search engines) and have millions, if not billions, of results returned to us in a fraction of a second.  The search engines have developed algorithms which help to bring the content that is seemingly most relevant to the top of those results with the seemingly least relevant placed at the end of the long list of results.  Information is much more readily available than it was in 1974; some might argue that cases such as NT1 and NT2 simply return the position back to something which more closely resembles 1974.

It is quite probable that we will begin to see cases like NT1 and NT2 arise more frequently.  The qualified right to erasure within the GDPR has attracted a lot of attention and individuals are certainly more aware of ‘the right to be forgotten’.  The GDPR arguably doesn’t take us forward from what was determined in Google Spain, but simply gives it a statutory basis as opposed to one that is derived mostly from case law.  The qualified right to erasure within the GDPR is, as noted above, often overstated and this will inevitably, in the event that people seek to enforce it more frequently, lead to disputes between controllers and data subjects.

Alistair Sloan

Should you require advice or assistance about UK Data Protection and Privacy law then contact Alistair Sloan on 0141 229 0880.  You can also contact him by E-mail.  You can also follow our dedicated Twitter account covering all Information Law matters:  @UKInfoLaw

Data Protection/Privacy Enforcement: March 2018

Probably the most high profile piece of enforcement action taken by the Information Commissioner’s Office in March was its application for, and execution of, a warrant to enter and inspect the offices occupied by Cambridge Analytica as part of the Commissioner’s wider investigation into the use of personal data in politics.  It would seem that data protection warrants get more people excited about data protection than would ordinarily be the case. The Cambridge Analytica warrant was not the only warrant that the Commissioner obtained and executed in March; the Commissioner’s website also published details of a warrant that it executed in Clydebank (Glasgow).  This warrant was directed towards alleged breaches of the Privacy and Electronic Communications (EC Directive) Regulations 2003 which deal with, insofar as this blog is concerned with, the rules concerning direct marketing to individuals by electronic means.

Key Points

  • Care needs to be taken when looking at sharing personal data on a controller-to-controller basis with other companies, including separate companies within the same group of companies. Data controllers need to ensure that they identify what their lawful basis for processing is, provide adequate fair processing information to data subjects in relation to such sharing of personal data and ensure that any changes to their policy in respect of data-sharing do not result in that sharing being for a purpose that is incompatible with those stated at the time of collection.
  • If you, as an individual (whether or not you are yourself a data controller), unlawfully disclose personal data to third parties then you could be liable for prosecution.

Enforcement Action published by the ICO during March 2018

WhatsApp Inc.
An undertaking was given by WhatsApp Inc. In it, WhatsApp undertook not to do a number of things; including not transferring personal data concerning users within the EU to another Facebook-controlled company on a controller-to-controller basis until the General Data Protection Regulation becomes applicable on 25th May 2018.  The undertaking was given after WhatsApp introduced new terms and conditions and a new privacy policy which affected how it processed personal data held by it; in particular, how it would now share personal data with other Facebook-controlled companies.

Prosecutions
A former housing worker was convicted at St. Albans Crown Court after he shared a confidential report identifying a potential vulnerable victim. The defendant was convicted of three charges of unlawfully obtaining disclosing personal data contrary to section 55 of the Data Protection Act 1998.  He was fined £200 for each charge and was ordered to pay £3,500 in costs.

Alistair Sloan

Should you require advice or assistance about UK Data Protection and Privacy law then contact Alistair Sloan on 0141 229 0880.  You can also contact him by E-mail.  You can also follow our dedicated Twitter account covering all Information Law matters@UKInfoLaw

The Law Enforcement Directive: Data Subjects’ Rights (Part 1)

Earlier this month I wrote a blog post providing an introduction to the Law Enforcement Directive (“LED”); in that post I indicated that I would look separately at the rights of data subjects under the LED.  I had anticipated that I would do this earlier on in the month, but then came Cambridge Analytica and the Information Commissioner’s power to obtain a search warrant.  This is part 1 of my look at the rights of data subjects under the LED and will focus on the rights in Artciles 13-16 of the LED.

Part 3 of the Data Protection Bill will implement the provisions of the LED in the UK.  Clauses 43 to 54 of the Bill (as the Bill presently stands) make provisions in respect of the rights of data subjects under Part 3.   The rights within the Data Protection Bill are derived from the LED itself, which is very much based upon the rights contained within the General Data Protection Regulation.  Chapter III of the LED sets out the rights which Member States must make available to data subjects where personal data is being processed for the law enforcement purposes.

Information to be made available, or given, to the data subject
Article 13 of the LED makes certain provisions in relation to the information that controllers, who are processing personal data for the law enforcement purposes, should normally make available to data subjects.  The provisions of Article 13 are contained within clause 44 of the Data Protection Bill (although, I make reference to the LED Articles it should be kpet in mind that the LED is a Directive rather than a Regulation and therefore does not have direct effect.  It will be the domestic provisions upon which data subjects will rely upon in their dealings with the competent authorities, Information Commissioner and domestic courts rather than the LED’s Articles).

Controllers who are processing personal data for the law enforcement purposes are to make the following information available:

  • The identity and contact details of the controller;
  • The contact details of the data protection officer (where there is one);
  • The purposes for which the controller processes personal data;
  • The existence of the data subject’s rights to (i) subject access; (ii) rectification;  (iii) erasure of personal data or the restriction of its use; and (iv) to make a complaint to the Information Commissioner;
  • information about the period for which the personal data will be stored or, where that is not possible, about the criteria used to determine that period;
  • where applicable, information about the categories of recipients of the personal data (including recipients in third countries or international organisations)
  • where necessary, further information to enable the exercise of the data subject’s rights under Part 3, in particular where the personal data are collected without the knowledge of the data subject

Controllers can restrict the level of information that is provided to the data subject in order to: (a) avoid obstructing official or legal inquiries, investigations or procedures; (b) avoid prejudicing the prevention, detection, investigation or prosecution of criminal offences or the execution of criminal penalties; (c) protect public security (d) protect national security; or (e) protect the rights and freedoms of others.

This right to information will not be unfamiliar to anyone who is familiar with the provisions of the GDPR; however, it’s not surprising that the right is limited to a degree to take account of the nature of the personal data that falls to be dealt with under the LED and Part 3 of the Data Protection Bill.

Subject Access
The right of subject access remains a fundamental aspect of data protection law emanating from the European Union.  I have previously looked at the right of subject access within the General Data Protection Regulation on this blog.  The right of such fundamental importance that it appears within LED; Articles 14 and 15 of the LED covers the right of subject access and this aspect of the LED is to be given effect to by clause 45 of the Data Protection Bill (as it currently stands)

If you are familiar with the right of subject access under the current Data Protection Act 1998 and/or the General Data Protection Regulation, then nothing much will surprise you vwithin Articles 14 and 15 and clause 45.  The right of subject access within the LED and Part 3 of the Data Protection Bill provides the data subject the same rights as they have under the GDPR.  It must be complied within one month and no fee can generally be charged for dealing with a Subject Access Request (SAR).

The controller can restrict the data subject’s right to subject access and these provisions are presently found within clause 45(4) of the Data Protection Bill.  The controller can restrict the data subject’s right to the extent and for so long as it is a necessary and proportionate measure to: (a) avoid obstructing an official or legal inquiry, investigation or procedure; (b) avoid prejudicing the prevention, detection, investigation or prosecution of criminal offences or the execution of criminal penalties;(c) protect public security; (d) protect national security; or (e) protect the rights and freedoms of others.  In determining whether the restriction is a necessary and proportionate measure the controller must have regard to the fundamental rights and legitimate interests of the data subject.

Where a data subject’s right to subject access under Part 3 of the Data Protection Bill is to be restricted, the Bill (in its current form) requires the data subject to be given information relating to the restriction except to the extent that to provide such information it would undermine the purpose of the restriction.  For example, if an individual who was being investigated by the Police for fraud made a Subject Access Request the police would be entitled to restrict the data subject’s rights insofar as it related to that investigation and that police would be able to do so without telling them that they have restricted their subject access rights.

The next part will look at the right to restriction of processing; the right to erasure and the data subject’s rights in relation to automated processing in the context of the LED and Part 3 of the Data Protection Bill.  Remember, the LED is due to be implemented by 6th May 2018, which is almost 3 weeks before the date upon which the GDPR becomes applicable.

Alistair Sloan

If you require any advice and assistance with matters relating to the Law Enforcement Directive or any other Privacy/Data Protection legal matter then contact Alistair Sloan on 0141 229 0880 or send him an E-mail.  You can follow Inksters’ dedicated Information Law Twitter account:  @UKInfoLaw

The Information Commissioner’s Powers of Entry and Inspection

Yesterday I wrote a blog post looking at data subject’s rights and lessons for controllers arising out of the Cambridge Analytica and Facebook privacy matter.  In that blog post I mentioned briefly about the Information Commissioner’s powers of entry and search after the Commissioner announced that she was seeking a warrant to enter and search Cambridge Analytica’s premises.   In this blog post I will look at the Commissioner’s powers of entry and search in a bit more detail.

As noted yesterday, the Commissioner’s powers of entry and search are contained in Schedule 9 to the Data Protection Act 1998.  Schedule 9 sets out the circumstances in which a judge can grant a warrant to the Information Commissioner.  The judge considering the application must be satisfied, based on statements made on oath, that the there are reasonable grounds of suspecting that (a) a data controller has contravened or is contravening any of the data protection principles, or (b) that an offence under the Data Protection Act has been or is being committed, and that evidence of the contravention or of the commission of the offence is to be found on any premises specified in the information supplied by the Commissioner.

The Commissioner is generally required, by the terms of Schedule 9 to the Data Protection Act 1998, to jump through some hoops before the judge considering the warrant application can grant the warrant to the Commissioner.  Paragraph 2 of Schedule 9 requires that the judge considering the application be satisfied of a number of other things:

  1. that the Commissioner has given seven days’ notice in writing to the occupier of the premises in question demanding access to the premises, and
  2. that either (i) access was demanded at a reasonable hour and was unreasonably refused, or (ii) although entry to the premises was granted, the occupier unreasonably refused to comply with a request by the Commissioner or any of the Commissioner’s officers or staff to permit the Commissioner or the officer or member of staff to do any of the things she would be entitled to do if she had a warrant (see below); and
  3. that the occupier, has, after the refusal, been notified by the Commissioner of the application for the warrant and has had an opportunity of being heard by the judge on the question whether or not it should be issued.

Where the judge is satisfied that the case is one of urgency or that compliance with those provisions would defeat the object of the entry, the judge does not need to be satisfied of the three things listed above.  In this case, given that the Commissioner announced her intention to apply for a warrant on national television, it is likely that a judge will require to be satisfied of the three conditions listed above.

Who considers an application by the Commissioner for a warrant depends upon the jurisdiction in which the warrant is being applied for.  In England and Wales a District Judge (Magistrates’ Court) or a Circuit Judge has the power to grant the warrant; in Scotland it is the Sheriff and in Northern Ireland it is a Country Court Judge.

A warrant granted under Schedule 9 of the Data Protection Act 1998 gives the Commissioner the power to do a number of things; these things can be found in paragraph 1(3) of the Schedule and are:

  1. to enter the premises
  2. to search the premises
  3. to inspect, examine, operate and test any equipment found on the premises which is used or intended to be used for the processing of personal data;
  4. to inspect and seize any relevant documents or other material found on the premises;
  5. to require any person on the premises to provide an explanation of any document or other material found on the premises;
  6. to require any person on the premises to provide such other information as may reasonably be required for the purpose of determining whether the data controller has contravened, or is contravening, the data protection principles.

The warrant must be executed at a reasonable hour, unless it appears to the person executing it that there are grounds for suspecting that the object of the warrant would be defeated if it were so executed, and within 7 days of the date of issue.  It allows the Commissioner, her officers and staff to use reasonable force to execute the warrant.

There are lots of other, really boring and technical requirements, which I won’t go into; the last thing I will mention is the terms of paragraph 12 of Schedule 9 which makes it an offence to: (i) intentionally obstruct a person in the execution of a warrant issued under Schedule 9; (ii) fail, without reasonable excuse, to give any person executing such a warrant such assistance as he may reasonably require for the execution of the warrant; (iii) makes a statement in response to a requirement  to provide information (see 5 and 6 in the list of powers the warrant gives the Commissioner) which that person knows to be false in a material respect; and (iv) recklessly makes a statement in response to such a requirement which is false in a material respect.

The Commissioner does get warrants from time to time; for example, earlier this month the ICO executed search warrants in relation to two properties in Greater Manchester as part of an investigation into companies suspected of sending text messages in contravention of the Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR).  The provisions of Schedule 9 to the Data Protection Act 1998 apply to PECR by virtue of Regulation 31 of PECR.

Alistair Sloan

If you are a data controller or an individual who is looking for advice and assistance with any aspect of data protection or privacy law, then you can contact Alistair Sloan on 0345 450 0123 or 0141 229 0880.  Alternatively, you can send him an E-mail.

Data Protection, Facebook and Cambridge Analytica

We know that the Information Commissioner is investigating the circumstances surrounding the obtaining of personal data of a considerable number of individuals by Cambridge Analytica.  Cambridge Analytica is a data analytics company that is in the midst of what can only be described as a data protection and privacy scandal.

There are a number of significant allegations being made against Cambridge Analytica about how it obtains and processes personal data.  The Information Commissioner has also revealed that Cambridge Analytica is not cooperating with her investigation to the extent that she is going to apply for a warrant to enter and search their premises.  This means that, in all probability, the Commissioner has already sought access and it has been refused.  Schedule 9 to Data Protection Act 1998 sets out the Information Commissioner’s powers of entry and inspection; it permits the Commissioner to obtain a warrant from the court where the court is satisfied that a data controller has contravened or is contravening any of the data protection principles, or that an offence under this Act has been or is being committed, and that evidence of the contravention or of the commission of the offence is to be found on any premises specified.

This story is moving at quite a pace and is constantly changing with new revelations coming to light; it’s also the subject of an investigation by the Information Commissioner and there is the possibility that the company might face prosecution for offences under Section 55 of the Data Protection Act 1998 depending upon what the Commissioner finds during the course of her investigation.  I am therefore going to try and keep this blog post broad and theoretical rather than trample upon the toes of a live regulatory investigation.

A data controller has a duty to comply with the data protection principles in relation to all of the personal data for which they are the controller, subject to certain specified exemptions set out in statute.  The First data protection principle requires that personal data be “processed fairly and lawfully”; this requires the data controller to meet one or more of the conditions set out in Schedule 2 to the Data Protection Act 1998 (and, in respect of sensitive personal data, a condition in Schedule 3 also requires to be satisfied).

What can individuals do if they are concerned about whether Cambridge Analytica has any personal data concerning them and what they’ve been doing with it?  Data Subjects have a number of rights under the Data Protection Act 1998 and the cornerstone of those rights is the right of subject access.  This is currently given effect to in section 7 of the Data Protection Act 1998 and is not simply about getting copies of the personal data being processed by a data controller:  it consists of a whole suite or rights, of which getting a copy of the personal data is only one aspect.  Under the current law, data controllers are entitled to charge a fee up to a prescribed maximum for dealing with such requests; a request of this nature would attract a fee of £10, but many individuals might well think that this is a price worth paying to know if and how they have been affected by this issue.  Data Controllers have up to 40 days in which to comply with a subject access request.  Some key changes to the right of subject access will come into effect on 25th May 2018, but for now the law contained within the Data Protection Act 1998 is still applicable.

Once you have the response to your subject access request your rights do not end there; once you’ve established what a data controller is processing about you, what they’re doing with it and where they got it from there are a number of other steps that you might be able to take, such as requiring them to cease processing your personal data, complaining to the Information Commissioner or making a claim for compensation.

For data controllers, what is currently unfolding should be seen as an important lesson.  Data can be a useful tool to a business; whether it is being used for targeted marketing campaigns or to work out what consumers want from products and services in your market.  However, there are laws governing data protection and privacy and at the heart of those laws are the principles of fairness and transparency.  Controllers need to be careful as to how they obtain personal data, where they obtain it from, what they do with it and be certain that they have a lawful basis for processing that personal data in the ways that they want to do so; that may be because you have the consent of the data subject, because you have a legitimate interest in the processing or some other lawful ground for processing.  Don’t forget the Privacy and Electronic Communications (EC Directive) Regulations 2003 when conducting direct marketing by electronic means.

Simply because a person has made their personal data available, for example through social media, does not mean that is free to be used by whomever and for whatever they want.  The principles of the Data Protection Act 1998 still apply and the reputational damage that can be suffered may well vastly outweigh any regulatory action taken by the Information Commissioner or by data subjects themselves.

Alistair Sloan

If you are a data controller or an individual who is looking for advice and assistance with any aspect of data protection or privacy law, then you can contact Alistair Sloan on 0345 450 0123 or 0141 229 08800.  Alternatively, you can send him an E-mail.

Data Protection Bill: Committee Day 1

The Data Protection Bill has been winding its way through the legislative process since it was first introduced to the House of Lords in September 2017.  Since then it has completed its passage through the House of Lords and is now being scrutinised by MPs in the House of Commons, having received its second Reading last week.  I made some initial observations on the Bill shortly after it was first published and thought that it was about time that I revisited the general subject of the Bill.

The Bill has now reached the committee stage in the House of Commons and is being considered by a Public Bills Committee, the first meetings of which took place yesterday.  You can read the first sitting, which took place yesterday the morning, in Hansard, meanwhile the second sitting, which took place yesterday afternoon, can be found in Hansard here.

There was a debate yesterday morning on a proposed amendment (‘new clause 12’) which would insert a new clause into the Bill incorporating Article 8 of the Charter of Fundamental Rights of the European UnionArticle 8 of the Charter makes specific provision for the protection of personal data; the amendment was tabled by MPs from opposition parties and was resisted by the Government.  The source of the government’s concern, as set out by the Minister of State yesterday, is that new clause 12 would, in the government’s view, create “a new and free-standing right”.  The Minister went on to say that “[t]he new right in new clause 12 would create confusion if it had to be interpreted by a court.”  This was contested by Liam Byrne MP, who moved the amendment.  Mr Byrne noted that this was a refined version of an amendment that was unsuccessfully moved in the House of Lords.  Mr Byrne described the suggestion that new clause 12 was creating a new and unfettered right as being “nonsense”.  The amendment, while debated yesterday, was not put to a vote; decisions on whether to insert new clauses are not due to be taken until towards the end of the Committee’s consideration of the Bill.  We will need to therefore wait to learn whether it is ultimately included in the Bill or not.

Some amendments were considered and agreed to yesterday, while some others were considered and not agreed to.  In Clause 3 of the Bill, the definition of ‘processing’ has been amended to remove reference to ‘personal data’ and to replace it with ‘information’.  This means that the definition of processing in the Data Protection Bill now reads:  “Processing”, in relation to information, means an operation or set of operations which is performed on information, or on sets of information, such as”.  This means that the definition of processing in Clause 3 of the Data Protection Bill differs from the definition within the GDPR.

The explanation proffered by the Minister in support of these amendments was that they were “designed to improve clarity and consistency of language.”  The Minister argued that “the amendments ensure consistency with terminology in other legislation.”  She also gave her view that the amendments have “no material impact on the use of the term “processing” in parts 2 to 7 of the Bill”.

Clause 7 of the Bill (which deals with the meaning of ‘public authority’ and ‘public body’) has also been amended so as to provide that Ministers, exercising their delegated powers to designate and undesignated (for the purposes of data protection law) public authorities and public bodies, can do so not simply by identifying specific bodies or organisations, but also by way of description.  The changes effectively mean that the provisions in the Data Protection Bill work in the same way as the similar provisions do within the Freedom of Information Act 2000 and the Freedom of Information (Scotland) Act 2002.

The controversial immigration exemption in paragraph 4 of Schedule 2 to the Data protection Bill saw a great deal of debate in the afternoon’s sitting.  An amendment to remove the immigration exemption entirely from the Bill was moved and a division took place.  The amendment to remove the exemption from the Bill was defeated by 10 votes to 9 and therefore the exemption remains in the Bill.  The split was among party lines with the Government’s MPs successfully voting down the amendment with all MPs from opposition parties voting in favour of it.

It would not be possible to discuss everything that went on during the course of the committee’s two sittings yesterday, but I have tried to pick out some of the key aspects from yesterday’s proceedings.  The amendment to the definition of processing seems to me to be rather odd and quite frankly unfathomable.  Personal data is a well understood term within the field of data protection and privacy law.  How the courts and Commissioner will interpret “information” is something that we will need to wait and see; if the amendment does in fact make no material change, then it will have been a completely pointless amendment.

I don’t see the controversy of the immigration amendment going away anytime soon.  The Government is satisfied that the exemption strikes the right balance and is one that is permissible in terms of the GDPR.  Campaign groups in opposition to the amendment say that it goes too far and, in any event, is unlawful as it is not permitted by the GDPR.  It will certainly be interesting to see where matters go in that regard.

The attempt to replicate Article 8 of the EU Charter is an interesting proposal; one of the Government’s red lines in relation to the EU withdrawal process is that the EU Charter will cease to apply in the United Kingdom, how the effective inclusion of one article of the Charter would go down with certain members of Parliament is something that remains to be seen.  Whether its inclusion will assist with the issue of ‘adequacy’ following the United Kingdom’s withdrawal from the European Union is debatable (for what it is worth, my initial reaction is it’s unlikely that it would have any bearing at all upon the question of adequacy).

The Committee’s consideration of the Bill is due to continue tomorrow (Thursday 15th March 2018) with sittings starting at 11:30am and again at 2pm.  This is a large and complex Bill and the task of undertaking a line by line scrutiny of it is no easy task, especially in a timetable that will see this line by line scrutiny come to an end on 27th March 2018.

Alistair Sloan

If you would like advice on the General Data Protection Regulation, the new Data Protection Bill or any other Information Law concern then contact our Alistair Sloan on 0345 450 0123 or by completing the form on the contact page of this blog.  Alternatively, you can send him an E-mail directly.

An introduction to the Law Enforcement Directive

Among all of the hype surrounding the General Data Protection Regulation (GDPR) some other aspects of information law are being overlooked; I have already written about the Privacy and Electronic Communications (EC Directive) Regulations 2003 and how they are forgotten about. The GDPR is not the only new piece of EU law which is due to take effect in May and which will impact data protection and privacy law in the United Kingdom. The processing of personal data by data controllers for the purpose of law enforcement falls outside of the scope of the GDPR; instead this is dealt with by the Law Enforcement Directive (LED). As the LED a Directive rather than a Regulation, the LED does not have direct effect and therefore requires to be transposed into Member States’ domestic law. This is being achieved in the UK through Part 3 the Data Protection Bill.

The LED is perhaps not as visible as the GDPR because of its much more limited scope. However, this blog aims to cover all information law bases and it would be remiss of me not to write something on it at least. The LED, and therefore the provisions of Part 3 of the Data Protection Bill, applies to what have been termed as “competent authorities” for the purposes of “the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, including the safeguarding against and the prevention of threats to public security”; these purposes are collectively known as the “law enforcement purposes”.

So, who needs to bother about the LED? Obviously, competent authorities have to bother about it because it governs how they process personal data for the law enforcement purposes; however, they are not the only ones. Data Subjects should also be concerned about the LED as it governs how their personal data is processed by these competent authorities and sets out what rights they have in relation to personal data processed by them for law enforcement purposes. The competent authorities are mostly set out in Schedule 7 to the Data Protection Bill; however, clause 30(1)(b) of the Data Protection Bill provides that “any other person if and to the extent that the person has statutory functions for any of the law enforcement purposes” is also a competent authority. The most obvious competent authority is the police; however, there are quite a few others listed within Schedule 7 including Revenue Scotland, the Department for Work and Pensions, the Police Investigations and Review Commissioner and HMRC. Of course, both the Information Commissioner and Scottish Information Commissioner process personal data for the law enforcement purposes and therefore Part 3 of the Data Protection Bill would apply to them when they’re processing personal data in the capacity.  In terms of 30(1)(b) competent authorities, the most obvious example would be local authorities who are responsible for things such as Trading Standards provision and also the investigation of fraud concerning benefits administered by them.

One thing that should be noted is that the security and intelligence services (The Security Service, Secret Intelligence Service and GCHQ) are not covered by the LED. National Security falls outside of the scope of EU law and therefore the European Union has no competence to regulate these areas. Therefore, although the Security Services process personal data for law enforcement purposes, the LED does not apply to them. The Data Protection Bill does make provision for the processing of personal data by the security and intelligence agencies; this can be found in Part 4 of the Data Protection Bill (and falls outside of the scope of this blog post).

Chapter 1 of Part 3 of the Data Protection Bill provides the key definitions which require to be used when applying Part 3. The definitions are broadly the same as those to be found in the GDPR with relevant modifications being made. Therefore if you are familiar with data protection law then these definitions will not be too alien to you.

Chapter 2 of Part 3 of the Data Protection Bill sets out the six principles to be complied with when processing personal data under Part 3. Meanwhile, Chapter 3 sets out data subjects’ rights; including the right to subject access, the right to rectification and the right to erasure or restriction of processing.

The rights of data subjects under part 3 of the Data Protection Bill will be the subject of a separate blog post later in the month; however, it is suffice to say that they have a more limited scope than under the GDPR because of the nature of the processing being dealt with.

There is one final part of the Data Protection Bill to make mention of in this blog post and that is Schedule 8 to the Data Protection Bill. This Schedule sets out the conditions which must be met before a competent authority can carry out sensitive processing of personal data under Part 3. 

The LED is supposed to be transposed into Member States’ domestic law by 6th May 2018; it remains to be seen whether the Data Protection Bill will complete its passage through Parliament and receive Royal Assent in time to allow Part 3 to be commenced by then.

Alistair Sloan

If you require any advice or assistance in connection with the provisions of the Law Enforcement Directive or any other information law concern, please contact Alistair Sloan on 0345 450 0123 or send him an E-mail.

Data Protection and Privacy Enforcement: February 2018

February is a short month, and did not see the same level of publicity by the Information Commissioner’s Office in respect of enforcement action taken to enforce privacy and data protection laws as was seen in January.

Key points 

  • Failing to comply with an Enforcement Notice is a criminal offence (see section 47 of the Data Protection Act 1998); there is a right of appeal to the First-Tier Tribunal (Information Rights) against the terms of an Enforcement Notice and so if you do not agree with the terms of the notice you should seek legal advice about the possibility of making such an appeal.
  • Employees should be careful what they do with personal data; in most cases the enforcement liability will lie with the employer (although, your employer might take disciplinary action against you for failing to comply with company policies and procedures).  However, there are circumstances when employees can be held personally, and indeed criminally, liable for breaches of the Data Protection act 1998.
  • The right of subject access is a fundamental right of data subjects and data controllers must ensure that they comply with their obligations in respect of a subject access request made by a data subject.  The right of subject access remains a key feature of the new European data protection framework and the GDPR strengthens the right of subject access for data subjects.

Enforcement action published by the ICO during February 2018

Pennine Care NHS Foundation Trust
The ICO has conducted a follow-up assessment [pdf] with Pennine Care NHS Foundation Trust finding that the Trust had complied with the terms of the undertaking which it had previously given [pdf] following a consensual audit [pdf] by the Commissioner’s staff.

Gain Credit LLC
Gain Credit LLC was served with an Enforcement Notice [pdf] by the Information Commissioner for failing to comply with a subject access request made to it.  This came to light after the data subject in question made a request to the Information Commissioner that she carry out an assessment pursuant to section 42 of the Data Protection Act 1998 into whether it was likely or unlikely that the processing by Gain Credit LLC was in accordance with the provisions of the Act.

Direct Choice Home Improvements Limited
In March 2016 Direct Choice Home Improvements Limited was served with a Monetary Penalty Notice in the amount of £50,000 [pdf] and also an Enforcement Notice [pdf] for breaching Regulation 21 of the Privacy and Electronic Communications (EC) Directive Regulations 2003 (PECR).  The company continued to breach Regulation 21 of PECR and the Commissioner prosecuted it for breaching the Enforcement Notice.  The company was not represented at Swansea Magistrates’ Court and was convicted in absence.  The company was fined £400 as well as being ordered to pay £364.08 in prosecution costs and a victim surcharge of £40. (Don’t forget that PECR remains part of the privacy and data protection law landscape when the GDPR becomes applicable in May.)

Other Prosecutions
A former employee of Nationwide Accident Repair Services Limited was prosecuted by the Information Commissioner for unlawfully obtaining personal data contrary to section 55 of the Data Protection Act 1998.  The defendant had sold the personal data of his employers’ customers to a third party who then made use of the personal data to contact some of those customers concerning their accident.  The defendant was convicted and fined £500 as well as being ordered to pay costs of £364 and a victim surcharge of £50.  An offence of unlawfully disclosing personal data was admitted to and taken into consideration by the Court.

A former local authority education worker was prosecuted after she unlawfully disclosed personal data contrary to section 55 of the Data Protection Act 1998.  The defendant had taken a screenshot of a council spreadsheet which concerned the eligibility of named children to free school meals and then sent it onto an estranged parent of one of the children.  She pled guilty to three offences and was fined £850 by Westminster Magistrates’ Court as well as being ordered to pay £713 in costs.

Alistair Sloan

If you require advice or assistance in respect of a data protection or privacy law matter, or any other Information Law matter; then contact Alistair Sloan on 0345 450 0123, or send him and E-mail.