< Back

Share |

Regulatory developments in 2014

November 2014

Despite the lack of a new Regulation, it's been a busy year for privacy developments, both within the UK and at an EU level. In this article, we highlight some of the key guidelines and opinions issued by UK and EU regulators.

UK

ICO guidance

Guidance on privacy in mobile apps

In January, the UK Information Commissioner's Office (ICO) issued new guidance on privacy in mobile apps together with recommendations to help consumers protect their privacy when downloading apps.

The guidance, aimed at app developers, highlights the importance of 'privacy by design' and advocates privacy friendly defaults and a high level of user control over their settings.  It also offers suggestions as to how app developers can overcome the constraints of a small screen by, for example, breaking down privacy notices into small sections.  The ICO highlights the importance of explaining to users why their information is being processed and of being transparent and not hiding important information or misleading the user.

The ICO says that where apps are funded by advertisers, those advertisers would be considered to be data controllers of any personal data they receive through the relevant app.

Updated Privacy Impact Assessments code of practice

In February the ICO updated its guidance on Privacy Impact Assessments (PIAs).  In the guidance, the ICO emphasises the importance of PIAs in establishing consumer trust in the way their personal data is handled and highlights the need to ensure PIAs are built into the development stage of any product development in relation to which privacy is an issue.

Organisations are urged to consult with both internal and external stakeholders during the PIA process in order properly to assess and address any privacy risks.  The guidance also contains template questions for organisations to consider when carrying out PIAs.

The intention is that organisations which follow the ICO guidance will be better prepared to deal with the requirements to conduct PIAs under any new European data protection legislation.

Updated code of practice on subject access requests

security key on keyboardIn March, the ICO updated its code of practice on handling subject access requests (SARs).  Among other things, the new version of the guidance recommends giving individuals access to their personal data through a secure website or sending them their data in electronic form in order to help reduce the costs of complying with SARs. The ICO's good practice suggestions are in line with the government's midata principles.

The guidance also states that while individuals are entitled to submit as many SARs as they want, businesses need not comply with requests made at unreasonable intervals. It points to a number of factors which could help organisations assess whether or not this is the case, such as: the nature of the data; the purposes of the processing and whether it may cause detriment to the data subject; and how often the data is modified.

Where new data has been collected or records have been amended between receipt of SARs, then the most recent SAR must be responded to in full.

Updated DPA and PECR marketing guidance for political campaigns

Also in March, the ICO published guidance to assist political campaigners in complying with the marketing requirements of the Data Protection Act 1998 and the Privacy and Electronic Communications Regulations 2011.  This was been done in light of the then upcoming Scottish independence referendum.

The guidance reminds political organisations of the conditions for sending marketing emails, including consent requirements, provision of opt-outs and the identification requirements. 

Guidelines on handling FOI requests through social media

March also saw the ICO publish guidelines for public authorities on responding to freedom of information (FOI) requests made via social media.

The guidelines deal with the issue of responding to an FOI request in the same medium in which it is made. The ICO says that if a request is made via a public website, there can be a reasonable expectation that the response will also be published via the site. Where, however, the public authority has a particular reason to believe it would be inappropriate to publish the information online, it may wish to respond by private message to the requester's account instead. If this is not available, it can contact the individual to obtain an address. The guidelines make it clear that public authorities do not have to respond to FOI requests where the requester's real name is unclear although there is an expectation that the authority would make further inquiries of the individual to establish their name.

In some cases, there may be technical restraints preventing a response in the same medium, for example, where a request is tweeted. In such cases, the authority should ask the individual to provide an alternative way of responding.

In addition, the guidelines discuss the issue of FOI requests for information containing keyword terms. Where the keyword is so common that the scope of the request is unreasonably broad, the authority should consider refusing the request and contacting the requester to ask them to narrow the request down. If the requester does not do so, the authority may have grounds for considering the request to be vexatious.

exclamation markGuidance on complaint handling procedures

The ICO published guidance on its complaint handling procedure in April.  The guidance is aimed at data controllers and explains how the ICO handles complaints which allege an organisation is in breach of the Data Protection Act 1998, the discretion it has when considering the issues and the types of action it may take in particular circumstances.  Among the issues the ICO will take into consideration are the ways in which the organisation itself has handled any complaint it may have received from a data subject, the context of any breach, its seriousness and the scale of the breach.

The guidance also encourages organisations to deal with complaints promptly to reduce the number escalated to the ICO and stresses that it will only consider a data subject's concerns where the data subject can show they have raised them first with the relevant data controller and the data controller has not addressed them appropriately.

Updated guidance on data protection obligations when disclosing data under TUPE

In June, the ICO updated its guidance on the transfer of personal data under the Transfer of Undertakings (Protection of Employment) (TUPE) rules to accommodate changes made in January.

Under TUPE, certain employer information must now be provided to the new employer at least 28 days before transferral, including details about pay, working hours, holiday entitlement and any disciplinary proceedings.  The ICO warned businesses to ensure that any response to a request for information does not go beyond what is provided for under TUPE and that no "excessive or irrelevant" information is disclosed.  It suggests anonymisation as a possible means for limiting the amount of personal data disclosed.

ICO blog on wearable tech and data protection

In June, as Google Glass launched in the UK, the ICO published a blog underlining the fact that use of personal data obtained from wearable tech for non-personal use, is likely to be subject to the Data Protection Act 1998. In addition, organisations which allow users of their wearable tech to capture video and still images, will need to comply with the ICO's CCTV code of practice (recently updated, see below).

ICO report on Big Data

August saw the ICO publish a report on Big Data and how it can and should operate in the context of the Data Protection Act 1998. The ICO is keen to stress that Big Data can be used within the law and stresses that organisations using Big Data must ensure their aim in doing so is clear at the outset and that the data processed is relevant and not excessive for that purpose. In addition, the ICO recommends:

  • anonymising where possible;
  • establishing whether individuals have consented to the use of the data, especially if it was originally collected by a third party;
  • considering whether the proposed processing is compatible with the original purpose for which the information was collected i.e. whether it is fair and the data subject would reasonably expect the data to be used in the proposed way; and
  • ensuring all information about use of data is communicated as clearly as possible and at an appropriate stage.

Guidance on smartphones

In September, the ICO issued guidance, together with Ofcom, Phonepay Plus, the Competition and Markets Authority and the Financial Conduct Authority, to help consumers take steps to ensure the safe use of apps and secure information on their mobiles. These include:

  • install apps from recognised store;
  • consider content ratings;
  • be aware of permissions you are granting;
  • treat your phone as your wallet;
  • be aware of costs especially for roaming and in-app purchases;
  • regularly delete apps that are not used; and
  • wipe phones (on disposal).

Updated code of practice for CCTV and Surveillance Cameras

CCTV cameraThe ICO published an updated Code of Practice for CCTV and Surveillance Cameras (Code) in October.

Much of the Code is similar to the previous version.  It emphasises the need to conduct privacy impact assessments (PIAs) prior to deciding whether or not to proceed with a surveillance system.  These should look at the pressing need the surveillance system is intended to address and whether its proposed use has a lawful basis and is justified, necessary and proportionate.

In addition, the Code stresses: the requirement to ensure that the system only goes as far as necessary for the purposes required;  the need to inform data subjects they are being recorded or monitored as well as of their other rights; the need to respond to data subject access requests; and the requirement to make sure data is held securely and for no longer than necessary. In other words, the overriding need to comply with the data protection principles.

The Code emphasises the possible application of other applicable legislation such as RIPA or the Protection of Freedoms Act 2012.

The main departure in the Code is the inclusion of a section on newer surveillance methods which the ICO considers are particularly intrusive such as Automatic Number Plate Recognition, body worn video (BWV), unmanned aerial systems (UAS) e.g. drones. Both BWVs and UASs should be capable of being turned off.  Special consideration has to be given as to how to inform users these technologies are being used and of their rights given, especially with UASs, that it is particularly challenging to provide fair information.  Continuous recording is discouraged and said to be highly unlikely to be justifiable.  Operators are also warned that audio recording is likely to be harder to justify than visual recording.

Interestingly, data controllers using automated recognition technologies to identify people's faces are advised to use sufficiently high quality equipment to ensure that individuals are captured accurately enough to fulfil the purpose and that there is no danger of confusion.

Europe

Article 29 Working Party

Article 29 Working Party Opinion on necessity and proportionality in law enforcement

In March, the Article 29 Working Party (WP) published an Opinion on the application of necessity and proportionality concepts and data protection within the enforcement sector.  The Opinion considers how the European Court of Human Rights has interpreted these principles and how they link to data protection.  It makes practical recommendations to legislators and authorities responsible for tackling issues in the areas of freedom, security and justice, to help ensure they comply with privacy and data protection requirements.

Article 29 Working Party Opinion on anonymisation techniques

The WP adopted an Opinion on anonymisation techniques in March.  The Opinion looks at and assesses current anonymisation techniques which consist mainly of randomisation and generalisation, but warns that is difficult to create a truly anonymous dataset where underlying information is retained which can be crossed with another set of information to re-identify individuals.  It also underlines that pseudonymisation, while useful as an extra security measure, is not anonymisation.

The WP advises that each case of anonymisation should be planned individually, using a variety of techniques.  Regular risk assessments should be carried out to ensure there is no risk of identification.

Draft EU processor to non-EU sub-processor clauses

In April, the WP published a working document on draft ad hoc contractual clauses for personal data transfers from an EU data processor to a non-EU sub-processor.  The clauses have not been adopted by the European Commission and so may be further amended.

Opinion on the notion of legitimate interests of the data controller

checklistThe WP published an Opinion on the concept of legitimate interests of the data controller under Article 7(f) of the Data Protection Directive in April.  The Opinion sets out guidance on how to apply the concept under current law and makes recommendations for future improvements.  The Opinion stresses that the legitimate interests justification should not be seen as being a less onerous standard to meet than the other justifications and should not be used as a last resort if none of the other justifications apply.  The Opinion contains a number of practical test cases and lists of factors which should be taken into consideration when assessing whether the data processing can be justified on the grounds of it being in the legitimate interests of the controller.

The WP argues for recitals to be added to the new EC data protection Regulation which would set out the key issues to consider when carrying out the assessment of the balance of interests.  It also argues for a new article to be included requiring data controllers to explain to data subjects why its legitimate interests do not compromise those of the data subject.

Opinion on the Internet of Things

The WP  published an Opinion on the Internet of Things (IoT) in September. It identifies what the WP considers to be the key risks to privacy posed by the IoT, looking particularly at wearable computing, quantified self things and home automation or domotics. The WP then considers the application of the Data Protection Directive (Directive) and the e-Privacy Directive to the IoT and finishes with a list of recommendations.

The Opinion highlights a number of key privacy concerns, which include:

  • lack of control on dissemination of personal data and information asymmetry;
  • the difficulty in obtaining a valid consent;
  • extrapolation of inferences from data and repurposing of original processing;
  • intrusive user profiling;
  • limitations to the ability to remain anonymous or going unnoticed; and
  • security risks.

The Opinion confirms that Article 4(1) of the Data Protection Directive 1995 (DPD) applies to data controllers not established in an EU Community territory but making use of equipment situated in a Member State. The DPD's meaning of equipment is likely to cover objects used to collect and process users' data in the provision of services.

To read more about the Opinion, see our article in Radar.

Statement on Big Data

The Article 29 Working Party (WP) published a statement to communicate "key messages" on the issue of Big Data in October.

In particular, the WP defends the continuing application of the purpose limitation and data minimisation principles to Big Data as well as the requirement that data controllers collect personal data only for specified , explicit and legitimate purposes and do not process personal data other than in accordance with those purposes.  The WP is clear that it does not believe these principles need to be reworked in order to facilitate collection of Big Data.  The WP goes on to say that complying with the data protection framework is key to privacy-friendly solutions which inspire consumer confidence and is also important to ensure fair and effective competition in the Big Data market.  Given the global nature of the Big Data industry, the WP also calls for increased co-operation between data protection regulators on a worldwide basis.

EDPS

Preliminary Opinion on big data privacy and competition

In addition to looking at cross border data transfers (see our article on the mass surveillance scandal fallout), the European Data Protection Supervisor (EDPS)  issued a preliminary Opinion on the relationship between big data, competition law and consumer protection in the big data economy.  The EDPS believes that major ISPs gain market power through their access to and control of big data which can lead to anti-competitive practices.  The EDPS is particularly critical of organisations which provide what to them are cost free services but ask consumers to 'pay' for them by supplying personal data.

gavelThe Opinion examines the interplay between data protection, competition law and consumer protection and looks at market power, consumer welfare and enforcement.

The EDPS wants to see greater cooperation on policy in these areas.

See our articles on Big Data and competition for more on this.

If you have any questions on this article or would like to propose a subject to be addressed by the Global Data Hub please contact us.

stack of books
Lucy Lyons

Lucy Lyons      


Lucy looks at some key regulatory guidance from the UK and the EU.

"Despite the lack of a new Regulation, it's been a busy year for privacy developments, both within the UK and at an EU level."