Oral Evidence to Parliament: Digital Economy Bill

On 11/10/16, our Chief Executive Renate Samson gave evidence to the Public Bill Committee on the Digital Economy Bill about Part 5: Digital Government. You can watch it here or read the Hansard transcript below.

The Chair: Thank you to our next two witnesses for being here promptly. We will now hear evidence from Big Brother Watch and the Open Rights Group. For this session we again have broadly half an hour to 45 minutes. Will the witnesses please read their names into the record?

Jim Killock: I am Jim Killock, executive director of the Open Rights Group.

Renate Samson: I am Renate Samson, chief executive of Big Brother Watch. We were also a member of the open policy making group and the Privacy and Consumer Advisory Group, to which Dr Whitley referred to earlier.

The Chair: Thank you. I turn first to Louise Haigh.

Louise Haigh: I will pick up where we left off, if that is okay. You were both involved in the consultation process for part 5 of the Bill. Did the proposals come as a surprise to you? Do they make sense to you as data experts?

Renate Samson: No, they do not make very much sense, if I am honest. As I said, we were a member of the open policy making process and we also submitted to the consultation. I am genuinely surprised that after a two-year process, all of a sudden it felt very rushed. There were conversations and meetings happening right up to the Queen’s Speech; there was still a general lack of clarity, particularly on safeguards, and many questions were still being asked, such as how, why, when and so on. The next thing we knew, it was in the Queen’s Speech and the Bill was published.

Reading through part 5—and I have read through it a lot and scratched my head a great deal, mainly for the reasons given in evidence earlier today—you see that the codes of practice, which would explain an awful lot of what we imagine is meant or may not be meant, just have not been published. I have repeatedly asked for them and been given various expected dates, and we are sitting here today without them but with the Bill already having been laid before Parliament.

We have also done a lot of work on the Investigatory Powers Bill, for which the codes of practice were there right from the start. There was clarity as to what was intended and what was going to be legislated for, straight up. So, I am profoundly disappointed, because data sharing and digital government are hugely important and we seem to be very far away after a very long process.

Jim Killock: It is worth considering why the open policy making process was put in place. Data sharing is known to be potentially controversial. It was knocked out of at least one previous Bill a few years back when proposed by Labour because of the lack of privacy safeguards. Everyone understood that something more solid was needed. Then the Cabinet Office was very keen to ensure it did not raise hackles, that it got the privacy and the safeguards right, that trust was in place. It was therefore a surprise, after that intense process, to get something back that lacked the safeguards everybody had been saying were needed.

We are particularly concerned not only about the lack of codes of practice, but the fact that a lot of these things should be in the Bill. Codes of practice are going to develop over years. We need to know about things like sunsetting, for instance—that these things are brought to a close, that you do not just have zombie data sharing arrangements in place, where everyone has half-forgotten about them and then suddenly they are revived. You need to have Parliament involved in the specifics.

As we have heard, data sharing has a huge range of possibilities, starting with the benign and the relatively uncontroversial: statistics and understanding what is happening to society and Government policy, where privacy is relatively easy to protect. You use the data once, you do the research and that is it. It ranges from that through to the very intrusive: profiling families for particular policy goals might be legitimate, but it also might be highly discriminatory. Getting to the specifics is important.

You need the safeguards in place to say, “These are the kinds of things we will be bringing back; these are the purposes that we may or may not share data for.” That way, you know there is a process in place. At the moment, it feels like once this has passed, the gate is opened and it is not necessarily for Parliament to scrutinise further.

Louise Haigh: We talked earlier about the bulk transfer and bulk sharing of data, and an earlier witness talked about providing data access, rather than data sharing. Should the Government not be pursuing trials on that basis, rather than these enormous powers without any kind of assurances to the public or parliamentarians about how they will be using them?

Renate Samson: It was very specific at the end of the open policy making process that, for example—put the bulk to one side for a moment—but regarding the fraud and debt aspect of the Bill, it had been agreed that three-year pilot projects would take place with subsequent review and scrutiny potentially by the OPM or by another group. They are in the Bill as a piece of legislation with the Minister deciding whether or not it is okay and potentially asking other groups, which are not defined. That is half an answer to half your question. Pilots are an excellent idea if they are pilots, not immediate legislation.

With regards to the bulk powers in the Bill, civil registration documents were a late addition. We are still not clear as to their purpose. The purpose given in the consultation to the OPM process, but also in the background documents relating to the Bill, is a whole mix of different reasons, none of which, I would argue, are clear and compelling or, indeed, necessary and proportionate. But again, as you have heard a lot today, without detail, how can we properly answer your question?

Jim Killock: I have a quick observation on this. We currently have a data protection framework. The European Union is revising its data protection laws; they are somewhat tougher, which is quite a good thing, but we do not know what the future of data protection legislation is in the UK. It might be the same or it might be entirely different in a few years’ time.

That is a very good reason for ensuring that privacy safeguards are quite specific and quite high in some of these sensitive areas, because we do not know whether the more general rules can be relied on and whether they are going to be the same. That is not to say that we do not need higher safeguards in any case here, because you are not dealing with a consent regime. People have to use Government and Government have to look at the data, so it is not a mutual agreement between people; you have to have higher safeguards around that.

Thangam Debbonaire: My questions are directed at Mr Killock and relate to paragraphs 37 and 38 of your submission, “Definition of pornographic material”. We heard earlier that both the NSPCC and the British Board of Film Classification support a provision to require ISPs to block websites that are non-compliant. There was also discussion of widening the scope to apply the restrictions to other harmful material that we would not allow children access to in the offline world. Here, you seem to be questioning the value of that:

“This extension of the definition…also raises questions as to why violent—but not sexual—materials rated as 18 should then be accessible online.”

I also question this consistency but the solution, to me, seems to be that we should include other material, such as violent material and pro-anorexic websites, as we talked about earlier. Will you tell us a bit more about what your objection is to creating a framework to keep children as safe online as they are offline?

Jim Killock: We have no objection; it is a laudable aim and something we should all be trying to do. The question is, what is effective and what will work and not impinge on people’s general rights? As soon as you look a little beyond pornography, you are talking about much more clear speech issues.

There will be a need to look at any given website and make a judgment about whether it should or should not be legally accessed by various people. That starts needing things like legal processes to be valid. Some of the things you are talking about are things that might not be viewed by anybody, potentially. The problem with all these systems is that they just do not work like that. They are working on bulk numbers of websites, potentially tens of thousands, all automatically identified, as a general rule, when people are trying to restrict this information. That poses a lot of problems.

I also query what is the measure of success here. Because I feel, I suspect, that the number of teenagers accessing pornography will probably not be greatly affected by these measures. There is more of an argument that small numbers of children who are, perhaps, under 12 may be less likely to stumble on pornographic material, but I doubt that the number of teenage boys, for instance, accessing pornographic material will be materially changed. If that is the case, what is the measure of success here? What harm is really being reduced? I just feel that, probably, these are rather expensive and difficult policies which are likely to have impacts on adults. People are saying it is not likely to affect them, but I rather suspect it might, and for what gain?

Thangam Debbonaire: You have mentioned your feelings and your suspicions but, actually, the British Board of Film Classification already has a system for identifying for instance pro-anorexic, pro-suicide and violent websites. It already has a system for use on mobile networks.

Jim Killock: No, it does not.

Thangam Debbonaire: Yes, it does. They sat right here this afternoon.

Jim Killock: No it does not. The mobile providers have a system that the BBFC—

Thangam Debbonaire: So a system exists?

Jim Killock: They have a system, which is not wildly accurate that people choose to use. To the extent that they are choosing to use it, there is some legitimacy around that. People choose to have websites blocked and they understand that a certain number of them may be incorrectly blocked, that is OK.

Thangam Debbonaire: Are you saying that that sort of system does not exist, because we were told that it did earlier?

Jim Killock: This is what they are currently doing: they are blocking websites, which are sometimes the right websites, sometimes not; sometimes the right websites are not blocked. It is essentially automated decision making that comes with the problem that you can only really do this by things like keyword search. There are not enough humans available at the right price to do the review, so all kinds of things get blocked for essentially no real reason. For instance, we have had a widget manufacturer—

Thangam Debbonaire: Forgive me for interrupting Mr Killock, but there is a good reason. You asked about successful outcomes—and if you are going to ask a question, I am going to answer it—the successful outcome is that children are protected in the online world in the same way as they are protected in the offline world. I have to reiterate this to you: I do not understand why you think it is a risk worth taking that some adults may or may not have their own personal preferences infringed, balanced against the harm which we know is done to children. On teenage boys, just saying that because teenage boys may or may not continue to watch pornography there is no point, that seems to be a very sad conclusion to come to.

Jim Killock: The point is that you can help children to be protected, the questions is, what is the best way? For instance, I agree with the NSPCC’s calls for the compulsory education of children. Of course that should be happening and it is not. Similarly, Claire Perry’s initiative to have filters available has its merits. Where I have a problem is where adults are forced into that situation, where they are having websites blocked and where there is little redress around that. I caution you around large-scale blocking of websites because we know from our own evidence that a very large number of websites get blocked incorrectly and it has impacts on those people too. The question is, what is effective? I am not sure that age verification will be effective in its own terms in protecting children.

Claire Perry: Mr Killock, it is nice to hear you finally supporting the initiative. Indeed, all of the shroud waving about false blocking was brought out with vigour many times over the past five years—

Jim Killock: We stand by that.

The Chair: Best not to interrupt the questions, Mr Killock. Let the questions be put.

Claire Perry: My point is that it is sad that the campaign once again from your organisation is that the perfect must be the enemy of the good. I am afraid I would also question this issue of false blocking, and I would appreciate written evidence if you have it. It is a tiny fraction. It has never reached anything like the levels your organisation has claimed, and the processes for notification and unblocking have massively improved over the last five years. My question to you is: at what point does your organisation stop dealing with this world where it is, “Hands off our internet” and start accepting that content provision via the internet, which is just another form of provider, should have exactly the same safeguards as exist in the offline world?

Renate, your points around this are also quite disturbing because you are holding up for a perfect world—

Renate Samson: What points?

Claire Perry: Around privacy and data recognition. At what point do we accept that what is proposed in this Bill is actually a good step forward? While it may not be perfect, it is a massive step-change improvement on what we have today.

Jim Killock: The first question is: “What is the impact on everyone?”

Claire Perry: No, the question is: will you provide us with written evidence of this issue of false blocking, in detail, because I happen to think it is completely untrue, your words on this?
Jim Killock: Yes, we can.

Claire Perry: We would appreciate written evidence by next week. Thank you.

Jim Killock: We have literally hundreds.

Claire Perry: Hundreds? Of the 1.5 billion websites that are out there?

Jim Killock: The error rate does not appear so large; but when you multiply that by the number of providers that have different blocking systems it becomes quite significant.

Claire Perry: I look forward to the evidence.

The Chair: Do not interrupt the questions, or the answers.

Jim Killock: On the wider question, what is effective, the question is how are children protected, versus what is the impact on adults. At the moment we do not know, because the system is not in place, what that effect on adults will be; but we have to be concerned that adults should feel free to access legal material, no matter what it is. They should not feel like they are being snooped on or having their privacy or anonymity removed.

I was encouraged by some of things that were said earlier, but I have to say that when we sent some technical observers to hear about the systems that are likely to be put in place—the sort of things that vendors want to do—we heard a rather different story. The sorts of things they want to do include harvesting user data, maybe using Facebook and other platforms, to pull in their data to verify people’s age by inference. These things were not privacy friendly. Let us assume that the BBFC has a job, as apparently it does. It would be good if it had clear duties around privacy and anonymity, to make sure that it has to put those things first and foremost when it is choosing and thinking about age verification systems.

Claire Perry: As a supplementary, does your organisation campaign against age verification on gambling sites on the internet?

Jim Killock: No, we do not.

Claire Perry: Even though exactly the same issues of privacy could apply?

Jim Killock: I think they are rather different, are not they?

Claire Perry: Why? They are legal.

Jim Killock: The first thing is that gambling sites are dealing with money. They have to know a little bit about their customers. They need to do that for fraud purposes, for instance. The second thing is, I think, it is much harder to argue that there is a free expression impact for gambling, compared with accessing legal material, whether it is pornographic or not.

Claire Perry: So your interest is not about legality. It is about your interpretation of legal and illegal material.

Jim Killock: It ultimately is about what the courts think is the boundary around free expression, and what sort of things are impacting on people’s free expression and privacy. That is our standpoint. What we are asking for, the same as you, is the same standards online as offline. One of those standards is human rights and what we are entitled to do.

The Chair: Let us hear from Ms Samson; and then we are moving on.

Renate Samson: Just to be clear, we submitted evidence and we have concerns about part 5 of the Bill. The questions you have been asking Mr Killock—I am unclear; are you asking me about the same issues you are asking him?

Claire Perry: No, specifically about the part 5 questions.

Renate Samson: Okay. We have not, in our evidence and our concerns, asked for a perfect Bill, although I do not believe there is any harm in trying to make the best piece of legislation we can. The work that we do with the Privacy and Consumer Advisory Group and as part of the open policy making process is about having engagement, to ensure that we are the leading light in data sharing, but also data protection. As Mr Killock has mentioned, we are currently looking at the Data Protection Act 1998. That will probably expire in May 2018, and we will get the general data protection regulation. Right now the measure in question does not even refer to that, or, indeed, to the Investigatory Powers Bill. It refers to the Regulation of Investigatory Powers Act 2000 and the DPA. Also, it will probably fail on a number of the key points of the GDPR, in relation to potential profiling, consent of the individual, and putting the citizen at the heart of data sharing and data protection.

I am not looking for “perfect”, but I think “perfect” is a good place to head towards.

Nigel Adams: My question is for Mr Killock, with regard to what the Bill is seeking to do in terms of equalisation of copyright offence penalties. I just wondered why your organisation was not in favour of rights holders—the tens of thousands of content creators. Why is your organisation not keen on the idea in the Bill?

Jim Killock: That would be a misrepresentation. We are quite clear in our response. We are worried about the impact of this on people who should not be criminalised and who we thought the Government were not trying to criminalise in this case. Our position is that if the Government are going to extend the sentence and have the same sentence online as offline for criminal copyright infringement—that is to say, 10 years—then they need to be very careful about how the lines are drawn, because the offences are quite different. Offline, in the real world, criminal copyright infringement covers a number of acts. It is all about copying and duplication. Essentially, it is about criminal gangs duplicating DVDs and the like. Online, making that separation is harder, because everything looks like the same act—that is to say, publication. You put something on the internet, it is a publication. So how do you tell who is the criminal and who is the slightly idiotic teenager, or whatever it happens to be? How do you make sure that people who should not be threatened with copyright criminal sentences are not given those threats?

We particularly draw attention to the phenomenon of copyright trolling. For instance, there is a company called Golden Eye International, a pornographer which specialises in sending bulk letters to Sky customers, BT customers and so on, saying, “Please pay us £300 because you downloaded a film that is under copyright.” These are obviously pornographic films and they then wait for people to pay up. They have no specific knowledge that these people are actually the people doing the downloading, all they know is that somebody appears to have downloaded.

Nigel Adams: Sorry to interrupt, but the idea of the Bill is not to go after people who are downloading content, it is purely for those who are uploading content for commercial gain. That is the whole purpose.

Jim Killock: Unfortunately, that is not how the language of the offence reads. The test in the offence is that somebody is “causing a loss”, which is defined as not paying a licence fee, or is “causing the risk of loss”, about which your guess is as good as mine, but it is essentially the same as making available, because if you have made something available and somebody else can then make a copy, and then infringe copyright further and avoid further licence fees, basically that is a criminal act. So file sharers, whether they are small or large, all appear to be criminal copyright thieves. Similarly, people who are publishing things on websites without licence are also potentially criminalised. Those things can be dealt with much better and more simply through civil courts and civil copyright action. What we are calling for is either to get rid of those things which are attacking individuals and wrongly bringing individuals into scope, or to put thresholds of seriousness around the risk of loss and/or causing loss. Something like, “Serious risk of causing significant loss” would be the way to deal with this. Similarly, “Causing serious loss”.

Nigel Adams: But if you are knowingly uploading creative content online for commercial gain, to my mind it does not matter whether it is 50 quid or 50,000 quid, you are knowingly stealing someone’s content.

Jim Killock: The commercial gain is not part of this offence. That is what I am saying. The offence is purely to cause loss—in other words, to not pay a licence fee—or to cause risk of loss. There is no “commercial” in it. So you have to put the threshold somewhere. You have an offence for the commercial activities and, separately, individuals who cause risk of loss or fail to pay a licence fee.

Nigel Adams:What do you think is a reasonable limit? Where would you set the limit?

Jim Killock: In terms of taking someone to court, there is no particular limit. If I cause £20 of damage to somebody where I should have paid it, the small claims court should be available and I should be able to either prosecute someone or be prosecuted in a civil court in the normal way. The question of how much is “serious” is, in all likelihood, something we should probably leave to the discretion of judges. It will not be very easy to fix a particular amount, but I think “serious” is usually the word used.

Nigel Huddleston: As you have already recognised, this part of the Bill has already been subject to a consultation. There were 282 responses to that consultation, with the majority of them being broadly supportive. You have raised quite a few perfectly valid concerns, but do you accept that there is broad public support for the sharing of data when there is a clear social upside?
Jim Killock: I think we are all clear that data sharing should be enabled. The question is how you do that without it being a completely wide open process. The principle is not something that anyone has ever objected to.

Renate Samson: On the consultation that you referred to, you just told me that there were 282 submissions and that most of them were broadly supportive, but the Government response did not indicate who was supportive and who was not, and I have not seen the submissions on the website to be able to see for myself who was broadly supportive and who was not.

Having been part of the open policy making process, I would say that several people in that room had a large number of concerns. They were not concerns to prevent data sharing, but concerns to ensure that data sharing could happen in the safest way possible, and not just in terms of privacy. That way, not only can Government benefit from it and clear processes can be established in Government, but the citizen can understand why their data are being shared and can then be supportive of it and can trust that their data are going to be looked after. It is about the citizen being able to feel as though their personal data, which are now part of the air we breathe in a connected, digital society—we cannot function without our data—are safe and secure. It is about not only data being private, because there are varying degrees of privacy, particularly when you are sharing, but the Government understanding that.

Nigel Huddleston: I am not sure whether we got a clear answer there. The Commons Library published a briefing, which includes statistics from an Ipsos MORI survey that you have probably seen before. The things that get public support are things such as:

“Creating a DNA database of cancer patients…Using data from electronic travel cards…to improve the scheduling of buses or trains…Using police and crime data to predict and plan for crimes that might take place in future”.

There is a clear public upside for some of the most vulnerable and hurt people in society; are we ever going to reach a point where you are satisfied with the use of data?

Renate Samson: You took evidence this morning from two witnesses whom you asked a very similar question, and I support the answers that they gave. People are happy to share data if they understand why and are asked. I believe that the answer you were given earlier referred to the individual. If you ask me whether I am happy to share my data to cure cancer, I go away and I make the decision about whether or not I am happy to do that. As you have pointed out, the majority of people are probably going to say, “Yes, of course.” Big Brother Watch has no desire to restrict that. We are asking for information that we feel is lacking from part 5 of the Bill. We are asking for information for the individual so that they can give their consent based on proper guidance. That is going to be a key part of data protection law going forward.

This is about the way the questions are being asked. Similar questions have been asked throughout the day. We are not trying to say no. We have never said no. We are just trying to say, “Please present us with as much information as possible, so that we can see how.”

Jim Killock: It is really in the interests of Government to get this right, because in the long term it is a matter of trust. We know that accidents happen. If at least the safeguards are in place and as many accidents are avoided as possible, and if people are not left embarrassed at either data leaks or programmes that turn out to be intrusive or prejudicial against people, then you have won. That really was the purpose of the open policy process: to ensure that the risks were understood so that the Government could legislate on the basis of dealing with the complex risks rather than heading straight into a situation where they got a huge backlash and/or stored up problems for the future.

Renate Samson: May I add something quickly? The first line of Big Brother Watch’s submission says that we support data sharing across Government. I want to be very clear on that.

My second point is about individuals doing well out of this. The Bill, well, the factsheets accompanying the Bill, refer to wellbeing. I direct you all to the Supreme Court’s review of the named persons scheme in Scotland, where it was deemed that wellbeing was not a high enough bar—it did not meet the bar of “vital”, which the Data Protection Act requires. We want to do this properly so that people can benefit, but let us ensure that it is proper—that is not perfect, but the best it can possibly be.

Matt Hancock: A couple of questions. Would you be happy to share your blood type data to help cure cancer?

Renate Samson: I do not even know what my blood type is. To answer your question, I don’t know. I would have to give it serious consideration, just as I would whether I would be prepared to donate organs after I die. It is not something to which I can give you a snap answer.

Matt Hancock: Okay. You referred to the open policy-making process, which was a big process with lots of people involved, and the large majority are content with that process. Have you read all the individual responses to the consultation?

Renate Samson: No, because I do not know where they are published. I looked for them but I could not find them.

Matt Hancock They are on the internet, so you are very welcome to have a look at them.

Renate Samson: My understanding is that I would have to go into every single organisation’s website separately to look at them. They are not collated on the consultation’s website itself.

Matt Hancock No, they are all published online.

Renate Samson: On the consultation’s website itself?

Matt Hancock: They are all published online. In an earlier exchange, you talked about the broad purposes of the Bill and the problem with parliamentary scrutiny of those purposes. I would just like to understand a bit more about what you meant.

Renate Samson: Sorry. Could you repeat that?

Matt Hancock: In an earlier exchange with Louise, you talked about the broad purposes of the Bill and how they are defined. You said that those purposes are very broad, and I think you said something like, “and therefore it can mean whatever the Government wants it to mean”. I do not understand that, because any sharing of data must be for purposes very specifically set out, for instance supporting troubled families and supporting families in fuel poverty. I think it would be very hard to be against those goals.

Renate Samson: Forgive me, I do not recall being quite as you have said; I know that Dr Whitley said something very similar to what you just said. Our concern is that I cannot give an answer, because I do not feel as though the Bill has defined clearly what data sharing is or what are personal data. I cannot give an answer without being able to understand what the Government intend to do with regards to data sharing. Troubled families and the retuning of televisions are not included in the Bill, they are referred to in the factsheet accompanying the Bill.

Matt Hancock: They are referred to in secondary legislation, which will be scrutinised by Parliament.

Renate Samson: I feel—I can only say how I and Big Brother Watch feel—that having looked through the Bill in great detail, we have more questions than answers. If the codes of practice had been published, it might not have been necessary for me to be sitting here, because I would probably know exactly what is the intention. However, based on what has been published so far, I do not feel that it is clear.

Jim Killock: Future secondary legislation is quite a weak way of Parliament safeguarding a process like this, because essentially you then need to ensure that civil society, Parliament and everyone make sure that all the relevant safeguards are included in each statutory instrument.

Matt Hancock: No, the safeguards are in the Bill. It is the purposes that are in the statutory instruments. It is interesting—

Jim Killock: I do not think that the safeguards are in the Bill.

Renate Samson: Could you explain where they are and what they look like? I cannot see them other than the reference to the misuse of data, and we absolutely support the proposal that those guilty of that could be subject to a prison sentence.

Matt Hancock: Okay. I want to refer to another point that I did not understand. You said that the problem with the Bill was that it referred to RIPA and the Data Protection Act 1998.

Renate Samson: Because that is current legislation.

Matt Hancock: But what exactly would you propose?

Renate Samson: My concern, and this is not a telling off, is that a large chunk of RIPA will no longer be applicable by the end of year when the Investigatory Powers Bill comes in, and the Data Protection Act is about to be replaced with the general data protection regulations. Of course it cannot say that on the face of the Bill and none of the supporting documentation even refers to those two pieces of legislation.

Matt Hancock: It just seems a totally odd point, because the Investigatory Powers Bill is not yet law and, as you can see from the screen, it is being debated in the Lords today. GDPR is not in domestic law yet.

Renate Samson: We were trying to be “assistive”—if that is a word—in that there are elements of the Bill about which not just Big Brother Watch but other individuals and organisations are concerned that if it passes, when the general data protection regulations come in, it will not adhere to that law. It was merely a note of what is coming down the line so we have legislation that has longevity.

Matt Hancock: I do not think it is possible to legislate on the basis of other legislation that has not yet passed.

Jim Killock: GDPR is passed; it is just not implemented.

The Chair: Thank you to our two witnesses. Thanks very much indeed for your evidence. We release you.