Discussion Meeting   RECENT DEVELOPMENTS IN CHILD INTERNET SAFETY  Chair: Diana Johnson MP 

Transcription

Discussion Meeting   RECENT DEVELOPMENTS IN CHILD INTERNET SAFETY  Chair: Diana Johnson MP 
 Discussion Meeting RECENT DEVELOPMENTS IN CHILD INTERNET SAFETY Chair: Diana Johnson MP (Shadow Minister for Home Affairs) WEDNESDAY 22 JANUARY 2014, 9.30‐11.30AM Committee Room 5, House of Commons Programme Chair: Diana Johnson MP, Shadow Minister for Home Affairs 9.30am‐9.35am: Introductions 9.35‐9.40: David Miles, Director, Europe, Middle East & Africa, Family Online Safety Institute (FOSI). Overview of current developments. 9.40‐9.45: Will Gardner, Chief Executive, Childnet International. Perspective from children and young people. 9.45‐9.50: Jim Killock, Executive Director, Open Rights Group. Perspective of civil liberties. 9.50‐9.55: Chris Ratcliff, Managing Director, Portland TV. Current developments on age verification. 9.55‐10.00: Nick Truman, Recently responsible for online child protection in Bahrain. Practical implications from online child protection in Bahrain. 10.00‐10.05: Peter Davies, Outgoing Head of the Child Exploitation & Online Protection Centre (CEOP). Personal Reflections. 10.05‐10.10: Comments by Parliamentarians present. 10.10‐10.15: Andy Baker, National Crime Agency Deputy Director, CEOP Command. The Police Perspective. Discussion Background Since the Prime Minister’s landmark speech at the NSPCC in July 2013, major and far reaching changes have been introduced across the UK child internet safety landscape. Online filtering will be extended to 20m UK homes this year and also to public places frequented by children. This discussion meeting aimed to address practical implications through 5‐minute presentations followed by wide‐ranging discussion. For example, blanket filtering may have unintended consequences such that teenage sexual health education sites could be blocked (“over‐blocking”). We will also examine related concerns such as online freedom of expression or more generalised web censorship. www.dpalliance.org.uk Current Measures The Child Internet Safety measures already underway include: 


20m UK households will have whole home Internet filtering applied as default by four leading Internet Service Providers (BT, Virgin Media, BSkyB and TalkTalk) backed by a £25m awareness campaign aimed to inform millions of parents on the benefits of filtering. 90% of all public Wi‐Fi networks will deploy family friendly filters through an agreement with the 6 leading UK Wi‐Fi providers – with a specific focus on networks in public places frequented by children (e.g. coffee shops). Since November, Google has blocked over 100,000 child sexual abuse search terms, the Internet Watch Foundation has £1.5m funding to proactively seek out those that download such content and for the National Crime Agency to prosecute offenders. TRANSCRIPT OF SPEAKERS, DISCUSSION & ROUNDUP David Miles FOSI is an international charity with offices in Washington DC and London and we are made up of big corporates from Amazon to Yahoo. I am on the Executive Board. I am also Chair of the UKCCIS (UK Council for Child Internet Safety) Overblocking Working Group. The last 18 months have been an extraordinary time. We are going through a lot of changes so this meeting is very timely. Last summer, with a lot of activity from the media, and the good efforts of Claire Perry and others, public consciousness of Internet child sexual abuse, pornography and inappropriate sexual content really came to the fore in the public domain. That was in conjunction with some of the comments from CEOP around the issues on the way predators prevailed. This sea‐change in public opinion in many ways, led to the Prime Minister's speech in July where he laid out a number of key objectives. A subsequent summit on 18th November reinforced that, and there were some important deliverables. That intervening four month period was a period of significant change. We're seeing all four leading ISPs implementing network level filtering across 20 million homes across the UK and will be deployed throughout 2014. That is very significant. Six leading wifi providers are providing family friendly public wifi for the first time in the UK to filter out child sexual content with the use of splash screens and the blocking of 100,000 terms in that area. Finally, there are initiatives of a more proactive approach to dealing with a range of activity in other Internet areas such as peer‐to‐peer networks. FOSI tries to track the individual initiatives. Overblocking (i.e. inadvertent blocking of non‐offensive or educational sites) is one. On the international level we are more aware of this than anyone else. There is no other democratic country that we are aware of that is try to implement so many important online safety initiatives all at once. The UK is the first to fast track this level of network filtering at this pace. There could be some consequences to that, such as overblocking, and we have to address these concerns. The technologies of ISP companies such as BT, TalkTalk etc. are very sophisticated. They have levels of customisation unseen in the UK before and they provide a unique opportunity to families to make an “unavoidable choice”. This is an important moment. They are doing that along with public wifi mobile operators who have already for around 5 years had the default on for blocking inappropriate Internet sites. www.dpalliance.org.uk Page 2 There may well be an increase in overblocking. The greater efficiency of new filtering products may do this. The working group is very well supported in this activity. We look forward to working with the games industry and ISPs and a range of organisations, including social media, to make sure filtering is as balanced and age appropriate as possible. We are in a year of deployment in the UK so it is important to give ISPs and others room to get on with the task they've been set. I admire the level of political leadership in putting this together and am very proud to be part of the UK Council for Child Internet Safety, sitting with Will Gardner and others who have taken a lead to ensure it is really multi‐stakeholder and balanced. It is a very exciting time in this debate which couldn't have come at a better time. Will Gardner We (Childnet International) are a children's charity. Our mission is to make the Internet a safer place for children and I'm here to talk about the young people's perspective. We conducted a survey last year of 24000 children. There were two groups: 7‐11 and 11‐19 year‐olds. In both groups young people were being very creative and using technology to the utmost. We will be releasing very startling figures about the extent that young people are creating using technology. That is a positive message. We found in the two groups the four "C"s of risks and consequences facing children: content, contact, conduct and commercialism. These were faced in all the different forms by both age groups – the risks of the Internet face all children, both primary and secondary school children. Young people wanted education and clear reporting while using services. In 2014 we have organised Safer Internet Day to take place on 11th February working with our partners in the UK Safer Internet Centre, the Internet Watch Foundation and the South West Grid for Learning. I hope you will all be aware of Safer Internet Day and see it when it happens and be involved. There are some big players supporting us – the likes of ISPs, BBC, Disney, Post Office and Tesco. We have a wide range of stakeholders from public, private and civil society who are actively supporting this day and hope we will have incredible reach and incredible impact. Activity will include many schools, who will be running assemblies, bringing parents in, doing sessions for pupils, and getting pupils to run sessions with parents for example. Statistics tell us that last year's Safer Internet Day reached 1 in 10 of the population and 40% of those who heard our messages changed something about their online behaviour as a result. Hearing about this potential reach and also potential impact, I hope you will be motivated to get involved. There are things everyone can do. Looking beyond Safer Internet Day, be aware of initiatives such as the current Cyber Streetwise campaign generated by the National Fraud Authority which is online and offline, and for the latter including publishing posters and bill boards promoting Internet security and messaging around passwords and virus protection and Internet safety. In 2014 we will also see a choice for the customers of the big ISPs ‐ do you want your parental control switch on or off? The ISPs will work this year to provide this choice to all their existing as well as new customers. It is a real opportunity for us to communicate with parents and carers on this issue. The choice has to be an informed choice to be an effective choice. We should all support what the ISPs are doing and encourage www.dpalliance.org.uk Page 3 parents to realise that there is this choice but that they don't have to wait for this choice to make a choice. The choice is already there. The filters are there and you can go ahead and do it even now. The new computing curriculum in September will have e‐safety in primary and secondary schools. This is again a real opportunity to get this issue more uniformly covered in the school curriculum. Ofsted has an e‐safety briefing for inspectors. That is on their agenda and a really effective lever for school leadership. 2014 has all these things, including the big ISP campaign which is starting in the Spring with £25 million worth of awareness activity to reach out to parents. A lot is happening and we can use that to the effect that we all want to see – to equip young people to use the Internet safely and responsibly and the support of those who support them in their use of technology. Jim Killock I have been asked to speak from the Civil liberties perspective but will start out with a pragmatic point of view. Firstly we (Open Rights Group) recognise that the UK has more Internet filtering than any other country. More parents use it than most other European countries. Secondly, in terms of getting this problem right, it is extremely important that parents are engaged and that they understand the choice they are signing up to because people will simply switch filters off if they set the start settings too broadly and find too much is blocked and it gets in their way. People need to understand how the technology is working. Part of the conversation up to now is that those mechanics have not been understood. If you get too many people to sign up unnecessarily to filtering you end up with the opposite of the aims government has set itself. You need to recognise that different types of setting and filtering are needed for different children and different parents with different aged children and the kinds of technology they are using. Network filtering has an allure to get the whole home protected but we need to recognise that's not true. David Cameron should not be using phrases like "One click to safety". That is downright irresponsible. It's a question of what do your children need. You cannot easily target that when you've got one setting for everybody. That may result in some parents feeling the level of filtering irritating such that they feel tempted to switch it off. Older children might pester their parents and that might also be a pressure to reduce filtering. If you can target children you will be better off in terms of the policy goal we have here. The second thing to emphasise is the problems we currently have with blocking technologies. You've heard about the problem with overblocking – it is a very real one. Small businesses such as small shops or cafés come to us and say "my website is blocked" but is extremely difficult to report that anywhere. Often they haven't gone to ISPs or if they have they find it extremely difficult. If they ring up the ISP they will be asked “are you a customer” and they reply: “no, I'm just blocked on your network”. So the ISPs say they that they cannot do anything because you are not a customer. Or you ring up a mobile company to say a website is blocked. You might say: “please remove the block on this site because it is not pornography ‐ it is a church”. Those companies would reply: “would you like your filtering off?” and you would reply “No, I don't want the filter switched off – I just want all the people who go to this church to be able to visit this website”. www.dpalliance.org.uk Page 4 Customer services don't know how to respond. The problem is that it is impossible to educate customer services easily. Customer services have a lot of staff turnover and they hear about this complaint relatively infrequently compared to other complaints they get, such as “my Internet is broken”. They find it difficult to respond and ISPs find it difficult to train their staff. What if you are an overseas website? What do you do if you are running a site in the US or France and it is being blocked in the UK? How do you even hear about it? When blocked how do you get the block removed? Will they try to find the network operator in the UK and go through the dreadful processes with customer services? It is unlikely. They will just have to live with the problem and its extreme difficulty. There is a problem with reporting and with the mechanism for getting errors corrected. O2 is the only service that has provided any checking but it is currently closed because too many people are using it and complaining about the results they were getting. The impact on children is often regarded as good but LGBT groups are worried about the access even to their own websites routinely getting blocked because of the similarity in key words they might use and pornography. It is the same for sex education sites which use similar keywords because filters identify them automatically so that both categories of material end up by being blocked. It is difficult to resolve because the number of sites is enormous and the means to categorise them is automated or partially automated. Some may be identified correctly as adult but are perhaps important for LGBT people to access because this is a vulnerable group and is a means of finding out about their sexuality. We need to recognise that this is a powerful technology that restricts information. It could easily lead to concern that religious groups might want to block sex education merely on religious views. Is it appropriate to deny access of information? For example, is it appropriate for them to block LGBT sites or even science and Darwinism? Such impacts could be a result. We are doing a lot of work to document and understand what's going on ‐ see blockstop.org.uk. Thirty voluntary developers met last weekend at Mozilla's offices where we tested the filtering direct. We are doing that because the network operators are not providing these tools to the public. We're doing that to find out what is and what isn't blocked. Concerns about intellectual property rights mean some commercial companies are reluctant to provide access to their major tools. They need to think about the public interest in finding out what's going on. Chris Ratcliff Portland TV has operated Ofcom‐licensed adult TV channels across the Sky, Virgin and Freeview platforms since 1995 under the brands Television X and Red Hot TV. Our broadcast channels offer soft‐core content equivalent to a BBFC 18 classification. In contrast our 2 on demand websites offer full‐strength hard‐core– material which if classified would receive an R18 or Restricted 18 certificate. They are fully compliant with the Rules & Guidance of the UK on demand regulator ATVOD (the Authority for Television on Demand.) Portland sits within the broader Northern & Shell media group comprising Express Newspapers, Channel 5 and the Health Lottery. Needless to say we take the issue of child protection very seriously. In line with ATVOD regulation, all sexually explicit content on our websites is placed behind access control. This means that until we have verified you are over 18, hard‐core imagery and videos remain inaccessible. ATVOD accepts that payment by credit card is one measure of a customer being 18 or over. If you wish to pay by any other means, we use the same methods as the gambling industry to age‐verify our customers. There are a number of established providers in the market offering age verification services that have www.dpalliance.org.uk Page 5 grown up around the gaming sector such as GB Group, 192 Experian, Call Credit and Intelligent ID with whom we’ve partnered. At point of first registration, we’ll take your name, address and date of birth and run a secure real time check against the edited electoral role and available credit data. If this fails, we are able to verify against passport and driving licence data. This will extend shortly to a document validation service which will allow you to take a picture of your passport or driving licence on your smart phone and send it to us for instant verification. We are also looking into the extent to which social identity (your social media profile for example) might be used as a valid indicator of age. Though robust in terms of restricting access, AV match rates remain low, costs per verification remain high and any hopes of users browsing their favourite porn site anonymously are swept aside. And however commendable these endeavours may sound, we are one of three UK adult companies to have adopted age verification and one of only 25 companies to step forward to be regulated and implement the bare minimum. With all these hoops for consumers to jump through, it is not surprising that ours are not the sites which garner the traffic. Web traffic will follow the path of least resistance and this is where the adult tube sites rise to the fore. The so‐called tube sites are porn sites based on the YouTube free‐play model which offer hard‐core video at the click of a play button with no warnings, splash pages or any means of restricting kids’ access. According to Alexa's global traffic ranks, 6 of these tube sites appear in the top 100 websites visited from the UK. The most popular ‐xhamster ‐ sits at number 26, one place ahead of Instagram, with xvideos at 36 and pornhub ranked as the 44th most popular website by UK usage. If we look at some Hitwise Experian stats for UK visits to these 6 tube sites, the figures are staggering. Pornhub gets 66 million monthly UK hits, xHamster 63 million, xnxx 29 million, red tube 28 million, xvideos 28 million and youporn 26 million. That’s a total of 240 million hits from the UK in a single month to adult sites with no form of child protection. These sites sit beyond the reach of the UK regulator. The ISP filters may catch them, but where parents choose not to opt in or to disable the filters because they are disrupting our adult experience of the web, nothing will stand between kids and a Google search for free porn. Clearly in terms of child protection on the Internet, the current regulatory and legislative framework is not fit for purpose. The tubes sites and the majority of offshore adult pay sites would, if UK‐based, fall foul of the Communications Act and the Obscene Publications Act. A once heavily controlled sector has become a free‐for‐all as the power balance has shifted away from broadcast to the Internet. ATVOD’s remit restricts it to scrutinising the activities of a handful of UK companies like Portland, while the market dominance of Internet porn giants like Pornhub operator Manwin (aka Mindgeek) remain unchecked. The situation is untenable. Urgent attention is required to adopt a common sense approach to regulating adult web content, defend UK business interests and of course to protect kids. Nick Truman I'm going to talk about my experiences working in the Middle East in the Kingdom of Bahrain which had a national filter in which all pornography was blocked by default regardless resulting in a huge market in VPNs and proxies. The country also had a portal. If you thought your website should be on there you could submit it and the national censors would decide if it was appropriate or not. www.dpalliance.org.uk Page 6 My remit was to try and implement in a Kingdom with no Internet control or Internet safety beyond the national filter – it was a blank canvass for child Internet security. I decided to put in place a system that I had installed while I was head of customer security at BT. It was quickly apparent that I had three main obstacles 


Politics; Culture; Industry. Politics: The country is run by a Sunni minority. The King and Crown Prince are forward looking but between them and the rest of the country was a firewall of staunch Sunni protectorates who fought hard to ensure their way of life is not undermined, objected to washing dirty linen in public and took no responsibility. Sunnis were the biggest adopters of VPNs and proxies to bypass their own firewalls. Meanwhile the Shia majority followed strict Sharia law based on Iran. The result was a two‐tier Internet, those who could afford to be safe and the uneducated who were at risk. Culture: The Bahraini culture does not fit with the 21st century Internet. For example a young girl cannot go out alone and has to be chaperoned when meeting boys. But, with the Internet and Facebook, girls have been talking online to boys outside their parents' control. The researchers in my “State of the Nation” review were confronted by the consequences when parents found out: severe beatings on children, attempted suicide by girls in public schools in the Shia majority, and uneducated people who didn't understand the technology going to great lengths to ensure that, where possible, their children were not involved with it. Industry: Industry in Bahrain has no sense of corporate social responsibility. Unlike in the UK there is no corporation tax and no tax whatsoever so there is no incentive to do anything to look after the consumer. There are four Internet Service Providers, the services are expensive and slow and urban communities tend to buy one connection and network all their houses together. I tried to propose to industry that they adopt best practices voluntarily but they looked at me as if I had fallen in from another planet. I carried out the first Middle East State of the Nation Review – a quantitative research review and interviewed most people in Bahrain from stakeholders to ordinary people to get to grips with the issues. One result was that the Minister for Telecommunications had a eureka moment when it dawned on him that Bahrain could be a destination for sexual predators because of the complete trust people in the community placed in the Internet. We put in place a fantastic technology‐neutral Internet safety guide in partnership with Childnet International. That resulted in a national consensus which industry signed up to and a code of practice which was adopted across the Middle East. As I left the Kingdom change was not going to happen overnight ‐ it was a slow process which is still ongoing. Peter Davies I wish to make it clear that Andy Baker is Deputy Director of CEOP who will speak for CEOP and the law enforcement community. I am still connected with the law enforcement community but I am speaking here for myself. We have so much to be proud of and glad of and David Miles nailed it when he said the last 18 months have been the most busy, productive and active in Child Internet Safety certainly in the UK and given the position the UK occupies, possibly any nation in the world and we should celebrate that. www.dpalliance.org.uk Page 7 From my perspective in CEOP and my national policing responsibility for child protection and child abuse investigation I have observed before that we enjoy probably the best framework for child protection, in terms of legislation, agencies, partnerships and participation, of any country I've seen. So we have much to be glad of. Speaking briefly about CEOP – knowing there are few precedents for outgoing CEOs I'd like to set my own. The last three years have been hugely developmental. The size of the centre, the productivity, and level of expertise, and reputation and the impact of our operations has been terrific. I also think that the National Crime Agency has been the right place to place the Centre. There are however challenges which can be summarised in two ways. Firstly, for the excellent work and all the progress we have achieved in the last 18 months to 3 years the level of the impact we have had compared to the scale of the threat can't give us any sense of contentment. Secondly, the fluidity of what goes on online and the lack of control of content and access superbly expressed by Chris Ratcliff are major challenges going forwards notwithstanding the progress already made. Threats change, technology changes and the way in which human beings, particularly children and predators, use it constantly change and so a set of solutions that were the appropriate state of play three years ago is now by definition obsolete. And the set of solutions we come up with now as a result of these sorts of conversations will be obsolete in three years’ time. So we must gear ourselves up for a constantly changing level of threat. The other important thing is that we have a duty and responsibility continually to improve our understanding of the threat. For all the existing expertise there are aspects of what goes on online about which we don't know enough, not just in terms of child sexual abuse, but other forms of harm. So it is our duty to continue on the quest for knowledge. There will never be an endpoint here. In terms of managing the risk, the situation in this country is better than elsewhere in the world but not enough to protect children either from predators or from access to the wrong type of material, as Chris Ratcliff so startlingly illustrated. The growing swell of evidence is that children are affected by their access to pornographic material and that affects their perception of relationships. That sets their assumptions and maybe we are setting a new set of problems for us to be tackling in terms of young people and adult behaviour in the future. I think that partnership, and understanding that the protection of children is everybody's responsibility as individual citizens, politicians, agencies and organisations whether industry or elsewhere, is absolutely essential. There is a duty to understand that. People I've encountered over the last three years absolutely understand that. One of the ways doing a job like CEOP is made easier is because there are very few people that don’t respond at the emotive level to what CEOP does. Child protection is always urgent and always important. Let's not look back in 18 months to three years’ time and feel we lost momentum or opportunities. Let's maintain it and move it forward and be more energetic and proactive in the future. This is a new normality. Baroness Thornton (Shadow Spokesperson (Equalities & Women's Issues)) I have been involved in all legislation over the past 15 years to do with child protection. As Parliamentarians we in the House of Lords think there is a gigantic new problem to deal with here. I have one question for Will Gardner: What might you be doing for cared‐for children? Filtering regimes in the home depends on parents doing something but what about those children who do not have parents ‐ cared‐for children? What are your thoughts? www.dpalliance.org.uk Page 8 I have a second question for Jim Killock: You gave a run‐down of the problem you believe we are facing but didn’t suggest any solutions apart from the possibility of turning off all the filters? As Shadow Equalities Minister in the House of Lords I am talking to LGBT groups about this issue and can share their feelings because I can see the problems that they are outlining. But I don't think the answer is to say that there shouldn't be any filters. Is it beyond the wit of large, rich ISPs and organisations in this space to come up with solutions? I think not. I welcome what PortlandTV had to say about age verification because, certainly in our debates in the Lords, we do think that age verification is very important. If the gambling industry and Portland TV can have proper age verification and really deal with it then I can't see why the rest can't. Baroness Howe (Cross‐bench Peeress) What's been said around the table today completely reinforces everything that we have been feeling about the whole area for a long time. Age verification ‐ demonstrating that a person seeking to access adult content is 18 years or over ‐ is absolutely key. It is very interesting that where age verification has been required by law, namely with respect to online gambling, it has been provided very successfully. It is also interesting that before it was required by law it did not really happen. I think that this has been an important gathering with everyone contributing to the picture which gets worse by the second. In the next few days we are seeing if we can strengthen some of the areas through an amendment to the Children & Families Bill currently going through the House of Lords. This meeting will certainly reinforce me as we consider what we can do in the next few days. Julian Huppert MP (Liberal Democrat MP for Cambridge) Child Internet protection is clearly very important for everybody here. That is the obvious starting point. We all want to achieve that, but it is not to be used to do anything that people want. I suspect that if you ask in Bahrain why they make sure a girl is not allowed to talk to her friends or start texting them that there will be a very moralistic line about that. So the important argument is about what we want to achieve by the measures but they cannot just be used as a trump card for anything you want to do. There is a risk along lines of “Yes Minister” that: “We must do something; this is something; therefore we must do this“, and a “this” which doesn't necessarily achieve the goal of child protection we're trying to get to. There is a huge split, and not much has been said about it here, between legal and illegal content. The Internet Watch Foundation work is incredibly effective in dealing with illegal material in this country. That can't be the same approach for legal content. There is an obvious distinction between legal and illegal. The Canadian Project Spade is a case where we were waiting 18 months for CEOP to act. That was a case where Canadian police passed to CEOP 2345 specific suspects of being involved with illegal activity with children. No action was taken when CEOP was given the information. It is a great shame that opportunity was missed. It seems to me that we don't need more powers when we get the suspects we should actually get the ability to get them brought to justice. On the issue of what we can do for children there is, of course, a role for filters. I have no problem with parents choosing filters and having a level of control with some constraints, but this should not be pushed www.dpalliance.org.uk Page 9 by government. It should not be enforced because I think that would actually worsen protection for children. It would give a sense of false protection. If you look at all the surveys on filtering you will see that a lot of parents choose not to have them. Children will believe there is protection that is simply not there. If you look at Ofcom reports you will see that there are lots of ways of getting round filtering. It is better for parents to keep an eye on what their children are up to than to say: “I've set this filtering up so we'll leave them in their rooms and all will be absolutely fine”. That is actually worse for the child. There's also the very serious issue about overblocking. It presents problems for the LGBT community and I've spoken to children who are struggling to get information but find controls on issues such as domestic violence. However hard it may be to set up the filters, there ought to be some sites that we should not allow people to block, for example, Childline. There will be a set of areas where adults should not be allowed to prevent children getting access to them. We want things that work. We do want to get child protection. We have to achieve what will work for child protection, not things that look as if they will work. Baroness Masham (Cross‐bench Peeress) Will more legislation help? This area is so complicated. There are many parents who really don't mind what their children do. Educating parents is important but there are some chaotic parents who just leave their children to do their own thing. Andy Baker First of all I would like to deal with the point about Operation Spade. Operation Spade was one referral out of 18,887 referrals for that year and it happened to have 5 cases within it and it was a global issue. The images we received in Operation Spade was a patchy picture passed on to us via Interpol from the Toronto police. You would have been challenged to have found images of sexual poses of children and there were none of contact abuse. Insight on the intelligence front of who those people are ‐ mistakes were made, we've moved on, and some of those people who were passed on to us have now been dealt with. However, others in a position of trust and position of working with children had bought those images through Amazon. So when you look at Operation Spade as an individual case there is a lot more delving into that and we need an intelligence response to the intelligence picture. There were no contact abuse images. However, men who had sexual interest in children had bought most of those images. We've learned from those issues and we've moved on. It was one referral of just short of 19,000 that we dealt with in CEOP. It is important to say that we're discussing the bad side of the Internet. We all use the Internet – we have no choice nowadays and 99% is good, used in banking or research for example. It's the bad side we have to police. Up to one third of imagery we see at CEOP is because of risky behaviour of children. It is a learning procedure that those of 12 years upwards go through. We've all been through and we often forget, except we are now in the Internet age. www.dpalliance.org.uk Page 10 We see many issues, whether through Childline or Childnet or others where children have threatened serious self‐harm as a result of their behaviour with a boyfriend where they have shared images in this learning process, then sadly they break up and their boyfriend goes viral (with the images). That is a massive issue for parents when it comes to abuse. Do you stand over their shoulders and police it in the nicest way because children are at risk more in their bedroom than crossing the road? As a child progresses through those tender years of age we show them how to cross the road. Likewise we should show them how to learn through the Internet, for education and parental guidance. But following the road analogy even when crossing the road supervised some children still put their toes over the edge of the kerb. It is interesting that Bahrain was mentioned – there the policy particularly about girls is strict. However, we have five men in custody in Bahrain for abusing children around the world over the Internet ‐ there are no borders in this. There is a culture of men grooming boys who target English speaking and British children both because of their knowledge of the language and because of their perceived thought processes around our relaxed view around gay rights and homosexuality. So there is an interesting tension there. We should never remove offline from online. To have an online offence you must have an offline offence. Whether that's sexual posing – an individual has taken it and posted it themselves ‐ or of contact. There is a blur between the two and we should never separate them. We identify three strata of activity in this world of child sexual abuse and exploitation: 1
Open web (with many thousands of viewers). 2
Hidden web (with hundreds of participants). 3
Peer to peer (with thousands of those sharing and distributing). Regarding the open web, the Prime Minister assisted us greatly with his statements in July and November, and that will help in tackling the Open Web through measures such as blocking, filtering and blacklisting. Overblocking is an issue – we need to police that and educate people around that, because detection and prevention are as important as pursuit. Regarding the hidden web, that was picked up in November and we now have a UK/US Task Force working on that under the guidance of Joanna Shields working to the Prime Minister. We are engaged with that and have just appointed a senior officer within CEOP to work on that. And we are conducting a number of operations in all those areas but we are particularly focussing on peer to peer communications. On the issue of peer to peer (which is one communication with another communication that is not on the open web), children are really versed in peer to peer because they download films and music in that arena. So in case we think it is paedophiles who are in that peer to peer world, children are there too. The Prime Minister's statement has helped us. We have a target date of November this year. I agree it is complex in law but we are working with officials to see where there are benefits in tweaking the law, in travel orders and asking if there is an offence of blackmail in this country because there is not in law. It goes back to the 1968 Theft Act, Section 21 when the existing statute, which regarding blackmail value as being solely money was changed to include drugs. I think sexual imagery of children has a value in the hands of paedophiles and the law needs to move on in that area. What we have done in the UK, and I’m proud to have been a police officer for many years in protecting children and dealing with offenders is that in the UK, we have lifted the stone. CEOP is unique in the world – there is no other CEOP worldwide with this discipline and the partnerships, openness and willingness to move forwards. www.dpalliance.org.uk Page 11 Our partnerships with the Internet Watch Foundation are best in class, and with Childnet and others on the international area too – we have the International Child Protection Network and we provide certificates for people to work abroad to say as best as we can that they have no records in the UK. Every day is a school day ‐ a learning day because paedophiles will find every opportunity to move forward. We've got push factors from the UK and pull factors from other countries with poorer cultures because of poverty, culture and other reasons. A good example of that is Operation Endeavour (where a UK‐led police operation broke up an international Philippines‐based web abuse ring) which was in the news last week and is to come to trial. Paedophiles have learned to live‐stream from the adult porn industry. There are many facets of child abuse that we have to tackle that we are not going to give to them. GENERAL DISCUSSION Will Gardner: On the question of cared‐for children, this is an important point to raise. What we have coming up is an opportunity that extends across charity carers and schools. There are educational materials available, as well as levers and opportunities within the schooling system. There are broad awareness schemes coming up. It is important that all this is being used to support all children across the spectrum. So that is something we need to be looking at. Our partners at the UK Safer Internet Centre, the South West Grid for Learning are delivering training courses for social workers around the country at the moment. It is a big job and important initiatives like this are happening but it is going to take some time. Jim Killock: On the question of solutions, to be absolutely clear we are perfectly fine and happy with, and find it reasonable for parents to have tools and use them. That is a good objective. I am trying to emphasise that they are no panaceas and we cannot assume that filtering is going to solve lots of problems. It just isn't. Filtering is OK if the children are co‐operating with them, if they think that is fair enough, and if the filtering is limited so that the children do not suffer from not getting material they're entitled to. Regarding solutions, how do we get filtering to work better? The key thing is that people understand what filters are enabling, that they are very involved in switching them on so that they know what they are, and that they are not blocking. They also need to know when they've got problems with filters. Wherever they are, people need to know whether they are being blocked or not. Therefore we need a great deal of transparency, we need everyone to be able to report problems, and we need error corrections to be dealt with. There’s a great deal of work that can be done in these areas and needs to be done whatever the set‐up processes are. Baroness Thornton: What about filters in public spaces. Should they be mandatory? Jim Killock: Mandatory is probably a little strong. I think it is a bit of a red herring. Baroness Thornton: But what about boys sitting in Starbucks accessing pornography. Jim Killock: It is extremely unlikely that people would be in Starbucks accessing pornography. Even if it is technologically possible it is also technologically possible to get round blocks. So block if you like but you have to remember that you are going to cause certain things not to work. www.dpalliance.org.uk Page 12 Filtering causes people a lot of inconvenience because lots of services break that you might want to use on the Internet. You don't just break access to web sites. Lots of other things go and that damages innovation. You have to be really, really careful about that. The key point is that you need to understand the harms on both sides. You need to make filtering work in a way that everybody understands what going on. Remember that any child armed with a computer can get round filtering, so if you really want to start blocking, stop access to a range of websites. If you really want children never to access pornography you're going to have to destroy peoples' technological education and make sure they are incapable of using a computer. Baroness Thornton: I didn't say that. I am concerned about a cohort of young boys with access to hard‐core pornography, not just top shelf stuff, for hours and days at a time. That is damaging them and their future relationships. We, as adults, have a responsibility to try to mitigate that. Jim Killock: Everyone agrees with that. The question is whether we are over‐relying on filters in this debate. Everybody will also agree that filtering is not going to answer more than a fairly small proportion of that problem. Anyone determined to access material and armed with a computer and an administrator's password is simply going to be able to do that. Nick Truman: In 2004 I came up with the idea of BT Cleanfeed which was a selective way of blocking images of child sexual abuse from the Internet Watch Foundation’s list. That was black and white: if it was on the list it got blocked and if it wasn't on the list it didn't get blocked. That is the experience in the Middle East where everything is blocked, and there is no way round it unless you invest in a VPN or a proxy to get round them. There is a national portal so you can get websites taken off. If you were absolutely adamant in accessing harmful sites even the filters won't work properly. In our experience, people used Google Translate to translate the search term they wanted into Chinese and then to look up the Chinese expression which bypassed all the filters. That was always a weak point. The problem is that the good guys are always blocked and the bad guys will always find what they want. Sites like Pornhub make their revenue by ensuring that they are on everybody’s desktop and they will do everything they possibly can to bypass and circumvent any filters that are in place. I think that filters are imperative and I think more emphasis should be put on parents and carers to ensure that. You can't buy a car without a seat belt – in the same way, all laptops should come with parental controls and filters built in. Peter Davies: We went through this discussion when we were dealing with major search providers hosting at the Prime Minister’s speech. It should be phenomenally easy to test how effective filters are in real life, what the level of overblocking is, and whether overblocking represents a highly prevalent risk, a low frequency minority concern, or a matter of principle. It is very easy to test this. John Carr has been leading the field in doing this sitting in Starbucks for hours on end. We don't need to theorise about overblocking. We can test this every day online. That is an opportunity that very few issues of this significance have. Andy Baker: The technology has been created and it must be able to be controlled. Those that are making money out of this must be mature enough to say we know how to make this happen, we also know how to police it and make it safe. I haven’t got an answer to overblocking. This has got to be a step approach. I don't think we are going to come up with the answer as a whole. I do know that over‐filtering just by taking out the LGBT references www.dpalliance.org.uk Page 13 stopped a company in America from trading. In America that means law suits. So we have to work it through really closely. But I do think there's an obligation on those who provide access to websites to police that access as well. David Miles: As Chair of the Overblocking Working Group I've been working over the last few months with all the stakeholders involved, including human rights groups and the gaming industry and those areas one would not have thought would be impacted. It is interesting that the level of overblocking at the moment is relatively low numerically. We are trying to measure that to see if there is a difference as the filters are implemented. It is important to understand where we are now and where we will be in the future. Internet Service Providers (ISPs) are working very closely with us. They are dealing with individual cases on blocking, responding to them and being flexible in dealing with them. They are trying to learn, because not all four ISPs are deployed. We are experimenting at the moment. These technologies are used in other countries, such as Australia and America but none of these countries are looking at overblocking, so we should be proud that we do want a balanced response and we are doing this over a period of time. So the Overblocking Working Group is breaking new ground and the involvement of all the stakeholders will come to a pretty good conclusion in terms of response. Some sectors of the industry have already taken a lead. The mobile operators have already implemented a reporting system with the British Board of Film Classification (BBFC) which is starting to work well. We will see how that gets on. There's a start there and we will be talking to them. The final point, which I've been discussing within the European Commission, is: “Why now”? There is a big driver here. Ofcom's research last year said that 37% of 3‐4 year olds access the Internet regularly. So technology is redefining childhood. Whereas we were dealing 3‐4 years ago with 12 years and up we are now dealing with pre‐school children as well of whom 9% (before Christmas) already have tablets. The urgency is there because a broader range of children at different age groups are coming onto the Internet and their engagement and fascination with technology make it very urgent that we fix child Internet safety urgently. Will Gardner: Filters are a useful tool. We've heard filters criticised because they are over‐effective and also because they are under‐effective. You can attack filters from both sides but in essence they provide a useful tool to help parents block content from children. I don't think you should imagine they are going to solve everything. It is very important that when we talk about filters we see them as part of doing something. There are still other Internet‐related issues such as cyber‐bullying, sexting and other areas that provide technological risk to children. So there are problems with filtering and I'm glad there is this Overblocking Working Group because the worst thing that can happen is that some people come to filtering as a means to help their children but end up by switching it off. We want to do everything we can to discourage that from happening. This is the time and opportunity to make the filters more intelligent, more responsive, and enable as effective child Internet safety as we can. Kristof Claesen, Internet Watch Foundation: We take reports from the public about illegal online content, in particular child sexual abuse images and videos, and then work internationally as well as nationally to remove that illegal content and temporarily block the content if hosted abroad. The Prime Minister’s Summit last year was a focal point for us and we have organised a number of awareness events. This activity has resulted in a 30% increase in abuse reports received by us and a 3000% increase in media coverage, during certain months of 2013. www.dpalliance.org.uk Page 14 The changes last year mean that we are now stronger than we were before. Following a membership review and a new funding structure our Members have increased our funding. Also Google donated £1m to employ additional content analysts. That has resulted in us taking on seven more analysts starting at the beginning of February to join our existing team of five to enable us to take a more proactive approach. We are now, therefore, able to search for publicly available child sexual abuse content ourselves instead of waiting for the public to come to us before we take action. Previously we only worked from reports we received from the public. We are also working with the Home Office and with law enforcement on a pilot project to disrupt the pathways to the peer‐to‐peer content that appear on search engines. We're not law enforcement so we don't go after the people: we try to remove content and work closely with law enforcement for this. Regarding our international work, Mauritius is the first country to use our Online Child Sexual Abuse Reporting Portal (OCSARP) which enables countries without a hotline to give their citizens a way of reporting child sexual abuse content on the Internet. The IWF receives the reports, assesses them and works to disrupt the availability of the content. On blocking, we have introduced with our industry members Splash Pages to provide additional transparency when a URL is being blocked because it is on our list. This Splash Page informs the user why the URL is blocked, how to complain about it if they deem the blocking incorrect and where to find help if they are worried about their online behaviour. We also have a Human Rights Review to be published soon and at the end of March our Annual Report will include the most up‐to‐date information regarding statistics and recent developments in this area. Andy Baker: CEOP is working with the IWF and through the Home Office to build a national library for a Child Abuse Images Database. Sally Leivesley, Newrisk Ltd: My current work is on catastrophic risk and terrorist risk and I wanted to raise the issue that when we're looking at the child internet sector we're looking at children’s preparation for life. We're dealing with trust and the opposite of that the abuse of trust. Children have absolute trust. I recommend that we look at the use of the Internet for advertising terrorist ideology and child recognition of international terror. Since 9/11 it has been apparent that terrorism has given a career option to children. Child recognition of ‘e‐terror’ which I define as the use of the Internet for advertising terrorist ideology is a broader option for Internet safety. E‐terror safety would focus on child and parent recognition of this catastrophic risk to the child’s future and to the security of the society. At a catastrophic level we see it particularly through the recruitment of females into an ideological framework where they can potentially maximise attacks on their home countries. This is result of early adolescent children searching the Internet and, through terrorist advertising, getting access to scenes of violence and power over people, and the ideology of doing good for the world through a terrorist career route. It hasn't yet been recognised that terrorism is open to children as an alternate option in their search for a career. There are risks through approaches to non‐Muslim nationals to convert them to terrorist ideology and to perpetrate devastating attacks on their country. If you look at that and the options you are considering, there is no reasonable parental objection for this recognition of terrorist influence and role modelling through the Internet not be taught at schools or for parents to be involved in this through school with information and reporting There is no reason for young children not to understand what terror recruitment is and to recognise it and also report it ‐ in the same way as they are taught regarding general abuse by strangers. www.dpalliance.org.uk Page 15 In this area of terrorism we are considering a different solution from those currently being suggested elsewhere at the moment. Maybe there is an important and wide reaching solution for providers. I agree about the concerns of overblocking and the intrusion that blocking will reveal. I think we are also naïve on children’s access to technology because of the range of technology now, from mobile phones to even in the future, screens on refrigerators! The access routes to children are such that parents will become important for recognising changes in their children and dealing with them through schools and other authorities in the same way as they currently do for drug abuse. We also have to look at parental controls to help prevent children being reached and groomed or persuaded into a terrorist group. We have the ability to circumvent this and need to ask if the providers can look beyond generic static blocking to look at Big Data. Responsible providers could try, by identifying child abuse signatures, to stop the vision or transmission of material. By setting up a shared Big Data file they can get to a granular and immediate evidence‐based identification of where the Internet is being used in real time for exploitation. This approach is related to that to prevent terror. We should not rely solely on the police, because there is always a time delay between analysis within government and action. Future technology could use Big Data. Providers know that this has to be dealt with instantly. With terrorism as soon as there is a terror act the violence and the dead bodies appear on the Internet. So reaction has to be instant. That is something the private sector can do itself if there is a shared Big Data file which gives instant access to e‐terror abuse. They can make their choice about what they do or do not transmit. Simon Milner, Facebook: There are arguments about the filtering debate on both sides and the work of David Miles and his group will be very important here. I do think this is something where Parliament should take an interest as it is the body that takes responsibility for the regulation of the communications industries. Parliament is evidence based and is by far the best organisation to assess these items and look at the evidence. So I would strongly encourage Parliamentarians to consider a potential for Ofcom in monitoring this year of implementation. I would just like to spotlight on the facts. Parliament has the best facility to look at this. Also, we should be very cautious around thinking that what may work for content based risk for children will also work for behavioural risks. Any suggestion that filtering will tackle issues such as self‐harm, suicide or bullying should be scotched. The idea that you can use your Internet filter to prevent these abuses is nonsense. We know from experience that technology is part of the solution: you need the best possible tools to help people to manage their experience of interaction with other people and report on that. But the solution is not just technology based. You need experts to look at those reports, real people who understand about human relationships and who understand about how people behave online, and you also need to work with partners. We work very closely with CEOP and Childnet and other organisations in the UK and around the world to get the messages out to children for teachers and parents on how to deal with these issues. That's not about technology: it is about the power of real people. It is about empathy, peer support groups, and it is about schools. Finally, one of the biggest mistakes in the UK is the blocking of social media in schools. From experience many teachers encounter a problem around Facebook. Teachers are told: “do not use that service – you shouldn’t be on Facebook” and they have no idea how to help their pupils. So let us use Ofcom and think about people and not just technology. Emma Carr, Big Brother Watch: We wrote a piece for Mumsnet recently on the topic of online privacy and www.dpalliance.org.uk Page 16 one aspect of the feedback that we received was that many parents are clearly advocates of active choice filtering ‐ meaning each household controls what their own children can see online. This is because of the cultural sensitivities in each household being different. There appears to be concern from parents about government, at a high level, mandating blocking what their children could see on the Internet. For instance, some made the relevant point that for each household outlook on what constitutes pornography and adult content are very different. It is important that the adults in each child’s life are able to directly address child protection and behavioural issues, such as attempted suicide or LGBT, and only they can decide what is and is not helpful each individual child to see. As an issue, filtering is incredibly complicated and cannot be a tactical solution. Filtering definitely has to be there but what is clear is that parents want to make decisions for themselves. Carsten Maple, A Director of the National Centre for Cyberstalking Research: Internet safety is complex, especially when it comes to key behaviours of children such as self‐harm, bullying, exploitation and the self‐
generation of indecent images. Since it is a very nuanced problem the solution requires many different components. It has been questioned whether technology can provide a solution. We have heard of problems of overblocking and that there is an issue around abilities to circumvent technological controls. To address the problem in a meaningful manner requires developing legislation, regulation and educational strategies. However technology must have its place in any approach. While it is to be acknowledged that a technological solution is unlikely to be error‐free, one must consider the benefits against drawbacks and shortcomings. With sufficient effort technological approaches will have fewer drawbacks. Education is a vital component in any proposal to address the problem. The school IT curriculum is changing and this, I would hope will bring an opportunity to ensure effective child education. Whilst it is recognised that children are becoming adept Internet and technology users, we should not confuse their ability with Internet technologies with their ability to assess risk; therein lays the problem. Children regularly download music from sites that lack authenticity. There can be all sorts of malware that is downloaded with what seems to a child to be an innocuous music or video file. In such cases the children are not looking for sexual or violent content however they do not know if there's sexual content or not in the file they have just downloaded. It is these behaviours that we need to address. Further, children don't know what actions and comments are acceptable online. They don't know what is legal and they don't know how to address each other online. Due to children’s under‐developed sense of “netiquette” we have seen many instances of cyberbullying ranging from un‐intentionally harmful teasing to the pretty horrific bullying that has had fatal consequences. Parents and a lot of adults also struggle to assess risk and only through education can this issue be tackled. We have to understand if we want to solve the problem we have to employ all four different approaches (legislation, regulation, technology and education), with education being a key approach. We need to provide parents with the skills and knowledge to make informed decisions. Adam Kinsley, BSkyB: On age verification, Sky introduced a network based filter in November 2013 called Sky Broadband Shield. It has ten different categories organised by age range rating (PG, 15, 18) which makes it quite simple. The key to its success is that it is very easy to configure. One may choose to have a certain set of settings for the afternoon and another set in the evening and changing from one set to the other is achieved with a simple click. It is therefore flexible for parents to use to switch off controls in the evening and then switch back on in the day time. www.dpalliance.org.uk Page 17 While it might be a good thing if all content providers and some of tube sites had age verification we have already heard that one of the age verification issues is that the cost of the Experian and other checks is quite high. If we were to have such a system at ISP level that the checking would be cumbersome and parents would not be interested in going through that process every time they wanted to make changes to the setting. Simplicity is absolutely crucial to the success which is why some of the proposals in the Bill are unworkable with the current set of filters we have. In terms of the overblocking issue, we have 10 categories and sexual education is not one of these categories. So if there is health education on a LGBT site it won't be filtered unless we have mis‐categorised it as pornography. There is a process under David Miles’ stewardship as Chairman of the Overblocking Working Group where we are going through a list of sex education sites to make sure they are not being blocked. In practice, the levels of overblocking in this categorisation are incredibly small – we have seen just two or three examples in the first month where genuine errors have been made. So we are not blocking any LGBT or sex education sites. I agree with Jim Killock that it is not a panacea but some of the concerns Jim Killock has highlighted on overblocking from our service at least are not transpiring in the way maybe being suggested. Mike Hurst, Centre for Strategic Cyberspace & Security Sciences: I have recently retired from the Metropolitan Police where I worked on cyber‐enabled fraud. We've been talking about blocking and filtering but how do parents review what's actually going on? My grandson could work an iPad before he could walk and talk! Is there a safe browser that actually gives that information so parents can go back and see whether or not their seven‐year old has been accessing unsuitable sites? There is a level of inappropriate content that would not be blocked by filtering. There are cultural issues around that – religious people may not want certain things being seen by their children. What is the reporting structure? I've done a lot of work in the reporting structure for fraud and how that isn't working properly. Also, with that reporting structure, what training do police officers get to understand what the issues are? What action should be taken? How do they refer that to a specialist? Are there specialists in place to deal with it? We hear from the Internet Watch Foundation about the reports they get. Would a parent have an easy route to actually report an issue through the IWF? Paul King, Director of Threat Intelligence, Cisco: Cisco offers a commercial filtering service which handles 18 billion web requests per day, so we know a bit about filtering web content. I would welcome another session because there is so much more to be discussed on Internet safety that does not involve filtering. All we seem to talk about in this context is filtering and that detracts from the other work which might be equally more effective, or maybe even more effective. I also teach a two hour Internet safety session annually for students hoping to become primary school teachers at the University of Gloucester. This is to give them some grounding in internet safety, both for themselves and their future pupils. I encourage filtering being switched off in school so that children can report Facebook problems. Of course this should be done with care, and as part of a systemic internet safety policy, but the principle is that if a school bans Facebook, and a pupil then has an issue on Facebook what will they do? Who will they tell? Not the teachers because it is banned. Also the ban on Facebook is probably ineffective because most children www.dpalliance.org.uk Page 18 have internet access on their mobile phones so how is banning it on the school computer helping the children? How does the school teach the pupils how to use Facebook safely? Patrick King, Wave: It is not just the web but simple email sites that are also being used for threats of cyber bullying and grooming by paedophiles posing as children. It would be a richer discussion if we could move away from focus on filtering to explore areas of real danger of straightforward simple communication over email and how people can protect their children without being intrusive in their children's lives. Our organisation has delivered a project in the USA in association with the Homeland Security Department to protect young people using not only email but social media messaging sites such as; Twitter, Facebook, and Yammer. The real threat to vulnerable young people is from proactive predators (sexual, criminal, drug or simple bullies) masquerading as other children. Generic content on the web (even that deemed as offensive) is nowhere near as sinister or dangerous. We have developed a non‐intrusive, free to use privacy service called “SCRAMBLS” to keep private messages and email private. ROUND‐UP FROM SPEAKERS View on Topics and view on what legislation needs to do Andy Baker We have a fantastic relationship with ISPs in the UK. Some are better than others but across the board they report to us and work with us and we couldn't have brought a lot of people to custody and then to sentence if we hadn't had that relationship. And even though all these conversations are going on at a high level, those relationships are developing opportunities whereby we can deal with the issues. They're not against us they are with us and very much travelling in the same direction. Legislation: we have specific legislation around the protection of children and Foreign Travel orders and we have the Serious and Organised Crime Act, but the Judiciary say because we have such specific laws you cannot use the latter. Consequently what we can use to combat crime and organised crime involved in criminal child sexual abuse and exploitation is not available to us because they are specific laws. I think we need to look at that and I do think we need to codify the law a little and streamline it better. It is all over the place. It would be better to tidy it up – we could say that about many things of course. Peter Davies These are my personal views. Firstly I would like to note with pleasure the fact that a great deal of work has been done to get a new model for sexual risk prevention through the House of Commons as an amendment to anti social behaviour legislation. Please let members of the House of Lords know how important it is to continue to support that and to see it put into the hands of child protection professionals including CEOP, the National Crime Agency, and police forces. That will make a difference, not just in tackling child sexual exploitation in some of the most vulnerable parts of the world inflicted by UK nationals, but also the kind of exploitation that we have seen and heard about in Rotherham, Rochdale, Oxford and so many other places. That is a good news story because the legislature has noticed a problem, has listened, and is doing something about it. There are one or two other areas of development picking up on Andy's point and reinforcing it. We have a number of cases where children and young people are effectively extorted into performing acts on themselves, and in some cases others, online for sexual gratification for others. There is a legislative problem in that this area does not fit into the definition of blackmail at present. The issue is that the law of www.dpalliance.org.uk Page 19 blackmail needs to be extended to accommodate that because frankly some of the financial loss that does fall into that legislation is as nothing to the torment and occasionally fatal harm that form of extortion leads to. We don't want to be in the position where we have to wait to put that into criminal law. Finally, there is a case to be made for making possession and distribution of paedophile written material illegal in the same way that indecent pictures of children are illegal. I think that is an interesting piece of work. My view is that the scale of impact of the written word, the way in which it can affect behaviour, or develop someone with a theoretical sexual interest in children towards contact offending with them, is not well enough understood, partly because it is not a crime so it is not counted. I do think that is an area to look into. Finally, reiterating a theme I have articulated many times before (and I am disappointed that Julian Huppert has had to leave the room) investigators into child abuse online would benefit from a rigorous process for retention of and access to communications data for the right reasons, subject to the principles of proportionality, legality, accountability and necessity. At present, victims are not being identified because of the absence of that. If it is impossible to enact such a thing for other reasons ‐ and I'm not naïve about this ‐ that is one of the prices that we are paying. My personal view is that that is an extremely high price to pay. Q Are bullying and blackmail linked? Because there's suicide too. Andy Baker We only know of one child has taken their own life following sexual extortion ‐ Daniel Perring in Scotland on 18th July last year. Following an Operation Vocal publicity campaign we found six suicides worldwide and six attempts – and five of those attempts were in the UK. That publicity went out on a Friday and over that weekend and into the week. We had 38 reports about concern around suicide due to sexual extortion and bullying from children ‐ and parents. That is really positive. One child, a 15 year old boy, said that he was a perpetrator against other people at school that he's bullied. So to say there is no link is a mistake. We are not there to police the bullying but we need to join up in this sort of work as well we can. Complex is the right word here. It is so complex that it needs a thought through response and not a knee jerk reaction because a knee jerk reaction leads to another knee jerk reaction a few months down the line. Nick Truman A while ago Al Qaida started putting out execution videos and I dealt with those of Ken Bigley’s execution. I tried to get something done there, so there's a lot more work to be done on peer to peer and on intolerance for the distribution of extremely harmful offensive material‐ not just that of a pornographic nature. The danger of removing filters from schools was demonstrated when Telnex in Mexico decided to give all school children laptops. There was no filtering in the schools at all at that time and it ended up with a pandemic of children getting addicted to hard core pornography. This is a very complicated area and we need more than two hours to debate it. www.dpalliance.org.uk Page 20 Chris Ratcliff We need to draw a clear line between the issue of child sexual abuse and mainstream pornography and perhaps to leave pornography for a separate session. It is not something that everyone is comfortable with. Maybe it needs to be taken out of the broader context. Even within that mainstream pornography debate there is the regulated versus the unregulated sector. Parliamentarians and policy makers should engage with us. I think there is a concern about reputational risk in engaging. I applied for memberships of UKCCIS and IWF and was kicked back. We are at the coal face and working on solutions, so do engage with us. I believe the onus is on the industry to find solutions but industry will not find a solution unless it is obligated to do so. ATVOD (Authority for TV on Demand) is an interesting experiment but it is imposed only on UK providers so perhaps legislation needs to look at broadening ATVOD’s remit but there are no borders here. How do we begin to contain a global problem? The key is to look at what is happening in the gaming sector. Ten years ago the gambling industry was reluctant to look at compliance ‐ now ten years on it is thriving and compliant so perhaps we should look at licensing adult sites rather than simply blocking them because this problem will not go away. Jim Killock To come back to the Communications Data Bill, surveillance can be used to reduce all kinds of crime. Whether that is appropriate for us as a society is a different question. I'm just cautious about the use of communications data. It might help but it could cause a vast number of other problems. I agree very much about separating the debate between child abuse from the much broader question of how we help children with their online existence. I think legislation and filtering are unlikely to help. I would like to try to suggest that this is about the way technology evolves. The solutions will change but legislation is a bit of a fixed position. It might end up doing more harm than good if we expect legislation to answer the problems. We simply need a much more flexible approach. Commenting a bit more broadly on the blocking questions, I know that work is being done and I am very glad it is being done. What I don't see is how somebody who is not a customer of the people involved can detect and find an error. Nobody is offering any public checking mechanism right at the moment or any kind of transparency, and the means of dealing with blocking errors is virtually impossible at this moment in time. Will Gardner I remind you about 11th February as being Safer Internet Day. It is not too late if you want to get involved, and I can suggest ways in which you might help support the day, see saferinternetday.org.uk. Education is absolutely key, even in relation to the parental control discussion. It is all very well already providing the controls but unless people are informed and empowered to know what these tools can do we're not going to make good sense in relation to them. Therefore, education at a very basic level needs to support parental controls but it needs to do much more than that. We have been running education outreach work for over 10 years. www.dpalliance.org.uk Page 21 We started off with primary school children because that was the age at which children's behaviour was being formed in relation to technology. We've had to take that earlier now – we are going to pre‐primary school and talking to very young children because that is the time they now start engaging with technology as we've just heard. There is work going on in pre‐school as well as in primary and secondary now. However, I do think that education is an under‐supported area and there is a need for more financial support within this educational space. There are some good supporters in this field but I'd like to see more as this is a very fast changing environment. The information and advice we give to children and to parents needs to be kept up to date and up to speed in order to provide people with reliable up to date information. David Miles I come to a lot of events and it is very easy to talk ourselves into the negative aspects of the Internet when in fact it is a hugely empowering and creative environment. I personally cannot conceive of a world without the Internet anymore. That is the sort of impact it has had. I am sure you all feel the same. That is why I make a plea for balance. I hope we provide the policing and prosecutions required to tackle the dark side of the Internet. It is about bringing the institutions and structures up to date and the legislation to deal with that. The final point is that the UK has a unique opportunity. We seem to have a middle ground of stakeholders and some centres of excellence like CEOP and others. We can bring about an incredible result on the safety and security of the Internet by the end of 2014. We don't have the answers to all the questions but we should a very good try and I think we should be very proud of our efforts. APPENDIX The following one‐page was circulated at the meeting on behalf of Raymond Drewry who was due to speak but unable to attend. Tackling Child Internet Protection A Personal, Unofficial View from Raymond Drewry, Principal Scientist, MovieLabs (joint venture of 6 major film studios), and DPA member Summary 
Search Engines are not sufficient 
there are several non‐Web Internet‐based sources (media industries have experience here) 
There is no silver bullet – continual adjustment is required 
Solutions require policy/technology co‐operation www.dpalliance.org.uk Page 22 There has been much discussion of using search engines to reduce the availability of unacceptable content. This content falls into two categories: things that are undesirable in a particular context, e.g. items that are not appropriate for children, and things that are absolutely illegal, e.g. child pornography. Using search engines to filter and warn can help prevent both accidental discovery of inappropriate content and deliberate discovery of illegal content. Even if it does not achieve full prevention, it can decrease the likelihood of such content being discovered. However, there are many ways of acquiring content that are not covered by search engines, or only partially covered. These include peer to peer systems, the link site/cyberlockers ecosystems, TOR hidden services, and Usenet. In these environments, search and discovery are often more separate from the retrieval mechanism than is typical for content hosted directly on the web. The creative industries have experience with some these, as they are also sources for illegal or illicit copies of books, music, software, film, and television. It is also the case that illegal content and malware can be hidden in files purporting to be films, music, and the like. There are existing techniques for finding illegal content on some of these systems. Finding the parties responsible for this content is much harder, and generally requires knowledgeable people to initiate the process, analyze the results, and decide on appropriate actions. This needs familiarity with the particular ecosystem, both at the technical level for the mechanism and at the human/sociological level for the type of content. The individuals responsible for illegal content will usually work very hard to circumvent prevention and detection measures, resulting in an arms race similar to the one seen in the more general cybersecurity realm. The separation between search and retrieval mentioned above can be used to further hide the perpetrators and make the system more resilient. This does not mean the situation is hopeless. There are existing ways of dealing with some of the problems, but none of them are silver bullets – the real problem must be perpetually kept in view, and means of detection and prevention must evolve based on new information. Further, in some cases consistent corporate policies around data and network security can benefit not only individual companies but society in general. There are also existing legal mechanisms that can be brought to bear. Lessons learned from combating media piracy can be applied to the problem, although it is almost certainly the case that illegal content such as child pornography or terrorism‐related material will be harder to get a handle on, since it carries higher penalties and more social censure than media piracy. This makes the producers and distributors of this content more cautious and the content harder to find. Therefore, human and technical resources must be invested to solve the problem, with cooperation between policy makers, law enforcement, and technical infrastructure providers. ‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐ www.dpalliance.org.uk Page 23