| |
Monday is the last day to respond and the Open Rights Group makes some suggestions
|
|
|
| 30th June 2019
|
|
| See article from openrightsgroup.org |
The Government is accepting public feedback on their plan until Monday 1 July. Send a message to their consultation using Open Rights Group tool before the end of Monday!
The Open Rights Group comments on the government censorship plans: Online Harms: Blocking websites doesn't work -- use a rights-based approach instead Blocking websites isn't working. It's not
keeping children safe and it's stopping vulnerable people from accessing information they need. It's not the right approach to take on Online Harms. This is the finding from our
recent research into website blocking by mobile and broadband
Internet providers. And yet, as part of its Internet regulation agenda, the UK Government wants to roll out even more blocking. The Government's Online Harms White Paper is focused on making online companies fulfil a "duty
of care" to protect users from "harmful content" -- two terms that remain troublingly ill-defined. 1
The paper proposes giving a regulator various punitive measures to use against companies that fail to fulfil this duty, including powers to block websites. If this scheme comes into effect, it could lead to
widespread automated blocking of legal content for people in the UK. Mobile and broadband Internet providers have been blocking websites with parental control filters for five years. But through our
Blocked project -- which detects incorrect website blocking -- we know that systems are still blocking far too many sites and far too many types of sites by mistake.
Thanks to website blocking, vulnerable people and under-18s are losing access to crucial information and support from websites including counselling, charity, school, and sexual health websites. Small businesses are
losing customers. And website owners often don't know this is happening. We've seen with parental control filters that blocking websites doesn't have the intended outcomes. It restricts access to legal, useful,
and sometimes crucial information. It also does nothing to prevent people who are determined to get access to material on blocked websites, who often use VPNs to get around the filters. Other solutions like filters applied by a parent to a child's
account on a device are more appropriate. Unfortunately, instead of noting these problems inherent to website blocking by Internet providers and rolling back, the Government is pressing ahead with website blocking in other areas.
Blocking by Internet providers may not work for long. We are seeing a technical shift towards encrypted website address requests that will make this kind of website blocking by Internet providers much more
difficult. When I type a human-friendly web address such as openrightsgroup.org into a web browser and hit enter, my computer asks a Domain Name System (DNS) for that website's computer-friendly IP address - which will
look something like 46.43.36.233 . My web browser can then use that computer-friendly address to load the website. At the moment, most DNS requests are unencrypted. This allows mobile and broadband Internet providers to
see which website I want to visit. If a website is on a blocklist, the system won't return the actual IP address to my computer. Instead, it will tell me that that site is blocked, or will tell my computer that the site doesn't exist. That stops me
visiting the website and makes the block effective. Increasingly, though, DNS requests are being encrypted. This provides much greater security for ordinary Internet users. It also makes website blocking by Internet providers
incredibly difficult. Encrypted DNS is becoming widely available through Google's Android devices, on Mozilla's Firefox web browser and through Cloudflare's mobile application for Android and iOS. Other encrypted DNS services are also available.
Our report DNS Security - Getting it Right discusses issues around encrypted DNS in more detail.
Blocking websites may be the Government's preferred tool to deal with social problems on the Internet but it doesn't work, both in policy terms and increasingly at a technical level as well. The Government must accept that website blocking by mobile and broadband Internet providers is not the answer. They should concentrate instead on a rights-based approach to Internet regulation and on educational and social approaches that address the roots of complex societal issues.
Offsite Article: CyberLegal response to the Online Harms Consultation 30th June 2019. See article from cyberleagle.com
Speech is not a tripping hazard |
| |
|
|
|
 |
28th June 2019
|
|
|
Age Verification providers that don't provide a way into Pornhub will only get the crumbs from the AV table See
article from medium.com |
| |
|
|
|
 | 24th June 2019
|
|
|
The UK Porn Block's Latest Failure. By David Flint See article from reprobatepress.com |
| |
Maybe its a good job the government has delayed Age Verification as there are a still a lot of issues to resolve for the AV companies
|
|
|
 | 21st June 2019
|
|
| See article from telegraph.co.uk |
The AV industry is not yet ready The Digital Policy Alliance (DPA) is a private lobby group connecting digital industries with Parliament. Its industry members include both Age Verification (AV) providers, eg OCL, and adult entertainment, eg
Portland TV. Just before the Government announcement that the commencement of adult verification requirements for porn websites would be delayed, the DPA wrote a letter explaining that the industry was not yet ready to implement AV, and had asked
for a 3 month delay. The letter is unpublished but fragments of it have been reported in news reports about AV. The Telegraph reported: The Digital Policy Alliance called for the scheme to be delayed or
risk nefarious companies using this opportunity to harvest and manipulate user data. The strongly-worded document complains that the timing is very tight, a fact that has put some AVPs [age verification providers] and adult
entertainment providers in a very difficult situation. It warns that unless the scheme is delayed there will be less protection for public data, as it appears that there is an intention for uncertified providers to use this
opportunity to harvest and manipulate user data.
The AV industry is unimpressed by a 6 month delay See
article from news.sky.com
Rowland Manthorpe from Sky News contributed a few interesting snippets
too. He noted that the AVPs were unsurprisingly not pleased by the government delay: Serge Acker, chief executive of OCL, which provides privacy-protecting porn passes for purchase at newsagents, told Sky News: As a
business, we have been gearing up to get our solution ready for July 15th and we, alongside many other businesses, could potentially now be being endangered if the government continues with its attitude towards these delays. Not
only does it make the government look foolish, but it's starting to make companies like ours look it too, as we all wait expectantly for plans that are only being kicked further down the road.
There are still issues with how
the AV providers can make money And interestingly Manthorpe revealed in the accompanying video news report that the AV providers were also distinctly unimpressed by the BBFC stipulating that certified AV providers must not use Identity Data
provided by porn users for any other purpose than verifying age. The sensible idea being that the data should not be made available for the the likes of targeted advertising. And one particular example of prohibited data re-use has caused particular
problems, namely that ID data should not be used to sign people up for digital wallets. Now AV providers have got to be able to generate their revenue somehow. Some have proposed selling AV cards in newsagents for about £10, but others had been
planning on using AV to generate a customer base for their digital wallet schemes. So it seems that there are still quite a few fundamental issues that have not yet been resolved in how the AV providers get their cut.
Some AV
providers would rather not sign up to BBFC accreditation See article from adultwebmasters.org Maybe these
issues with BBFC AV accreditation requirements are behind a move to use an alternative standard. An AV provider called VeriMe has announced that it has the first AV company to receive a PAS1296 certification. The PAS1296 was developed between the
British Standards Institution and the Age Check Certification Scheme (ACCS). It stands for Public Accessible Specification and is designed to define good practice standards for a product, service or process. The standard was also championed by the
Digital Policy Alliance. Rudd Apsey, the director of VeriMe said: The PAS1296 certification augments the voluntary standards outlined by the BBFC, which don't address how third-party websites handle consumer
data, Apsey added. We believe it fills those gaps and is confirmation that VeriMe is indeed leading the world in the development and implementation of age verification technology and setting best practice standards for the industry.
We are incredibly proud to be the first company to receive the standard and want consumers and service providers to know that come the July 15 roll out date, they can trust VeriMe's systems to provide the most robust solution for age
verification.
This is not a very convincing argument as PAS1296 is not available for customers to read, (unless they pay about 120 quid for the privilege). At least the BBFC standard can be read by anyone for free, and they can then
make up their own minds as to whether their porn browsing history and ID data is safe. However it does seem that some companies at least are planning to give the BBFC accreditation scheme a miss.
The BBFC standard fails to provide
safety for porn users data anyway. See article from
medium.com
The AV company 18+ takes issue with the BBFC accreditation standard, noting that it allows AV providers to dangerously log people's porn browsing history: Here's the problem with the
design of most age verification systems: when a UK user visits an adult website, most solutions will present the user with an inline frame displaying the age verifier's website or the user will be redirected to the age verifier's website. Once on the age
verifier's website, the user will enter his or her credentials. In most cases, the user must create an account with the age verifier, and on subsequent visits to the adult website, the user will enter his account details on the age verifier's website
(i.e., username and password). At this point in the process, the age verifier will validate the user and, if the age verifier has a record the user being at least age 18, will redirect the user back to the adult website. The age verification system will
transmit to the adult website whether the user is at least age 18 but will not transmit the identity of the user. The flaw with this design from a user privacy perspective is obvious: the age verification website will know the
websites the user visits. In fact, the age verification provider obtains quite a nice log of the digital habits of each user. To be fair, most age verifiers claim they will delete this data. However, a truly privacy first design would ensure the data
never gets generated in the first place because logs can inadvertently be kept, hacked, leaked, or policies might change in the future. We viewed this risk to be unacceptable, so we set about building a better system. Almost all
age verification solutions set to roll out in July 2019 do not provide two-way anonymity for both the age verifier and the adult website, meaning, there remains some log of?204?or potential to log -- which adult websites a UK based user visits.
In fact one AV provider revealed that up until recently the government demanded that AV providers keep a log of people's porn browsing history and it was a bit of a late concession to practicality that companies were able to opt out if
they wanted. Note that the logging capability is kindly hidden by the BBFC by passing it off as being used for only as long as is necessary for fraud prevention. Of course that is just smoke and mirrors, fraud, presumably meaning that passcodes
could be given or sold to others, could happen anytime that an age verification scheme is in use, and the time restriction specified by the BBFC may as well be forever. |
| |
Jeremy Wright apologises to supporters for an admin cock-up, and takes the opportunity to sneer at the millions of people who just want to keep their porn browsing private and safe
|
|
|
 |
20th June 2019
|
|
| See parliamentary transcription from hansard.parliament.uk
|
Jeremy Wright, the Secretary of State for Digital, Culture, Media and Sport addressed parliament to explain that the start data for Age Verification scheme for porn has been delayed by about 6 months. The reason is that the Government failed to inform
the EU about laws that effect free trade (eg those that that allow EU websites to be blocked in the UK). Although the main Digital Economy Act was submitted to the EU, extra bolt on laws added since, have not been submitted. Wright explained:
In autumn last year, we laid three instruments before the House for approval. One of them204the guidance on age verification arrangements204sets out standards that companies need to comply with. That should have been notified to the
European Commission, in line with the technical standards and regulations directive, and it was not. Upon learning of that administrative oversight, I instructed my Department to notify this guidance to the EU and re-lay the guidance in Parliament as
soon as possible. However, I expect that that will result in a delay in the region of six months. Perhaps it would help if I explained why I think that six months is roughly the appropriate time. Let me set out what has to happen
now: we need to go back to the European Commission, and the rules under the relevant directive say that there must be a three-month standstill period after we have properly notified the regulations to the Commission. If it wishes to look into this in
more detail204I hope that it will not204there could be a further month of standstill before we can take matters further, so that is four months. We will then need to re-lay the regulations before the House. As she knows, under the negative procedure,
which is what these will be subject to, there is a period during which they can be prayed against, which accounts for roughly another 40 days. If we add all that together, we come to roughly six months. Wright apologised profusely to
supporters of the scheme: I recognise that many Members of the House and many people beyond it have campaigned passionately for age verification to come into force as soon as possible to ensure that children are
protected from pornographic material they should not see. I apologise to them all for the fact that a mistake has been made that means these measures will not be brought into force as soon as they and I would like.
However the law has
not been received well by porn users. Parliament has generally shown no interest in the privacy and safety of porn users. In fact much of the delay has been down belatedly realising that the scheme might not get off the ground at all unless they at least
pay a little lip service to the safety of porn users. Even now Wright decided to dismiss people's privacy fears and concerns as if they were all just deplorables bent on opposing child safety. He said: However,
there are also those who do not want these measures to be brought in at all, so let me make it clear that my statement is an apology for delay, not a change of policy or a lessening of this Government's determination to bring these changes about. Age
verification for online pornography needs to happen. I believe that it is the clear will of the House and those we represent that it should happen, and that it is in the clear interests of our children that it must.
Wright compounded
his point by simply not acknowledging that if, given a choice people, would prefer not to hand over their ID. Voluntarily complying websites would have to take a major hit from customers who would prefer to seek out the safety of non-complying sites.
Wright said: I see no reason why, in most cases, they [websites] cannot begin to comply voluntarily. They had expected to be compelled to do this from 15 July, so they should be in a position to comply. There seems to
be no reason why they should not.
In passing Wright also mentioned how the government is trying to counter encrypted DNS which reduces. the capabilities of ISPs to block websites. Instead the Government will try and press the browser
companies into doing their censorship dirty work for them instead: It is important to understand changes in technology and the additional challenges they throw up, and she is right to say that the so-called D over H
changes will present additional challenges. We are working through those now and speaking to the browsers, which is where we must focus our attention. As the hon. Lady rightly says, the use of these protocols will make it more difficult, if not
impossible, for ISPs to do what we ask, but it is possible for browsers to do that. We are therefore talking to browsers about how that might practically be done, and the Minister and I will continue those conversations to ensure that these provisions
can continue to be effective.
|
| |
Open Rights Group reports on how the Online Harms Bill will harm free speech, justice and liberty
|
|
|
 | 18th
June 2019
|
|
| See article from openrightsgroup.org See
report [pdf] from openrightsgroup.org |
This report follows our research into current Internet content regulation efforts, which found a lack of accountable, balanced and independent procedures governing content removal, both formally and informally by the state. There is a legacy of Internet regulation in the UK that does not comply with due process, fairness and fundamental rights requirements. This includes: bulk domain suspensions by Nominet at police request without prior authorisation; the lack of an independent legal authorisation process for Internet Watch Foundation (IWF) blocking at Internet Service Providers (ISPs) and in the future by the British Board of Film Classification (BBFC), as well as for Counter-Terrorism Internet Referral Unit (CTIRU) notifications to platforms of illegal content for takedown. These were detailed in our previous report.
The UK government now proposes new controls on Internet content, claiming that it wants to ensure the same rules online as offline. It says it wants harmful content removed, while respecting human rights and protecting free
expression. Yet proposals in the DCMS/Home Office White Paper on Online Harms will create incentives for Internet platforms such as Google, Twitter and Facebook to remove content without legal processes. This is not the same rules
online as offline. It instead implies a privatisation of justice online, with the assumption that corporate policing must replace public justice for reasons of convenience. This goes against the advice of human rights standards that government has itself
agreed to and against the advice of UN Special Rapporteurs. The government as yet has not proposed any means to define the harms it seeks to address, nor identified any objective evidence base to show what in fact needs to be
addressed. It instead merely states that various harms exist in society. The harms it lists are often vague and general. The types of content specified may be harmful in certain circumstances, but even with an assumption that some content is genuinely
harmful, there remains no attempt to show how any restriction on that content might work in law. Instead, it appears that platforms will be expected to remove swathes of legal-but-unwanted content, with as as-yet-unidentified regulator given a broad duty
to decide if a risk of harm exists. Legal action would follow non-compliance by a platform. The result is the state proposing censorship and sanctions for actors publishing material that it is legal to publish.
|
| |
|
|
|
 | 18th June 2019
|
|
|
Porn Block Demonstrates the Government Is More Concerned With Censorship Than Security See
article from gizmodo.co.uk |
| |
|
|
|
 | 16th June 2019
|
|
|
Filtering filth won't save the children, but the block could be bad news for you. By Carrie Marshall See
article from techradar.com |
| |
|
|
|
 | 15th June 2019
|
|
|
Who'd have thought that a Christian Campaign Group would be calling on its members to criticise the government's internet censorship bill in a consultation See
article from christianconcern.com |
| |
Open Rights Group Report: Analysis of BBFC Age Verification Certificate Standard June 2019
|
|
|
| 14th June 2019
|
|
| See article from
openrightsgroup.org See article [pdf] from openrightsgroup.org |
Executive Summary The BBFC's Age-verification Certificate Standard ("the Standard") for providers of age verification services, published in April 2019, fails to meet adequate standards of cyber security and
data protection and is of little use for consumers reliant on these providers to access adult content online. This document analyses the Standard and certification scheme and makes recommendations for improvement and remediation.
It sub-divides generally into two types of concern: operational issues (the need for a statutory basis, problems caused by the short implementation time and the lack of value the scheme provides to consumers), and substantive issues (seven problems with
the content as presently drafted). The fact that the scheme is voluntary leaves the BBFC powerless to fine or otherwise discipline providers that fail to protect people's data, and makes it tricky for consumers to distinguish
between trustworthy and untrustworthy providers. In our view, the government must legislate without delay to place a statutory requirement on the BBFC to implement a mandatory certification scheme and to grant the BBFC powers to require reports and
penalise non-compliant providers. The Standard's existence shows that the BBFC considers robust protection of age verification data to be of critical importance. However, in both substance and operation the Standard fails to
deliver this protection. The scheme allows commercial age verification providers to write their own privacy and security frameworks, reducing the BBFC's role to checking whether commercial entities follow their own rules rather than requiring them to
work to a mandated set of common standards. The result is uncertainty for Internet users, who are inconsistently protected and have no way to tell which companies they can trust. Even within its voluntary approach, the BBFC gives
providers little guidance to providers as to what their privacy and security frameworks should contain. Guidance on security, encryption, pseudonymisation, and data retention is vague and imprecise, and often refers to generic "industry
standards" without explanation. The supplementary Programme Guide, to which the Standard refers readers, remains unpublished, critically undermining the scheme's transparency and accountability. Recommendations
Grant the BBFC statutory powers: The BBFC Standard should be substantively revised to set out comprehensive and concrete standards for handling highly sensitive age verification data. -
The government should legislate to grant the BBFC statutory power to mandate compliance. The government should enable the BBFC to require remedial action or apply financial penalties for non-compliance.
The BBFC should be given statutory powers to require annual compliance reports from providers and fine those who sign up to the certification scheme but later violate its requirements. The
Information Commissioner should oversee the BBFC's age verification certification scheme
Delay implementation and enforcement: Delay implementation and enforcement of age verification until both (a) a statutory standard of data privacy and security is in place, and (b) that standard has been
implemented by providers. Improve the scheme content: Even if the BBFC certification scheme remains voluntary, the Standard should at least contain a definitive set of precisely delineated objectives
that age verification providers must meet in order to say that they process identity data securely. Improve communication with the public: Where a provider's certification is revoked, the BBFC should
issue press releases and ensure consumers are individually notified at login. The results of all penetration tests should be provided to the BBFC, which must publish details of the framework it uses to evaluate test results, and
publish annual trends in results. Strengthen data protection requirements: Data minimisation should be an enforceable statutory requirement for all registered age verification providers.
The Standard should outline specific and very limited circumstances under which it's acceptable to retain logs for fraud prevention purposes. It should also specify a hard limit on the length of time logs may be kept.
The Standard should set out a clear, strict and enforceable set of policies to describe exactly how providers should "pseudonymise" or "deidentify" data. Providers that no longer meet the
Standard should be required to provide the BBFC with evidence that they have destroyed all the user data they collected while supposedly compliant. The BBFC should prepare a standardised data protection risk assessment framework
against which all age verification providers will test their systems. Providers should limit bespoke risk assessments to their specific technological implementation. Strengthen security, testing, and encryption requirements:
Providers should be required to undertake regular internal and external vulnerability scanning and a penetration test at least every six months, followed by a supervised remediation programme to correct any discovered
vulnerabilities. Providers should be required to conduct penetration tests after any significant application or infrastructure change. Providers should be required to use a comprehensive and specific
testing standard. CBEST or GBEST could serve as guides for the BBFC to develop an industry-specific framework. The BBFC should build on already-established strong security frameworks, such as the Center for Internet Security Cyber
Controls and Resources, the NIST Cyber Security Framework, or Cyber Essentials Plus. At a bare minimum, the Standard should specify a list of cryptographic protocols which are not adequate for certification.
|
| |
The catastrophic impact of DNS-over-HTTPs. The IWF makes its case
|
|
|
 | 10th June 2019
|
|
| See article from iwf.org.uk by Fred Langford, IWF
Deputy CEO and CTO |
Here at the IWF, we've created life-changing technology and data sets helping people who were sexually abused as children and whose images appear online. The IWF URL List , or more commonly, the block list, is a list of live webpages that show children
being sexually abused, a list used by the internet industry to block millions of criminal images from ever reaching the public eye. It's a crucial service, protecting children, and people of all ages in their homes and places of
work. It stops horrifying videos from being stumbled across accidentally, and it thwarts some predators who visit the net to watch such abuse. But now its effectiveness is in jeopardy. That block list which has for years stood
between exploited children and their repeated victimisation faces a challenge called DNS over HTTPS which could soon render it obsolete. It could expose millions of internet users across the globe - and of any age -- to the risk
of glimpsing the most terrible content. So how does it work? DNS stands for Domain Name System and it's the phonebook by which you look something up on the internet. But the new privacy technology could hide user requests, bypass
filters like parental controls, and make globally-criminal material freely accessible. What's more, this is being fast-tracked, by some, into service as a default which could make the IWF list and all kinds of other protections defunct.
At the IWF, we don't want to demonise technology. Everyone's data should be secure from unnecessary snooping and encryption itself is not a bad thing. But the IWF is all about protecting victims and we say that the way in which DNS
over HTTPS is being implemented is the problem. If it was set as the default on the browsers used by most of us in the UK, it would have a catastrophic impact. It would make the horrific images we've spent all these years blocking
suddenly highly accessible. All the years of work for children's protection could be completely undermined -- not just busting the IWF's block list but swerving filters, bypassing parental controls, and dodging some counter terrorism efforts as well.
From the IWF's perspective, this is far more than just a privacy or a tech issue, it's all about putting the safety of children at the top of the agenda, not the bottom. We want to see a duty of care placed upon DNS providers so they
are obliged to act for child safety and cannot sacrifice protection for improved customer privacy.
|
| |
|
|
|
 |
10th June 2019
|
|
|
When is a porn film not a porn film? See article from reprobatepress.com |
| |
|
|
|
| 6th June 2019
|
|
|
Foreign websites will block UK users altogether rather than be compelled to invest time and money into a nigh-impossible compliance process. By Heather Burns See
article from webdevlaw.uk |
| |
Internet companies slam the data censor's disgraceful proposal to require age verification for large swathes of the internet
|
|
|
| 5th June 2019
|
|
| From the Financial Times |
The Information Commissioner's Office has for some bizarre reason have been given immense powers to censor the internet. And in an early opportunity to exert its power it has proposed a 'regulation' that would require strict age verification for
nearly all mainstream websites that may have a few child readers and some material that may be deemed harmful for very young children. Eg news websites that my have glamour articles or perhaps violent news images. In a mockery of 'data protection'
such websites would have to implement strict age verification requiring people to hand over identity data to most of the websites in the world. Unsurprisingly much of the internet content industry is unimpressed. A six weerk consultation on the
new censorship rules has just closed and according to the Financial Times: Companies and industry groups have loudly pushed back on the plans, cautioning that they could unintentionally quash start-ups and endanger
people's personal data. Google and Facebook are also expected to submit critical responses to the consultation. Tim Scott, head of policy and public affairs at Ukie, the games industry body, said it was an inherent contradiction
that the ICO would require individuals to give away their personal data to every digital service. Dom Hallas, executive director at the Coalition for a Digital Economy (Coadec), which represents digital start-ups in the UK, said
the proposals would result in a withdrawal of online services for under-18s by smaller companies: The code is seen as especially onerous because it would require companies to provide up to six different versions of
their websites to serve different age groups of children under 18. This means an internet for kids largely designed by tech giants who can afford to build two completely different products. A child could access YouTube Kids, but
not a start-up competitor.
Stephen Woodford, chief executive of the Advertising Association -- which represents companies including Amazon, Sky, Twitter and Microsoft -- said the ICO needed to conduct a full technical
and economic impact study, as well as a feasibility study. He said the changes would have a wide and unintended negative impact on the online advertising ecosystem, reducing spend from advertisers and so revenue for many areas of the UK media.
An ICO spokesperson said: We are aware of various industry concerns about the code. We'll be considering all the responses we've had, as well as engaging further where necessary, once the consultation
has finished.
|
| |
The harms will be that British tech businesses will be destroyed so that politicians can look good for 'protecting the children'
|
|
|
 | 2nd June 2019
|
|
| 1st June 2019. See article from cityam.com See
submission to teh government [pdf] from uk.internetassociation.org |
A scathing new report, seen by City A.M. and authored by the Internet Association (IA), which represents online firms including Google, Facebook and Twitter, has outlined a string of major concerns with plans laid out in the government Online Harms white
paper last month. The Online Harms white paper outlines a large number of internet censorship proposals hiding under the vague terminology of 'duties of care'. Under the proposals, social media sites could face hefty fines or even a ban if they
fail to tackle online harms such as inappropriate age content, insults, harassment, terrorist content and of course 'fake news'. But the IA has branded the measures unclear and warned they could damage the UK's booming tech sector, with smaller
businesses disproportionately affected. IA executive director Daniel Dyball said: Internet companies share the ambition to make the UK one of the safest places in the world to be online, but in its current form the online harms white paper will
not deliver that, said The proposals present real risks and challenges to the thriving British tech sector, and will not solve the problems identified. The IA slammed the white paper over its
use of the term duty of care, which it said would create legal uncertainty and be unmanageable in practice.
The lobby group also called for a more precise definition of which online services would be covered by regulation and greater
clarity over what constitutes an online harm. In addition, the IA said the proposed measures could raise serious unintended consequences for freedom of expression. And while most internet users favour tighter rules in some areas, particularly
social media, people also recognise the importance of protecting free speech 203 which is one of the internet's great strengths. Update: Main points 2nd June 2019. See
article from uk.internetassociation.org The Internet Association
paper sets out five key concerns held by internet companies:
- "Duty of Care" has a specific legal meaning that does not align with the obligations proposed in the White Paper, creating legal uncertainty, and would be unmanageable;
- The scope of the services covered by regulation
needs to be defined differently, and more closely related to the harms to be addressed;
- The category of "harms with a less clear definition" raises significant questions and concerns about clarity and democratic process;
- The proposed code of practice obligations raise potentially dangerous unintended consequences for freedom of expression;
- The proposed measures will damage the UK digital sector, especially start-ups, micro-businesses and
small- and medium-sized enterprises (SMEs), and slow innovation.
|
| |
Well perhaps if the UK wasn't planning to block legal websites then people wouldn't need to seek out circumvention techniques, so allowing the laudable IWF blocking to continue
|
|
|
 | 1st June 2019
|
|
| See article from techworld.com
|
A recent internet protocol allows for websites to be located without using the traditional approach of asking your ISP's DNS server, and so evading website blocks implemented by the ISP. Because the new protocol is encrypted then the ISP is restricted in
its ability to monitor websites being accessed. This very much impacts the ISPs ability to block illegal child abuse as identified in a block list maintained by the IWF. Over the years the IWF have been very good at sticking to its universally
supported remit. Presumably it has realised that extending its blocking capabilities to other less critical areas may degrade its effectiveness as it would then lose that universal support. Now of course the government has stepped in and will use
the same mechanism as used for the IWF blocks to block legal and very popular adult porn websites. The inevitable interest in circumvention options will very much diminish the IWF's ability to block child abuse. So the IWF has taken to campaign to
supports its capabilities. Fred Langford, the deputy CEO of IWF, told Techworld about the implementation of encrypted DNS: Everything would be encrypted; everything would be dark. For the last 15 years, the IWF have
worked with many providers on our URL list of illegal sites. There's the counterterrorism list as well and the copyright infringed list of works that they all have to block. None of those would work. We put the entries onto our
list until we can work with our international stakeholders and partners to get the content removed in their country, said Langford. Sometimes that will only be on the list for a day. Other times it could be months or years. It just depends on the regime
at the other end, wherever it's physically located.
The IWF realises the benefit of universal support so generally acknowledged the benefits of the protocol on privacy and security and focusing on the needs for it to be deployed with
the appropriate safeguards in place. It is calling for the government to insert a censorship rule that includes the IWF URL List in the forthcoming online harms regulatory framework to ensure that the service providers comply with current UK laws and
security measures. Presumably the IWF would like its block list t be implemented by encrypted DNS servers worldwide. IWF's Fred Langford said: The technology is not bad; it's how you implement it. Make sure your
policies are in place, and make sure there's some way that if there is an internet service provider that is providing parental controls and blocking illegal material that the DNS over HTTPS server can somehow communicate with them to redirect the traffic
on their behalf.
Given the IWF's respect, then this could be a possibility, but if the government then step in and demand adult porn sites be blocked too, then this approach would surely stumble as every world dictator and
international moralist campaigner would expect the same. |
| |
Pointing out that it is crazy for the data protection police to require internet users to hand over their private identity data to all and sundry (all in the name of child protection of course)
|
|
|
 | 31st May 2019
|
|
| See article from
indexoncensorship.org |
Elizabeth Denham, Information Commissioner Information Commissioner's Office, Dear Commissioner Denham, Re: The Draft Age Appropriate Design Code for Online Services We write to
you as civil society organisations who work to promote human rights, both offline and online. As such, we are taking a keen interest in the ICO's Age Appropriate Design Code. We are also engaging with the Government in its White Paper on Online Harms,
and note the connection between these initiatives. Whilst we recognise and support the ICO's aims of protecting and upholding children's rights online, we have severe concerns that as currently drafted the Code will not achieve
these objectives. There is a real risk that implementation of the Code will result in widespread age verification across websites, apps and other online services, which will lead to increased data profiling of both children and adults, and restrictions
on their freedom of expression and access to information. The ICO contends that age verification is not a silver bullet for compliance with the Code, but it is difficult to conceive how online service providers could realistically
fulfil the requirement to be age-appropriate without implementing some form of onboarding age verification process. The practical impact of the Code as it stands is that either all users will have to access online services via a sorting age-gate or adult
users will have to access the lowest common denominator version of services with an option to age-gate up. This creates a de facto compulsory requirement for age-verification, which in turn puts in place a de facto restriction for both children and
adults on access to online content. Requiring all adults to verify they are over 18 in order to access everyday online services is a disproportionate response to the aim of protecting children online and violates fundamental
rights. It carries significant risks of tracking, data breach and fraud. It creates digital exclusion for individuals unable to meet requirements to show formal identification documents. Where age-gating also applies to under-18s, this violation and
exclusion is magnified. It will put an onerous burden on small-to-medium enterprises, which will ultimately entrench the market dominance of large tech companies and lessen choice and agency for both children and adults -- this outcome would be the
antithesis of encouraging diversity and innovation. In its response to the June 2018 Call for Views on the Code, the ICO recognised that there are complexities surrounding age verification, yet the draft Code text fails to engage
with any of these. It would be a poor outcome for fundamental rights and a poor message to children about the intrinsic value of these for all if children's safeguarding was to come at the expense of free expression and equal privacy protection for
adults, including adults in vulnerable positions for whom such protections have particular importance. Mass age-gating will not solve the issues the ICO wishes to address with the Code and will instead create further problems. We
urge you to drop this dangerous idea. Yours sincerely, Open Rights Group Index on Censorship Article19 Big Brother Watch Global Partners Digital
|
| |
A new proposal forcing people to brainlessly hand over identity data to any Tom, Dick or Harry website that asks. Open Rights Group suggests we take a stand
|
|
|
 | 30th May 2019
|
|
| From action.openrightsgroup.org See ICO's
Age-Appropriate Design: Code of Practice for Online Services |
New proposals to safeguard children will require everyone to prove they are over 18 before accessing online content. These proposals - from the Information Commissioner's Office (ICO) - aim at protecting children's privacy,
but look like sacrificing free expression of adults and children alike. But they are just plans: we believe and hope you can help the ICO strike the right balance, and abandon compulsory age gates, by making your voice heard. The
rules cover websites (including social media and search engines), apps, connected toys and other online products and services. The ICO is requesting public feedback on its proposals until Friday 31 May 2019. Please urgently write
to the consultation to tell them their plan goes too far! You can use these bullet points to help construct your own unique message:
In its current form, the Code is likely to result in widespread age verification across everyday websites, apps and online services for children and adults alike. Age checks for everyone are a step too
far. Age checks for everyone could result in online content being removed or services withdrawn. Data protection regulators should stick to privacy. It's not the Information Commissioner's job to restrict adults' or children's access to content. -
With no scheme to certify which providers can be trusted, third-party age verification technologies will lead to fakes and scams, putting people's personal data at risk. Large age verification providers
will seek to offer single-sign-in across a wide variety of online services, which could lead to intrusive commercial tracking of children and adults with devastating personal impacts in the event of a data breach.
|
| |
Presumably GCHQ would rather not half the population using technology that makes surveillance more difficult
|
|
|
 | 30th May 2019
|
|
| From dailystar.co.uk |
The authorities have admitted for the first time they will be unable to enforce the porn block law if browsers such as Firefox and Chrome roll out DNS over HTTPS encryption. The acknowledgement comes as senior representatives of ISPs privately told
Daily Star Online they believe the porn block law could be delayed. Earlier this month, this publication revealed Mozilla Firefox is thought to be pushing ahead with the roll out of DNS encryption, despite government concerns they and ISPs will be
unable to see what website we are looking at and block them. Speaking at the Internet Service Providers Association's Annual Conference last week, Mark Hoe, from the government's National Cyber Security Centre (NCSC), said they would not be able
to block websites that violate the porn block and enforce the new law. He said: The age verification -- although those are not directly affected [by DNS encryption] it does effect enforcement of access to non-compliant
websites. So, whereas we had previously envisaged that ISPs would be able to block access to non-compliant sites, [those] using DNS filtering techniques don't provide a way around that.
Hoe said that the
browsers were responding to legitimate concerns after the Daily Star reported Google Chrome was thought to have changed its stance on the roll out of encrypted DNS. However, industry insiders still think Firefox will press ahead, potentially
leading to people who want to avoid the ban switching to their browser. In an official statement, a government spokesman told Daily Star Online the law would come into force in a couple of months, as planned, but without explaining how it will
enforce it. Meanwhile a survey reveals three quarters of Brit parents are worried the porn block could leave them open to ID theft because they will be forced to hand over details to get age verified. AgeChecked surveyed 1,500 UK parents and found
73% would be apprehensive about giving personal information as verification online, for fear of how the data would be used. |
| |
Ofcom does its bit to support state internet censorship suggesting that this is what people want
|
|
|
 | 30th May 2019
|
|
| See Online Nation report [pdf] from ofcom.org.uk
|
Ofcom has published a wide ranging report about internet usage in Britain. Of course Ofcom takes teh opportunity to bolster the UK government's push to censor the internet. Ofcom writes: When prompted, 83% of adults expressed
concern about harms to children on the internet. The greatest concern was bullying, abusive behaviour or threats (55%) and there were also high levels of concern about children's exposure to inappropriate content including pornography (49%), violent /
disturbing content (46%) and content promoting self-harm (42%). Four in ten adults (39%) were concerned about children spending too much time on the internet. Many 12 to 15-year-olds said they have experienced potentially harmful
conduct from others on the internet. More than a quarter (28%) said they had had unwelcome friend or follow requests or unwelcome contact, 23% had experienced bullying, abusive behaviour or threats, 20% had been trolled'4 and 19% had experienced someone
pretending to be another person. Fifteen per cent said they had viewed violent or disturbing content. Social media sites, and Facebook in particular, are the most commonly-cited source of online harm for most of the types of
potential harm we asked about. For example, 69% of adults who said they had come across fake news said they had seen it on Facebook. Among 12 to 15-year-olds, Facebook was the most commonly-mentioned source of most of the potentially harmful experiences.
Most adults say they would support more regulation of social media sites (70%), video sharing sites (64%) and instant messenger services (61%). Compared to our 2018 research, support for more online regulation appears to have
strengthened. However, just under half (47%) of adult internet users recognised that websites and social media sites have a careful balance to maintain in terms of supporting free speech, even where some users might find the content offensive
|
| |
|
|
|
 |
23rd May 2019
|
|
|
The government is quietly creating a digital ID card without us noticing See article
from news.sky.com |
| |
Tom Watson asks in parliament about which internet browsers plan to implement censor busting DNS Over HTTPS technology
|
|
|
 | 22nd May 2019
|
|
| 19th May 2019. See Parliamentary transcription from 16th May from
theyworkforyou.com |
Tom Watson asked a parliamentary question about the censor busting technology of DNS over HTTPS. Up until now, ISPs have been able to intercept website address look ups (via a DNS server) and block the ones that they, or the state, don't like. This latest internet protocol allows browsers and applications to bypass ISPs' censored DNS servers and use encrypted alternatives that cannot then be intercepted by ISPs and so can't be censored by the state. (note that they can offer a censored service such as an option for a family friendly feeds, but this is on their own terms and not the state's).
Anyway Labour Deputy leader has been enquiring about whether browsers are intending to implement the new protocol. Perhaps revealing an idea to try and pressurise browsers into not offering options to circumvent the state's blocking list.
Tom Watson Deputy Leader of the Labour Party, Shadow Secretary of State for Digital, Culture, Media and Sport To ask the Secretary of State for Digital, Culture, Media and Sport, how many internet
browser providers have informed his Department that they will not be adopting the Internet Engineering Task Force DNS over HTTPS ( DOH ) protocol. Margot James The Minister of State, Department for Culture, Media and Sport
How DOH will be deployed is still a subject of discussion within the industry, both for browser providers and the wider internet industry. We are aware of the public statements made by some browser providers on deployment and we
are seeking to understand definitively their rollout plans. DCMS is in discussions with browser providers, internet industry and other stakeholders and we are keen to see a resolution that is acceptable for all parties. Update: Speaking of government pressure
22nd May 2019. See article from edinburghnews.scotsman.com
Here's another indication that the government is trying to preserve its internet censorship capabilities by pressurising browser companies: The Internet Service Providers Association (ISPA) - representing firms
including BT, Virgin, and Sky - has expressed concerns over the implications the encryption on Firefox could have on internet safety. A spokesperson said, We remain concerned about the consequences these proposed changes will have
for online safety and security, and it is therefore important that the Government sends a strong message to the browser manufacturers such as Mozilla that their encryption plans do not undermine current internet safety standards in the UK.
|
| |
|
|
|
 | 22nd May 2019
|
|
|
Proposed controversial online age verification checks could increase the risk of identity theft and other cyber crimes, warn security experts See
article from computerweekly.com |
| |
Firefox has a research project to integrate with TOR to create a Super Private Browsing mode
|
|
|
 | 21st May 2019
|
|
| See article from mozilla-research.forms.fm |
Age verification for porn is pushing internet users into areas of the internet that provide more privacy, security and resistance to censorship. I'd have thought that security services would prefer that internet users to remain in the more open areas
of the internet for easier snooping. So I wonder if it protecting kids from stumbling across porn is worth the increased difficulty in monitoring terrorists and the like? Or perhaps GCHQ can already see through the encrypted internet.
RQ12: Privacy & Security for Firefox Mozilla has an interest in potentially integrating more of Tor into Firefox, for the purposes of providing a Super Private Browsing (SPB) mode for our users.
Tor offers privacy and anonymity on the Web, features which are sorely needed in the modern era of mass surveillance, tracking and fingerprinting. However, enabling a large number of additional users to make use of the Tor network
requires solving for inefficiencies currently present in Tor so as to make the protocol optimal to deploy at scale. Academic research is just getting started with regards to investigating alternative protocol architectures and route selection protocols,
such as Tor-over-QUIC, employing DTLS, and Walking Onions. What alternative protocol architectures and route selection protocols would offer acceptable gains in Tor performance? And would they preserve Tor properties? Is it truly
possible to deploy Tor at scale? And what would the full integration of Tor and Firefox look like? |
| |
House of Lords: Questions about DNS over HTTPS
|
|
|
 | 15th May
2019
|
|
| See article from theyworkforyou.com |
At the moment when internet users want to view a page, they specify the page they want in the clear. ISPs can see the page requested and block it if the authorities don't like it. A new internet protocol has been launched that encrypts the specification
of the page requested so that ISPs can't tell what page is being requested, so can't block it. This new DNS Over HTTPS protocol is already available in Firefox which also provides an uncensored and encrypted DNS server. Users simply have to change the
settings in about:config (being careful of the dragons of course) Questions have been
raised in the House of Lords about the impact on the UK's ability to censor the internet. House of Lords, 14th May 2019, Internet Encryption Question Baroness Thornton Shadow Spokesperson (Health)
2:53 pm, 14th May 2019 To ask Her Majesty 's Government what assessment they have made of the deployment of the Internet Engineering Task Force 's new " DNS over HTTPS " protocol and its implications for the blocking
of content by internet service providers and the Internet Watch Foundation ; and what steps they intend to take in response. Lord Ashton of Hyde The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport
My Lords, DCMS is working together with the National Cyber Security Centre to understand and resolve the implications of DNS over HTTPS , also referred to as DoH, for the blocking of content online. This involves liaising
across government and engaging with industry at all levels, operators, internet service providers, browser providers and pan-industry organisations to understand rollout options and influence the way ahead. The rollout of DoH is a complex commercial and
technical issue revolving around the global nature of the internet. Baroness Thornton Shadow Spokesperson (Health) My Lords, I thank the Minister for that Answer, and I apologise to the House for
this somewhat geeky Question. This Question concerns the danger posed to existing internet safety mechanisms by an encryption protocol that, if implemented, would render useless the family filters in millions of homes and the ability to track down
illegal content by organisations such as the Internet Watch Foundation . Does the Minister agree that there is a fundamental and very concerning lack of accountability when obscure technical groups, peopled largely by the employees of the big internet
companies, take decisions that have major public policy implications with enormous consequences for all of us and the safety of our children? What engagement have the British Government had with the internet companies that are represented on the Internet
Engineering Task Force about this matter? Lord Ashton of Hyde The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport My Lords, I thank the noble Baroness for discussing this
with me beforehand, which was very welcome. I agree that there may be serious consequences from DoH. The DoH protocol has been defined by the Internet Engineering Task Force . Where I do not agree with the noble Baroness is that this is not an obscure
organisation; it has been the dominant internet technical standards organisation for 30-plus years and has attendants from civil society, academia and the UK Government as well as the industry. The proceedings are available online and are not restricted.
It is important to know that DoH has not been rolled out yet and the picture in it is complex--there are pros to DoH as well as cons. We will continue to be part of these discussions; indeed, there was a meeting last week, convened by the NCSC , with
DCMS and industry stakeholders present. Lord Clement-Jones Liberal Democrat Lords Spokesperson (Digital) My Lords, the noble Baroness has raised a very important issue, and it sounds from the
Minister 's Answer as though the Government are somewhat behind the curve on this. When did Ministers actually get to hear about the new encrypted DoH protocol? Does it not risk blowing a very large hole in the Government's online safety strategy set out
in the White Paper ? Lord Ashton of Hyde The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport As I said to the noble Baroness, the Government attend the IETF . The
protocol was discussed from October 2017 to October 2018, so it was during that process. As far as the online harms White Paper is concerned, the technology will potentially cause changes in enforcement by online companies, but of course it does not
change the duty of care in any way. We will have to look at the alternatives to some of the most dramatic forms of enforcement, which are DNS blocking. Lord Stevenson of Balmacara Opposition Whip (Lords)
My Lords, if there is obscurity, it is probably in the use of the technology itself and the terminology that we have to use--DoH and the other protocols that have been referred to are complicated. At heart, there are two issues at
stake, are there not? The first is that the intentions of DoH, as the Minister said, are quite helpful in terms of protecting identity, and we do not want to lose that. On the other hand, it makes it difficult, as has been said, to see how the Government
can continue with their current plan. We support the Digital Economy Act approach to age-appropriate design, and we hope that that will not be affected. We also think that the soon to be legislated for--we hope--duty of care on all companies to protect
users of their services will help. I note that the Minister says in his recent letter that there is a requirement on the Secretary of State to carry out a review of the impact and effectiveness of the regulatory framework included in the DEA within the
next 12 to 18 months. Can he confirm that the issue of DoH will be included? Lord Ashton of Hyde The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport Clearly, DoH is on
the agenda at DCMS and will be included everywhere it is relevant. On the consideration of enforcement--as I said before, it may require changes to potential enforcement mechanisms--we are aware that there are other enforcement mechanisms. It is not true
to say that you cannot block sites; it makes it more difficult, and you have to do it in a different way. The Countess of Mar Deputy Chairman of Committees, Deputy Speaker (Lords) My Lords, for the
uninitiated, can the noble Lord tell us what DoH means --very briefly, please? Lord Ashton of Hyde The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport It is not possible
to do so very briefly. It means that, when you send a request to a server and you have to work out which server you are going to by finding out the IP address, the message is encrypted so that the intervening servers are not able to look at what is in
the message. It encrypts the message that is sent to the servers. What that means is that, whereas previously every server along the route could see what was in the message, now only the browser will have the ability to look at it, and that will put more
power in the hands of the browsers. Lord West of Spithead Labour My Lords, I thought I understood this subject until the Minister explained it a minute ago. This is a very serious issue. I was
unclear from his answer: is this going to be addressed in the White Paper ? Will the new officer who is being appointed have the ability to look at this issue when the White Paper comes out? Lord Ashton of Hyde The
Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport It is not something that the White Paper per se can look at, because it is not within the purview of the Government. The protocol is designed by the
IETF , which is not a government body; it is a standards body, so to that extent it is not possible. Obviously, however, when it comes to regulating and the powers that the regulator can use, the White Paper is consulting precisely on those matters,
which include DNS blocking, so it can be considered in the consultation. |
| |
It couldn't possibly be anything to do with her government's policies to impoverish people through austerity, globalisation, benefits sanctions, universal credit failures and the need for food banks
|
|
|
| 15th May 2019
|
|
| See article from
dailymail.co.uk |
Jackie Doyle-Price is the government's first suicide prevention minister. She seems to believe that this complex and tragic social problem can somehow be cure by censorship and an end to free speech. She said society had come to tolerate behaviour
online which would not be tolerated on the streets. She urged technology giants including Google and Facebook to be more vigilant about removing harmful comments. Doyle-Price told the Press Association: It's
great that we have these platforms for free speech and any one of us is free to generate our own content and put it up there, ...BUT... free speech is only free if it's not abused. I just think in terms of implementing their duty of care to
their customers, the Wild West that we currently have needs to be a lot more regulated by them.
|
| |
|
|
|
 | 15th May 2019
|
|
|
Age verification measures pose a tangible threat to sex workers' income and safety. See article from elle.com
|
| |
Government announces new law to ban watching porn in public places
|
|
|
 | 13th May
2019
|
|
| See article from dailymail.co.uk
|
Watching pornography on buses is to be banned, ministers have announced. Bus conductors and the police will be given powers to tackle those who watch sexual material on mobile phones and tablets. Ministers are also drawing up plans for a
national database of claimed harassment incidents. It will record incidents at work and in public places, and is likely to cover wolf-whistling and cat-calling as well as more serious incidents. In addition, the Government is considering whether
to launch a public health campaign warning of the effects of pornography -- modelled on smoking campaigns.
|
| |
The Channel Islands is considering whether to join the UK in the censorship of internet porn
|
|
|
 | 13th May 2019
|
|
| See article from jerseyeveningpost.com |
As of 15 July, people in the UK who try to access porn on the internet will be required to verify their age or identity online. The new UK Online Pornography (Commercial Basis) Regulations 2018 law does not affect the Channel Islands but the
States have not ruled out introducing their own regulations. The UK Department for Censorship, Media and Sport said it was working closely with the Crown Dependencies to make the necessary arrangements for the extension of this legislation to the
Channel Islands. A spokeswoman for the States said they were monitoring the situation in the UK to inform our own policy development in this area.
|
| |
|
|
|
 | 6th May 2019
|
|
|
Detailed legal analysis of Online Harms white paper does not impress See article from cyberleagle.com |
| |
BBFC warns that age verification should not be coupled with electronic wallets
|
|
|
 | 4th
May 2019
|
|
| See article from
bbfc.co.uk |
The BBFC has re-iterated that its Age Verification certification scheme does not allow for personal data to be used for another purpose beyond age verification. In particular age verification should not be coupled with electronic wallets. Presumably this is intended to prevent personal date identifying porn users to be dangerously stored in databases use for other purposes.
In passing, this suggests that there may be commercial issues as age verification systems for porn may not be reusable for age verification for social media usage or identity verification required for online gambling. I suspect that several AV
providers are only interested in porn as a way to get established for social media age verification. This BBFC warning may be of particular interest to users of the porn site xHamster. The preferred AV option for that website is the electronic
wallet 1Account. The BBFC write in a press release: The Age-verification Regulator under the UK's Digital Economy Act, the British Board of Film Classification (BBFC), has advised age-verification providers that
they will not be certified under the Age-verification Certificate (AVC) if they use a digital wallet in their solution. The AVC is a voluntary, non-statutory scheme that has been designed specifically to ensure age-verification
providers maintain high standards of privacy and data security. The AVC will ensure data minimisation, and that there is no handover of personal information used to verify an individual is over 18 between certified age-verification providers and
commercial pornography services. The only data that should be shared between a certified AV provider and an adult website is a token or flag indicating that the consumer has either passed or failed age-verification. Murray
Perkins, Policy Director for the BBFC, said: A consumer should be able to consider that their engagement with an age-verification provider is something temporary.
In order to
preserve consumer confidence in age-verification and the AVC, it was not considered appropriate to allow certified AV providers to offer other services to consumers, for example by way of marketing or by the creation of a digital wallet. The AVC is
necessarily robust in order to allow consumers a high level of confidence in the age-verification solutions they choose to use. Accredited providers will be indicated by the BBFC's green AV symbol, which is what consumers should
look out for. Details of the independent assessment will also be published on the BBFC's age-verification website, ageverificationregulator.com, so consumers can make an informed choice between age-verification providers. The
Standard for the AVC imposes limits on the use of data collected for the purpose of age-verification, and sets out requirements for data minimisation. The AVC Standard has been developed by the BBFC and NCC Group - who are experts
in cyber security and data protection - in cooperation with industry, with the support of government, including the National Cyber Security Centre and Chief Scientific Advisors, and in consultation with the Information Commissioner's Office. In order to
be certified, AV Providers will undergo an on-site audit as well as a penetration test. Further announcements will be made on AV Providers' certification under the scheme ahead of entry into force on July 15.
|
| |
Well known security expert does a bit of a hatchet job on the BBFC Age Verification Certificate Standard
|
|
|
 |
27th April 2019
|
|
| See article from threadreaderapp.com Also an
article from twitter.com |
Starting with a little background into the authorship of the document under review. AVSecure CMO Steve Winyard told XBIZ: The accreditation plan appears to have very strict rules and was crafted with significant
input from various governmental bodies, including the DCMS (Department for Culture, Media & Sport), NCC Group plc (an expert security and audit firm), GCHQ (U.K. Intelligence and Security Agency), ICO (Information Commissioner's Office) and of course the
BBFC.
But computer security expert Alec Muffett writes: This is the document which is being proffered to protect the facts & details of _YOUR_ online #Porn viewing. Let's read it together!
What could possibly go wrong? .... This document's approach to data protection is fundamentally flawed. The (considerably) safer approach - one easier to certificate/validate/police -
would be to say everything is forbidden except for upon for ; you would then allow vendors to appeal for exceptions under review. It makes a few passes at
pretending that this is what it's doing, but with subjective holes (green) that you can drive a truck through:
... What we have here is a rehash of quite a lot of reasonable physical/operational security, business continuity & personnel security management thinking -- with digital stuff almost entirely punted.
It's better than #PAS1296 , but it's still not fit for purpose.
Read the full thread
|
| |
Does the BBFC AV kite mark mean that at age verification service is safe?
|
|
|
 | 22nd April 2019
|
|
| See BBFC Age-verification Certificate
Standard [pdf] from ageverificationregulator.com See article from avsecure.com |
The BBFC has published a detailed standard for age verifiers to get tested against to obtain a green AV kite mark aiming to convince users that their identity data and porn browsing history is safe. I have read through the document and conclude
that it is indeed a rigorous standard that I guess will be pretty tough for companies to obtain. I would say it would be almost impossible for a small or even medium size website to achieve the standard and more or less means that using an age
verification service is mandatory. The standard has lots of good stuff about physical security of data and vetting of staff access to the data. Age verifier AVSecure commented: We received the final
documents and terms for the BBFC certification scheme for age verification providers last Friday. This has had significant input from various Government bodies including DCMS (Dept for Culture, Media & Sport), NCC Group plc (expert security and audit
firm), GCHQ (UK Intelligence & Security Agency) ICO (Information Commissioner's Office) and of course the BBFC (the regulator). The scheme appears to have very strict rules. It is a multi-disciplined scheme
which includes penetration testing, full and detailed audits, operational procedures over and above GDPR and the DPA 2018 (Data Protection Act). There are onerous reporting obligations with inspection rights attached. It is also a very costly scheme when
compared to other quality standard schemes, again perhaps designed to deter the faint of heart or shallow of pocket. Consumers will likely be advised against using any systems or methods where the prominent green AV accreditation
kitemark symbol is not displayed.
But will the age verifier be logging your ID data and browsing history?
And the answer is very hard to pin down from the document. At first read it suggests that minimal data will be retained, but a more sceptical read, connecting a few paragraphs together suggests that the verifier will be required to keep extensive records
about the users porn activity. Maybe this is a reflection of a recent change of heart. Comments from AVSecure suggested that the BBFC/Government originally mandated a log of user activity but recently decided that keeping a log or not is down to
the age verifier. As an example of the rather evasive requirements: 8.5.9 Physical Location Personal data relating to the physical location of a user shall not be collected as part of the
age-verification process unless required for fraud prevention and detection. Personal data relating to the physical location of a user shall only be retained for as long as required for fraud prevention and detection.
Here it sounds
like keeping tabs on location is optional, but another paragraph suggest otherwise: 8.4.14 Fraud Prevention and Detection Real-time intelligent monitoring and fraud prevention and detection
systems shall be used for age-verification checks completed by the age-verification provider.
Now it seems that the fraud prevention is mandatory, and so a location record is mandatory after all. Also the use off the phrase only be retained for as long as required for fraud prevention and detection.
seems a little misleading too, as in reality fraud prevention will be required for as long as the customer keeps on using it. This may as well be forever. There are other statements that sound good at first read, but don't really offer
anything substantial: 8.5.6 Data Minimisation Only the minimum amount of personal data required to verify a user's age shall be collected.
But if the minimum is to provide
name and address + eg a drivers licence number or a credit card number then the minimum is actually pretty much all of it. In fact there are only the porn pass methods that offer any scope for 'truely minimal' data collection. Perhaps the minimal data
also applies to the verified mobile phone method as although the phone company probably knows your identity, then maybe they won't need to pass it on to the age verifier. What does the porn site get to know
The rare unequivocal and reassuring statement is 8.5.8 Sharing Results Age-verification providers shall only share the result of an age-verification check (pass or fail) with the requesting
website.
So it seems that identity details won't be passed to the websites themselves. However the converse is not so clear: 8.5.6 Data Minimisation Information about
the requesting website that the user has visited shall not be collected against the user's activity.
Why add the phrase, against the user's activity. This is worded such that information about the requesting website could
indeed be collected for another reason, fraud detection maybe. Maybe the scope for an age verifier to maintain a complete log of porn viewing is limited more by the practical requirement for a website to record a successful age verification in a
cookie such that the age verifier only gets to see one interaction with each website. No doubt we shall soon find out whether the government wants a detailed log of porn viewed, as it will be easy to spot if a website queries the age verifier for
every film you watch.
Fraud Detection And what about all this reference to fraud detection. Presumably the BBFC/Government is a little worried that passwords and accounts will be shared by enterprising kids. But on the other hand
it may make life tricky for those using shared devices, or perhaps those who suddenly move from London to New York in an instant, when in fact this is totally normal for someone using a VPN on a PC. Wrap up
The BBFC/Government
have moved on a long way from the early days when the lawmakers created the law without any real protection for porn users and the BBFC first proposed that this could be rectified by asking porn companies to voluntarilyfollow 'best practice' in keeping
people's data safe. A definite improvement now, but I think I will stick to my VPN. |
| |
It's good to see the internet community pull together to work around censorship via age verification
|
|
|
 | 22nd April 2019
|
|
| Thanks to Jon and Kath 6th April 2019. See
article from prolificlondon.co.uk See also
iwantfourplay.com |
A TV channel, a porn producer, an age verifier and maybe even the government got together this week to put out a live test of age verification. The test was implemented on a specially created website featuring a single porn video. The test
required a well advertised website to provide enough traffic of viewers positively wanting to see the content. Channel 4 obliged with its series Mums Make Porn. The series followed a group of mums making a porn video that they felt would be more
sex positive and less harmful to kids than the more typical porn offerings currently on offer. The mums did a good job and produced a decent video with a more loving and respectful interplay than is the norm. The video however is still proper
hardcore porn and there is no way it could be broadcast on Channel 4. So the film was made available, free of charge, on its own dedicated website complete with an age verification requirement. The website was announced as a live test for
AgeChecked software to see how age verification would pan out in practice. It featured the following options for age verification
- entering full credit card details + email
- entering driving licence number + name and address + email
- mobile phone number + email (the phone must have been verified as 18+ by the service provider and must must be ready to receive an
SMS message containing login details)
Nothing has been published in detail about the aims of the test but presumably they were interested in the basic questions such as:
- What proportion of potential viewers will be put off by the age verification?
- What proportion of viewers would be stupid enough to enter their personal data?
- Which options of identification would be preferred by viewers?
The official test 'results' Alastair Graham, CEO of AgeChecked provided a few early answers inevitably claiming that: The results of this first mainstream test of our software were hugely
encouraging.
He went on to claim that customers are willing to participate in the process, but noted that verified phone number method emerged as by far the most popular method of verification. He said that this finding would
be a key part of this process moving forward. Reading between the lines perhaps he was saying that there wasn't much appetite for handing over detailed personal identification data as required by the other two methods. I suspect that we
will never get to hear more from AgeChecked especially about any reluctance of people to identify themselves as porn viewers. The unofficial test results
Maybe they were also interested in other questions too:
- Will people try and work around the age verification requirements?
- if people find weaknesses in the age verification defences, will they pass on their discoveries to others?
Interestingly the age verification requirement was easily sidestepped by those with a modicum of knowledge about downloading videos from websites such as YouTube and PornHub. The age verification mechanism effectively only hid the start button from
view. The actual video remained available for download, whether people age verified or not. All it took was a little examination of the page code to locate the video. There are several tools that allow this: video downloader addons, file downloaders or
just using the browser's built in debugger to look at the page code. Presumably the code for the page was knocked up quickly so this flaw could have been a simple oversight that is not likely to occur in properly constructed commercial websites.
Or perhaps the vulnerability was deliberately included as part of the test to see if people would pick up on it. However it did identify that there is a community of people willing to stress test age verification restrictions and see if work
rounds can be found and shared. I noted on Twitter that several people had posted about the ease of downloading the video and had suggested a number of tools or methods that enabled this. There was also an interesting article posted on
achieving age verification using an expired credit card. Maybe that is not so catastrophic as it still identifies a cardholder as over 18, even if cannot be used to make a payment. But of course it may open new possibilities for misuse of old data. Note
that random numbers are unlikely to work because of security algorithms. Presumably age verification companies could strengthen the security by testing that a small transaction works, but this intuitively this would have significant cost implications. I
guess that to achieve any level of take up, age verification needs to be cheap for both websites and viewers. Community Spirit It was very heartening to see how many people were helpfully contributing their thoughts about
testing the age verification software. Over the course of a couple of hours reading, I learnt an awful lot about how websites hide and protect video content, and what tools are available to see through the protection. I suspect that many others
will soon be doing the same... and I also suspect that young minds will be far more adept than I at picking up such knowledge. A final thought I feel a bit sorry for small websites who sell content. It adds a whole new level
complexity as a currently open preview area now needs to be locked away behind an age verification screen. Many potential customers will be put off by having to jump through hoops just to see the preview material. To then ask them to enter all their
credit card details again to subscribe, may be a hurdle too far. Update: The Guardian reports that age verification were easily circumvented 22nd April 2019. See
article from theguardian.com
The Guardian reported that the credit card check used by AgeChecked could be easily fooled by generating a totally false credit card number. Note that a random number will not work as there is a well known sum check algorithm which invalidates a lot of
random numbers. But anyone who knows or looks up the algorithm would be able to generate acceptable credit card numbers that would at least defeat AgeChecked. Or they would have been had AgeChecked not now totally removed the credit card check
option from its choice of options. Still the damage was done when the widely distributed Guardian article has established doubts about the age verification process. Of course the workaround is not exactly trivial and will stop younger kids
from 'stumbling on porn' which seems to be the main fall back position of this entire sorry scheme. |
| |
David Flint looks into flimsy porn evidence used to justify government censorship
|
|
|
 | 22nd
April 2019
|
|
| See article from
reprobatepress.com |
|
| |
|
|
|
 | 22nd April
2019
|
|
|
John Carr, a leading supporter of the government's porn censorship regime, is a little exasperated by its negative reception in the media See
article from johnc1912.wordpress.com |
| |
|
|
|
 | 21st April 2019
|
|
|
Politics, privacy and porn: the challenges of age-verification technology. By Ray Allison See article from computerweekly.com
|
| |
VPNCompare reports a significant increase in website visitors in response to upcoming porn censorship. Meanwhile age verifications options announced so far for major websites seem to be apps only
|
|
|
 |
20th April 2019
|
|
| See article from vpncompare.co.uk |
VPNCompare is reporting that internet users in Britain are responding to the upcoming porn censorship regime by investigating the option to get a VPN so as to workaround most age verification requirements without handing over dangerous identity
details. VPNCompare says that the number of UK visitors to its website has increased by 55% since the start date of the censorship scheme was announced. The website also sated that Google searches for VPNs had trippled. Website editor, Christopher
Seward told the Independent: We saw a 55 per cent increase in UK visitors alone compared to the same period the previous day. As the start date for the new regime draws closer, we can expect this number to rise even
further and the number of VPN users in the UK is likely to go through the roof. The UK Government has completely failed to consider the fact that VPNs can be easily used to get around blocks such as these.
Whilst the immediate assumption is that porn viewers will reach for a VPN to avoid handing over dangerous identity information, there may be another reason to take out a VPN, a lack of choice of appropriate options for age validation.
3 companies run the 6 biggest adult websites. Mindgeek owns Pornhub, RedTube and Youporn. Then there is Xhamster and finally Xvideos and xnxx are connected. Now Mindgeek has announced that it will partner with Portes Card for age
verification, which has options for identity verification, giving a age verified mobile phone number, or else buying a voucher in a shop and showing age ID to the shop keeper (which is hopefully not copied or recorded). Meanwhile Xhamster has
announced that it is partnering with 1Account which accepts a verified mobile phone, credit card, debit card, or UK drivers licence. It does not seem to have an option for anonymous verification beyond a phone being age verified without having to show
ID. Perhaps most interestingly is that both of these age verifiers are smart phone based apps. Perhaps the only option for people without a phone is to get a VPN. I also spotted that most age verification providers that I have looked at seem to be
only interested in UK cards, drivers licences or passports. I'd have thought there may be legal issues in not accepting EU equivalents. But foreigners may also be in the situation of not being able to age verify and so need a VPN. And of course
the very fact that is no age verification option common to the major porn website then it may just turn out to be an awful lot simpler just to get a VPN. |
| |
|
|
|
 | 20th April
2019
|
|
|
An interesting look at the government's Online Harms white paper proposing extensive internet censorship for the UK See article from cyberleagle.com
|
| |
Is your identity data and porn browsing history safe with an age verification service sporting a green BBFC AV badge?...Err...No!...
|
|
|
 | 19th April 2019
|
|
| See article from ageverificationregulator.com |
The Interrogator : Is it safe?
The BBFC (on its Age Verification website)...err...no!...: An assessment and accreditation under the AVC is not a
guarantee that the age-verification provider and its solution (including its third party companies) comply with the relevant legislation and standards, or that all data is safe from malicious or criminal interference. Accordingly
the BBFC shall not be responsible for any losses, damages, liabilities or claims of whatever nature, direct or indirect, suffered by any age-verification provider, pornography services or consumers/ users of age-verification provider's services or
pornography services or any other person as a result of their reliance on the fact that an age-verification provider has been assessed under the scheme and has obtained an Age-verification Certificate or otherwise in connection with the scheme.
|
| |
|
|
|
 | 18th April 2019
|
|
|
But it will spell the end of ethical porn. By Girl on the Net See article from theguardian.com
|
| |
The government announces that its internet porn censorship scheme will come into force on 15th July 2019
|
|
|
 | 17th April 2019
|
|
| See press release from gov.uk
|
The UK will become the first country in the world to bring in age-verification for online pornography when the measures come into force on 15 July 2019. It means that commercial providers of online pornography will be required by law to carry out
robust age-verification checks on users, to ensure that they are 18 or over. Websites that fail to implement age-verification technology face having payment services withdrawn or being blocked for UK users. The British Board of Film
Classification (BBFC) will be responsible for ensuring compliance with the new laws. They have confirmed that they will begin enforcement on 15 July, following an implementation period to allow websites time to comply with the new standards. Minister for Digital Margot James said that she wanted the UK to be the most censored place in the world to b eonline:
Adult content is currently far too easy for children to access online. The introduction of mandatory age-verification is a world-first, and we've taken the time to balance privacy concerns with the need to protect
children from inappropriate content. We want the UK to be the safest place in the world to be online, and these new laws will help us achieve this.
Government has listened carefully to privacy concerns and is clear that
age-verification arrangements should only be concerned with verifying age, not identity. In addition to the requirement for all age-verification providers to comply with General Data Protection Regulation (GDPR) standards, the BBFC have created a
voluntary certification scheme, the Age-verification Certificate (AVC), which will assess the data security standards of AV providers. The AVC has been developed in cooperation with industry, with input from government. Certified age-verification
solutions which offer these robust data protection conditions will be certified following an independent assessment and will carry the BBFC's new green 'AV' symbol. Details will also be published on the BBFC's age-verification website,
ageverificationregulator.com so consumers can make an informed choice between age-verification providers. BBFC Chief Executive David Austin said: The introduction of age-verification to restrict access to
commercial pornographic websites to adults is a ground breaking child protection measure. Age-verification will help prevent children from accessing pornographic content online and means the UK is leading the way in internet safety.
On entry into force, consumers will be able to identify that an age-verification provider has met rigorous security and data checks if they carry the BBFC's new green 'AV' symbol.
The change in law is part of the
Government's commitment to making the UK the safest place in the world to be online, especially for children. It follows last week's publication of the Online Harms White Paper which set out clear responsibilities for tech companies to keep UK citizens
safe online, how these responsibilities should be met and what would happen if they are not. |
| |
When spouting on about keeping porn users data safe the DCMS proves that it simply can't be trusted by revealing journalists' private emails
|
|
|
 | 17th April
2019
|
|
| See article from bbc.com |
|
| Believe us, we can cure all society's ills
|
A government department responsible for data protection laws has shared the private contact details of hundreds of journalists. The Department for Censorship, Media and Sport emailed more than 300 recipients in a way that allowed their
addresses to be seen by other people. The email - seen by the BBC - contained a press release about age verifications for adult websites . Digital Minister Margot James said the incident was embarrassing. She added:
It was an error and we're evaluating at the moment whether that was a breach of data protection law. In the email sent on Wednesday, the department claimed new rules would offer robust data protection conditions,
adding: Government has listened carefully to privacy concerns. |
| |
|
|
|
 | 17th April 2019
|
|
|
Instead of regulating the internet to protect young people, give them a youth-net of their own. By Conor Friedersdorf See article
from theatlantic.com |
|
|