|
MPs nod through the BBFC internet porn censorship guidelines
|
|
|
| 19th December
2018
|
|
| See parliamentary transcription from theyworkforyou.com See
TV recording from parliamentlive.tv |
The House of Commons approved the upcoming internet porn censorship scheme to be implemented by the BBFC from about Easter 2019. The debate was set for 3 sections to approve each of the 3 documents defining the BBFC censorship guidelines. Each was
allotted 90 minutes for a detailed debate on how the BBFC would proceed. However following a Brexit debate the debate was curtailed to a single 90 minute session covering all 3 sections. It didn't matter much as the debate consisted only of MPs
with a feminist agenda saying how the scope of the censorship didn't go far enough. Even the government spokeswoman leading the debate didn't understand why the rules didn't go further in extending sites being censored to social media; and why the range
of porn to be banned outright wasn't more extensive. Hardly a word said was relevant to the topic of examining the BBFC guidelines. Issues of practicality, privacy, the endangerment of porn viewers from fraud, outing and blackmail are
clearly of no interest to MPs. The MPs duly nodded their approval of the BBFC regime and so it will soon be announced when the censorship will commence. The age verification service provider was quick to follow up with a press release
extolling the virtues of its porn viewing card approach. Several newspapers obliging published articles using it, eg See
Porn sites 'will all require proof of age from April 2019' -- here's how it'll work from metro.co.uk
|
|
The upcoming porn censorship regime has been approved by the Lords
|
|
|
|
14th December 2018
|
|
| See article from xbiz.com See
transcript of the Lords debate from theyworkforyou.com |
On Tuesday the House of Lords approved the BBFC's scheme to implement internet porn censorship in the UK. Approval will now be sought from the House of Commons. The debate in the Lords mentioned a few issues in passing but they seemed to be
avoiding taking about some of the horrors of the scheme. The Digital Economy Act defining the law behind the scheme offers no legal requirement for age verification providers to restrict how they can use porn viewers data. Lords mentioned that it
is protected under the GDPR rules but these rules still let companies do whatever they like with data, just with the proviso that they ask for consent. But of course the consent is pretty mandatory to sign up for age verification, and some of the biggest
internet companies in the world have set the precedent they can explain wide ranging usage of the data claiming it will be used say to improve customer experience. Even if the lords didn't push very hard, people at the DCMS or BBFC have
been considering this deficiency, and have come up with the idea that data use should be voluntarily restricted according to a kite mark scheme. Age verification schemes will have their privacy protections audited by some independent group and if they
pass they can display a gold star. Porn viewers are then expected to trust age verification schemes with a gold star. But unfortunately it sounds a little like the sort of process that decided that cladding was safe for high rise blocks of flats. The lords were much more concerned about the age verification requirements for social media and search engines, notably Twitter and Google Images. Clearly age verification schemes required for checking that users are 13 or 18 will be very different from an 18 only check, and will be technically very different. So the Government explained that these wider issues will be addressed in a new censorship white paper to be published in 2019.
The lords were also a bit perturbed that the definition of banned material wasn't wide enough for their own preferences. Under the current scheme the BBFC will be expected to ban totally any websites with child porn or extreme porn. The lords
wondered why this wasn't extended to cartoon porn and beyond R18 porn, presumably thinking of fisting, golden showers and the like. However in reality if the definition of bannable porn was extended, then every major porn website in the word would have
to be banned by the BBFC. And anyway the government is changing its censorship rules such that fisting and golden showers are, or will soon be, allowable at R18 anyway. The debate revealed that the banks and payment providers have already agreed
to ban payments to websites banned by the BBFC. The government also confirmed its intention to get the scheme up and running by April. Saying that, it would seem a little unfair for the website's 3 month implementation period to be set running before
their age verification options are accredited with their gold stars. Otherwise some websites would waste time and money implementing schemes that may later be declared unacceptable. Next a motion to approve draft legislation over the UK's
age-verification regulations will be debated in the House of Commons. Stephen Winyard, AVSecure s chief marketing officer, told XBIZ: We are particularly pleased that the prime minister is set to approve the draft
guidance for the age-verification law on Monday. From this, the Department for Digital, Culture, Media and Sport will issue the effective start date and that will be around Easter.
But maybe the prime minister has a few more urgent
issues on her mind at the moment.
|
|
The government's age verification scheme which leaves people's sensitive sexual preferences unprotected by law is to be presented for approval by the House of Lords
|
|
|
|
10th December 2018
|
|
| See article from lordsbusiness.parliament.uk
|
The following four motions are expected to be debated together in the House of Lords on 11th December 2018: Online Pornography (Commercial Basis) Regulations 2018 Lord Ashton of Hyde to move that the
draft Regulations laid before the House on 10 October be approved. Special attention drawn to the instrument by the Joint Committee on Statutory Instruments, 38th Report, 4th Report from the Secondary Legislation Scrutiny Committee (Sub-Committee B)
Guidance on Age-verification Arrangements Lord Ashton of Hyde to move that the draft Guidance laid before the House on 25 October be approved. Special attention drawn to the instrument by the Joint
Committee on Statutory Instruments, 39th Report, 4th Report from the Secondary Legislation Scrutiny Committee (Sub-Committee B) Lord Stevenson of Balmacara to move that this House regrets that the draft Online Pornography
(Commercial Basis) Regulations 2018 and the draft Guidance on Age-verification Arrangements do not bring into force section 19 of the Digital Economy Act 2017, which would have given the regulator powers to impose a financial penalty on persons who have
not complied with their instructions to require that they have in place an age verification system which is fit for purpose and effectively managed so as to ensure that commercial pornographic material online will not normally be accessible by persons
under the age of 18. Guidance on Ancillary Service Providers Lord Ashton of Hyde to move that the draft Guidance laid before the House on 25 October be approved. Special attention drawn to the
instrument by the Joint Committee on Statutory Instruments, 39th Report, 4th Report from the Secondary Legislation Scrutiny Committee (Sub-Committee B) The DCMS and BBFC age verification scheme has been widely panned as fundamentally the law
provides no requirement to actually protect people's identity data that can be coupled with their sexual preferences and sexuality. The scheme only offers voluntary suggestions that age verification services and websites should protect their user's
privacy. But one only has to look to Google, Facebook and Cambridge Analytica to see how worthless mere advice is. GDPR is often quoted but that only requires that user consent is obtained. One will have to simply to the consent to the 'improved user
experience' tick box to watch the porn, and thereafter the companies can do what the fuck they like with the data. See criticism of the scheme:
Security expert provides a detailed break down of the privacy and security failures of the age
verification scheme Parliamentary scrutiny committee condemns BBFC Age Verification Guidelines
Parliamentary scrutiny committee condemns as 'defective' a DCMS Statutory Instrument excusing Twitter and Google
images from age verification. |
|
|
|
|
| 8th
December 2018
|
|
|
No sex please, we're beholden to our advertisers. By Violet Blue See article from engadget.com |
|
Uganda blocks 27 internet porn websites
|
|
|
| 6th December 2018
|
|
| See article from the-star.co.ke
|
ISPs in Uganda have blocked 27 pornography websites after a directive was issued by the Uganda Communications Commission. Pornhub, Xvideos, and Youporn were among the top 100 most visited websites. The Daily Monitor reports that at least 25 of
the 27 banned websites cannot be accessed on mobile phones. However, users of Virtual Private Networks can access the banned sites. Chairperson of the Pornography Control Committee Annette Kezaabu told the Monitor there is a drop in the number of
people accessing pornography after they blocked the prominent porn sites. She said: We have a team that is compiling a list of other porn sites that will be blocked We anticipate that some
people will open up new sites but this is a continuous process.
|
|
|
|
|
| 5th December 2018
|
|
|
New Zealand film censor, with a keen eye on upcoming UK cesorship, publishes a report on porn viewing by the young and inevitably finds that they want porn to be censored See
report [pdf] from classificationoffice.govt.nz |
|
Parliamentary scrutiny committee condemns as 'defective' a DCMS Statutory Instrument excusing Twitter and Google images from age verification. Presumably one of the reasons for the delayed introduction
|
|
|
| 3rd
December 2018
|
|
| See article from publications.parliament.uk
|
There's a joint committee to scrutinise laws passed in parliament via Statutory Instruments. These are laws that are not generally presented to parliament for discussion, and are passed by default unless challenged. The committee has now taken issue
with a DCMS law to excuse the likes of social media and search engines from requiring age verification for any porn images that may get published on the internet. The committee reports from a session on 21st November 2018 that the law was defective and
'makes an unexpected use of the enabling power'. Presumably this means that the DCMS has gone beyond the scope of what can be passed without full parliamentary scrutiny. Draft S.I.: Reported for defective drafting and for
unexpected use of powers Online Pornography (Commercial Basis) Regulations 2018 7.1 The Committee draws the special attention of both Houses to these draft Regulations on the grounds that they are defectively drafted and
make an unexpected use of the enabling power. 7.2 Part 3 of the Digital Economy Act 2017 ("the 2017 Act") contains provisions designed to prevent persons under the age of 18 from accessing internet sites which
contain pornographic material. An age-verification regulator 1 is given a number of powers to enforce the requirements of Part 3, including the power to impose substantial fines. 2 7.3 Section 14(1) is the key requirement. It
provides: "A person contravenes [Part 3 of the Act] if the person makes pornographic material available on the internet to persons in the United Kingdom on a commercial basis other than in a way that secures
that, at any given time, the material is not normally accessible by persons under the age of 18".
7.4 The term "commercial basis" is not defined in the Act itself. Instead, section 14(2) confers a
power on the Secretary of State to specify in regulations the circumstances in which, for the purposes of Part 3, pornographic material is or is not to be regarded as made available on a commercial basis. These draft regulations would be made in exercise
of that power. Regulation 2 provides: "(1) Pornographic material is to be regarded as made available on the internet to persons in the United Kingdom on a commercial basis for the purposes of Part 3 of the Digital
Economy Act 2017 if either paragraph (2) or (3) are met. (2) This paragraph applies if access to that pornographic material is available only upon payment. (3) This paragraph applies (subject to paragraph
(4)) if the pornographic material is made available free of charge and the person who makes it available receives (or reasonably expects to receive) a payment, reward or other benefit in connection with making it available on the internet.
(4) Subject to paragraph (5), paragraph (3) does not apply in a case where it is reasonable for the age-verification regulator to assume that pornographic material makes up less than one-third of the content of the material made
available on or via the internet site or other means (such as an application program) of accessing the internet by means of which the pornographic material is made available. (5) Paragraph (4) does not apply if the internet
site or other means (such as an application program) of accessing the internet (by means of which the pornographic material is made available) is marketed as an internet site or other means of accessing the internet by means of which pornographic
material is made available to persons in the United Kingdom."
7.5 The Committee finds these provisions difficult to understand, whether as a matter of simple English or as legal propositions. Paragraphs (4) and
(5) are particularly obscure. 7.6 As far as the Committee can gather from the Explanatory Memorandum, the policy intention is that a person will be regarded as making pornographic material available on the internet on a commercial
basis if: (A) a charge is made for access to the material; OR (B) the internet site is accessible free of charge, but the person expects to receive a payment or other commercial benefit, for
example through advertising carried on the site.
7.7 There is, however, an exception to (B): in cases in which no access charge is made, the person will NOT be regarded as making the pornographic material available on
a commercial basis if the material makes up less than one-third of the content on the internet site--even if the person expects to receive a payment or other commercial benefit from the site. But that exception does not apply in a case where the person
markets it as a pornographic site, or markets an "app" as a means of accessing pornography on the site. 7.8 As the Committee was doubtful whether regulation 2 as drafted is effective to achieve the intended result, it
asked the Department for Digital, Culture, Media and Sport a number of questions. These were designed to elicit information about the regulation's meaning and effect. 7.9 The Committee is disappointed with the Department's
memorandum in response, printed at Appendix 7: it fails to address adequately the issues raised by the Committee. 7.10 The Committee's first question asked the Department to explain why paragraph (1) of regulation 2 refers to
whether either paragraph (2) or (3) "are met" 3 rather than "applies". The Committee raised this point because paragraphs (2) and (3) each begin with "This paragraph applies if ...". There is therefore a mismatch between
paragraph (1) and the subsequent paragraphs, which could make the regulation difficult to interpret. It would be appropriate to conclude paragraph (1) with "is met" only if paragraphs (2) and (3) began with "The condition in this paragraph
is met if ...". The Department's memorandum does not explain this discrepancy. The Committee accordingly reports regulation 2(1) for defective drafting. 7.11 The first part of the Committee's second question sought to
probe the intended effect of the words in paragraph (4) of regulation 2 italicised above, and how the Department considers that effect is achieved. 7.12 While the Department's memorandum sets out the policy reasons for setting the
one-third threshold, it offers little enlightenment on whether paragraph (4) is effective to achieve the policy aims. Nor does it deal properly with the second part of the Committee's question, which sought clarification of the concept of "one-third
of ... material ... on ... [a] means .... of accessing the internet ...". 7.13 The Committee is puzzled by the references in regulation 2(4) to the means of accessing the internet. Section 14(2) of the 2017 Act confers a
power on the Secretary of State to specify in regulations circumstances in which pornographic material is or is not to be regarded as made available on the internet on a commercial basis. The means by which the material is accessed (for example, via an
application program on a smart phone) appears to be irrelevant to the question of whether it is made available on the internet on a commercial basis. The Committee remains baffled by the concept of "one-third of ... material ... on [a] means ... of
accessing the internet". 7.14 More generally, regulation 2(4) fails to specify how the one-third threshold is to be measured and what exactly it applies to. Will the regulator be required to measure one-third of the pictures
or one-third of the words on a particular internet site or both together? And will a single webpage on the site count towards the total if less than one-third of the page's content is pornographic--for example, a sexually explicit picture occupying 32%
of the page, with the remaining 68% made up of an article about fishing? The Committee worries that the lack of clarity in regulation 2(4) may afford the promoter of a pornographic website opportunities to circumvent Part 3 of the 2017 Act.
7.15 The Committee is particularly concerned that a promoter may make pornographic material available on one or more internet sites containing multiple pages, more than two-thirds of which are non-pornographic. For every 10 pages of
pornography, there could be 21 pages about (for example) gardening or football. Provided the sites are not actively marketed as pornographic, they would not be regarded as made available on a commercial basis. This means that Part 3 of the Act would not
apply, and the promoter would be free to make profits through advertising carried on the sites, while taking no steps at all to ensure that they were inaccessible to persons under 18. 7.16 The Committee anticipates that the
shortcomings described above are likely to cause significant difficulty in the application and interpretation of regulation 2(4). The Committee also doubts whether Parliament contemplated, when enacting Part 3 of the 2017 Act, that the power conferred by
section 14(2) would be exercised in the way provided for in regulation 2(4). The Committee therefore reports regulation 2(4) for defective drafting and on the ground that it appears to make an unexpected use of the enabling power.
|
|
|
|
|
|
26th November 2018
|
|
|
Beyond the massive technical challenge, filters are a lazy alternative to effective sex education. By Lux Alptraum See
article from theverge.com |
|
Further demonstrating how dangerous it is for the government to demand that identity information is handed over before viewers can access the adult web
|
|
|
| 21st November 2018
|
|
| See article from bbc.com |
The website of an adult video game featuring sexualised animals has been hacked, with the information of nearly half a million subscribers stolen. High Tail Hall is a customisable role-playing game, which features what the website describes as
sexy furry characters, including buxom zebras and scantily clad lionesses. The compromised information, including email addresses, names and order histories, resurfaced on a popular hacking forum a few months later. HTH Studio has acknowledged the
breach and say that it has been fixed. The company added: Both our internal security and web team security assures us that no financial data was compromised. The security of our users is the highest priority.
It
further recommended that all users change their passwords. So although credit card data is safe users are still at risk from identity fraud, outing and blackmail. It is the latest in a long series of hacks aimed at adult sites and demonstrates the
dangers for UK porn viewers when they are forced to supply identity information to be able to browse the adult web. |
|
|
|
|
| 20th November
2018
|
|
|
Once, the fight against pornography was the beating heart of the American culture war. Now porn is a ballooning industry with no real opponents. What happened? By Tim Alberta See
article from politico.com |
|
High Court orders the censorship of all internet porn websites for 6 months
|
|
|
| 19th November 2018
|
|
| See article from
thedailystar.net |
The Bangladesh High Court has ordered the country's government to block all pornography websites and publication of all obscene materials from the internet for the next six months. The court also ordered the authorities concerned to explain in four
weeks why pornography websites and publication of obscene materials should not be declared illegal. The judges issue the orders in response to a writ petition filed by Law and Life Foundation campaigning for internet censorship.
|
|
DCMS minister Margot James informs parliamentary committee of the schedule for the age verification internet porn censorship regime
|
|
|
|
15th November 2018
|
|
| See
article from
data.parliament.uk |
Age Verification and adult internet censorship was discussed by the Commons Science and Technology Committee on 13th November 2018. Carol Monaghan Committee Member: The Digital Economy Act made it compulsory for commercial
pornography sites to undertake age verification, but implementation has been subject to ongoing delays. When do we expect it to go live? Margot James MP, Minister for Digital and the Creative Industries: We can expect it to
be in force by Easter next year. I make that timetable in the knowledge that we have laid the necessary secondary legislation before Parliament. I am hopeful of getting a slot to debate it before Christmas, before the end of the year. We have always said
that we will permit the industry three months to get up to speed with the practicalities and delivering the age verification that it will be required to deliver by law. We have also had to set up the regulator--well, not to set it up, but to establish
with the British Board of Film Classification , which has been the regulator, exactly how it will work. It has had to consult on the methods of age verification, so it has taken longer than I would have liked, but I would balance that with a confidence
that we have got it right. Carol Monaghan: Are you confident that the commercial pornography companies are going to engage fully and will implement the law as you hope? Margot James: I am
certainly confident on the majority of large commercial pornography websites and platforms being compliant with the law. They have engaged well with the BBFC and the Department , and want to be on the right side of the law. I have confidence, but I am
wary of being 100% confident, because there are always smaller and more underground platforms and sites that will seek ways around the law. At least, that is usually the case. We will be on the lookout for that, and so will the BBFC. But the vast
majority of organisations have indicated that they are keen to comply with the legislation. Carol Monaghan: One concern that we all have is that children can stumble across pornography. We know that on social media
platforms, where children are often active, up to a third of their content can be pornographic, but they fall outside the age verification regulation because it is only a third and not the majority. Is that likely to undermine the law? Ultimately the
law, as it stands, is there to safeguard our children. Margot James: I acknowledge that that is a weakness in the legislative solution. I do not think that for many mainstream social media platforms as much of a third of
their content is pornographic, but it is well known that certain social media platforms that many people use regularly have pornography freely available. We have decided to start with the commercial operations while we bring in the age verification
techniques that have not been widely used to date. But we will keep a watching brief on how effective those age verification procedures turn out to be with commercial providers and will keep a close eye on how social media platforms develop in terms of
the extent of pornographic material, particularly if they are platforms that appeal to children--not all are. You point to a legitimate weakness, on which we have a close eye. |
|
The Lords discuss when age verification internet censorship will start
|
|
|
| 13th
November 2018
|
|
| See article from theyworkforyou.com |
Pornographic Websites: Age Verification - Question House of Lords on 5th November 2018 . Baroness Benjamin Liberal Democrat To ask Her Majesty 's Government what
will be the commencement date for their plans to ensure that age-verification to prevent children accessing pornographic websites is implemented by the British Board of Film Classification . Lord Ashton of Hyde The
Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport My Lords, we are now in the final stages of the process, and we have laid the BBFC 's draft guidance and the Online Pornography (Commercial Basis)
Regulations before Parliament for approval. We will ensure that there is a sufficient period following parliamentary approval for the public and the industry to prepare for age verification. Once parliamentary proceedings have concluded, we will set a
date by which commercial pornography websites will need to be compliant, following an implementation window. We expect that this date will be early in the new year. Baroness Benjamin I thank the
Minister for his Answer. I cannot wait for that date to happen, but does he share my disgust and horror that social media companies such as Twitter state that their minimum age for membership is 13 yet make no attempt to restrict some of the most gross
forms of pornography being exchanged via their platforms? Unfortunately, the Digital Economy Act does not affect these companies because they are not predominantly commercial porn publishers. Does he agree that the BBFC needs to develop mechanisms to
evaluate the effectiveness of the legislation for restricting children's access to pornography via social media sites and put a stop to this unacceptable behaviour? Lord Ashton of Hyde My Lords, I
agree that there are areas of concern on social media sites. As the noble Baroness rightly says, they are not covered by the Digital Economy Act . We had many hours of discussion about that in this House. However, she will be aware that we are producing
an online harms White Paper in the winter in which some of these issues will be considered. If necessary, legislation will be brought forward to address these, and not only these but other harms too. I agree that the BBFC should find out about the
effectiveness of the limited amount that age verification can do; it will commission research on that. Also, the Digital Economy Act itself made sure that the Secretary of State must review its effectiveness within 12 to 18 months.
Lord Griffiths of Burry Port Opposition Whip (Lords), Shadow Spokesperson (Digital, Culture, Media and Sport), Shadow Spokesperson (Wales) My Lords, once again I find this issue raising a dynamic that we
became familiar with in the only too recent past. The Government are to be congratulated on getting the Act on to the statute book and, indeed, on taking measures to identify a regulator as well as to indicate that secondary legislation will be brought
forward to implement a number of the provisions of the Act. My worry is that, under one section of the Digital Economy Act , financial penalties can be imposed on those who infringe this need; the Government seem to have decided not to bring that
provision into force at this time. I believe I can anticipate the Minister 's answer but--in view of the little drama we had last week over fixed-odds betting machines--we would not want the Government, having won our applause in this way, to slip back
into putting things off or modifying things away from the position that we had all agreed we wanted. Lord Ashton of Hyde My Lords, I completely understand where the noble Lord is coming from but what
he said is not quite right. The Digital Economy Act included a power that the Government could bring enforcement with financial penalties through a regulator. However, they decided--and this House decided--not to use that for the time being. For the
moment, the regulator will act in a different way. But later on, if necessary, the Secretary of State could exercise that power. On timing and FOBTs, we thought carefully--as noble Lords can imagine--before we said that we expect the date will be early
in the new year, Lord Addington Liberal Democrat My Lords, does the Minister agree that good health and sex education might be a way to counter some of the damaging effects? Can the Government make
sure that is in place as soon as possible, so that this strange fantasy world is made slightly more real? Lord Ashton of Hyde The noble Lord is of course right that age verification itself is not the
only answer. It does not cover every possibility of getting on to a pornography site. However, it is the first attempt of its kind in the world, which is why not only we but many other countries are looking at it. I agree that sex education in schools is
very important and I believe it is being brought into the national curriculum already. The Earl of Erroll Crossbench Why is there so much wriggle room in section 6 of the guidance from the DCMS to
the AV regulator? The ISP blocking probably will not work, because everyone will just get out of it. If we bring this into disrepute then the good guys, who would like to comply, probably will not; they will not be able to do so economically. All that
was covered in British Standard PAS 1296, which was developed over three years. It seems to have been totally ignored by the DCMS. You have spent an awful lot of time getting there, but you have not got there. Lord Ashton of
Hyde One of the reasons this has taken so long is that it is complicated. We in the DCMS , and many others, not least in this House, have spent a long time discussing the best way of achieving this. I am not immediately
familiar with exactly what section 6 says, but when the statutory instrument comes before this House--it is an affirmative one to be discussed--I will have the answer ready for the noble Earl. Lord West of Spithead Labour
My Lords, does the Minister not agree that the possession of a biometric card by the population would make the implementation of things such as this very much easier? Lord Ashton of Hyde
In some ways it would, but there are problems with people who either do not want to or cannot have biometric cards.
|
|
Analysis of BBFC's Post-Consultation Guidance by the Open Rights Group
|
|
|
|
8th November 2018
|
|
| See article from openrightsgroup.org (CC)
|
Following the conclusion of their consultation period, the BBFC have issued new age verification guidance that has been laid before Parliament. It is unclear why, if the government now recognises that privacy protections like
this are needed, the government would also leave the requirements as voluntary. Summary The new code has some important improvements, notably the introduction of a voluntary
scheme for privacy, close to or based on a GDPR Code of Conduct. This is a good idea, but should not be put in place as a voluntary arrangement. Companies may not want the attention of a regulator, or may simply wish to apply lower or different
standards, and ignore it. It is unclear why, if the government now recognises that privacy protections like this are needed, the government would also leave the requirements as voluntary. We are also concerned that the
voluntary scheme may not be up and running before the AV requirement is put in place. Given that 25 million UK adults are expected to sign up to these products within a few months of its launch, this would be very unhelpful.
Parliament should now:
- Ask the government why the privacy scheme is to be voluntary, if the risks of relying on general data protection law are now recognised;
- Ask for assurance from BBFC that the voluntary scheme
will cover the all of the major operators; and
- Ask for assurance from BBFC and DCMS that the voluntary privacy scheme will be up and running before obliging operators to put Age Verification measures in place.
The draft code can be found here .
Lack of Enforceability of Guidance The Digital Economy Act does not allow the BBFC to judge age verification tools by any standard other than whether or not they sufficiently verify age. We
asked that the BBFC persuade the DCMS that statutory requirements for privacy and security were required for age verification tools. The BBFC have clearly acknowledged privacy and security concerns with age
verification in their response. However, the BBFC indicate in their response that they have been working with the ICO and DCMS to create a
voluntary certification scheme for age verification providers: "This voluntary certification scheme will mean that age-verification providers may choose to be
independently audited by a third party and then certified by the Age-verification Regulator. The third party's audit will include an assessment of an age-verification solution's compliance with strict privacy and data security requirements."
The lack of a requirement for additional and specific privacy regulation in the Digital Economy Act is the cause for this voluntary approach. While a voluntary scheme above is likely to be
of some assistance in promoting better standards among age verification providers, the "strict privacy and data security requirements" which the voluntary scheme mentions are not a statutory requirement, leaving some consumers at greater risk
than others. Sensitive Personal Data The data handled by age verification systems is sensitive personal data. Age verification services must directly identify users in order to
accurately verify age. Users will be viewing pornographic content, and the data about what specific content a user views is highly personal and sensitive. This has potentially disastrous consequences for individuals and families if the data is lost,
leaked, or stolen. Following a hack affecting Ashley Madison -- a dating website for extramarital affairs -- a number of the site's users were driven to suicide as a result of the public exposure of their sexual
activities and interests. For the purposes of GDPR, data handled by age verification systems falls under the criteria for sensitive personal data, as it amounts to "data concerning a natural person's sex life or
sexual orientation". Scheduling Concerns It is of critical importance that any accreditation scheme for age verification providers, or GDPR code of conduct if one is
established, is in place and functional before enforcement of the age verification provisions in the Digital Economy Act commences. All of the major providers who are expected to dominate the age verification market should undergo their audit under the
scheme before consumers will be expected to use the tool. This is especially true when considering the fact that MindGeek have indicated their expectation that 20-25 million UK adults will sign up to their tool within the first few months of operation. A
voluntary accreditation scheme that begins enforcement after all these people have already signed up would be unhelpful. Consumers should be empowered to make informed decisions about the age verification tools that
they choose from the very first day of enforcement. No delays are acceptable if users are expected to rely upon the scheme to inform themselves about the safety of their data. If this cannot be achieved prior to the start of expected enforcement of the
DE Act's provisions, then the planned date for enforcement should be moved back to allow for the accreditation to be completed. Issues with Lack of Consumer Choice It is of vital
importance that consumers, if they must verify their age, are given a choice of age verification providers when visiting a site. This enables users to choose which provider they trust with their highly sensitive age verification data and prevents one
actor from dominating the market and thereby promoting detrimental practices with data. The BBFC also acknowledge the importance of this in their guidance, noting in 3.8: "Although not a requirement under section
14(1) the BBFC recommends that online commercial pornography services offer a choice of age-verification methods for the end-user". This does not go far enough to acknowledge the potential issues that may arise in
a fragmented market where pornographic sites are free to offer only a single tool if they desire. Without a statutory requirement for sites to offer all appropriate and available tools for age verification and log in
purposes, it is likely that a market will be established in which one or two tools dominate. Smaller sites will then be forced to adopt these dominant tools as well, to avoid friction with consumers who would otherwise be required to sign up to a new
provider. This kind of market for age verification tools will provide little room for a smaller provider with a greater commitment to privacy or security to survive and robs users of the ability to choose who they
trust with their data. We already called for it to be made a statutory requirement that pornographic sites must offer a choice of providers to consumers who must age verify, however this suggestion has not been taken
up. We note that the BBFC has been working with the ICO and DCMS to produce a voluntary code of conduct. Perhaps a potential alternative solution would be to ensure that a site is only considered compliant if it offers
users a number of tools which has been accredited under the additional privacy and security requirements of the voluntary scheme. GDPR Codes of Conduct A GDPR "Code of
Conduct" is a mechanism for providing guidelines to organisations who process data in particular ways, and allows them to demonstrate compliance with the requirements of the GDPR. A code of conduct is voluntary,
but compliance is continually monitored by an appropriate body who are accredited by a supervisory authority. In this case, the "accredited body" would likely be the BBFC, and the "supervisory authority" would be the ICO. The code of
conduct allows for certifications, seals and marks which indicate clearly to consumers that a service or product complies with the code. Codes of conduct are expected to provide more specific guidance on exactly how
data may be processed or stored. In the case of age verification data, the code could contain stipulations on:
- Appropriate pseudonymisation of stored data;
- Data and metadata retention periods;
- Data minimisation recommendations;
-
Appropriate security measures for data storage;
- Security breach notification procedures;
- Re-use of data for other purposes.
The BBFC's proposed "voluntary standard" regime appears to be similar to a GDPR code of conduct, though it remains to be seen how specific the stipulations in the BBFC's standard are. A code of conduct would also
involve being entered into the ICO's public register of UK approved codes of conduct, and the EPDB's public register for all codes of conduct in the EU. Similarly, GDPR Recital 99 notes that "relevant
stakeholders, including data subjects" should be consulted during the drafting period of a code of conduct - a requirement which is not in place for the BBFC's voluntary scheme. It is possible that the BBFC have
opted to create this voluntary scheme for age verification providers rather than use a code of conduct, because they felt they may not meet the GDPR requirements to be considered as an appropriate body to monitor compliance. Compliance must be monitored
by a body who has demonstrated:
- Their expertise in relation to the subject-matter;
- They have established procedures to assess the ability of data processors to apply the code of conduct;
-
They have the ability to deal with complaints about infringements; and
- Their tasks do not amount to a conflict of interest.
Parties Involved in the Code of Conduct Process As noted by GDPR Recital 99, a consultation should be a public process which involves stakeholders and data subjects, and their responses should
be taken into account during the drafting period: "When drawing up a code of conduct, or when amending or extending such a code, associations and other bodies representing categories of controllers or processors
should consult relevant stakeholders, including data subjects where feasible , and have regard to submissions received and views expressed in response to such consultations." The code of conduct must be approved
by a relevant supervisory authority (in this case the ICO). An accredited body (BBFC) that establishes a code of conduct and monitors compliance is able to establish their own structures and procedures under GDPR
Article 41 to handle complaints regarding infringements of the code, or regarding the way it has been implemented. BBFC would be liable for failures to regulate the code properly under Article 41(4),
[1] however DCMS appear to have accepted the principle that the government would need to protect BBFC from such
liabilities. [2] GDPR Codes of Conduct and Risk Management Below is
a table of risks created by age verification which we identified during the consultation process. For each risk, we have considered whether a GDPR code of conduct may help to mitigate the effects of it.
Risk | CoC Appropriate? | Details | User identity may be correlated with viewed content. |
Partially | This risk can never be entirely mitigated if AV is to go ahead, but a CoC could contain very strict restrictions on what identifying data could be stored after a successful age
verification. | Identity may be associated to an IP address, location or device. | No | It would be very difficult for a CoC to mitigate
this risk as the only safe mitigation would be not to collect user identity information. | An age verification provider could track users across all the websites it's tool is offered on. |
Yes | Strict rules could be put in place about what data an age verification provider may store, and what data it is forbidden from storing. | Users
may be incentivised to consent to further processing of their data in exchange for rewards (content, discounts etc.) | Yes | Age verification tools could be expressly forbidden from
offering anything in exchange for user consent. | Leaked data creates major risks for identified individuals and cannot be revoked or adequately compensated for. | Partially
| A CoC can never fully mitigate this risk if any data is being collected, but it could contain strict prohibitions on storing certain information and specify retention periods after which data must be destroyed, which
may mitigate the impacts of a data breach. | Risks to the user of access via shared computers if viewing history is stored alongside age verification data. | Yes |
A CoC could specify that any accounts for pornographic websites which may track viewed content must be strictly separate and not in any visible way linked to a user's age verification account or data that confirms their
identity. | Age verification systems are likely to trade off convenience for security. (No 2FA, auto-login, etc.) | Yes | A CoC could
stipulate that login cookies that "remember" a returning user must only persist for a short time period, and should recommend or enforce two-factor authentication. | The need to re-login to age
verification services to access pornography in "private browsing" mode may lead people to avoid using this feature and generate much more data which is then stored. | No | A CoC cannot fix this issue. Private browsing by nature will not store any login cookies or other objects and will require the user to re-authenticate with age verification providers every time they wish to view adult content.
| Users may turn to alternative tools to avoid age verification, which carry their own security risks. (Especially "free" VPN services or peer-to-peer networks). |
No | Many UK adults, although over 18, will be uncomfortable with the need to submit identity documents to verify their age and will seek alternative means to access content. It is unlikely that many of these
individuals will be persuaded by an accreditation under a GDPR code. | Age verification login details may be traded and shared among teenagers or younger children, which could lead to bullying or
"outing" if such details are linked to viewed content. | Yes | Strict rules could be put in place about what data an age verification provider may store, and what data it
is forbidden from storing. | Child abusers could use their access to age verified content as an adult as leverage to create and exploit relationships with children and teenagers seeking access to such content
(grooming). | No | This risk will exist as long as age verification is providing a successful barrier to accessing such content for under-18s who wish to do so. |
The sensitivity of content dealt with by age verification services means that users who fall victim to phishing scams or fraud have a lower propensity to report it to the relevant authorities. |
Partially | A CoC or education campaign may help consumers identify trustworthy services, but it can not fix the core issue, which is that users are being socialised into it being
"normal" to input their identity details into websites in exchange for pornography. Phishing scams resulting from age verification will appear and will be common, and the sensitivity of the content involved is a disincentive to reporting it.
| The use of credit cards as an age verification mechanism creates an opportunity for fraudulent sites to engage in credit card theft. | No |
Phishing and fraud will be common. A code of conduct which lists compliant sites and tools externally on the ICO website may be useful, but a phishing site may simply pretend to be another (compliant) tool, or rely on the fact that
users are unlikely to check with the ICO every time they wish to view pornographic content. | The rush to get age verification tools to market means they may take significant shortcuts when it comes to privacy
and security. | Yes | A CoC could assist in solving this issue if tools are given time to be assessed for compliance before the age verification regime commences . |
A single age verification provider may come to dominate the market, leaving users little choice but to accept whatever terms the provider offers. | Partially |
Practically, a CoC could mitigate some of the effects of an age verification tool monopoly if the dominant tool is accredited under the Code. However, this relies on users being empowered to demand compliance with a CoC, and it is
possible that users will instead be left with a "take it or leave it" situation where the dominant tool is not CoC accredited. | Allowing pornography "monopolies" such as MindGeek to
operate age verification tools is a conflict of interest. | Partially | As the BBFC note in their consultation response, it would not be reasonable to prohibit a pornographic content
provider from running an age verification service as it would prevent any site from running their own tool. However, under a CoC it is possible that a degree of separation could be enforced that requires an age verification tools to adhere to strict
rules about the use of data, which could mitigate the effects of a large pornographic content provider attempting to collect as much user data as possible for their own business purposes. |
[1] "Infringements of the following provisions shall, in accordance with paragraph 2, be subject to
administrative fines up to 10 000 000 EUR, or in the case of an undertaking, up to 2 % of the total worldwide annual turnover of the preceding financial year, whichever is higher: the obligations of the monitoring body pursuant to Article 41(4)."
[2] "contingent liability will provide indemnity to the British Board of Film Classification
(BBFC) against legal proceedings brought against the BBFC in its role as the age verification regulator for online pornography." |
|
Parliamentary committee of feminists calls for the censorship of pornography for adults
|
|
|
| 22nd October 2018
|
|
| See article from publications.parliament.uk
|
A Parliamentary committee of feminists has issued a document mainly on the subject of harassment according to their definitions. The Women and Equalities Committee have published a document titled: Sexual Harassment of Women and Girls in Public
Places. It contains a section calling for the censorship of pornography for adults: Pornography 92. There is specific concern about the role of pornography in contributing to harmful attitudes to
women and girls and providing a context in which sexual harassment takes place, and that it is increasingly being used by young people as a source of sex education, with negative consequences. One man who participated in our focus groups said, "I
think the problem is that not only has [pornography] become normalised, it is also considered acceptable, even expected." This was worrying, as the research also showed that men in particular--who are far more likely to be regular users of
pornography than women --believed that pornography was harmful because it engendered unrealistic expectations of sex. 93. Our research did not find a strong relationship between attitudes towards pornography and attitudes towards
sexual harassment, although it did suggest some clear trends that need exploring in further research. For example, people who find legal pornography acceptable are generally more likely to find sexual harassment acceptable than people who find legal
pornography unacceptable. However, our research asked about attitudes rather than behaviours (for example, use of pornography or sexual harassment perpetration), and research both internationally and in the UK suggests that there is a relationship
between the consumption of pornography and sexist attitudes and sexually aggressive behaviours, including violence. We asked Dr Maddy Coy whether there is a link between men viewing pornography and the likelihood of them sexually harassing women and
girls. Dr Coy told us: There is a meta-analysis of research that shows that. It was pornography consumption associated with higher levels of attitudes that support violence, which includes things like acceptance of violence, rape
myth acceptance and sexual harassment, yes. [ ... ] The basis of some of those studies can be critiqued [ ... ] but the findings are consistent across individual studies and the meta-analysis that pulled them together that there is a relationship between
pornography consumption, attitudes that support sexual violence and likelihood of committing sexual violence. 94. The BBFC told us that it knows through its work with charities that children report that exposure to pornography,
much of which is accidental, is impacting on their attitudes and their behaviours. A rapid evidence assessment for the Children's Commissioner for England in 2016 found that children's exposure to pornography was linked to unrealistic attitudes about
sex, belief that women are sex objects and less progressive gender role attitudes. 95. One woman told us that the Government should recognise pornography, sexism and objectification as a public health risk and use the media to
inform society of the harms associated with them: "This could be done in the same way the amazing effort by the Government worked in turning people's attitudes around regarding smoking." Our research suggested that, whilst men may believe that
pornography can be harmful, this does not necessarily lead them to think it is socially unacceptable. This has implications for how the Government develops policy to tackle the harms associated with pornography; focusing messaging solely on harms may not
be the most effective approach with men and boys. More research is needed to develop policies that address these issues. 96. The Government is not consistent in its understanding of the research suggesting a relationship between
pornography and sexually harmful behaviour. On the one hand, in a range of ways government policies and media regulation already assume that some media content is sexually harmful. For example, in introducing the new policy of age verification for online
pornography the Government says: "We will help make sure children aren't exposed to harmful sexualised content online by requiring age verification for access to commercial sites containing pornographic material." The Minister told us that she
very much hoped that the policy would have an impact on attitudes towards women and sexual harassment. The draft consultation on the new statutory guidance on Relationships and Relationships and Sex Education and Health Education states that: "Some
pupils are [ ... ] exposed to harmful behaviours online, and via other forms of media, which may normalise violent sexual behaviours." Chief Executive David Austin told us that, as a regulator, the BBFC takes into account research evidence about the
effect of men viewing violent pornography when determining classifications: For example, we will not classify depictions of pornography that feature real or simulated lack of consent, encourage an interest in abusive
relationships, such as sex with children or incest, that kind of content. We definitely take that into account. The Government also restricts adults' access to hard copy pornographic films to licensed sex shops and licensed
cinemas. It is therefore clear that government policy and media regulation is already based on an understanding that pornographic content can be harmful. 97. It is odd, therefore, that the Government's written evidence to us
expressed doubt about the strength of research suggesting a relationship between the consumption of pornography and sexually harmful behaviours. It stated that "there is currently limited evidence to suggest a link between the consumption of
pornography and sexual violence". The Minister for Women told us that she was commissioning research on the impact of online pornography on attitudes towards women and girls, saying that: We have to be careful about the
research, which is why I have commissioned this research over and above everything that has gone before. We have to acknowledge the fact that the Crime Survey for England & Wales has shown a reduction in sexual violence since 2004--05, while online
pornography has exploded exponentially. I have to bear that in mind in terms of what we are doing, which is why I want thorough research looking not just at gang criminality, frankly, but also at how this affects people forming healthy relationships in
adult life. [ ... ] I know the Children's Commissioner did some research in 2014 that showed some evidence, but I do not think it could be described as being unequivocal in the links between these things. I would like to be entirely clear on that.
98. The Government's approach to pornography is not consistent. It restricts adults' access to offline pornography to licensed premises and is introducing age verification of commercial pornography online to prevent children's
exposure to it. But the Government has no plans to address adult men's use of mainstream online pornography, despite research suggesting that men who use pornography are more likely to hold sexist attitudes and be sexually aggressive towards women.
99. There are examples of lawful behaviours which the Government recognises as harmful, such as smoking, which are addressed through public health campaigns and huge investment designed to reduce and prevent those harms. The
Government should take a similar, evidence-based approach to addressing the harms of pornography. 100. The BBFC, the regulator for age verification, believes that, as a result of the new policy, "accidental stumbling across
commercial pornography by children online will largely become a thing of the past." However, writer and commentator Melanie Phillips told us she was more sceptical about pornography websites abiding by the new law because the "commercial
impulse is so enormous ." Furthermore, pornography accessed through social media is not part of the new regime, because it does not come within the definition of 'commercial pornography' under the draft regulations published in 2017, though not
consulted upon. As pornography is also accessed through social media, this gap could undermine the effectiveness of the policy. 101. The definition of 'commercial pornography services' for the Government's policy on age
verification of pornography websites should be amended to include social media, to ensure that this policy is as effective and comprehensive as possible. 102. BBFC classification guidelines address content related to
discrimination: "Potentially offensive content relating to matters such as race, gender, religion, disability or sexuality may arise in a wide range of works, and the classification decision will take account of the strength or impact of their
inclusion." The BBFC told us that preliminary research to inform new classification guidelines suggests increased public concern about sexual violence. We believe that the new guidelines provide an opportunity to be clearer about normalised sexism
as discrimination, and to name sexual harassment as a form of sexual violence in order to be clear about the regulation of its depiction. 103. British Board of Film Classification policies and guidelines should be explicit about
categorising normalised sexism as discrimination. The policies and guidelines should name sexual harassment as a form of sexual violence in order to be clearer about regulation of its depiction. |
|
|
|
|
| 21st October 2018
|
|
|
The government makes changes such that image hosting sites, not identifying as porn sites, do not need age verification for porn images they carry See
article from theguardian.com |
|
MoneySupermarket survey finds that 25% of customers will take action if their porn is blocked
|
|
|
| 16th October 2018
|
|
| See article from moneysupermarket.com |
In a survey more about net neutrality than porn censorship, MoneySupermarket noted: We conducted a survey of over 2,000 Brits on this and it seems that if an ISP decided to block sites, it could result in increasing
numbers of Brits switching - 64 per cent of Brits would be likely to switch ISP if they put blocks in place In reality, this means millions could be considering a switch as nearly six million having tried to access a site that was
blocked in the last week - nearly one in 10 across the country. It's an issue even more pertinent for those aged 18 to 34, with nearly half (45 per cent) having tried to access a site that was blocked at some point.
While ISPs might block sites for various reasons, a quarter of Brits said they would switch ISP if they were blocked from viewing adult sites - with those living with partners the most likely to do so!
Now
switching ISPs isn't going to help much if the BBFC, the government appointed porn censor, has dictated that all ISPs block porn sites. But maybe these 25% of internet users will take up alternatives such as subscribing to a VPN service.
|
|
|
|
|
|
16th October 2018
|
|
|
Race, porn, and education: will the UK's 2020 sex education update teach people to be PC about their choice of porn? See
article from opendemocracy.net |
|
The Government picks up the tab for legal liabilities arising from the BBFC being sued over age verification issues
|
|
|
| 12th
October 2018
|
|
| See article from theyworkforyou.com |
As far as I can see if a porn website verifies your age with personal data, it will probably also require you tick tick a consent box with a hol load of small print that nobody ever reads. Now if that small print lets it forward all personal data,
coupled with porn viewing data, to the Kremlin's dirty tricks and blackmail department then that's ok with the the Government's age verification law. So for sure some porn viewers are going to get burnt because of what the government has legislated and
because of what the BBFC have implemented. So perhaps it is not surprising that the BBFC has asked the government to pick up the tab should the BBFC be sued by people harmed by their decisions. After all it was the government who set up the unsafe
environment, not the BBFC. Margot James The Minister of State, Department for Culture, Media and Sport announced in Parliament: I am today laying a Departmental Minute to advise that the Department for Digital, Culture,
Media and Sport (DCMS) has received approval from Her Majesty's Treasury (HMT) to recognise a new Contingent Liability which will come into effect when age verification powers under Part 3 of the Digital Economy Act 2017 enter force.
The contingent liability will provide indemnity to the British Board of Film Classification (BBFC) against legal proceedings brought against the BBFC in its role as the age verification regulator for online pornography.
As you know, the Digital Economy Act introduces the requirement for commercial providers of online pornography to have robust age verification controls to protect children and young people under 18 from exposure to online pornography.
As the designated age verification regulator, the BBFC will have extensive powers to take enforcement action against non-compliant sites. The BBFC can issue civil proceedings, give notice to payment-service providers or ancillary service providers, or
direct internet service providers to block access to websites where a provider of online pornography remains non-compliant. The BBFC expects a high level of voluntary compliance by providers of online pornography. To encourage
compliance, the BBFC has engaged with industry, charities and undertaken a public consultation on its regulatory approach. Furthermore, the BBFC will ensure that it takes a proportionate approach to enforcement and will maintain arrangements for an
appeals process to be overseen by an independent appeals body. This will help reduce the risk of potential legal action against the BBFC. However, despite the effective work with industry, charities and the public to promote and
encourage compliance, this is a new law and there nevertheless remains a risk that the BBFC will be exposed to legal challenge on the basis of decisions taken as the age verification regulator or on grounds of principle from those opposed to the policy.
As this is a new policy, it is not possible to quantify accurately the value of such risks. The Government estimates a realistic risk range to be between 2£1m - 2£10m in the first year, based on likely number and scale of legal
challenges. The BBFC investigated options to procure commercial insurance but failed to do so given difficulties in accurately determining the size of potential risks. The Government therefore will ensure that the BBFC is protected against any legal
action brought against the BBFC as a result of carrying out duties as the age verification regulator. The Contingent Liability is required to be in place for the duration of the period the BBFC remain the age verification
regulator. However, we expect the likelihood of the Contingent Liability being called upon to diminish over time as the regime settles in and relevant industries become accustomed to it. If the liability is called upon, provision for any payment will be
sought through the normal Supply procedure. It is usual to allow a period of 14 Sitting Days prior to accepting a Contingent Liability, to provide Members of Parliament an opportunity to raise any objections. |
|
The BBFC launches a new website
|
|
|
|
11th October 2018
|
|
| See article from ageverificationregulator.com |
There's loads of new information today about the upcoming internet porn censorship regime to be coordinated by the BBFC. The BBFC has launched a new website, ageverificationregulator.com
, perhaps to distance itself a bit from its film censorship work. The BBFC has made a few changes to its approach since the rather ropey document published prior to the BBFC's public consultation. In general the BBFC seems a little more
pragmatic about trying to get adult porn users to buy into the age verification way of thinking. The BBFC seems supportive of the anonymously bought porn access card from the local store, and has taken a strong stance against age verification providers
who reprehensibly want to record people's porn browsing, claiming a need to provide an audit trail. The BBFC has also decided to offer a service to certify age verification providers in the way that they protect people's data. This is again
probably targeted at making adult porn users a bit more confident in handing over ID. The BBFC tone is a little bit more acknowledging of people's privacy concerns, but it's the government's law being implemented by the BBFC, that allows the
recipients of the data to use it more or less how they like. Once you tick the 'take it or leave it' consent box allowing the AV provider 'to make your user experience better' then they can do what they like with your data (although GDPR does kindly let
you later withdraw that consent and see what they have got on you). Another theme that runs through the site is a rather ironic acceptance that, for all the devastation that will befall the UK porn industry, for all the lives ruined by people
having their porn viewing outed, for all the lives ruined by fraud and identity theft, that somehow the regime is only about stopping young children 'stumbling on porn'... because the older, more determined, children will still know how to find it
anyway. So the BBFC has laid out its stall, and its a little more conciliatory to porn users, but I for one will never hand over any ID data to anyone connected with a servicing porn websites. I suspect that many others will feel the same. If you
can't trust the biggest companies in the business with your data, what hope is there for anyone else. There's no word yet on when all this will come into force, but the schedule seems to be 3 months after the BBFC scheme has been approved by
Parliament. This approval seems scheduled to be debated in Parliament in early November, eg on 5th November there will be a House of Lords session: Implementation by the British Board of Film Classification of
age-verifications to prevent children accessing pornographic websites 203 Baroness Benjamin Oral questions
So the earliest it could come into force is about mid February. |
|
BBFC publishes its sometimes bizarre Guidance on Age-verification Arrangement
|
|
|
| 11th October 2018
|
|
| See article [pdf] from
ageverificationregulator.com |
The BBFC has published its Age Verification Guidance document that will underipin the implementation of internet porn censorship in the UK. Perhaps a key section is: 5. The criteria against which the BBFC
will assess that an age-verification arrangement meets the requirement under section 14(1) to secure that pornographic material is not normally accessible by those under 18 are set out below: a. an effective control
mechanism at the point of registration or access to pornographic content by the end-user which verifies that the user is aged 18 or over at the point of registration or access b use of age-verification data that cannot be
reasonably known by another person, without theft or fraudulent use of data or identification documents nor readily obtained or predicted by another person c. a requirement that either a user age-verify each visit or access is
restricted by controls, manual or electronic, such as, but not limited to, password or personal identification numbers. A consumer must be logged out by default unless they positively opt-in for their log in information to be remembered
d. the inclusion of measures which authenticate age-verification data and measures which are effective at preventing use by non-human operators including algorithms
It is fascinating as to why the BBFC
feels that bots need to be banned, perhaps they need to be 18 years old too, before they can access porn. I am not sure if porn sites will appreciate Goggle-bot being banned from their sites. I love the idea that the word 'algorithms' has been elevated
to some sort of living entity. It all smacks of being written by people who don't know what they are talking about. In a quick read I thought the following paragraph was important: 9. In the interests of
data minimisation and data protection, the BBFC does not require that age-verification arrangements maintain data for the purposes of providing an audit trail in order to meet the requirements of the act.
It rather suggests that the
BBFC pragmatically accept that convenience and buy-in from porn-users is more important than making life dangerous for everybody, just n case a few teenagers get hold of an access code.
|
|
BBFC publishes its summary of the consultation repsonses
|
|
|
| 11th October 2018
|
|
| See BBFC summary of responses [pdf] from
ageverificationregulator.com See Consultation Responses [pdf] from ageverificationregulator.com
|
BBFC Executive Summary The British Board of Film Classification was designated as the age-verification regulator under Part 3 of the Digital Economy Act on 21 February 2018. The BBFC launched its consultation on the
draft Guidance on Age-verification Arrangements and draft Guidance on Ancillary Service Providers on 26 March 2018. The consultation was available on the BBFC's website and asked for comments on the technical aspects on how the BBFC intends to approach
its role and functions as the age-verification regulator. The consultation ran for 4 weeks and closed on 23 April 2018, although late submissions were accepted until 8 May 2018. There were a total of 624 responses to the
consultation. The vast majority of those (584) were submitted by individuals, with 40 submitted by organisations. 623 responses were received via email, and one was received by post. Where express consent has been given for their publication, the BBFC
has published responses in a separate document. Response summaries from key stakeholders are in part 4 of this document. Responses from stakeholders such as children's charities, age-verification providers and internet service
providers were broadly supportive of the BBFC's approach and age-verification standards. Some responses from these groups asked for clarification to some points. The BBFC has made a number of amendments to the guidance as a result. These are outlined in
chapter 2 of this document. Responses to questions raised are covered in chapter 3 of this document. A significant number of responses, particularly from individuals and campaign groups, raised concerns about the introduction of
age-verification, and set out objections to the legislation and regulatory regime in principle. Issues included infringement of freedom of expression, censorship, problematic enforcement powers and an unmanageable scale of operation. The government's
consultation on age-verification in 2016 addressed many of these issues of principle. More information about why age-verification has been introduced, and the considerations given to the regulatory framework and enforcement powers can be found in the
2016 consultation response by the Department for Digital Culture Media and Sport1.
|
|
Channel 4 commissions a negative sounding documentary series
|
|
|
| 5th October 2018
|
|
| See See publicity from channel4.com
|
Generation Porn , a landmark Channel 4 documentary trilogy from the award-winning producers at Story Films, will explore the influence of the modern internet porn epidemic through the people who watch it, star in it and control it.
The cyber porn industry has exploded. Online giant Porn Hub reported that in 2017 it catered to 28.5 billion visitors - or 800 searches per second. In fact, online porn sites create more internet traffic than Netflix and Twitter
combined. How did we get here? And what is the impact of porn on our attitudes and relationships? This bold and timely series weaves present tense narratives from across the world showing the growing impact
that free, easy-to-access porn is having on the lives of adults and kids. As well as those who watch porn, we'll meet those who work in the industry and learn how the new porn world is rapidly changing what producers demand from
porn stars. The series will also meet those trying to cash in on the porn Gold Rush and the on-line giants dominating an industry said to be worth $100bn globally. Series Director, Philippa Robinson said:
Through the characters we meet, the juxtaposition of stories and the key exchanges Generation Porn will illustrate how the porn industry exploded and the consequences for all of us.
|
|
New rules for AudioVisual Media Services approved by Parliament
|
|
|
| 3rd October 2018
|
|
| See press release from europarl.europa.eu
See cenorship law text [pdf] from europarl.europa.eu |
New rules on audiovisual media services will apply to broadcasters, and also to video-on-demand and video-sharing platforms MEPs voted on updated rules on audiovisual media services covering children protection, stricter rules
on advertising, and a requirement 30% European content in video-on-demand. Following the final vote on this agreement, the revised legislation will apply to broadcasters, but also to video-on-demand and video-sharing platforms,
such as Netflix, YouTube or Facebook, as well as to live streaming on video-sharing platforms. The updated rules will ensure:
Audiovisual media services providers should have appropriate measures to combat content inciting violence, hatred and terrorism, while gratuitous violence and pornography will be subject to the strictest rules. Video-sharing platforms
will now be responsible for reacting quickly when content is reported or flagged by users as harmful. The legislation does not include any automatic filtering of uploaded content, but, at the request of the Parliament, platforms
need to create a transparent, easy-to-use and effective mechanism to allow users to report or flag content. The new law includes strict rules on advertising, product placement in children's TV programmes and content available on
video-on-demand platforms. EP negotiators also secured a personal data protection mechanism for children, imposing measures to ensure that data collected by audiovisual media providers are not processed for commercial use, including for profiling and
behaviourally targeted advertising.
Under the new rules, advertising can take up a maximum of 20% of the daily broadcasting period between 6.00 and 18.00, giving the broadcaster the flexibility to adjust their advertising periods. A prime-time window between 18:00 and
0:00 was also set out, during which advertising will only be allowed to take up a maximum of 20% of broadcasting time.
In order to support the cultural diversity of the European audiovisual sector, MEPs ensured that 30% of content in the video-on-demand platforms' catalogues should be European. Video-on-demand platforms are
also asked to contribute to the development of European audiovisual productions, either by investing directly in content or by contributing to national funds. The level of contribution in each country should be proportional to their on-demand revenues in
that country (member states where they are established or member states where they target the audience wholly or mostly). The legislation also includes provisions regarding accessibility, integrity of a broadcaster's signal,
strengthening regulatory authorities and promoting media competences. Next steps The deal still needs to be formally approved by the Council of EU ministers before the revised law can enter into
force. Member States have 21 months after its entry into force to transpose the new rules into national legislation. The text was adopted by 452 votes against 132, with 65 abstentions. Article 6a
A new section has been added to the AVMS rules re censorship
Member States shall take appropriate measures to ensure that audiovisual media services provided by media service providers under their jurisdiction which may impair the physical, mental or moral development of minors are only
made available in such a way as to ensure that minors will not normally hear or see them. Such measures may include selecting the time of the broadcast, age verification tools or other technical measures. They shall be proportionate to the potential harm
of the programme. The most harmful content, such as gratuitous violence and pornography, shall be subject to the strictest measures. Personal data of minors collected or otherwise generated by media service
providers pursuant to paragraph 1 shall not be processed for commercial purposes, such as direct marketing, profiling and behaviourally targeted advertising. Member States shall ensure that media service providers
provide sufficient information to viewers about content which may impair the physical, mental or moral development of minors. For this purpose, media service providers shall use a system describing the potentially harmful nature of the content of an
audiovisual media service. For the implementation of this paragraph, Member States shall encourage the use of co - regulation as provided for in Article 4a(1). The Commission shall encourage media service providers
to exchange best practices on co - regulatory codes of conduct . Member States and the Commission may foster self - regulation, for the purposes of this Article, through Union codes of conduct as referred to in Article 4a(2).
Article 4a suggests possible organisation of the censors assigned to the task, eg state censors, state controlled organisations eg Ofcom, or nominally state controlled co-regulators like the defunct ATVOD. Article 4a(3). notes that
censorial countries like the UK are free to add further censorship rules of their own: Member States shall remain free to require media service providers under their jurisdiction to comply with more detailed or
stricter rules in compliance with this Directive and Union law, including where their national independent regulatory authorities or bodies conclude that any code of conduct or parts thereof h ave proven not to be sufficiently effective. Member States
shall report such rules to the Commission without undue delay. ;
|
|
|