|
The continuingly dangerous campaign to force ALL people to hand over sensitive ID details to porn sites in the name of protecting children from handing over sensitive ID details.
|
|
|
| 3rd September 2022
|
|
| See article from ico.org.uk
|
The UK's data protection censors at the Information Commissioner's Office ICO have generated a disgracefully onerous red tape nightmare called the Age Appropriate Design Code that requires any internet service that provides any sort of grown up content
to evaluate the age of all users so that under 18s can be protected from handing over sensitive ID data. Of course the age checking usually requires all users to hand over lots of sensitive and dangerous ID data to any website that asks. Now the ICO
has decided to make these requirements of porn sites given that they are often accessed by under 18s. ICO writes: Next steps We will continue to evolve our approach, listening to others to
ensure the code is having the maximum impact. For example, we have seen an increasing amount of research (from the NSPCC, 5Rights, Microsoft and British Board of Film Classification), that children are likely to be accessing
adult-only services and that these pose data protection harms, with children losing control of their data or being manipulated to give more data, in addition to content harms. We have therefore revised our position to clarify that adult-only services are
in scope of the Children's code if they are likely to be accessed by children. As well as engaging with adult-only services directly to ensure they conform with the code, we will also be working closely with Ofcom and the
Department for Digital, Culture, Media and Sport (DCMS) to establish how the code works in practice in relation to adult-only services and what they should expect. This work is continuing to drive the improvements necessary to provide a better internet
for children.
|
|
|
|
|
|
9th February 2022
|
|
|
The House of Lords asks whether the new Information Commissioner will enforce ID/age verification for porn viewing See
article from hansard.parliament.uk
|
|
|
|
|
|
2nd September 2021
|
|
|
Britain tamed Big Tech and nobody noticed. The Age Appropriate Design Code has caused huge global changes. Not that tech platforms want to admit it See
article from wired.co.uk |
|
|
|
|
|
27th August 2021
|
|
|
Trade group for age verification companies s clearly campaigning for its own commercial interests but it does lay out the practical vagaries of ICO's Age Appropriate Design See
article from techmonitor.ai |
|
Facebook and Instagram announces far reaching changes ready for the start of the UK's Age Appropriate Design code
|
|
|
|
27th July 2021
|
|
| See article from about.instagram.com
See article from about.fb.com |
The data protection censors at the Information Commissioner's Office have got into the internet censorship game with a new regime that starts on the 2nd September 2021. It's Age Appropriate Design code very much requires an age gated internet in the name
of data protection for children, The code itself is not law but ICO claims that is an interpretation of the EU's GDPR (General Data Protection Regulation) law and so carries legal weight. The code requires that websites hand over their personal data
to anyone that asks to verify that they are of sufficient age to hand over their personal data. All in the name of preventing children from handing over their personal data. And the most immediate impact is that social media websites need to
ensure that their users are over the age of 13 before the internet companies can make hay with their personal data. And in preparation for the new rules Facebook and Instagram have posted substantial blogs laying out new polices on age
verification. Facebook summarised: Facebook and Instagram weren't designed for people under the age of 13, so we're creating new ways to stop those who are underage from signing up. We're
developing AI to find and remove underaged accounts, and new solutions to verify people's ages. We're also building new experiences designed specifically for those under 13. See full
article from about.fb.com
Instagram added: Creating an experience on Instagram that's safe and private
for young people, but also fun comes with competing challenges. We want them to easily make new friends and keep up with their family, but we don't want them to deal with unwanted DMs or comments from strangers. We think private accounts are the right
choice for young people, but we recognize some young creators might want to have public accounts to build a following. We want to strike the right balance of giving young people all the things they love about Instagram while also
keeping them safe. That's why we're announcing changes we'll make today, including:
Defaulting young people into private accounts. Making it harder for potentially suspicious accounts to find young people. Limiting the options advertisers have to reach
young people with ads.
See full article from about.instagram.com
|
|
Facebook is creating an Instagram for kids
|
|
|
|
19th March 2021
|
|
| See article from buzzfeednews.com |
Facebook is planning to build a version of the popular photo-sharing app Instagram that can be used by children under the age of 13, according to an internal company post obtained by BuzzFeed News. Vishal Shah, Instagram's vice president of product,
wrote on an employee message board: I'm excited to announce that going forward, we have identified youth work as a priority for Instagram and have added it to our H1 priority list. We will be building a new youth pillar within the
Community Product Group to focus on two things:
- (a) accelerating our integrity and privacy work to ensure the safest possible experience for teens and
- (b) building a version of Instagram that allows people under the age of 13 to safely use Instagram for the first time.
Instagram currently 'forbids' children under the age of 13 from using the service, but it is widely used by children anyway. Maybe this announcement ties in with the UK's requirement for age appropriate data sharing that comes into
force in September 2021. |
|
ICO warns internet companies of the impending impossible to comply with Age Appropriate Design Code
|
|
|
| 7th March 2021
|
|
| See
article from ico.org.uk |
A survey by the Information Commissioner's Office (ICO) shows that three quarters of businesses surveyed are aware of the impending Children's Code. The full findings will be published in May but initial analysis shows businesses are still in the
preparation stages. And with just six months to go until the code comes into force, the ICO is urging organisations and businesses to make the necessary but onerous changes to their online services and products. The Children's Code sets out 15
standards organisations must meet to ensure that children's data is protected online. The code will apply to all the major online services used by children in the UK and includes measures such as providing default settings which ensure that children have
access to online services whilst minimising data collection and use. Details of the code were first published in June 2018 and UK Parliament approved it last year. Since then, the ICO has been providing support and advice to help organisations
adapt their online services and products in line with data protection law. |
|
|
|
|
| 15th September 2020
|
|
|
A good summary of some of the unexpected consequences of internet censorship that will arise from ICO's Age Appropriate Design Code. See
article from parentzone.org.uk |
|
The ICO publishes its impossible to comply with, and business suffocating, Age Appropriate Design Code with a 12 month implementation period until 2nd September 2021
|
|
|
|
12th August 2020
|
|
| See
press release from ico.org.uk See
Age Appropriate Design [pdf] from ico.org.uk
|
The ICO issued the code on 12 August 2020 and it will come into force on 2 September 2020 with a 12 month transition period. Information Commissioner Elizabeth Denham writes: Data sits at the heart of the digital services
children use every day. From the moment a young person opens an app, plays a game or loads a website, data begins to be gathered. Who's using the service? How are they using it? How frequently? Where from? On what device? That
information may then inform techniques used to persuade young people to spend more time using services, to shape the content they are encouraged to engage with, and to tailor the advertisements they see. For all the benefits the
digital economy can offer children, we are not currently creating a safe space for them to learn, explore and play. This statutory code of practice looks to change that, not by seeking to protect children from the digital world,
but by protecting them within it. This code is necessary. This code will lead to changes that will help empower both adults and children. One in five UK internet users are
children, but they are using an internet that was not designed for them. In our own research conducted to inform the direction of the code, we heard children describing data practices as nosy, rude and a bit freaky. Our recent
national survey into people's biggest data protection concerns ranked children's privacy second only to cyber security. This mirrors similar sentiments in research by Ofcom and the London School of Economics. This code will lead
to changes in practices that other countries are considering too. It is rooted in the United Nations Convention on the Rights of the Child (UNCRC) that recognises the special safeguards children need in all aspects of their life.
Data protection law at the European level reflects this and provides its own additional safeguards for children. The code is the first of its kind, but it reflects the global direction of travel with similar reform being
considered in the USA, Europe and globally by the Organisation for Economic Co-operation and Development (OECD). This code will lead to changes that UK Parliament wants. Parliament and government ensured UK
data protection laws will truly transform the way we look after children online by requiring my office to introduce this statutory code of practice. The code delivers on that mandate and requires information society services to
put the best interests of the child first when they are designing and developing apps, games, connected toys and websites that are likely to be accessed by them. This code is achievable. The code is
not a new law but it sets standards and explains how the General Data Protection Regulation applies in the context of children using digital services. It follows a thorough consultation process that included speaking with parents, children, schools,
children's campaign groups, developers, tech and gaming companies and online service providers. Such conversations helped shape our code into effective, proportionate and achievable provisions. Organisations should conform to the code and demonstrate that their services use children's data fairly and in compliance with data protection law.
The code is a set of 15 flexible standards 203 they do not ban or specifically prescribe 203 that provides built-in protection to allow children to explore, learn and play online by ensuring that the best interests of the child
are the primary consideration when designing and developing online services. Settings must be high privacy by default (unless there's a compelling reason not to); only the minimum amount of personal data should be collected and
retained; children's data should not usually be shared; geolocation services should be switched off by default. Nudge techniques should not be used to encourage children to provide unnecessary personal data, weaken or turn off their privacy settings. The
code also addresses issues of parental control and profiling. This code will make a difference. Developers and those in the digital sector must act. We have allowed the maximum transition period of
12 months and will continue working with the industry. We want coders, UX designers and system engineers to engage with these standards in their day-to-day to work and we're setting up a package of support to help.
But the next step must be a period of action and preparation. I believe companies will want to conform with the standards because they will want to demonstrate their commitment to always acting in the best interests of the child.
Those companies that do not make the required changes risk regulatory action. What's more, they risk being left behind by those organisations that are keen to conform. A generation from now, I believe we
will look back and find it peculiar that online services weren't always designed with children in mind. When my grandchildren are grown and have children of their own, the need to keep children safer online will be as second
nature as the need to ensure they eat healthily, get a good education or buckle up in the back of a car. And while our code will never replace parental control and guidance, it will help people have greater confidence that their
children can safely learn, explore and play online. There is no doubt that change is needed. The code is an important and significant part of that change. |
|
The ICO's onerous internet censorship measure starts its parliamentary approval stage
|
|
|
|
12th June 2020
|
|
| See statement
from ico.org.uk |
ICO statement in response to the Government laying the Age Appropriate Design Code, also known as the Children's Code, before Parliament. We welcome the news that Government has laid the Age Appropriate Design Code before
Parliament. It's a huge step towards protecting children online especially given the increased reliance on online services at home during COVID-19. The code sets out 15 standards that relevant online services should meet to
protect children's privacy and is the result of wide-ranging consultation and engagement with stakeholders including the tech industry, campaigners, trade bodies and organisations. We are now pulling together our existing work on
the benefits and the costs of the code to assess its impact. This will inform the discussions we have with businesses to help us develop a package of support to help them implement the code during the transition year."
|
|
The government seems a bit cagey about the timetable for introducing the internet censorship measures contained in ICO's Age Appropriate Design rules
|
|
|
| 19th May 2020
|
|
| See Parliamentary transcript from hansard.parliament.uk
|
The Age Appropriate Design Code has been written by the Information Commissioner's Office (ICO) to inform websites what they must do to keep ICO internet censors at bay with regards to the government's interpretations of GDPR provisions. Perhaps in the
same way that the Crown Prosecution Service provides prosecution guidance as to how it interprets criminal law. The Age Appropriate Design Code dictates how websites, and in particular social media, make sure that they are not exploiting children's
personal data. Perhaps the most immediate effect is that social media will have to allow a level of usages that simply does not require children to hand over personal data. Requiring more extensive personal data, say in the way that Facebook does,
requires users to provide 'age assurance' that they are old enough to take such decisions wisely. However adult users may not be so willing to age verify, and may in fact also appreciate an option to use such websites without handing over data
into the exploitative hands of social media companies. So one suspects that US internet social media giants may not see Age Appropriate Design and the government's Online Harms model for internet censorship as commercially very desirable for their
best interests. And one suspects that maybe US internet industry pushback may be something that is exerting pressure on UK negotiators seeking a free trade agreement with the US. Pure conjecture of course, but the government does seem very cagey
about its timetable for both the Age Appropriate Design Code and the Online Harms bill. Here is the latest parliamentary debate in the House of Lords very much on the subject of the government's timetable. House of Lords
Hansard: Age-appropriate Design Code, 18 May 2020 Lord Stevenson of Balmacara: To ask Her Majesty's Government when they intend to lay the regulation giving effect to the age- appropriate
design code required under section 123 of the Data Protection Act 2018 before Parliament.
The Parliamentary Under-Secretary of State, Department for Digital, Culture, Media and Sport (Baroness Barran) (Con)
The age-appropriate design code will play an important role in protecting children's personal data online. The Government notified the final draft of the age-appropriate design code to the European Commission as part of
our obligations under the technical standards and regulations directive. The standstill period required under the directive has concluded. The Data Protection Act requires that the code is laid in Parliament as soon as is practicably possible.
Lord Stevenson of Balmacara: I am delighted to hear that, my Lords, although no date has been given. The Government have a bit of ground to make up here, so perhaps it will not be
delayed too long. Does the Minister agree that the Covid-19 pandemic is a perfect storm for children and for young people's digital experience? More children are online for more time and are more reliant on digital technology. In light of that, more
action needs to be taken. Can she give us some information about when the Government will publish their final response to the consultation on the online harms White Paper, for example, and a date for when we are likely to see the draft Bill for
pre-legislative scrutiny?
Baroness Barran I spent some time this morning with a group of young people, in part discussing their experience online. The noble Lord is right that the
pandemic presents significant challenges, and they were clear that they wanted a safe space online as well as physical safe spaces. The Government share that aspiration. We expect to publish our response to the online harms consultation this autumn and
to introduce the legislation this Session.
Lord Clement-Jones (LD) My Lords, I was very disappointed to see in the final version of the code that the section dealing with
age-appropriate application has been watered down to leave out reference to age-verification mechanisms. Is this because the age-verification provisions of the Digital Economy Act have been kicked into the long grass at the behest of the pornography
industry so that we will not have officially sanctioned age-verification tools available any time soon?
Baroness Barran There is no intention to water down the code. Its content is
the responsibility of the Information Commissioner, who has engaged widely to develop the code, with a call for evidence and a full public consultation.
Lord Moynihan (Con) My
Lords, is my noble friend the Minister able to tell the House the results of the consultation process with the industry on possible ways to implement age verification online?
Baroness Barran
We believe that our online harms proposals will deliver a much higher level of protection for children, as is absolutely appropriate. We expect companies to use a proportionate range of tools, including age-assurance and
age-verification technologies, to prevent children accessing inappropriate behaviour, whether that be via a website or social media. The Earl of Erroll (CB) May I too push the
Government to use the design code to cover the content of publicly accessible parts of pornographic websites, since the Government are not implementing Part 3 of the Digital Economy Act to protect children? Any online harms Act will be a long time in
becoming effective, and such sites are highly attractive to young teenagers.
Baroness Barran We agree absolutely about the importance of protecting young children online and that is
why we are aiming to have the most ambitious online harms legislation in the world. My right honourable friend the Secretary of State and the Minister for Digital and Culture meet representatives of the industry regularly to urge them to improve their
actions in this area.
Lord Holmes of Richmond (Con) My Lords, does my noble friend agree that the code represents a negotiation vis-Ã -vis the tech companies and thus there is
no reason for any delay in laying it before Parliament? Does she further agree that it should be laid before Parliament before 10 June to enable it to pass before the summer break? This would enable the Government to deliver on the claim that the UK is
the safest place on the planet to be online. Share The edit just sent has not been saved. The following error was returned: This content has already been edited and is awaiting review.
Baroness Barran
The negotiation is not just with the tech companies. We have ambitions to be not only a commercially attractive place for tech companies but a very safe place to be online, while ensuring that freedom of speech is upheld. The timing
of the laying of the code is dependent on discussions with the House authorities. As my noble friend is aware, there is a backlog of work which needs to be processed because of the impact of Covid-19. |
|
Newspapers realise that the ICO default child protection policy may be very popular with adults too, and so it may prove tough to get them to age verify as required for monetisation
|
|
|
| 24th January 2020
|
|
| See article from pressgazette.co.uk
See See ICO's FAQ discussing the code's applicability to news websites [pdf] from ico.org.uk
|
News websites will have to ask readers to verify their age or comply with a new 15-point code from the Information Commissioner's Office (ICO) designed to protect children's online data, ICO has confirmed. Press campaign groups were hoping news
websites would be exempt from the new Age Appropriate Design Code so protecting their vital digital advertising revenues which are currently enhanced by extensive profiled advertising. Applying the code as standard will mean websites putting
privacy settings to high and turning off default data profiling. If they want to continue enjoying revenues from behavioural advertising they will need to get adult readers to verify their age. In its 2019 draft ICO had previously said such measures
must be robust and that simply asking readers to declare their age would not be enough.But it has now confirmed to Press Gazette that for news websites that adhere to an editorial code, such self-declaration measures are likely to be sufficient. This
could mean news websites asking readers to enter their date of birth or tick a box confirming they are over 18. An ICO spokesperson said sites using these methods might also want to consider some low level technical measures to discourage false
declarations of age, but anything more privacy intrusive is unlikely to be appropriate.. But Society of Editors executive director Ian Murray predicted the new demands may prove unpopular even at the simplest level. Asking visitors to confirm
their age [and hence submit to snooping and profiling] -- even a simple yes or no tick box -- could be a barrier to readers. The ICO has said it will work with the news media industry over a 12-month transition period to enable proportionate and
practical measures to be put in place for either scenario. In fact ICO produced a separate document alongside the code to explain how it could impact news media, which it said would be allowed to apply the code in a risk-based and proportionate way.
|
|
ICO backs off a little from an age gated internet but imposes masses of red tape for any website that is likely to be accessed by under 18s
|
|
|
| 23rd January 2020
|
|
| 22nd January 2020. See
press release from ico.org.uk See
Age Appropriate Design [pdf] from ico.org.uk
|
The Information Commissioner's Office (ICO) has just published its Age Appropriate Design Code: The draft was published last year and was opened to a public consultation which came down heavily against ICO's demands that website users should be
age verified so that the websites could tailor data protection to the age of the user. Well in this final release ICO has backed off from requiring age verification for everything, and instead suggested something less onerous called age
'assurance'. The idea seems to be that age can be ascertained from behaviour, eg if a YouTube user watches Peppa Pig all day then one can assume that they are of primary school age. However this does seem lead to a loads of contradictions, eg age
can be assessed by profiling users behaviour on the site, but the site isn't allowed to profile people until they are old enough to agree to this. The ICO recognises this contradiction but doesn't really help much with a solution in practice. The
ICO defines the code as only applying to sites likely to be accessed by children (ie websites appealing to all ages are considered caught up by the code even though they are not specifically for children. On a wider point the code will be very
challenging to monetisation methods for general websites. The code requires website to default to no profiling, no geo-location, no in-game sales etc. It assumes that adults will identify themselves and so enable all these things to happen. However it
may well be that adults will quite like this default setting and end up not opting for more, leaving the websites without income. Note that these rules are in the UK interpretation of GDPR law and are not actually in the European directive. So they
are covered by statute, but only in the UK. European competitors have no equivalent requirements. The ICO press release reads: Today the Information Commissioner's Office has published its final Age Appropriate Design Code
-- a set of 15 standards that online services should meet to protect children's privacy. The code sets out the standards expected of those responsible for designing, developing or providing online services like apps, connected
toys, social media platforms, online games, educational websites and streaming services. It covers services likely to be accessed by children and which process their data. The code will require digital services to automatically
provide children with a built-in baseline of data protection whenever they download a new app, game or visit a website. That means privacy settings should be set to high by default and nudge techniques should not be used to
encourage children to weaken their settings. Location settings that allow the world to see where a child is, should also be switched off by default. Data collection and sharing should be minimised and profiling that can allow children to be served up
targeted content should be switched off by default too. Elizabeth Denham, Information Commissioner, said: "Personal data often drives the content that our children are exposed to -- what
they like, what they search for, when they log on and off and even how they are feeling. "In an age when children learn how to use an iPad before they ride a bike, it is right that organisations designing and developing
online services do so with the best interests of children in mind. Children's privacy must not be traded in the chase for profit."
The code says that the best interests of the child should be a primary
consideration when designing and developing online services. And it gives practical guidance on data protection safeguards that ensure online services are appropriate for use by children. Denham said:
"One in five internet users in the UK is a child, but they are using an internet that was not designed for them. "There are laws to protect children in the real world -- film ratings, car seats, age
restrictions on drinking and smoking. We need our laws to protect children in the digital world too. "In a generation from now, we will look back and find it astonishing that online services weren't always designed with
children in mind." The standards of the code are rooted in the General Data Protection Regulation (GDPR) and the code was introduced by the Data Protection Act 2018. The ICO submitted the code to the Secretary of
State in November and it must complete a statutory process before it is laid in Parliament for approval. After that, organisations will have 12 months to update their practices before the code comes into full effect. The ICO expects this to be by autumn
2021. This version of the code is the result of wide-ranging consultation and engagement. The ICO received 450 responses to its initial consultation in April 2019 and followed up with dozens of meetings
with individual organisations, trade bodies, industry and sector representatives, and campaigners. As a result, and in addition to the code itself, the ICO is preparing a significant package of support for organisations.
The code is the first of its kind, but it reflects the global direction of travel with similar reform being considered in the USA, Europe and globally by the Organisation for Economic Co-operation and Development (OECD).
Update: The legals 23rd January 2020. See article from
techcrunch.com Schedule The code now has to be laid before parliament for approval for a period of 40 sitting days -- with the ICO saying it will come into force 21 days after that, assuming no objections. Then there's a further 12
month transition period after it comes into force. Obligation or codes of practice? Neil Brown, an Internet, telecoms and tech lawyer at Decoded Legal explained: This is not, and will not be, 'law'. It
is just a code of practice. It shows the direction of the ICO's thinking, and its expectations, and the ICO has to have regard to it when it takes enforcement action but it's not something with which an organisation needs to comply as such. They need to
comply with the law, which is the GDPR [General Data Protection Regulation] and the DPA [Data Protection Act] 2018. Right now, online services should be working out how to comply with the GDPR, the ePrivacy rules, and any other
applicable laws. The obligation to comply with those laws does not change because of today's code of practice. Rather, the code of practice shows the ICO's thinking on what compliance might look like (and, possibly, goldplates some of the requirements of
the law too).
Comment: ICO pushes ahead with age gates 23rd January 2020. See
article from openrightsgroup.org
The ICO's Age Appropriate Design Code released today includes changes which lessen the risk of widespread age gates, but retains strong incentives towards greater age gating of content. Over 280 ORG supporters wrote to the ICO
about the previous draft code, to express concerns with compulsory age checks for websites, which could lead to restrictions on content. Under the code, companies must establish the age of users, or restrict their use of data. ORG
is concerned that this will mean that adults only access websites when age verified creating severe restrictions on access to information. The ICO's changes to the Code in response to ORG's concerns suggest that different
strategies to establish age may be used, attempting to reduce the risk of forcing compulsory age verification of users. However, the ICO has not published any assessment to understand whether these strategies are practical or what
their actual impact would be. The Code could easily lead to Age Verification through the backdoor as it creates the threat of fines if sites have not established the age of their users. While the Code has
many useful ideas and important protections for children, this should not come at the cost of pushing all websites to undergo age verification of users. Age Verification could extend through social media, games and news publications.
Jim Killock, Executive Director of Open Rights Group said: The ICO has made some useful changes to their code, which make it clear that age verification is not the only method to determine age.
However, the ICO don't know how their code will change adults access to content in practice. The new code published today does not include an Impact Assessment. Parliament must produce one and assess implications for free expression
before agreeing to the code. Age Verification demands could become a barrier to adults reaching legal content, including news, opinion and social media. This would severely impact free expression. The
public and Parliament deserve a thorough discussion of the implications, rather than sneaking in a change via parliamentary rubber stamping with potentially huge implications for the way we access Internet content.
|
|
But it can't possibly let you read them...because of data protection y'now
|
|
|
| 23rd November 2019
|
|
| See article from ico.org.uk
|
The Information Commissions Office (ICO) earlier in the year presented draft internet censorship laws targeted at the commendable aim of protecting the personal data of younger website users. These rules are legally enforceable under the EU GDPR and are
collectively known as The Age Appropriate Design Code. The ICO originally proposed that website designers should consider several age ranges of their users. The youngest users should be presented with no opportunity to reveal their
personal data and then the websites could relent a little on the strictness of the rules as they get older. It all sounds good at first read... until one considers exactly how to know how old users are. And of course ICO proposed age verification
(AV) to prove that people are old enough for the tier of data protection being applied. ISO did not think very hard about the bizarre contradiction that AV requires people to hand over enough data to give identity thieves an orgasm. So the ICO
were going to ask people to hand over their most sensitive ID to any websites that ask... in the name of the better protection of the data that they have just handed over anyway. The draft rules were ridiculous, requiring even a small innocent
site with a shopping trolley to require AV before allowing people to type in their details in the shopping trolley. Well the internet industry strongly pointed out the impracticality of the ICO's nonsense ideas. And indeed the ICO released a
blog and made a few comments that suggest it would be scaling back on its universal AV requirements. The final censorship were delivered to the government on schedule on 23rd November 2019. The industry is surely very keen to know if the
ICO has retreated on its stance, but the ICO has now just announced that the publication date will be delayed until the next government is in place. It sounds that their ideas may still be a little controversial, and they need to hide behind a government
minister before announcing the new rules. |
|
ICO seems to have backed off from requiring age verification for nearly all websites
|
|
|
| 13th August 2019
|
|
| See article
from ico.org.uk See ICO censorship proposal and consultation document [pdf] from
ico.org.uk |
Back in April of this year the data protection police of the ICO set about drawing up rules for nearly all commercial websites in how they should deal with children and their personal data. Rather perversely the ICO decided that age verification
should underpin a massively complex regime to require different data processing for several age ranges. And of course the children's data would be 'protected' by requiring nearly all websites to demand everybody's identity defining personal data in order
to slot people into the ICO's age ranges. The ICO consulted on their proposals and it seems that the internet industry forcefully pointed out that it was not a good idea for nearly all websites to have to demand age verification from all website
visitors. The ICO has yet to publish the results of the consultation or its response to the criticism but the ICO have been playing down this ridiculously widespread age verification. This week the Information Commissioner Elizabeth Denham further
hinted at this in a blog. She wrote: Our consultation on the
proposed code began in April, and prompted more than 450 written responses, as well more
than 40 meetings with key stakeholders. We were pleased with the breadth of views we heard. Parents, schools and children's campaign groups helped us better understand the problems young people can face online, whether using social media services or
popular games, while developers, tech companies and online service providers gave us a crucial insight into the challenges industry faces to make this a reality. ... This consultation has helped us ensure our final code
will be effective, proportionate and achievable. It has also flagged the need for us to be clearer on some standards. We do not want to see an age-gated internet, where visiting any digital service requires
people to prove how old they are. Our aim has never been to keep children from online services, but to protect them within it. We want providers to set their privacy settings to high as a default, and to have strategies in place for how children's data
is handled. We do not want to prevent young people from engaging with the world around them, and we've been quick to respond to concerns that our code would affect news websites. This isn't the case. As we told a DCMS Select
Committee in July, we do not want to create any barriers to children accessing news content. The news media plays a fundamental role in children's lives and the final version of the code will make that very clear. That final
version of the code will be delivered to the Secretary of State ahead of the statutory deadline of 23 November 2019. We recognise the need to allow companies time to implement the standards and ensure they are complying with the
law. The law allows for a transition period of up to a year and we'll be considering the most appropriate approach to this, before making a final decision in the autumn. In addition to the code itself, my office is also preparing a significant package to
ensure that organisations are supported through any transition period, including help and advice for designers and engineers.
|
|
But the News Media Association points out that it would force websites to choose between being devoid of audience or stripped of advertising
|
|
|
| 4th July 2019
|
|
| See article from newsmediauk.org See also
criticism of ICO plan from newsmediauk.org |
For some bizarre reason the ICO seems to have been given powers to make wide ranging internet censorship law on the fly without needing it to be considered by parliament. And with spectacular
incompetence, they have come up with a child safety plan to require nearly every website in Britain to implement strict age verification. Baldric would have been proud, it is more or less an internet equivalent of making children safe on the roads by
banning all cars. A trade association for news organisations, News Media Association, summed up the idea in a consultation response saying: ICO's Age Appropriate Code Could Wreak Havoc On News Media
Unless amended, the draft code published for consultation by the ICO would undermine the news media industry, its journalism and business innovation online. The ICO draft code would require commercial news media publishers to choose between their online
news services being devoid of audience or stripped of advertising, with even editorial content subject to ICO judgment and sanction, irrespective of compliance with general law and codes upheld by the courts and relevant regulators.
The NMA strongly objects to the ICO's startling extension of its regulatory remit, the proposed scope of the draft code, including its express application to news websites, its application of the proposed standards to all users in the
absence of robust age verification to distinguish adults from under 18-year olds and its restrictions on profiling. The NMA considers that news media publishers and their services should be excluded from scope of the proposed draft Code.
Attracting and retaining audience on news websites, digital editions and online service, fostering informed reader relationships, are all vital to the ever evolving development of successful newsbrands and their services, their
advertising revenues and their development of subscription or other payment or contribution models, which fund and sustain the independent press and its journalism. There is surely no justification for the ICO to attempt by way of
a statutory age appropriate design code, to impose access restrictions fettering adults (and children's) ability to receive and impart information, or in effect impose 'pre watershed' broadcast controls upon the content of all currently publicly
available, free to use, national, regional and local news websites, already compliant with the general law and editorial and advertising codes of practice upheld by IPSO and the ASA. In practice, the draft Code would undermine
commercial news media publishers' business models, as audience and advertising would disappear. Adults will be deterred from visiting newspaper websites if they first have to provide age verification details. Traffic and audience will also be reduced if
social media and other third parties were deterred from distributing or promoting or linking titles' lawful, code compliant, content for fear of being accused of promoting content detrimental to some age group in contravention of the Code. Audience
measurement would be difficult. It would devastate advertising, since effective relevant personalised advertising will be rendered impossible, and so destroy the vital commercial revenues which actually fund the independent media, its trusted journalism
and enable it to innovate and evolve to serve the ever-changing needs of its audience. The draft Code's impact would be hugely damaging to the news industry and wholly counter to the Government's policy on sustaining high quality,
trusted journalism at local, regional, national and international levels. Newspapers online content, editorial and advertising practices do not present any danger to children. The ICO has not raised with the industry any evidence
of harm, necessitating such drastic restrictions, caused by reading news or service of advertisements where these are compliant with the law and the standards set by specialist media regulators.
| The Information Commissioner's Office
has a 'cunning plan' |
Of course the News Media Association is making a strong case for its own exclusion from the ICO's 'cunning plan', but the idea is equally devastating for websites from any other internet sector. Information Commissioner Elizabeth Denham was
called to give evidence to Parliament's DCMS Select Committee this week on related matters, and she spoke of a clearly negative feedback to her age verification idea. Her sidekick backtracked a little, saying that the ICO did not mean Age
Verification via handing over passport details, more like one of those schemes where AI guesses age by scanning what sort of thing the person has been posting on social media. (Which of course requires a massive grab of data that should be best kept
private, especially for children). The outcome seems to be a dictate to the internet industry to 'innovate' and find a solution to age verification that does not require the mass hand over of private data (you know like what the data protection laws
are supposed to be protecting). The ICO put a time limit on this innovation demand of about 12 months. In the meantime the ICO has told the news industry that age verification idea won't apply to them, presumably because they can kick up a hell of
stink about the ICO in their mass market newspapers. Denham said: We want to encourage children to find out about the world, we want children to access news sites. So the concern about the
impact of the code on media and editorial comment and journalism I think is unfounded. We don't think there will be an impact on news media sites. They are already regulated and we are not a media regulator.
She did speak any similar
reassuring words to any other sector of the internet industry who are likely to be equally devastated by the ICO's 'cunning plan'. |
|
|
|
|
| 6th June 2019
|
|
|
Foreign websites will block UK users altogether rather than be compelled to invest time and money into a nigh-impossible compliance process. By Heather Burns See
article from webdevlaw.uk |
|
Internet companies slam the data censor's disgraceful proposal to require age verification for large swathes of the internet
|
|
|
| 5th June 2019
|
|
| From the Financial Times |
The Information Commissioner's Office has for some bizarre reason have been given immense powers to censor the internet. And in an early opportunity to exert its power it has proposed a 'regulation' that would require strict age verification for
nearly all mainstream websites that may have a few child readers and some material that may be deemed harmful for very young children. Eg news websites that my have glamour articles or perhaps violent news images. In a mockery of 'data protection'
such websites would have to implement strict age verification requiring people to hand over identity data to most of the websites in the world. Unsurprisingly much of the internet content industry is unimpressed. A six weerk consultation on the
new censorship rules has just closed and according to the Financial Times: Companies and industry groups have loudly pushed back on the plans, cautioning that they could unintentionally quash start-ups and endanger
people's personal data. Google and Facebook are also expected to submit critical responses to the consultation. Tim Scott, head of policy and public affairs at Ukie, the games industry body, said it was an inherent contradiction
that the ICO would require individuals to give away their personal data to every digital service. Dom Hallas, executive director at the Coalition for a Digital Economy (Coadec), which represents digital start-ups in the UK, said
the proposals would result in a withdrawal of online services for under-18s by smaller companies: The code is seen as especially onerous because it would require companies to provide up to six different versions of
their websites to serve different age groups of children under 18. This means an internet for kids largely designed by tech giants who can afford to build two completely different products. A child could access YouTube Kids, but
not a start-up competitor.
Stephen Woodford, chief executive of the Advertising Association -- which represents companies including Amazon, Sky, Twitter and Microsoft -- said the ICO needed to conduct a full technical
and economic impact study, as well as a feasibility study. He said the changes would have a wide and unintended negative impact on the online advertising ecosystem, reducing spend from advertisers and so revenue for many areas of the UK media.
An ICO spokesperson said: We are aware of various industry concerns about the code. We'll be considering all the responses we've had, as well as engaging further where necessary, once the consultation
has finished.
|
|
Pointing out that it is crazy for the data protection police to require internet users to hand over their private identity data to all and sundry (all in the name of child protection of course)
|
|
|
| 31st May 2019
|
|
| See article from
indexoncensorship.org |
Elizabeth Denham, Information Commissioner Information Commissioner's Office, Dear Commissioner Denham, Re: The Draft Age Appropriate Design Code for Online Services We write to
you as civil society organisations who work to promote human rights, both offline and online. As such, we are taking a keen interest in the ICO's Age Appropriate Design Code. We are also engaging with the Government in its White Paper on Online Harms,
and note the connection between these initiatives. Whilst we recognise and support the ICO's aims of protecting and upholding children's rights online, we have severe concerns that as currently drafted the Code will not achieve
these objectives. There is a real risk that implementation of the Code will result in widespread age verification across websites, apps and other online services, which will lead to increased data profiling of both children and adults, and restrictions
on their freedom of expression and access to information. The ICO contends that age verification is not a silver bullet for compliance with the Code, but it is difficult to conceive how online service providers could realistically
fulfil the requirement to be age-appropriate without implementing some form of onboarding age verification process. The practical impact of the Code as it stands is that either all users will have to access online services via a sorting age-gate or adult
users will have to access the lowest common denominator version of services with an option to age-gate up. This creates a de facto compulsory requirement for age-verification, which in turn puts in place a de facto restriction for both children and
adults on access to online content. Requiring all adults to verify they are over 18 in order to access everyday online services is a disproportionate response to the aim of protecting children online and violates fundamental
rights. It carries significant risks of tracking, data breach and fraud. It creates digital exclusion for individuals unable to meet requirements to show formal identification documents. Where age-gating also applies to under-18s, this violation and
exclusion is magnified. It will put an onerous burden on small-to-medium enterprises, which will ultimately entrench the market dominance of large tech companies and lessen choice and agency for both children and adults -- this outcome would be the
antithesis of encouraging diversity and innovation. In its response to the June 2018 Call for Views on the Code, the ICO recognised that there are complexities surrounding age verification, yet the draft Code text fails to engage
with any of these. It would be a poor outcome for fundamental rights and a poor message to children about the intrinsic value of these for all if children's safeguarding was to come at the expense of free expression and equal privacy protection for
adults, including adults in vulnerable positions for whom such protections have particular importance. Mass age-gating will not solve the issues the ICO wishes to address with the Code and will instead create further problems. We
urge you to drop this dangerous idea. Yours sincerely, Open Rights Group Index on Censorship Article19 Big Brother Watch Global Partners Digital
|
|
A new proposal forcing people to brainlessly hand over identity data to any Tom, Dick or Harry website that asks. Open Rights Group suggests we take a stand
|
|
|
| 30th May 2019
|
|
| From action.openrightsgroup.org See ICO's
Age-Appropriate Design: Code of Practice for Online Services |
New proposals to safeguard children will require everyone to prove they are over 18 before accessing online content. These proposals - from the Information Commissioner's Office (ICO) - aim at protecting children's privacy,
but look like sacrificing free expression of adults and children alike. But they are just plans: we believe and hope you can help the ICO strike the right balance, and abandon compulsory age gates, by making your voice heard. The
rules cover websites (including social media and search engines), apps, connected toys and other online products and services. The ICO is requesting public feedback on its proposals until Friday 31 May 2019. Please urgently write
to the consultation to tell them their plan goes too far! You can use these bullet points to help construct your own unique message:
In its current form, the Code is likely to result in widespread age verification across everyday websites, apps and online services for children and adults alike. Age checks for everyone are a step too
far. Age checks for everyone could result in online content being removed or services withdrawn. Data protection regulators should stick to privacy. It's not the Information Commissioner's job to restrict adults' or children's access to content. -
With no scheme to certify which providers can be trusted, third-party age verification technologies will lead to fakes and scams, putting people's personal data at risk. Large age verification providers
will seek to offer single-sign-in across a wide variety of online services, which could lead to intrusive commercial tracking of children and adults with devastating personal impacts in the event of a data breach.
|
|
Jeremy Hunt demands that social media companies immediately ban under 13s from using their apps and websites
|
|
|
| 22nd April 2018
|
|
| See article from twitter.com
|
This is so wrong on so many levels. Britain would undergo a mass tantrum. How are parents supposed to entertain their kids if they can't spend all day on YouTube? And what about all the
privacy implications of letting social media companies have complete identity details of their users. It will be like Cambridge Analytica on speed. Jeremy Hunt wrote to the social media companies: Dear Colleagues,
Thank you for participating in the working group on children and young people's mental health and social media with officials from my Department and DCMS. We appreciate your time and engagement, and your willingness to continue discussions and
potentially support a communications campaign in this area, but I am disappointed by the lack of voluntary progress in those discussions. We set three very clear challenges relating to protecting children and young people's mental
health: age verification, screen time limits and cyber-bullying. As I understand it, participants have focused more on promoting work already underway and explaining the challenges with taking further action, rather than offering innovative solutions or
tangible progress. In particular, progress on age verification is not good enough. I am concerned that your companies seem content with a situation where thousands of users breach your own terms and conditions on the minimum user
age. I fear that you are collectively turning a blind eye to a whole generation of children being exposed to the harmful emotional side effects of social media prematurely; this is both morally wrong and deeply unfair on parents, who are faced with the
invidious choice of allowing children to use platforms they are too young to access, or excluding them from social interaction that often the majority of their peers are engaging in. It is unacceptable and irresponsible for you to put parents in this
position. This is not a blanket criticism and I am aware that these aren't easy issues to solve. I am encouraged that a number of you have developed products to help parents control what their children an access online in response
to Government's concerns about child online protection, including Google's Family Link. And I recognise that your products and services are aimed at different audiences, so different solutions will be required. This is clear from the submissions you've
sent to my officials about the work you are delivering to address some of these challenges. However, it is clear to me that the voluntary joint approach has not delivered the safeguards we need to protect our children's mental health. In May, the
Department for Digital, Culture, Media and Sport will publish the Government response to the Internet Safety Strategy consultation, and I will be working with the Secretary of State to explore what other avenues are open to us to
pursue the reforms we need. We will not rule out legislation where it is needed. In terms of immediate next steps, I appreciate the information that you provided our officials with last month but would be grateful if you would set
out in writing your companies' formal responses, on the three challenges we posed in November. In particular, I would like to know what additional new steps you have taken to protect children and young people since November in each of the specific
categories we raised: age verification, screen time limits and cyber-bullying. I invite you to respond by the end of this month, in order to inform the Internet Safety Strategy response. It would also be helpful if you can set out any ideas or further
plans you have to make progress in these areas. During the working group meetings I understand you have pointed to the lack of conclusive evidence in this area — a concern which I also share. In order to address this, I have asked
the Chief Medical Officer to undertake an evidence review on the impact of technology on children and young people's mental health, including on healthy screen time. 1 will also be working closely with DCMS and UKRI to commission research into all these
questions, to ensure we have the best possible empirical basis on which to make policy. This will inform the Government's approach as we move forwards. Your industry boasts some of the brightest minds and biggest budgets globally.
While these issues may be difficult, I do not believe that solutions on these issues are outside your reach; I do question whether there is sufficient will to reach them. I am keen to work with you to make technology a force for
good in protecting the next generation. However, if you prove unwilling to do so, we will not be deterred from making progress. |
| |