Melon Farmers Original Version

ICO News


Latest

 2019   2020   2021   2022   2023   Latest 

 

Not thinking hard enough about the risks associated with AI...

ICO data censor harangues Snap with a nonsensically abstract accusation, whilst noting that rules haven't actually been broken yet


Link Here 8th October 2023

UK Information Commissioner issues preliminary enforcement notice against Snap

  • Snap issued with preliminary enforcement notice over potential failure to properly assess the privacy risks posed by its generative AI chatbot 'My AI'

  • Investigation provisionally finds Snap failed to adequately identify and assess the risks to several million 'My AI' users in the UK including children aged 13 to 17.

The Information Commissioner's Office (ICO) has issued Snap Inc with a preliminary enforcement notice over potential failure to properly assess the privacy risks posed by Snap's generative AI chatbot 'My AI'.

The preliminary notice sets out the steps which the Commissioner may require, subject to Snap's representations on the preliminary notice. If a final enforcement notice were to be adopted, Snap may be required to stop processing data in connection with 'My AI'. This means not offering the 'My AI' product to UK users pending Snap carrying out an adequate risk assessment.

Snap launched the 'My AI' feature for UK Snapchat+ subscribers in February 2023, with a roll out to its wider Snapchat user base in the UK in April 2023. The chatbot feature, powered by OpenAI's GPT technology, marked the first example of generative AI embedded into a major messaging platform in the UK. As at May 2023 Snapchat had 21 million monthly active users in the UK.

The ICO's investigation provisionally found the risk assessment Snap conducted before it launched 'My AI' did not adequately assess the data protection risks posed by the generative AI technology, particularly to children. The assessment of data protection risk is particularly important in this context which involves the use of innovative technology and the processing of personal data of 13 to 17 year old children.

The Commissioner's findings in the notice are provisional. No conclusion should be drawn at this stage that there has, in fact, been any breach of data protection law or that an enforcement notice will ultimately be issued. The ICO will carefully consider any representations from Snap before taking a final decision.

John Edwards, Information Commissioner said:

The provisional findings of our investigation suggest a worrying failure by Snap to adequately identify and assess the privacy risks to children and other users before launching 'My AI'.

We have been clear that organisations must consider the risks associated with AI, alongside the benefits. Today's preliminary enforcement notice shows we will take action in order to protect UK consumers' privacy rights.

 

 

Ofcom will demand that all website users hand over dangerous identity data to any website that asks...

And ICO claims that its data protection rules will keep us 'safe'....just like laws against burglary have put an end to break ins


Link Here26th November 2022

The Information Commissioner's Office (ICO) and Ofcom have set out how we will work together to ensure coherence between the data protection and the new online safety regimes.

Our joint statement builds on our existing cooperative approach to regulation - and on our close working relationship established as co-founders of the Digital Regulation Cooperation Forum.

In anticipation of Ofcom taking on new duties in 2023 under the Online Safety Bill, the statement sets out our shared regulatory aims. We want:

  • people who use online services to have confidence that their safety and privacy will be upheld and that we will take prompt and effective action when providers fail in their obligations; and

  • providers of online services of all sizes to comply with their obligations and to continue to innovate and grow, supported by regulatory clarity and free from undue burden.

To achieve this, the ICO and Ofcom will work closely together to achieve maximum alignment and consistency between the data protection and online safety regimes. We will:

  • maximise coherence by ensuring our policies are consistent with each other's regulatory requirements -- and consult closely when preparing codes and guidance. We will seek solutions that enhance users' safety and preserve their privacy. Where there are tensions between privacy and safety objectives, we will provide clarity on how compliance can be achieved with both regimes; and

  • promote compliance by setting clear expectations for industry on what they must do to meet both their online safety and data protection requirements. That includes particular support through the transition for small and emerging firms to help them thrive and grow. We will take action against services that don't meet their obligations, sharing information and intelligence as appropriate and coordinating approaches to enforcement.

 

 

Data protection censors ICO to go after porn sites to add age verification...

The continuingly dangerous campaign to force ALL people to hand over sensitive ID details to porn sites in the name of protecting children from handing over sensitive ID details.


Link Here3rd September 2022
Full story: ICO Age Appropriate Design...ICO calls for age assurance for websites accessed by children
The UK's data protection censors at the Information Commissioner's Office ICO have generated a disgracefully onerous red tape nightmare called the Age Appropriate Design Code that requires any internet service that provides any sort of grown up content to evaluate the age of all users so that under 18s can be protected from handing over sensitive ID data. Of course the age checking usually requires all users to hand over lots of sensitive and dangerous ID data to any website that asks.

Now the ICO has decided to make these requirements of porn sites given that they are often accessed by under 18s. ICO writes:

Next steps

We will continue to evolve our approach, listening to others to ensure the code is having the maximum impact.

For example, we have seen an increasing amount of research (from the NSPCC, 5Rights, Microsoft and British Board of Film Classification), that children are likely to be accessing adult-only services and that these pose data protection harms, with children losing control of their data or being manipulated to give more data, in addition to content harms. We have therefore revised our position to clarify that adult-only services are in scope of the Children's code if they are likely to be accessed by children.

As well as engaging with adult-only services directly to ensure they conform with the code, we will also be working closely with Ofcom and the Department for Digital, Culture, Media and Sport (DCMS) to establish how the code works in practice in relation to adult-only services and what they should expect. This work is continuing to drive the improvements necessary to provide a better internet for children.

 

 

Gambling on surveillance...

ICO called on to investigate the massive scale of data mining and snooping at the online betting company Sky Bet.


Link Here8th August 2022
Internet censors of the Information Commissioner's Office has been called on to implement  a full-scale probe into how the online betting industry is exploiting new technology to profile and target gamblers.

The move follows a complaint by  the campiagn group Clean Up Gambling. It alleges that Sky Bet and its partners are creating detailed behavioural profiles of customers and sharing thousands of data points with dozens of third parties.

Clean Up Gambling alleges that one advertising partner, Signal, owned by TransUnion, has a dossier of 186 attributes for an individual, including their propensity to gamble, their favourite games and their susceptibility to specific types of marketing.

TransUnion said it assists gambling companies in preventing fraud, confirming age and identity, checking affordability and protecting vulnerable customers, to support responsible gambling.

 

 

Offsite Article: Censorship via data protection...


Link Here9th February 2022
Full story: ICO Age Appropriate Design...ICO calls for age assurance for websites accessed by children
The House of Lords asks whether the new Information Commissioner will enforce ID/age verification for porn viewing

See article from hansard.parliament.uk

 

 

Offsite Article: Outgoing data protection censor speaks of her tenure at ICO...


Link Here 18th October 2021
Information commissioner Elizabeth Denham: How to be a pro-active censor

See article from bbc.co.uk

 

 

Offsite Article: Britannia still rules the waves...


Link Here2nd September 2021
Full story: ICO Age Appropriate Design...ICO calls for age assurance for websites accessed by children
Britain tamed Big Tech and nobody noticed. The Age Appropriate Design Code has caused huge global changes. Not that tech platforms want to admit it

See article from wired.co.uk

 

 

Maybe more about data monetisation than data protection...

The government nominates the new Information Commissioner


Link Here 27th August 2021
Culture Secretary Oliver Dowden has announced that John Edwards is the Government's preferred candidate for Information Commissioner.

John Edwards is currently New Zealand's Privacy Commissioner. He will now appear before MPs on the Digital, Culture, Media and Sport Select Committee for pre-appointment scrutiny on 9th September 2021.

It seems that the Government has its eyes on market opportunities related to selling data rather than data protection. Dowden commented:

Data underpins innovation and the global digital economy, everyday apps and cloud computing systems. It allows businesses to trade, drives international investment, supports law enforcement agencies tackling crime, the delivery of critical public services and health and scientific research.

The government is outlining the first territories with which it will prioritise striking data adequacy partnerships now it has left the EU as the United States, Australia, the Republic of Korea, Singapore, the Dubai International Finance Centre and Colombia. It is also confirming that future partnerships with India, Brazil, Kenya and Indonesia are being prioritised.

Estimates suggest there is as much as £11 billion worth of trade that goes unrealised around the world due to barriers associated with data transfers.

The aim is to move quickly and creatively to develop global partnerships which will make it easier for UK organisations to exchange data with important markets and fast-growing economies. T

The government also today names New Zealand Privacy Commissioner John Edwards as its preferred candidate to be the UK's next Information Commissioner, following a global search.

As Information Commissioner and head of the UK regulator responsible for enforcing data protection law, he will be empowered to go beyond the regulator's traditional role of focusing only on protecting data rights, with a clear mandate to take a balanced approach that promotes further innovation and economic growth.

...

It means reforming our own data laws so that they're based on common sense, not box-ticking. And it means having the leadership in place at the Information Commissioner's Office to pursue a new era of data-driven growth and innovation. John Edwards's vast experience makes him the ideal candidate to ensure data is used responsibly to achieve those goals.

 

 

Offsite Article: Verified as the age of self interest...


Link Here 27th August 2021
Full story: ICO Age Appropriate Design...ICO calls for age assurance for websites accessed by children
Trade group for age verification companies s clearly campaigning for its own commercial interests but it does lay out the practical vagaries of ICO's Age Appropriate Design

See article from techmonitor.ai

 

 

Age Appropriate Censorship...

Facebook and Instagram announces far reaching changes ready for the start of the UK's Age Appropriate Design code


Link Here27th July 2021
Full story: ICO Age Appropriate Design...ICO calls for age assurance for websites accessed by children
The data protection censors at the Information Commissioner's Office have got into the internet censorship game with a new regime that starts on the 2nd September 2021. It's Age Appropriate Design code very much requires an age gated internet in the name of data protection for children, The code itself is not law but ICO claims that is an interpretation of the EU's GDPR (General Data Protection Regulation) law and so carries legal weight.

The code requires that websites hand over their personal data to anyone that asks to verify that they are of sufficient age to hand over their personal data. All in the name of preventing children from handing over their personal data.

And the most immediate impact is that social media websites need to ensure that their users are over the age of 13 before the internet companies can make hay with their personal data.

And in preparation for the new rules Facebook and Instagram have posted substantial blogs laying out new polices on age verification.

Facebook summarised:

Facebook and Instagram weren't designed for people under the age of 13, so we're creating new ways to stop those who are underage from signing up.

We're developing AI to find and remove underaged accounts, and new solutions to verify people's ages.

We're also building new experiences designed specifically for those under 13.

See full article from about.fb.com

Instagram added:

Creating an experience on Instagram that's safe and private for young people, but also fun comes with competing challenges. We want them to easily make new friends and keep up with their family, but we don't want them to deal with unwanted DMs or comments from strangers. We think private accounts are the right choice for young people, but we recognize some young creators might want to have public accounts to build a following.

We want to strike the right balance of giving young people all the things they love about Instagram while also keeping them safe. That's why we're announcing changes we'll make today, including:

  • Defaulting young people into private accounts.

  • Making it harder for potentially suspicious accounts to find young people.

  • Limiting the options advertisers have to reach young people with ads.

See full article from about.instagram.com

 

 

Cease moralising...

A new anti porn campaigner proposes to take legal action against the ICO for failing to keep children's data safe from porn sites


Link Here18th June 2021
CEASE (Centre to End All Sexual Exploitation) is a new morality group campaigning against porn and sex work in the UK.

The group was founded in 2019 and describes itself on its website:

We shine a light on what sexual exploitation is, where it occurs and how it contravenes our human rights. We campaign for new and better laws, advocate for policy change and hold the global sex industry to account.

We're building a UK-wide movement of campaigners against sexual exploitation, and we're amplifying the voices of the very best advocates for change: survivors.

Its latest cunning plan is to hold the Information Commissioners Office (the UK data protection censor) as responsible for failing to prevent the world's porn sites from obtaining usage data from under 18s. The group writes on its website:

We are threatening to take legal action against the Information Commissioner's Office (ICO) for failing to protect children's data from misuse by porn sites.

The excuses the ICO has given for its failure to fulfil its regulatory duties are legally and factually flawed. What's more, it has left children exposed to a profit-hungry industry which is intent on drawing children back again and again to watch violent and abusive pornographic material for its own financial gain.

The group quotes long time porn campaigner John Carr:

I was shocked and dismayed by the Information Commissioner's reply to me in which they refused to act against porn sites which were collecting and processing children's data on a large scale. If the data protection laws weren't designed to protect children ... I am sure a lot of parents will wonder just what they were designed to do.

 

 

Age of nightmares...

ICO warns internet companies of the impending impossible to comply with Age Appropriate Design Code


Link Here7th March 2021
Full story: ICO Age Appropriate Design...ICO calls for age assurance for websites accessed by children
A survey by the Information Commissioner's Office (ICO) shows that three quarters of businesses surveyed are aware of the impending Children's Code. The full findings will be published in May but initial analysis shows businesses are still in the preparation stages.

And with just six months to go until the code comes into force, the ICO is urging organisations and businesses to make the necessary but onerous changes to their online services and products.

The Children's Code sets out 15 standards organisations must meet to ensure that children's data is protected online. The code will apply to all the major online services used by children in the UK and includes measures such as providing default settings which ensure that children have access to online services whilst minimising data collection and use.

Details of the code were first published in June 2018 and UK Parliament approved it last year. Since then, the ICO has been providing support and advice to help organisations adapt their online services and products in line with data protection law.

 

 

Non-consensual data exploitation...

ICO tells data broker Experian to seek users permission before selling their personal data


Link Here27th October 2020
In a landmark decison that shines a light on widespread data protecton failings by the entire data broker industry, the UK data protection censor ICO, has taken enforcement action against Experian, based in part on a complaint made by Privacy International in 2018.

Privacy International (PI) welcomes the report from the UK Information Commissioner's Office (ICO) into three credit reference agencies (CRAs) which also operate as data brokers for direct marketing purposes. As a result, the ICO has ordered the credit reference agency Experian to make fundamental changes to how it handles people's personal data within its offline direct marketing services.

Experian now has until July 2021 to inform people that it holds their personal data and how it intends to use it for marketing purposes. The ICO also requires Experian to stop using personal data derived from the credit referencing side of its business by January 2021.

The ICO investigation found widespread and systemic data protection failings across the sector, significant data protection failures at each company and that significant invisible processing took place, likely affecting millions of individuals in the UK. As the report underlines, between the CRAs, the data of almost every adult in the UK was, in some way, screened, traded, profiled, enriched, or enhanced to provide direct marketing services.

Moreover, the report notes that all three of the credit referencing agencies investigated were also using profiling to generate new or previously unknown information about people. This can be extremely invasive and can also have discriminatory effects for individuals.

Experian has said it intends to appeal the ICO decisions saying:

We believe the ICO's view goes beyond the legal requirements. This interpretation (of General Data Protection Regulation) also risks damaging the services that help consumers, thousands of small businesses and charities, particularly as they try to recover from the COVID-19 crisis.

 

 

Carry On ICO...

Data censor consults on its fines and sanctions regime for use after the Brexit transition period


Link Here4th October 2020

ICO consultation on the draft Statutory guidance

We are running a consultation about an updated version of the Statutory guidance on how the ICO will exercise its data protection regulatory functions of information notices, assessment notices, enforcement notices and penalty notices.

This guidance is a requirement of the Data Protection Act 2018 and only covers data protection law under that Act. Our other regulatory activity and the other laws we regulate are covered in our Regulatory action policy (which is currently under review).

We welcome written responses from all interested parties including members of the public and data controllers and those who represent them. Please answer the questions in the survey and also tell us whether you are responding on behalf of an organisation or in a personal capacity.

We will use your responses to this survey to help us understand the areas where organisations and members of the public are seeking further clarity about information notices, assessment notices, enforcement notices and penalty notices. We will only use this information to inform the final version of this guidance and not to consider any regulatory action.

We will publish this guidance after the UK has left the EU and we have therefore drafted it accordingly.

 

 

Offsite Article: Best to banish kids from the grown up internet...


Link Here 15th September 2020
Full story: ICO Age Appropriate Design...ICO calls for age assurance for websites accessed by children
A good summary of some of the unexpected consequences of internet censorship that will arise from ICO's Age Appropriate Design Code.

See article from parentzone.org.uk

 

 

Won't somebody think of the children!...

The ICO publishes its impossible to comply with, and business suffocating, Age Appropriate Design Code with a 12 month implementation period until 2nd September 2021


Link Here12th August 2020
Full story: ICO Age Appropriate Design...ICO calls for age assurance for websites accessed by children
The ICO issued the code on 12 August 2020 and it will come into force on 2 September 2020 with a 12 month transition period.

Information Commissioner Elizabeth Denham writes:

Data sits at the heart of the digital services children use every day. From the moment a young person opens an app, plays a game or loads a website, data begins to be gathered. Who's using the service? How are they using it? How frequently? Where from? On what device?

That information may then inform techniques used to persuade young people to spend more time using services, to shape the content they are encouraged to engage with, and to tailor the advertisements they see.

For all the benefits the digital economy can offer children, we are not currently creating a safe space for them to learn, explore and play.

This statutory code of practice looks to change that, not by seeking to protect children from the digital world, but by protecting them within it.

This code is necessary.

This code will lead to changes that will help empower both adults and children.

One in five UK internet users are children, but they are using an internet that was not designed for them. In our own research conducted to inform the direction of the code, we heard children describing data practices as nosy, rude and a bit freaky.

Our recent national survey into people's biggest data protection concerns ranked children's privacy second only to cyber security. This mirrors similar sentiments in research by Ofcom and the London School of Economics.

This code will lead to changes in practices that other countries are considering too.

It is rooted in the United Nations Convention on the Rights of the Child (UNCRC) that recognises the special safeguards children need in all aspects of their life. Data protection law at the European level reflects this and provides its own additional safeguards for children.

The code is the first of its kind, but it reflects the global direction of travel with similar reform being considered in the USA, Europe and globally by the Organisation for Economic Co-operation and Development (OECD).

This code will lead to changes that UK Parliament wants.

Parliament and government ensured UK data protection laws will truly transform the way we look after children online by requiring my office to introduce this statutory code of practice.

The code delivers on that mandate and requires information society services to put the best interests of the child first when they are designing and developing apps, games, connected toys and websites that are likely to be accessed by them.

This code is achievable.

The code is not a new law but it sets standards and explains how the General Data Protection Regulation applies in the context of children using digital services. It follows a thorough consultation process that included speaking with parents, children, schools, children's campaign groups, developers, tech and gaming companies and online service providers.

Such conversations helped shape our code into effective, proportionate and achievable provisions.

Organisations should conform to the code and demonstrate that their services use children's data fairly and in compliance with data protection law.

The code is a set of 15 flexible standards 203 they do not ban or specifically prescribe 203 that provides built-in protection to allow children to explore, learn and play online by ensuring that the best interests of the child are the primary consideration when designing and developing online services.

Settings must be high privacy by default (unless there's a compelling reason not to); only the minimum amount of personal data should be collected and retained; children's data should not usually be shared; geolocation services should be switched off by default. Nudge techniques should not be used to encourage children to provide unnecessary personal data, weaken or turn off their privacy settings. The code also addresses issues of parental control and profiling.

This code will make a difference.

Developers and those in the digital sector must act. We have allowed the maximum transition period of 12 months and will continue working with the industry.

We want coders, UX designers and system engineers to engage with these standards in their day-to-day to work and we're setting up a package of support to help.

But the next step must be a period of action and preparation. I believe companies will want to conform with the standards because they will want to demonstrate their commitment to always acting in the best interests of the child. Those companies that do not make the required changes risk regulatory action.

What's more, they risk being left behind by those organisations that are keen to conform.

A generation from now, I believe we will look back and find it peculiar that online services weren't always designed with children in mind.

When my grandchildren are grown and have children of their own, the need to keep children safer online will be as second nature as the need to ensure they eat healthily, get a good education or buckle up in the back of a car.

And while our code will never replace parental control and guidance, it will help people have greater confidence that their children can safely learn, explore and play online.

There is no doubt that change is needed. The code is an important and significant part of that change.

 

 

Age Appropriate Censorship...

The ICO's onerous internet censorship measure starts its parliamentary approval stage


Link Here12th June 2020
Full story: ICO Age Appropriate Design...ICO calls for age assurance for websites accessed by children
ICO statement in response to the Government laying the Age Appropriate Design Code, also known as the Children's Code, before Parliament.

We welcome the news that Government has laid the Age Appropriate Design Code before Parliament. It's a huge step towards protecting children online especially given the increased reliance on online services at home during COVID-19.

The code sets out 15 standards that relevant online services should meet to protect children's privacy and is the result of wide-ranging consultation and engagement with stakeholders including the tech industry, campaigners, trade bodies and organisations.

We are now pulling together our existing work on the benefits and the costs of the code to assess its impact. This will inform the discussions we have with businesses to help us develop a package of support to help them implement the code during the transition year."

 

 

The UK's opening gambit in US trade negotiations is to stifle US social media giants...

The government seems a bit cagey about the timetable for introducing the internet censorship measures contained in ICO's Age Appropriate Design rules


Link Here19th May 2020
Full story: ICO Age Appropriate Design...ICO calls for age assurance for websites accessed by children
The Age Appropriate Design Code has been written by the Information Commissioner's Office (ICO) to inform websites what they must do to keep ICO internet censors at bay with regards to the government's interpretations of GDPR provisions. Perhaps in the same way that the Crown Prosecution Service provides prosecution guidance as to how it interprets criminal law.

The Age Appropriate Design Code dictates how websites, and in particular social media, make sure that they are not exploiting children's personal data. Perhaps the most immediate effect is that social media will have to allow a level of usages that simply does not require children to hand over personal data. Requiring more extensive personal data, say in the way that Facebook does, requires users to provide 'age assurance' that they are old enough to take such decisions wisely.

However adult users may not be so willing to age verify, and may in fact also appreciate an option to use such websites without handing over data into the exploitative hands of social media companies.

So one suspects that US internet social media giants may not see Age Appropriate Design and the government's Online Harms model for internet censorship as commercially very desirable for their best interests. And one suspects that maybe US internet industry pushback may be something that is exerting pressure on UK negotiators seeking a free trade agreement with the US.

Pure conjecture of course, but the government does seem very cagey about its timetable for both the Age Appropriate Design Code and the Online Harms bill. Here is the latest parliamentary debate in the House of Lords very much on the subject of the government's timetable.

House of Lords Hansard: Age-appropriate Design Code, 18 May 2020

Lord Stevenson of Balmacara:

To ask Her Majesty's Government when they intend to lay the regulation giving effect to the age- appropriate design code required under section 123 of the Data Protection Act 2018 before Parliament.

The Parliamentary Under-Secretary of State, Department for Digital, Culture, Media and Sport (Baroness Barran) (Con)

The age-appropriate design code will play an important role in protecting children's personal data online. The Government notified the final draft of the age-appropriate design code to the European Commission as part of our obligations under the technical standards and regulations directive. The standstill period required under the directive has concluded. The Data Protection Act requires that the code is laid in Parliament as soon as is practicably possible.

Lord Stevenson of Balmacara:

I am delighted to hear that, my Lords, although no date has been given. The Government have a bit of ground to make up here, so perhaps it will not be delayed too long. Does the Minister agree that the Covid-19 pandemic is a perfect storm for children and for young people's digital experience? More children are online for more time and are more reliant on digital technology. In light of that, more action needs to be taken. Can she give us some information about when the Government will publish their final response to the consultation on the online harms White Paper, for example, and a date for when we are likely to see the draft Bill for pre-legislative scrutiny?

Baroness Barran

I spent some time this morning with a group of young people, in part discussing their experience online. The noble Lord is right that the pandemic presents significant challenges, and they were clear that they wanted a safe space online as well as physical safe spaces. The Government share that aspiration. We expect to publish our response to the online harms consultation this autumn and to introduce the legislation this Session.

Lord Clement-Jones (LD)

My Lords, I was very disappointed to see in the final version of the code that the section dealing with age-appropriate application has been watered down to leave out reference to age-verification mechanisms. Is this because the age-verification provisions of the Digital Economy Act have been kicked into the long grass at the behest of the pornography industry so that we will not have officially sanctioned age-verification tools available any time soon?

Baroness Barran

There is no intention to water down the code. Its content is the responsibility of the Information Commissioner, who has engaged widely to develop the code, with a call for evidence and a full public consultation.

Lord Moynihan (Con)

My Lords, is my noble friend the Minister able to tell the House the results of the consultation process with the industry on possible ways to implement age verification online?

Baroness Barran

We believe that our online harms proposals will deliver a much higher level of protection for children, as is absolutely appropriate. We expect companies to use a proportionate range of tools, including age-assurance and age-verification technologies, to prevent children accessing inappropriate behaviour, whether that be via a website or social media.

The Earl of Erroll (CB)

May I too push the Government to use the design code to cover the content of publicly accessible parts of pornographic websites, since the Government are not implementing Part 3 of the Digital Economy Act to protect children? Any online harms Act will be a long time in becoming effective, and such sites are highly attractive to young teenagers.

Baroness Barran

We agree absolutely about the importance of protecting young children online and that is why we are aiming to have the most ambitious online harms legislation in the world. My right honourable friend the Secretary of State and the Minister for Digital and Culture meet representatives of the industry regularly to urge them to improve their actions in this area.

Lord Holmes of Richmond (Con)

My Lords, does my noble friend agree that the code represents a negotiation vis-à-vis the tech companies and thus there is no reason for any delay in laying it before Parliament? Does she further agree that it should be laid before Parliament before 10 June to enable it to pass before the summer break? This would enable the Government to deliver on the claim that the UK is the safest place on the planet to be online. Share The edit just sent has not been saved. The following error was returned: This content has already been edited and is awaiting review.

Baroness Barran

The negotiation is not just with the tech companies. We have ambitions to be not only a commercially attractive place for tech companies but a very safe place to be online, while ensuring that freedom of speech is upheld. The timing of the laying of the code is dependent on discussions with the House authorities. As my noble friend is aware, there is a backlog of work which needs to be processed because of the impact of Covid-19.

 

 

Offsite Article: It looks like the UK's data regulator has given up, blaming coronavirus...


Link Here 19th May 2020
Information Commissioner's Office has effectively downed tools as a result of the pandemic, raising concerns about outstanding cases and ongoing privacy issues

See article from wired.co.uk

 

 

In these recessionary times maybe it is not a good idea to deprive websites of their income...

The Data censor ICO has suspended its action against adtech citing coronavirus effects


Link Here8th May 2020
The Information Commissioner's Office (ICO) has announced:

The ICO recently set out its regulatory approach during the COVID-19 pandemic, where we spoke about reassessing our priorities and resources.

Taking this into account we have made the decision to pause our investigation into real time bidding and the Adtech industry.

It is not our intention to put undue pressure on any industry at this time but our concerns about Adtech remain and we aim to restart our work in the coming months, when the time is right.

 

 

Do not snoop, do not profile, and do not earn any money...

Newspapers realise that the ICO default child protection policy may be very popular with adults too, and so it may prove tough to get them to age verify as required for monetisation


Link Here 24th January 2020
Full story: ICO Age Appropriate Design...ICO calls for age assurance for websites accessed by children
News websites will have to ask readers to verify their age or comply with a new 15-point code from the Information Commissioner's Office (ICO) designed to protect children's online data, ICO has confirmed.

Press campaign groups were hoping news websites would be exempt from the new Age Appropriate Design Code so protecting their vital digital advertising revenues which are currently enhanced by extensive profiled advertising.

Applying the code as standard will mean websites putting privacy settings to high and turning off default data profiling. If they want to continue enjoying revenues from behavioural advertising they will need to get adult readers to verify their age.

In its 2019 draft ICO had previously said such measures must be robust and that simply asking readers to declare their age would not be enough.But it has now confirmed to Press Gazette that for news websites that adhere to an editorial code, such self-declaration measures are likely to be sufficient.

This could mean news websites asking readers to enter their date of birth or tick a box confirming they are over 18. An ICO spokesperson said sites using these methods might also want to consider some low level technical measures to discourage false declarations of age, but anything more privacy intrusive is unlikely to be appropriate..

But Society of Editors executive director Ian Murray predicted the new demands may prove unpopular even at the simplest level. Asking visitors to confirm their age [and hence submit to snooping and profiling] -- even a simple yes or no tick box -- could be a barrier to readers.

The ICO has said it will work with the news media industry over a 12-month transition period to enable proportionate and practical measures to be put in place for either scenario.

In fact ICO produced a separate document alongside the code to explain how it could impact news media, which it said would be allowed to apply the code in a risk-based and proportionate way.

 

 

And ICO takes a watching brief...

Met Police to make facial recognition cameras a fully operational feature of its arsenal


Link Here24th January 2020
Full story: CCTV with facial recognition...Police introduce live facial recognition system
The Metropolitan Police has announced it will use live facial recognition cameras operationally for the first time on London streets.

Following earlier pilots in London and deployments by South Wales police, the cameras are due to be put into action within a month. Cameras will be clearly signposted, covering a small, targeted area, and police officers will hand out leaflets about the facial recognition scanning, the Met said.

Trials of the cameras have already taken place on 10 occasions in locations such as Stratford's Westfield shopping centre and the West End of London. The Met said in these trials, 70% of wanted suspects in the system who walked past the cameras were identified, while only one in 1,000 people generated a false alert. But an independent review of six of these deployments found that only eight out of 42 matches were verifiably correct.

Over the past four years, as the Met has trialled facial recognition, opposition to its use has intensified, led in the UK by campaign groups Liberty and Big Brother Watch.

The force also believes a recent High Court judgment, which said South Wales Police did not breach the rights of a man whose face had been scanned by a camera, gives it some legal cover. The case is heading for the Court of Appeal. But the Met is pressing on, convinced that the public at large will support its efforts to use facial recognition to track down serious offenders.

Last year, the Met admitted it supplied images for a database carrying out facial recognition scans on a privately owned estate in King's Cross, after initially denying involvement.

Update: Censored whilst claiming to be uncensored

24th January 2020. See article from ico.org.uk

It seems to the normal response from the Information Commissioner's Office to turn a blind eye to the actual serious exploitation of people's personal data whilst focusing heavily on generating excessive quantities of red tape rules requiring small players to be ultra protective of personal to point of strangling their businesses and livelihoods. And, just like for unconsented website tracking and profiling by the only advertising industry, the ICO will monitor and observe and comment again later in the year:

In October 2019 we concluded our investigation into how police use live facial recognition technology (LFR) in public places. Our investigation found there was public support for police use of LFR but also that there needed to be improvements in how police authorised and deployed the technology if it was to retain public confidence and address privacy concerns. We set out our views in a formal Opinion for police forces.

The Metropolitan Police Service (MPS) has incorporated the advice from our Opinion into its planning and preparation for future LFR use. Our Opinion acknowledges that an appropriately governed, targeted and intelligence- led deployment of LFR may meet the threshold of strict necessity for law enforcement purposes. We have received assurances from the MPS that it is considering the impact of this technology and is taking steps to reduce intrusion and comply with the requirements of data protection legislation. We expect to receive further information from the MPS regarding this matter in forthcoming days. The MPS has committed to us that it will review each deployment, and the ICO will continue to observe and monitor the arrangements for, and effectiveness of, its use.

This is an important new technology with potentially significant privacy implications for UK citizens. We reiterate our call for Government to introduce a statutory and binding code of practice for LFR as a matter of priority. The code will ensure consistency in how police forces use this technology and to improve clarity and foreseeability in its use for the public and police officers alike. We believe it's important for government to work with regulators, law enforcement, technology providers and communities to support the code.

Facial recognition remains a high priority for the ICO and the public. We have several ongoing investigations. We will be publishing more about its use by the private sector later this year.

Update: Big Brother Watch  Petition

24th January 2020. Sign the petition from you.38degrees.org.uk

To: Priti Patel, Home Secretary and Cressida Dick, Commissioner of the Metropolitan Police

Urgently stop the Metropolitan Police using live facial recognition surveillance.

Why is this important?

The Metropolitan Police has announced it will use live facial recognition across London, despite an independent review finding its previous trials likely unlawful and over 80% inaccurate. The Met is the largest police force in the democratic world to roll out this dangerously authoritarian surveillance. This represents an enormous expansion of the surveillance state and a serious threat to civil liberties in the UK - and it sets a dangerous precedent worldwide. We urge the Home Secretary and Met Commissioner to stop it now.

 

 

Updated: Children likely to prove toxic to a website's monetisation...

ICO backs off a little from an age gated internet but imposes masses of red tape for any website that is likely to be accessed by under 18s


Link Here 23rd January 2020
Full story: ICO Age Appropriate Design...ICO calls for age assurance for websites accessed by children
The Information Commissioner's Office (ICO) has just published its Age Appropriate Design Code:

The draft was published last year and was opened to a public consultation which came down heavily against ICO's demands that website users should be age verified so that the websites could tailor data protection to the age of the user.

Well in this final release ICO has backed off from requiring age verification for everything, and instead suggested something less onerous called age 'assurance'. The idea seems to be that age can be ascertained from behaviour, eg if a YouTube user watches Peppa Pig all day then one can assume that they are of primary school age.

However this does seem lead to a loads of contradictions, eg age can be assessed by profiling users behaviour on the site, but the site isn't allowed to profile people until they are old enough to agree to this. The ICO recognises this contradiction but doesn't really help much with a solution in practice.

The ICO defines the code as only applying to sites likely to be accessed by children (ie websites appealing to all ages are considered caught up by the code even though they are not specifically for children.

On a wider point the code will be very challenging to monetisation methods for general websites. The code requires website to default to no profiling, no geo-location, no in-game sales etc. It assumes that adults will identify themselves and so enable all these things to happen. However it may well be that adults will quite like this default setting and end up not opting for more, leaving the websites without income.

Note that these rules are in the UK interpretation of GDPR law and are not actually in the European directive. So they are covered by statute, but only in the UK. European competitors have no equivalent requirements.

The ICO press release reads:

Today the Information Commissioner's Office has published its final Age Appropriate Design Code -- a set of 15 standards that online services should meet to protect children's privacy.

The code sets out the standards expected of those responsible for designing, developing or providing online services like apps, connected toys, social media platforms, online games, educational websites and streaming services. It covers services likely to be accessed by children and which process their data.

The code will require digital services to automatically provide children with a built-in baseline of data protection whenever they download a new app, game or visit a website.

That means privacy settings should be set to high by default and nudge techniques should not be used to encourage children to weaken their settings. Location settings that allow the world to see where a child is, should also be switched off by default. Data collection and sharing should be minimised and profiling that can allow children to be served up targeted content should be switched off by default too.

Elizabeth Denham, Information Commissioner, said:

"Personal data often drives the content that our children are exposed to -- what they like, what they search for, when they log on and off and even how they are feeling.

"In an age when children learn how to use an iPad before they ride a bike, it is right that organisations designing and developing online services do so with the best interests of children in mind. Children's privacy must not be traded in the chase for profit."

The code says that the best interests of the child should be a primary consideration when designing and developing online services. And it gives practical guidance on data protection safeguards that ensure online services are appropriate for use by children.

Denham said:

"One in five internet users in the UK is a child, but they are using an internet that was not designed for them.

"There are laws to protect children in the real world -- film ratings, car seats, age restrictions on drinking and smoking. We need our laws to protect children in the digital world too.

"In a generation from now, we will look back and find it astonishing that online services weren't always designed with children in mind."

The standards of the code are rooted in the General Data Protection Regulation (GDPR) and the code was introduced by the Data Protection Act 2018. The ICO submitted the code to the Secretary of State in November and it must complete a statutory process before it is laid in Parliament for approval. After that, organisations will have 12 months to update their practices before the code comes into full effect. The ICO expects this to be by autumn 2021.

This version of the code is the result of wide-ranging consultation and engagement.

The ICO received 450 responses to its initial consultation in April 2019 and followed up with dozens of meetings with individual organisations, trade bodies, industry and sector representatives, and campaigners.

As a result, and in addition to the code itself, the ICO is preparing a significant package of support for organisations.

The code is the first of its kind, but it reflects the global direction of travel with similar reform being considered in the USA, Europe and globally by the Organisation for Economic Co-operation and Development (OECD).

Update: The legals

23rd January 2020. See article from techcrunch.com

Schedule

The code now has to be laid before parliament for approval for a period of 40 sitting days -- with the ICO saying it will come into force 21 days after that, assuming no objections. Then there's a further 12 month transition period after it comes into force.

Obligation or codes of practice?

Neil Brown, an Internet, telecoms and tech lawyer at Decoded Legal explained:

This is not, and will not be, 'law'. It is just a code of practice. It shows the direction of the ICO's thinking, and its expectations, and the ICO has to have regard to it when it takes enforcement action but it's not something with which an organisation needs to comply as such. They need to comply with the law, which is the GDPR [General Data Protection Regulation] and the DPA [Data Protection Act] 2018.

Right now, online services should be working out how to comply with the GDPR, the ePrivacy rules, and any other applicable laws. The obligation to comply with those laws does not change because of today's code of practice. Rather, the code of practice shows the ICO's thinking on what compliance might look like (and, possibly, goldplates some of the requirements of the law too).

Comment: ICO pushes ahead with age gates

23rd January 2020. See article from openrightsgroup.org

The ICO's Age Appropriate Design Code released today includes changes which lessen the risk of widespread age gates, but retains strong incentives towards greater age gating of content.

Over 280 ORG supporters wrote to the ICO about the previous draft code, to express concerns with compulsory age checks for websites, which could lead to restrictions on content.

Under the code, companies must establish the age of users, or restrict their use of data. ORG is concerned that this will mean that adults only access websites when age verified creating severe restrictions on access to information.

The ICO's changes to the Code in response to ORG's concerns suggest that different strategies to establish age may be used, attempting to reduce the risk of forcing compulsory age verification of users.

However, the ICO has not published any assessment to understand whether these strategies are practical or what their actual impact would be.

The Code could easily lead to Age Verification through the backdoor as it creates the threat of fines if sites have not established the age of their users.

While the Code has many useful ideas and important protections for children, this should not come at the cost of pushing all websites to undergo age verification of users. Age Verification could extend through social media, games and news publications.

Jim Killock, Executive Director of Open Rights Group said:

The ICO has made some useful changes to their code, which make it clear that age verification is not the only method to determine age.

However, the ICO don't know how their code will change adults access to content in practice. The new code published today does not include an Impact Assessment. Parliament must produce one and assess implications for free expression before agreeing to the code.

Age Verification demands could become a barrier to adults reaching legal content, including news, opinion and social media. This would severely impact free expression.

The public and Parliament deserve a thorough discussion of the implications, rather than sneaking in a change via parliamentary rubber stamping with potentially huge implications for the way we access Internet content.

 

 

Commented: Floundering...

ICO takes no immediate action against the most blatant examples of people's most personal data being exploited without consent, ie profiled advertising


Link Here23rd January 2020
Blatant abuse of people's private data has become firmly entrenched in the economic model of the free internet ever since Google recognised the value of analysing what people are searching for.

Now vast swathes of the internet are handsomely funded by the exploitation of people's personal data. But that deep entrenchment clearly makes the issue a bit difficult to put right without bankrupting half of the internet that has come to rely on the process.

The EU hasn't helped with its ludicrous idea of focusing its laws on companies having to obtain people's consent to have their data exploited. A more practical lawmaker would have simply banned the abuse of personal data without bothering with the silly consent games. But the EU seems prone to being lobbied and does not often come up with the most obvious solution.

Anyway enforcement of the EU's law is certainly causing issues for the internet censors at the UK's ICO.

The ICO warned the adtech industry 6 months ago that its approach is illegal  and has now announced that it would not be taking any action against the data abuse yet, as the industry has made a few noises about improving a bit over the coming months.

Simon McDougall, ICO Executive Director of Technology and Innovation has written:

The adtech real time bidding (RTB) industry is complex, involving thousands of companies in the UK alone. Many different actors and service providers sit between the advertisers buying online advertising space, and the publishers selling it.

There is a significant lack of transparency due to the nature of the supply chain and the role different actors play. Our June 2019 report identified a range of issues. We are confident that any organisation that has not properly addressed these issues risks operating in breach of data protection law.

This is a systemic problem that requires organisations to take ownership for their own data processing, and for industry to collectively reform RTB. We gave industry six months to work on the points we raised, and offered to continue to engage with stakeholders. Two key organisations in the industry are starting to make the changes needed.

The Internet Advertising Bureau (IAB) UK has agreed a range of principles that align with our concerns, and is developing its own guidance for organisations on security, data minimisation, and data retention, as well as UK-focused guidance on the content taxonomy. It will also educate the industry on special category data and cookie requirements, and continue work on some specific areas of detail. We will continue to engage with IAB UK to ensure these proposals are executed in a timely manner.

Separately, Google will remove content categories, and improve its process for auditing counterparties. It has also recently proposed improvements to its Chrome browser, including phasing out support for third party cookies within the next two years. We are encouraged by this, and will continue to look at the changes Google has proposed.

Finally, we have also received commitments from other UK advertising trade bodies to produce guidance for their members

If these measures are fully implemented they will result in real improvements to the handling of personal data within the adtech industry. We will continue to engage with industry where we think engagement will deliver the most effective outcome for data subjects.

Comment: Data regulator ICO fails to enforce the law

18th January 2020. See article from openrightsgroup.org

Responding to ICO's announcement today that the regulator is taking minimal steps to enforce the law against massive data breaches taking place in the online ad industry through Real-Time Bidding, complainants Jim Killock and Michael Veale have called on the regulator to enforce the law.

The complainants are considering taking legal action against the regulator. Legal action could be taken against the ICO for failure to enforce, or against the companies themselves for their breaches of Data Protection law.

The Real-Time Bidding data breach at the heart of RTB market exposes every person in the UK to mass profiling, and the attendant risks of manipulation and discrimination.

As the evidence submitted by the complainants notes, the real-time bidding systems designed by Google and the IAB broadcast what virtually all Internet users read, watch, and listen to online to thousands of companies, without protection of the data once broadcast. Now, sixteen months after the initial complaint, the ICO has failed to act.

Jim Killock, Executive Director of the Open Rights Group said:

The ICO is a regulator, so needs to enforce the law. It appears to be accepting that unlawful and dangerous sharing of personal data can continue, so long as 'improvements' are gradually made, with no actual date for compliance.

Last year the ICO gave a deadline for an industry response to our complaints. Now the ICO is falling into the trap set by industry, of accepting incremental but minimal changes that fail to deliver individuals the control of their personal data that they are legally entitled to.

The ICO must take enforcement action against IAB members.

We are considering our position, including whether to take legal action against the regulator for failing to act, or individual companies for their breach of data protection law.

Dr Michael Veale said:

When an industry is premised and profiting from clear and entrenched illegality that breach individuals' fundamental rights, engagement is not a suitable remedy. The ICO cannot continue to look back at its past precedents for enforcement action, because it is exactly that timid approach that has led us to where we are now.

Ravi Naik, solicitor acting for the complainants, said:

There is no dispute about the underlying illiegality at the heart of RTB that our clients have complained about. The ICO have agreed with those concerns yet the companies have not taken adequate steps to address those conerns. Nevertheless, the ICO has failed to take direct enforcement action needed to remedy these breaches.

Regulatory ambivalence cannot continue. The ICO is not a silo but is subject to judicial oversight. Indeed, the ICO's failure to act raises a question about the adequacy of the UK Data Protection Act. Is there proper judicial oversight of the ICO? This is a critical question after Brexit, when the UK needs to agree data transfer arrangements with the EU that cover all industries.

Dr. Johnny Ryan of Brave said:

The RTB system broadcasts what everyone is reading and watching online, hundreds of billions of times a day, to thousands of companies. It is by far the largest data breach ever recorded. The risks are profound. Brave will support ORG to ensure that the ICO discharges its responsibilities.

Jim Killock and Michael Veale complained about the Adtech industry and Real Time Bidding to the UK's ICO in September 2018. Johnny Ryan of Brave submitted a parallel complaint against Google about their Adtech system to the Irish Data Protection Authority.

Update: Advertising industry will introduce a 'gold standard 2.0' for privacy towards the end of 2020

23rd January 2020. See article from campaignlive.co.uk

The Internet Advertising Bureau UK has launched a new version of what it calls its Gold Standard certification process that will be independently audited by a third party.

In a move to address ongoing privacy concerns with the digital supply chain, the IAB's Gold Standard 2.0 will incorporate the Transparency and Consent Framework, a widely promoted industry standard for online advertising.

The new process will be introduced in the fourth quarter after an industry consultation to agree on the compliance criteria for incorporating the TCF.

 

 

ICO delivers its new internet censorship rules to the government...

But it can't possibly let you read them...because of data protection y'now


Link Here23rd November 2019
Full story: ICO Age Appropriate Design...ICO calls for age assurance for websites accessed by children
The Information Commissions Office (ICO) earlier in the year presented draft internet censorship laws targeted at the commendable aim of protecting the personal data of younger website users. These rules are legally enforceable under the EU GDPR and are collectively known as The Age Appropriate Design Code.

The ICO originally proposed that website designers should consider several age ranges of their users. The youngest users should be presented with no opportunity to reveal their personal data and then the websites could relent a little on the strictness of the rules as they get older. It all sounds good at first read... until one considers exactly how to know how old users are.

And of course ICO proposed age verification (AV) to prove that people are old enough for the tier of data protection being applied.

ISO did not think very hard about the bizarre contradiction that AV requires people to hand over enough data to give identity thieves an orgasm. So the ICO were going to ask people to hand over their most sensitive ID to any websites that ask... in the name of the better protection of the data that they have just handed over anyway.

The draft rules were ridiculous, requiring even a small innocent site with a shopping trolley to require AV before allowing people to type in their details in the shopping trolley.

Well the internet industry strongly pointed out the impracticality of the ICO's nonsense ideas. And indeed the ICO released a blog and made a few comments that suggest it would be scaling back on its universal AV requirements.

The final censorship were delivered to the government on schedule on 23rd November 2019.

The industry is surely very keen to know if the ICO has retreated on its stance, but the ICO has now just announced that the publication date will be delayed until the next government is in place. It sounds that their ideas may still be a little controversial, and they need to hide behind a government minister before announcing the new rules.

 

 

Offsite Article: Breaking the internet but improving privacy...


Link Here21st November 2019
The AdTech showdown is coming but will the ICO bite?

See article from openrightsgroup.org

 

 

Offsite Article: Those who care about privacy need not apply...


Link Here7th November 2019
Jobs microsite used for jobs at the ICO sets hundreds of cookies without visitors' consent

See article from theregister.co.uk

 

 

My Ministers will continue to develop proposals to extend internet censorship...

A summary of the Online Harms Bill as referenced in the Queen's Speech


Link Here15th October 2019

The April 2019 Online Harms White Paper set out the Government's plan for world-leading legislation to make the UK the safest place in the world to be online.

The proposals, as set out in the White Paper were:

  • A new duty of care on companies towards their users, with an independent regulator to oversee this framework.

  • We want to keep people safe online, but we want to do this in a proportionate way, ensuring that freedom of expression is upheld and promoted online, and businesses do not face undue burdens.

  • We are seeking to do this by ensuring that companies have the right processes and systems in place to fulfil their obligations, rather than penalising them for individual instances of unacceptable content.

  • Our public consultation on this has closed and we are analysing the responses and considering the issues raised. We are working closely with a variety of stakeholders, including technology companies and civil society groups, to understand their views.

We are seeking to do this by ensuring that companies have the right processes and systems in place to fulfil their obligations, rather than penalising them for individual instances of unacceptable content.

Our public consultation on this has closed and we are analysing the responses and considering the issues raised. We are working closely with a variety of stakeholders, including technology companies and civil society groups, to understand their views.

Next steps:

  • We will publish draft legislation for pre-legislative scrutiny.

  • Ahead of this legislation, the Government will publish work on tackling the use of the internet by terrorists and those engaged in child sexual abuse and exploitation, to ensure companies take action now to tackle content that threatens our national security and the physical safety of children.

  • We are also taking forward additional measures, including a media literacy strategy, to empower users to stay safe online. A Safety by Design framework will help start-ups and small businesses to embed safety during the development or update of their products and services.

 

 

Offsite Article: Censor! Censor Yourself!...


Link Here12th October 2019
ICO's website found to be allowing video providers to put tracking cookies on your device without obtaining consent

See article from markalanrichards.com

 

 

Offsite Article: Come in Google Adwords. your time is up...


Link Here2nd October 2019
UK data 'protection' censor reiterates GDPR warning to ad tech companies about the blatant use of people's web browsing history without consent

See article from digiday.com

 

 

The ICO has a Baldric inspired 'cunning plan' to require age verification for nearly all websites...

But the News Media Association points out that it would force websites to choose between being devoid of audience or stripped of advertising


Link Here4th July 2019
Full story: ICO Age Appropriate Design...ICO calls for age assurance for websites accessed by children
For some bizarre reason the ICO seems to have been given powers to make wide ranging internet censorship law on the fly without needing it to be considered by parliament. And with spectacular incompetence, they have come up with a child safety plan to require nearly every website in Britain to implement strict age verification. Baldric would have been proud, it is more or less an internet equivalent of making children safe on the roads by banning all cars.

A trade association for news organisations, News Media Association, summed up the idea in a consultation response saying:

ICO's Age Appropriate Code Could Wreak Havoc On News Media 

Unless amended, the draft code published for consultation by the ICO would undermine the news media industry, its journalism and business innovation online. The ICO draft code would require commercial news media publishers to choose between their online news services being devoid of audience or stripped of advertising, with even editorial content subject to ICO judgment and sanction, irrespective of compliance with general law and codes upheld by the courts and relevant regulators.

The NMA strongly objects to the ICO's startling extension of its regulatory remit, the proposed scope of the draft code, including its express application to news websites, its application of the proposed standards to all users in the absence of robust age verification to distinguish adults from under 18-year olds and its restrictions on profiling. The NMA considers that news media publishers and their services should be excluded from scope of the proposed draft Code.

Attracting and retaining audience on news websites, digital editions and online service, fostering informed reader relationships, are all vital to the ever evolving development of successful newsbrands and their services, their advertising revenues and their development of subscription or other payment or contribution models, which fund and sustain the independent press and its journalism.

There is surely no justification for the ICO to attempt by way of a statutory age appropriate design code, to impose access restrictions fettering adults (and children's) ability to receive and impart information, or in effect impose 'pre watershed' broadcast controls upon the content of all currently publicly available, free to use, national, regional and local news websites, already compliant with the general law and editorial and advertising codes of practice upheld by IPSO and the ASA.

In practice, the draft Code would undermine commercial news media publishers' business models, as audience and advertising would disappear. Adults will be deterred from visiting newspaper websites if they first have to provide age verification details. Traffic and audience will also be reduced if social media and other third parties were deterred from distributing or promoting or linking titles' lawful, code compliant, content for fear of being accused of promoting content detrimental to some age group in contravention of the Code. Audience measurement would be difficult. It would devastate advertising, since effective relevant personalised advertising will be rendered impossible, and so destroy the vital commercial revenues which actually fund the independent media, its trusted journalism and enable it to innovate and evolve to serve the ever-changing needs of its audience.

The draft Code's impact would be hugely damaging to the news industry and wholly counter to the Government's policy on sustaining high quality, trusted journalism at local, regional, national and international levels.

Newspapers online content, editorial and advertising practices do not present any danger to children. The ICO has not raised with the industry any evidence of harm, necessitating such drastic restrictions, caused by reading news or service of advertisements where these are compliant with the law and the standards set by specialist media regulators.

  blackadder cast
The Information Commissioner's Office has a 'cunning plan'

Of course the News Media Association is making a strong case for its own exclusion from the ICO's 'cunning plan', but the idea is equally devastating for websites from any other internet sector.

Information Commissioner Elizabeth Denham was called to give evidence to Parliament's DCMS Select Committee this week on related matters, and she spoke of a clearly negative feedback to her age verification idea.

Her sidekick backtracked a little, saying that the ICO did not mean Age Verification via handing over passport details, more like one of those schemes where AI guesses age by scanning what sort of thing the person has been posting on social media. (Which of course requires a massive grab of data that should be best kept private, especially for children). The outcome seems to be a dictate to the internet industry to 'innovate' and find a solution to age verification that does not require the mass hand over of private data (you know like what the data protection laws are supposed to be protecting). The ICO put a time limit on this innovation demand of about 12 months.

In the meantime the ICO has told the news industry that age verification idea won't apply to them, presumably because they can kick up a hell of stink about the ICO in their mass market newspapers. Denham said:

We want to encourage children to find out about the world, we want children to access news sites.

So the concern about the impact of the code on media and editorial comment and journalism I think is unfounded. We don't think there will be an impact on news media sites. They are already regulated and we are not a media regulator.

She did speak any similar reassuring words to any other sector of the internet industry who are likely to be equally devastated by the ICO's 'cunning plan'.

 

 

Clear data abuse proves too entrenched for the ICO to handle...

ICO reports on adtech snooping on, and profiling internet users without their consent


Link Here 25th June 2019

In recent months we've been reviewing how personal data is used in real time bidding (RTB) in programmatic advertising, engaging with key stakeholders directly and via our fact-finding forum event to understand the views and concerns of those involved.

We're publishing our Update report into adtech and real time bidding which summarises our findings so far.

We have prioritised two areas: the processing of special category data, and issues caused by relying solely on contracts for data sharing across the supply chain. Under data protection law, using people's sensitive personal data to serve adverts requires their explicit consent, which is not happening right now. Sharing people's data with potentially hundreds of companies, without properly assessing and addressing the risk of these counterparties, raises questions around the security and retention of this data.

We recognise the importance of advertising to participants in this commercially sensitive ecosystem, and have purposely adopted a measured and iterative approach to our review of the industry as a whole so that we can observe the market's reaction and adapt our thinking. However, we want to see change in how things are done. We'll be spending the next six months continuing to engage with the sector, which will give the industry the chance to start making changes based on the conclusions we've come to so far.

Open Rights Group responds

25th June 2019. See article from openrightsgroup.org

The ICO has responded to a complaint brought by Jim Killock and Dr Michael Veale in Europe's 12 billion euro real-time bidding adtech industry. Killock and Veale are now calling on the ICO to take action against companies that are processing data unlawfully.

The ICO has agreed in substance with the complainants' points about the insecurity of adtech data sharing. In particular, the ICO states that:

  • Processing of non-special category data is taking place unlawfully at the point of collection

  • [The ICO has] little confidence that the risks associated with RTB have been fully assessed and mitigated

  • Individuals have no guarantees about the security of their personal data within the ecosystem

However the ICO is proceeding very cautiously and slowly, and not insisting on immediate changes, despite the massive scale of the data breach.

Jim Killock said:

The ICO's conclusions are strong and very welcome but we are worried about the slow pace of action and investigation. The ICO has confirmed massive illegality on behalf of the adtech industry. They should be insisting on remedies and fast.

Dr Michael Veale said:

The ICO has clearly indicated that the sector operates outside the law, and that there is no evidence the industry will correct itself voluntarily. As long as it remains doing so, it undermines the operation and the credibility of the GDPR in all other sectors. Action, not words, will make a difference--and the ICO needs to act now.

The ICO concludes:

Overall, in the ICO's view the adtech industry appears immature in its understanding of data protection requirements. Whilst the automated delivery of ad impressions is here to stay, we have general, systemic concerns around the level of compliance of RTB:

  • Processing of non-special category data is taking place unlawfully at the point of collection due to the perception that legitimate interests can be used for placing and/or reading a cookie or other technology (rather than obtaining the consent PECR requires).
  • Any processing of special category data is taking place unlawfully as explicit consent is not being collected (and no other condition applies). In general, processing such data requires more protection as it brings an increased potential for harm to individuals.
  • Even if an argument could be made for reliance on legitimate interests, participants within the ecosystem are unable to demonstrate that they have properly carried out the legitimate interests tests and implemented appropriate safeguards.
  • There appears to be a lack of understanding of, and potentially compliance with, the DPIA requirements of data protection law more broadly (and specifically as regards the ICO's Article 35(4) list). We therefore have little confidence that the risks associated with RTB have been fully assessed and mitigated.
  • Privacy information provided to individuals lacks clarity whilst also being overly complex. The TCF and Authorized Buyers frameworks are insufficient to ensure transparency and fair processing of the personal data in question and therefore also insufficient to provide for free and informed consent, with attendant implications for PECR compliance.
  • The profiles created about individuals are extremely detailed and are repeatedly shared among hundreds of organisations for any one bid request, all without the individuals' knowledge.
  • Thousands of organisations are processing billions of bid requests in the UK each week with (at best) inconsistent application of adequate technical and organisational measures to secure the data in transit and at rest, and with little or no consideration as to the requirements of data protection law about international transfers of personal data.
  • There are similar inconsistencies about the application of data minimisation and retention controls.
  • Individuals have no guarantees about the security of their personal data within the ecosystem.

 

 

Offsite Article: Just how bad is the ICO's draft age appropriate design code?...


Link Here 6th June 2019
Full story: ICO Age Appropriate Design...ICO calls for age assurance for websites accessed by children
Foreign websites will block UK users altogether rather than be compelled to invest time and money into a nigh-impossible compliance process. By Heather Burns

See article from webdevlaw.uk

 

 

Strangling UK business and endangering people's personal data...

Internet companies slam the data censor's disgraceful proposal to require age verification for large swathes of the internet


Link Here 5th June 2019
Full story: ICO Age Appropriate Design...ICO calls for age assurance for websites accessed by children
The Information Commissioner's Office has for some bizarre reason have been given immense powers to censor the internet.

And in an early opportunity to exert its power it has proposed a 'regulation' that would require strict age verification for nearly all mainstream websites that may have a few child readers and some material that may be deemed harmful for very young children. Eg news websites that my have glamour articles or perhaps violent news images.

In a mockery of 'data protection' such websites would have to implement strict age verification requiring people to hand over identity data to most of the websites in the world.

Unsurprisingly much of the internet content industry is unimpressed. A six weerk consultation on the new censorship rules has just closed and according to the Financial Times:

Companies and industry groups have loudly pushed back on the plans, cautioning that they could unintentionally quash start-ups and endanger people's personal data. Google and Facebook are also expected to submit critical responses to the consultation.

Tim Scott, head of policy and public affairs at Ukie, the games industry body, said it was an inherent contradiction that the ICO would require individuals to give away their personal data to every digital service.

Dom Hallas, executive director at the Coalition for a Digital Economy (Coadec), which represents digital start-ups in the UK, said the proposals would result in a withdrawal of online services for under-18s by smaller companies:

The code is seen as especially onerous because it would require companies to provide up to six different versions of their websites to serve different age groups of children under 18.

This means an internet for kids largely designed by tech giants who can afford to build two completely different products. A child could access YouTube Kids, but not a start-up competitor.

Stephen Woodford, chief executive of the Advertising Association -- which represents companies including Amazon, Sky, Twitter and Microsoft -- said the ICO needed to conduct a full technical and economic impact study, as well as a feasibility study. He said the changes would have a wide and unintended negative impact on the online advertising ecosystem, reducing spend from advertisers and so revenue for many areas of the UK media.

An ICO spokesperson said:

We are aware of various industry concerns about the code. We'll be considering all the responses we've had, as well as engaging further where necessary, once the consultation has finished.

 

 

Does destroying the livelihoods of parents protect the children?...

ICO announces another swathe of internet censorship and age verification requirements in the name of 'protecting the children'


Link Here 15th April 2019
This is the biggest censorship event of the year. It is going destroy the livelihoods of many. It is framed as if it were targeted at Facebook and the like, to sort out their abuse of user data, particularly for kids.

However the kicker is that the regulations will equally apply to all UK accessed websites that earn at least earn some money and process user data in some way or other.  Even small websites will then be required to default to treating all their readers as children and only allow more meaningful interaction with them if they verify themselves as adults. The default kids-only mode bans likes, comments, suggestions, targeted advertising etc, even for non adult content.

Furthermore the ICO expects websites to formally comply with the censorship rules using market researchers, lawyers, data protection officers, expert consultants, risk assessors and all the sort of people that cost a grand a day.

Of course only the biggest players will be able to afford the required level of red tape and instead of hitting back at Facebook, Google, Amazon and co for misusing data, they will further add to their monopoly position as they will be the only companies big enough to jump over the government's child protection hurdles.

Another dark day for British internet users and businesses.

The ICO write in a press release

Today we're setting out the standards expected of those responsible for designing, developing or providing online services likely to be accessed by children, when they process their personal data.

Parents worry about a lot of things. Are their children eating too much sugar, getting enough exercise or doing well at school. Are they happy?

In this digital age, they also worry about whether their children are protected online. You can log on to any news story, any day to see just how children are being affected by what they can access from the tiny computers in their pockets.

Last week the Government published its white paper covering online harms.

Its proposals reflect people's growing mistrust of social media and online services. While we can all benefit from these services, we are also increasingly questioning how much control we have over what we see and how our information is used.

There has to be a balancing act: protecting people online while embracing the opportunities that digital innovation brings.

And when it comes to children, that's more important than ever. In an age when children learn how to use a tablet before they can ride a bike, making sure they have the freedom to play, learn and explore in the digital world is of paramount importance.

The answer is not to protect children from the digital world, but to protect them within it.

So today we're setting out the standards expected of those responsible for designing, developing or providing online services likely to be accessed by children, when they process their personal data. Age appropriate design: a code of practice for online services has been published for consultation.

When finalised, it will be the first of its kind and set an international benchmark.

It will leave online service providers in no doubt about what is expected of them when it comes to looking after children's personal data. It will help create an open, transparent and protected place for children when they are online.

Organisations should follow the code and demonstrate that their services use children's data fairly and in compliance with data protection law. Those that don't, could face enforcement action including a fine or an order to stop processing data.

Introduced by the Data Protection Act 2018, the code sets out 16 standards of age appropriate design for online services like apps, connected toys, social media platforms, online games, educational websites and streaming services, when they process children's personal data. It's not restricted to services specifically directed at children.

The code says that the best interests of the child should be a primary consideration when designing and developing online services. It says that privacy must be built in and not bolted on.

Settings must be "high privacy" by default (unless there's a compelling reason not to); only the minimum amount of personal data should be collected and retained; children's data should not usually be shared; geolocation services should be switched off by default. Nudge techniques should not be used to encourage children to provide unnecessary personal data, weaken or turn off their privacy settings or keep on using the service. It also addresses issues of parental control and profiling.

The code is out for consultation until 31 May. We will draft a final version to be laid before Parliament and we expect it to come into effect before the end of the year.

Our Code of Practice is a significant step, but it's just part of the solution to online harms. We see our work as complementary to the current initiatives on online harms, and look forward to participating in discussions regarding the Government's white paper.

The proposals are now open for public consultation:

The Information Commissioner is seeking feedback on her draft code of practice Age appropriate design -- a code of practice for online services likely to be accessed by children (the code).

The code will provide guidance on the design standards that the Commissioner will expect providers of online 'Information Society Services' (ISS), which process personal data and are likely to be accessed by children, to meet.

The code is now out for public consultation and will remain open until 31 May 2019. The Information Commissioner welcomes feedback on the specific questions set out below.

You can respond to this consultation via our online survey , or you can download the document below and email to ageappropriatedesign@ico.org.uk .

lternatively, print off the document and post to:

Age appropriate design code consultation
Policy Engagement Department
Information Commissioner's Office
Wycliffe House
Water Lane
Wilmslow
Cheshire
SK9 5AF

 

 

Comments: An unelected quango introducing draconian limitations on the internet...

Responses to the ICO internet censorship proposals


Link Here 15th April 2019

Comment: Entangling start ups in red tape

See article from adamsmith.org

Today the Information Commissioner's Office announced a consultation on a draft Code of Practice to help protect children online.

The code forbids the creation of profiles on children, and bans data sharing and nudges of children. Importantly, the code also requires everyone be treated like a child unless they undertake robust age-verification.

The ASI believes that this code will entangle start-ups in red tape, and inevitably end up with everyone being treated like children, or face undermining user privacy by requiring the collection of credit card details or passports for every user.

Matthew Lesh, Head of Research at free market think tank the Adam Smith Institute, says:

This is an unelected quango introducing draconian limitations on the internet with the threat of massive fines.

This code requires all of us to be treated like children.

An internet-wide age verification scheme, as required by the code, would seriously undermine user privacy. It would require the likes of Facebook, Google and thousands of other sites to repeatedly collect credit card and passport details from millions of users. This data collection risks our personal information and online habits being tracked, hacked and exploited.

There are many potential unintended consequences. The media could be forced to censor swathes of stories not appropriate for young people. Websites that cannot afford to develop 'children-friendly' services could just block children. It could force start-ups to move to other countries that don't have such stringent laws.

This plan would seriously undermine the business model of online news and many other free services by making it difficult to target advertising to viewer interests. This would be both worse for users, who are less likely to get relevant advertisements, and journalism, which is increasingly dependent on the revenues from targeted online advertising.

The Government should take a step back. It is really up to parents to keep their children safe online.

Offsite Comment: Web shake-up could force ALL websites to treat us like children

15th April 2019. See article from dailymail.co.uk

The information watchdog has been accused of infantilising web users, in a draconian new code designed to make the internet safer for children.

Web firms will be forced to introduce strict new age checks on their websites -- or treat all their users as if they are children, under proposals published by the Information Commissioner's Office today.

The rules are so stringent that critics fear people could end up being forced to demonstrate their age for virtually every website they visit, or have the services that they can access limited as if they are under 18.


 2019   2020   2021   2022   2023   Latest 

melonfarmers icon

Home

Top

Index

Links

Search
 

UK

World

Media

Liberty

Info
 

Film Index

Film Cuts

Film Shop

Sex News

Sex Sells
 
 

 
UK News

UK Internet

UK TV

UK Campaigns

UK Censor List
ASA

BBC

BBFC

ICO

Ofcom
Government

Parliament

UK Press

UK Games

UK Customs


Adult Store Reviews

Adult DVD & VoD

Adult Online Stores

New Releases/Offers

Latest Reviews

FAQ: Porn Legality
 

Sex Shops List

Lap Dancing List

Satellite X List

Sex Machines List

John Thomas Toys