Melon Farmers Original Version

UK Internet Censorship


2020: Jan-March

 2013   2014   2015   2016   2017   2018   2019   2020   2021   2022   2023   2024   Latest 
Jan-March   April-June   July-Sept   Oct-Dec    

 

Take your medicine, stay home for 3 months, and don't worry about the depression...

UK government to censor quack cures for coronavirus


Link Here31st March 2020
The UK government is reported to be actively working with social media to remove coronavirus fake news and harmful content.

Social media companies have responded by introducing several sweeping rule changes that crack down on any dissenting opinions and push users to what they deem to be authoritative or credible sources of information. And now the BBC is reporting that the UK government will be working with these social media companies to remove what it deems to be fake news, harmful content, and misinformation related to the coronavirus.

The report doesn't specify how the UK government will determine what qualifies as fake news or harmful content.

Twitter has updated rules around the coronavirus targeting users that deny expert guidance. The company has also forced some users to remove jokes about the virus.

 

 

There are more important harms to be thinking about than Pornhub...

Miserable MPs whinge about an uptick of people entertaining themselves on Pornhub during the coronavirus lockdown


Link Here27th March 2020
British MPs have claimed that that measures to reform and regulate the porn industry have faltered, putting vulnerable people at risk.

Last year attempts to introduce age verification systems into open access porn sites to stop children being able to access extreme online content stalled, and MPs are warning that regulation proposed in a new online harms bill, currently at consultation stage in parliament, does not go far enough.

Tracy Brabin, the shadow culture secretary, whinged:

The online harms bill doesn't go far enough. We have to get control over this industry, said  We have a duty of care to young people whose videos are being shared who might not want them shared, and ... to potential victims of sex trafficking and rape.

MPs from both sides of the political divide agree. Conservative MP Maria Miller, chair of the women and equalities committee, said: These are hugely important issues and [the online harms bill] is taking too long, we have been talking about this for two years now. She said the promised duty of care should include a way to hold companies to account if unlawful material is posted.

Activist Laila Mickelwait, part of a group of activists at Exodus Cry, told the Guardian: Pornhub handing out 'free' premium content is a way for them to cash in on those around the world impacted by the pandemic. Pornhub is collecting an incredible amount of user data including IP addresses by allowing web beacons and other special information targeting technology on all user devices, and monetising it for their own gain.

 

 

Worthy but blinkered...

Independent report on child abuse material recommends strict age/identity verification for social media


Link Here14th March 2020
  The Independent Inquiry into Child Sexual Abuse, chaired by Professor Alexis Jay, was set up because of serious concerns that some organisation had failed and were continuing to fail to protect children from sexual abuse. It describes its remit as:

Our remit is huge, but as a statutory inquiry we have unique authority to address issues that have persisted despite previous inquiries and attempts at reform.

The inquiry has just published its report with the grandiose title: The Internet.

It has consider many aspects of child abuse and come up with the following short list of recommendation:

  1. Pre-screening of images before uploading

    The government should require industry to pre-screen material before it is uploaded to the internet to prevent access to known indecent images of children.
     
  2. Removal of images

    The government should press the WeProtect Global Alliance to take more action internationally to ensure that those countries hosting indecent images of children implement legislation and procedures to prevent access to such imagery.
     
  3. Age verification

    The government should introduce legislation requiring providers of online services and social media platforms to implement more stringent age verification techniques on all relevant devices.
     
  4. Draft child sexual abuse and exploitation code of practice

    The government should publish, without further delay, the interim code of practice in respect of child sexual abuse and exploitation as proposed by the Online Harms White Paper (published April 2019).

But it should be noted that the inquiry gave not even a passing mention to some of the privacy issues that would have far reaching consequences should age verification be required for children's social media access.

Perhaps the authorities should recall that age verification for porn failed because the law makers were only thinking of the children, and didn't give even a moment of passing consideration for the privacy of the porn users. The lawmaker's blinkeredness resulted in the failure of their beloved law.

Has anyone even considered the question what will happen if they ban kids from social media. An epidemic of tantrums? Collapse of social media companies? kids go back to hanging around on street corners?, the kids find more underground websites to frequent? they play violent computer games all day instead?

 

 

Yet another example demonstrating the dangers identifying yourself as a porn user...

Virgin Media details customer porn access data that it irresponsibly made openly available on the internet


Link Here6th March 2020
Full story: BBFC Internet Porn Censors...BBFC: Age Verification We Don't Trust
A customer database left unsecured online by Virgin Media contained details linking some customers to pornography and explicit websites.

The researchers who first discovered the database told the BBC that it contained more information than Virgin Media suggested. Such details could be used by cyber-criminals to extort victims.

Virgin revealed on Thursday that one of its marketing databases containing details of 900,000 people was open to the internet and had been accessed on at least one occasion by an unknown user.

On Friday, it confirmed that the database contained details of about 1,100 customers who had used an online form to ask for a particular website to be blocked or unblocked. It said it was in the process of contacting customers again about specific data that may have been stolen.

When it first confirmed the data breach on Thursday, Virgin Media warned the public that the database contained phone numbers, home addresses and emails, however the company did not disclose that database contained more intimate details.

A representative of TurgenSec, the research company said Virgin Media's security had been far from adequate. The information was in plain text and unencrypted, which meant anyone browsing the internet could clearly view and potentially download all of this data without needing any specialised equipment, tools, or hacking techniques.

A spokeswoman for the ICO said it was investigating, and added:

People have the right to expect that organisations will handle their personal information securely and responsibly. When that doesn't happen, we advise people who may have been affected by data breaches to be vigilant when checking their financial records.

Virgin Media said it would be emailing those affected, in order to warn them about the risks of phishing, nuisance calls and identity theft. The message will include a reminder not to click on unknown links in emails, and not to provide personal details to unverified callers.

 

 

Extract: Online harms harms trade negotiations...

Eye watering fines or jailing directors for not protecting kids from perceived online social media harms isn't sitting comfortably with negotiating a free trade deal with the US


Link Here 23rd February 2020

A Times article has popped up under the headline Boris Johnson set to water down curbs on tech giants.

It had all the hallmarks of an insider briefing, opening with the following

The prime minister is preparing to soften plans for sanctions on social media companies amid concerns about a backlash from tech giants.

There is a very pro-tech lobby in No 10, a well-placed source said. They got spooked by some of the coverage around online harms and raised concerns about the reaction of the technology companies. There is a real nervousness about it.

Read the full article from johncarr.blog

 

 

Offsite Article: This is state censorship of the internet...


Link Here 20th February 2020
UK government plans to tackle online harms pose a huge threat to free speech. By Andrew Tettenborn

See article from spiked-online.com

 

 

The DCMS announces its full censorship team line up...

Captain Oliver Dowden leads centre forward Matt Warman and own goal kicker Nigel Huddleston. Taking offence will be Caroline Dinenage. John Whittingdale is on the right wing and Diana Barran takes the other place


Link Here 18th February 2020
The Department for Digital, Culture, Media and Sport (DCMS) has welcomed a number of new and returning ministers, following appointments made by Prime Minister Boris Johnson.

Oliver Dowden, Secretary of State for Digital, Culture, Media and Sport.

The Secretary of State has overall responsibility for strategy and policy across the department and management of Brexit for the department.
Caroline Dinenage, Minister of State for Digital and Culture:
  • Online Harms and Security
  • Digital and Tech Policy including Digital Skills
  • Creative Industries
  • Arts and Libraries
  • Museums and Cultural Property
  • Festival 2022
John Whittingdale, Minister of State for Media and Data:
  • Media
  • Oversight of EU negotiations
  • Overall international strategy including approach to future trade deals
  • Data and the National Archives
  • Public Appointments
Matt Warman: Parliamentary Under Secretary of State for Digital Infrastructure:
  • Broadband Delivery UK (BDUK)
  • Gigabit delivery programme
  • Mobile coverage
  • Telecoms supply chain
  • Cyber Security
Nigel Huddleston: Parliamentary Under Secretary of State for Sport, Tourism and Heritage:
  • Sport
  • Commonwealth Games
  • Gambling and Lotteries
  • Tourism and Heritage
  • Lead Secondary Legislation Minister (including EU Exit SIs)
Diana Barran: DCMS Lords Minister, Parliamentary Under Secretary of State for Civil Society and DCMS
  • All DCMS business in the House of Lords
  • Ceremonials
  • Youth and Social Action
  • Office for Civil Society
  • Loneliness

 

 

New government internet censors...

Oliver Dowden takes over as the Culture Secretary, Julian Knight takes over the chair of the DCMS Select Committee and Ofcom is appointed as the AVMS internet censor


Link Here16th February 2020
Oliver Dowden was appointed Secretary of State for Digital, Culture, Media and Sport on 13 February 2020.

He was previously Paymaster General and Minister for the Cabinet Office, and before that, Parliamentary Secretary at the Cabinet Office. He was elected Conservative MP for Hertsmere in May 2015.

The previous Culture Secretary Nicky Morgan will now be spending more time with her family.

There's been no suggestions that Dowden will diverge from the government path on setting out a new internet censorship regime as outlined in its OnlIne Harms white paper.

Perhaps another parliamentary appointment that may be relevant is that Julian Knight has taken over the Chair of the DCMS Select Committee, the Parliamentary scrutiny body overseeing the DCMS.

Knight seems quite keen on the internet censorship idea and will surely be spurring on the DCMS.

And finally one more censorship appointment was announced by the Government. The government has appointed Ofcom to regulate video-sharing platforms under the audiovisual media services directive, which aims to reduce harmful content on these sites. That will provide quicker protection for some harms and activities and will act as a stepping stone to the full online harms regulatory framework.

 Matt Warman, The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport announced:

We also yesterday appointed Ofcom to regulate video-sharing platforms under the audiovisual media services directive, which aims to reduce harmful content on these sites. That will provide quicker protection for some harms and activities and will act as a stepping stone to the full online harms regulatory framework.

In Fact this censorship process is set to start in September 2020 and in fact Ofcom have already produced their solution that shadows the age verification requirements of the Digital Economy Act but now may need rethinking as some of the enforcement mechanisms, such as ISP blocking, are no longer on the table. The mechanism also only applies to British based online adult companies providing online video. of which there are hardly any left, after previously being destroyed by the ATVOD regime.

 

 

Offsite Article: The Porn Block Failed...


Link Here15th February 2020
Now the Next Ofcom Censorship Bandwagon Begins. By Jerry Barnett

See article from sexandcensorship.org

 

 

The fundamental online harm is for British people to speak freely amongst themselves...

The Government will effectively ban British websites from having forums or comment sections by imposing onerous, vague and expensive censorship requirements on those that defiantly continue.


Link Here12th February 2020
The Government has signalled its approach to introducing internet censorship in a government response to consultation contributions about the Online Harms white paper. A more detailed paper will follow in the spring.

The Government has outlined onerous, vague and expensive censorship requirements on any British website that lets its users post content including speech. Any website that takes down its forums and comment sections etc will escape the nastiness of the new law.

The idea seems to be to force all speech onto a few US and Chinese social media websites that can handle the extensive censorship requirements of the British Governments. No doubt this will give a market opportunity for the US and Chinese internet giants to start charging for forcibly moderated and censored interaction.

The Government has more or less committed to appointing Ofcom as the state internet censor who will be able to impose massive fines on companies and their fall guy directors who allow speech that the government doesn't like.

On a slightly more positive note the government seems to have narrowed down its censorship scope from any conceivable thing that could be considered a harm to someone somewhere into more manageable set that can be defines as harms to children.

The introductory sections of the document read:

Executive summary

1. The Online Harms White Paper set out the intention to improve protections for users online through the introduction of a new duty of care on companies and an independent regulator responsible for overseeing this framework. The White Paper proposed that this regulation follow a proportionate and risk-based approach, and that the duty of care be designed to ensure that all companies have appropriate systems and processes in place to react to concerns over harmful content and improve the safety of their users - from effective complaint mechanisms to transparent decision-making over actions taken in response to reports of harm.

2. The consultation ran from 8 April 2019 to 1 July 2019. It received over 2,400 responses ranging from companies in the technology industry including large tech giants and small and medium sized enterprises, academics, think tanks, children's charities, rights groups, publishers, governmental organisations and individuals. In parallel to the consultation process, we have undertaken extensive engagement over the last 12 months with representatives from industry, civil society and others. This engagement is reflected in the response.

3. This initial government response provides an overview of the consultation responses and wider engagement on the proposals in the White Paper. It includes an in-depth breakdown of the responses to each of the 18 consultation questions asked in relation to the White Paper proposals, and an overview of the feedback in response to our engagement with stakeholders. This document forms an iterative part of the policy development process. We are committed to taking a deliberative and open approach to ensure that we get the detail of this complex and novel policy right. While it does not provide a detailed update on all policy proposals, it does give an indication of our direction of travel in a number of key areas raised as overarching concern across some responses.

4. In particular, while the risk-based and proportionate approach proposed by the White Paper was positively received by those we consulted with, written responses and our engagement highlighted questions over a number of areas, including freedom of expression and the businesses in scope of the duty of care. Having carefully considered the information gained during this process, we have made a number of developments to our policies. These are clarified in the 'Our Response' section below.

5. This consultation has been a critical part of the development of this policy and we are grateful to those who took part. This feedback is being factored into the development of this policy, and we will continue to engage with users, industry and civil society as we continue to refine our policies ahead of publication of the full policy response. We believe that an agile and proportionate approach to regulation, developed in collaboration with stakeholders, will strengthen a free and open internet by providing a framework that builds public trust, while encouraging innovation and providing confidence to investors.

Our response Freedom of expression

1. The consultation responses indicated that some respondents were concerned that the proposals could impact freedom of expression online. We recognise the critical importance of freedom of expression, both as a fundamental right in itself and as an essential enabler of the full range of other human rights protected by UK and international law. As a result, the overarching principle of the regulation of online harms is to protect users' rights online, including the rights of children and freedom of expression. Safeguards for freedom of expression have been built in throughout the framework. Rather than requiring the removal of specific pieces of legal content, regulation will focus on the wider systems and processes that platforms have in place to deal with online harms, while maintaining a proportionate and risk-based approach.

2. To ensure protections for freedom of expression, regulation will establish differentiated expectations on companies for illegal content and activity, versus conduct that is not illegal but has the potential to cause harm. Regulation will therefore not force companies to remove specific pieces of legal content. The new regulatory framework will instead require companies, where relevant, to explicitly state what content and behaviour they deem to be acceptable on their sites and enforce this consistently and transparently. All companies in scope will need to ensure a higher level of protection for children, and take reasonable steps to protect them from inappropriate or harmful content.

3. Services in scope of the regulation will need to ensure that illegal content is removed expeditiously and that the risk of it appearing is minimised by effective systems. Reflecting the threat to national security and the physical safety of children, companies will be required to take particularly robust action to tackle terrorist content and online child sexual exploitation and abuse.

4. Recognising concerns about freedom of expression, the regulator will not investigate or adjudicate on individual complaints. Companies will be able to decide what type of legal content or behaviour is acceptable on their services, but must take reasonable steps to protect children from harm. They will need to set this out in clear and accessible terms and conditions and enforce these effectively, consistently and transparently. The proposed approach will improve transparency for users about which content is and is not acceptable on different platforms, and will enhance users' ability to challenge removal of content where this occurs.

5. Companies will be required to have effective and proportionate user redress mechanisms which will enable users to report harmful content and to challenge content takedown where necessary. This will give users clearer, more effective and more accessible avenues to question content takedown, which is an important safeguard for the right to freedom of expression. These processes will need to be transparent, in line with terms and conditions, and consistently applied.

Ensuring clarity for businesses

6. We recognise the need for businesses to have certainty, and will ensure that guidance is provided to help businesses understand potential risks arising from different types of service, and the actions that businesses would need to take to comply with the duty of care as a result. We will ensure that the regulator consults with relevant stakeholders to ensure the guidance is clear and practicable.

Businesses in scope

7. The legislation will only apply to companies that provide services or use functionality on their websites which facilitate the sharing of user generated content or user interactions, for example through comments, forums or video sharing. Our assessment is that only a very small proportion of UK businesses (estimated to account to less than 5%) fit within that definition. To ensure clarity, guidance will be provided by the regulator to help businesses understand whether or not the services they provide or functionality contained on their website would fall into the scope of the regulation.

8. Just because a business has a social media page that does not bring it in scope of regulation. Equally, a business would not be brought in scope purely by providing referral or discount codes on its website to be shared with other potential customers on social media. It would be the social media platform hosting the content that is in scope, not the business using its services to advertise or promote their company. To be in scope, a business would have to operate its own website with the functionality to enable sharing of user-generated content, or user interactions. We will introduce this legislation proportionately, minimising the regulatory burden on small businesses. Most small businesses where there is a lower risk of harm occurring will not have to make disproportionately burdensome changes to their service to be compliant with the proposed regulation.

9. Regulation must be proportionate and based on evidence of risk of harm and what can feasibly be expected of companies. We anticipate that the regulator would assess the business impacts of any new requirements it introduces. Final policy positions on proportionality will, therefore, align with the evidence of risk of harm and impact to business. Business-to-business services have very limited opportunities to prevent harm occurring to individuals and as such will be out of scope of regulation.

Identity of the regulator

11. We are minded to make Ofcom the new regulator, in preference to giving this function to a new body or to another existing organisation. This preference is based on its organisational experience, robustness, and experience of delivering challenging, high-profile remits across a range of sectors. Ofcom is a well-established and experienced regulator, recently assuming high profile roles such as regulation of the BBC. Ofcom's focus on the communications sector means it already has relationships with many of the major players in the online arena, and its spectrum licensing duties mean that it is practised at dealing with large numbers of small businesses.

12. We judge that such a role is best served by an existing regulator with a proven track record of experience, expertise and credibility. We think that the best fit for this role is Ofcom, both in terms of policy alignment and organisational experience - for instance, in their existing work, Ofcom already takes the risk-based approach that we expect the online harms regulator will need to employ.

Transparency

13. Effective transparency reporting will help ensure that content removal is well-founded and freedom of expression is protected. In particular, increasing transparency around the reasons behind, and prevalence of, content removal may address concerns about some companies' existing processes for removing content. Companies' existing processes have in some cases been criticised for being opaque and hard to challenge.

14. The government is committed to ensuring that conversations about this policy are ongoing, and that stakeholders are being engaged to mitigate concerns. In order to achieve this, we have recently established a multi-stakeholder Transparency Working Group chaired by the Minister for Digital and Broadband which includes representation from all sides of the debate, including from industry and civil society. This group will feed into the government's transparency report, which was announced in the Online Harms White Paper and which we intend to publish in the coming months.

15. Some stakeholders expressed concerns about a potential 'one size fits all' approach to transparency, and the material costs for companies associated with reporting. In line with the overarching principles of the regulatory framework, the reporting requirements that a company may have to comply with will also vary in proportion with the type of service that is being provided, and the risk factors involved. To maintain a proportionate and risk-based approach, the regulator will apply minimum thresholds in determining the level of detail that an in-scope business would need to provide in its transparency reporting, or whether it would need to produce reports at all.

Ensuring that the regulator acts proportionately

16. The consideration of freedom of expression is at the heart of our policy development, and we will ensure that appropriate safeguards are included throughout the legislation. By taking action to address harmful online behaviours, we are confident that our approach will support more people to enjoy their right to freedom of expression and participate in online discussions.

17. At the same time, we also remain confident that proposals will not place an undue burden on business. Companies will be expected to take reasonable and proportionate steps to protect users. This will vary according to the organisation's associated risk, first and foremost, size and the resources available to it, as well as by the risk associated with the service provided. To ensure clarity about how the duty of care could be fulfilled, we will ensure there is sufficient clarity in the regulation and codes of practice about the applicable expectations on business, including where businesses are exempt from certain requirements due to their size or risk.

18. This will help companies to comply with the legislation, and to feel confident that they have done so appropriately.

Enforcement

19. We recognise the importance of the regulator having a range of enforcement powers that it uses in a fair, proportionate and transparent way. It is equally essential that company executives are sufficiently incentivised to take online safety seriously and that the regulator can take action when they fail to do so. We are considering the responses to the consultation on senior management liability and business disruption measures and will set out our final policy position in the Spring.

Protection of children

20. Under our proposals we expect companies to use a proportionate range of tools including age assurance, and age verification technologies to prevent children from accessing age-inappropriate content and to protect them from other harms. This would achieve our objective of protecting children from online pornography, and would also fulfil the aims of the Digital Economy Act.

 

 

I don't believe the government's new internet harm vaccine will work!...

The UK government has been briefing the press about its upcoming internet censorship bill


Link Here6th February 2020
The U.K. government has hinted at its thoughts on its internet censorship plans and has also be giving clues about the schedule.

A first announcement seems to be due this month. It seems that the government is planning a summer bill and implementation within about 18 months.

The plans are set to be discussed in Cabinet on Thursday and are due to be launched to coincide with Safer Internet Day next Tuesday when Baroness Morgan will also publish results of a consultation on last year's White Paper on online harms.

The unelected Nicky Morgan proposes the new regime should mirror regulation in the financial sector, known as senior management liability where firms have to appoint a fall guy director to take personal responsibility for ensuring they meet their legal duties. They face fines and criminal prosecution for breaches.

Ofcom will advise on potential sanctions against the directors ranging from enforcement notices, professional disqualification, fines and criminal prosecution. Under the plans, Ofcom will also draw up legally enforceable codes of practice setting out what the social media firms will be expected to do to protect users from loosely define online harms that may not even be illegal. 

Other legal harms to be covered by codes are expected to include disinformation that causes public harm such as anti-vaccine propaganda, self-harm, harassment, cyberbullying, violence and pornography where there will be tougher rules on age verification to bar children.

Tellingly proposals to include real and actual financial harms such as fraud in the codes have been dropped.

Ministers have yet to decide if to give the internet censor the power to block website access to UK internet users but this option seems out of favour, maybe because it results in massive numbers of people moving to the encrypted internet that makes it harder the authorities to snoop on people's internet activity.

 

 

Extract: Porn is not the root of all evil...

As a sociology professor, I've studied and conducted my own research about the myths parents have accepted. Let me tell you, adult entertainment isn't nearly as damaging as poor sex education


Link Here5th February 2020
Full story: BBFC Internet Porn Censors...BBFC: Age Verification We Don't Trust

  A new British Board of Film Classification (BBFC) study has found that British teenagers regularly watch porn, and that parents are unaware or in denial about it. Surprising? No. But the results do reveal something interesting about how readily we accept so many misconceptions about porn.

As media reports on the study indicates, young people are watching porn without their parents knowing. My own research found that young people chose to watch porn and found ways around their parents' strategies to stop them. The problems they encountered related to their parents' punishments and not their porn consumption.

The theory of why porn is harmful is that consumers will start to adopt troubling aspects of it in their attitudes and behaviours. But that's a limited and simplistic perspective of how people watch porn. It is a monkey see, monkey do theory of consumption where people are passive consumers of a script that they entirely accept and then act out themselves. It is far easier to blame porn for young people's sexual interests than to recognize that young people are interested in sex.

We are now almost inundated with research that counters the harmful porn narrative. Research is also now focusing on the potential benefits of watching porn. Just as the BBFC study reportedly finds young people watching porn as a form of sex education, research documents several educational benefits for young people: helping them understand their sexual identities, explore sexual fantasies in a safe environment, and educate themselves about sexual health.

...Read the full article from independent.co.uk

 

 

The ethics of censorship...

DCMS group calls for new law in the Online Harms Bill to give the government oversight into algorithms used by social media companies


Link Here4th February 2020
The Centre for Data Ethics and Innovation does is part of the Department for Digital, Culture, Media & Sport. It's tasked by the Government to connect policymakers, industry, civil society, and the public to develop the 'right' governance regime for data-driven technologies.

The group has just published its final report into the control of social media and their 'algorithms' in time for their suggestions to be incorporated into the government's upcoming internet censorship bill.

Maybe the term 'algorithm' has been used to imply some sort of manipulative menace that secretly drives social media. In fact the algorithm isn't likely to be far away from: Give them more of what they like, and maybe also try them with what their mates like. No doubt the government would prefer something more like: Give them more of what the government likes.

Anyway the press release reads:

The CDEI publishes recommendations to make online platforms more accountable, increase transparency, and empower users to take control of how they are targeted. These include:

  • New systemic regulation of the online targeting systems that promote and recommend content like posts, videos and adverts.

  • Powers to require platforms to allow independent researchers secure access to their data to build an evidence base on issues of public concern - from the potential links between social media use and declining mental health, to its role in incentivising the spread of misinformation

  • Platforms to host publicly accessible online archives for 'high-risk' adverts, including politics, 'opportunities' (e.g. jobs, housing, credit) and age-restricted products.

  • Steps to encourage long-term wholesale reform of online targeting to give individuals greater control over how their online experiences are personalised.

The CDEI recommendations come as the government develops proposals for online harms regulation.

The Centre for Data Ethics and Innovation (CDEI), the UK's independent advisory body on the ethical use of AI and data-driven technology, has warned that people are being left in the dark about the way that major platforms target information at their users, in its first report to the government.

The CDEI's year long review of online targeting systems - which use personal information about users to decide which posts, videos and adverts to show them - has found that existing regulation is out of step with the public's expectations.

A major new analysis of public attitudes towards online targeting, conducted with Ipsos MORI, finds that people welcome the convenience of targeting systems, but are concerned that platforms are unaccountable for the way their systems could cause harm to individuals and society, such as by increasing discrimination and harming the vulnerable. The research highlighted most concern was related to social media platforms.

The analysis found that only 28% of people trust platforms to target them in a responsible way, and when they try to change settings, only one-third (33%) of people trust these companies to do what they ask. 61% of people favoured greater regulatory oversight of online targeting, compared with 17% of people who support self-regulation.

The CDEI's recommendations to the government would increase the accountability of platforms, improve transparency and give users more meaningful control of their online experience.

The recommendations strike a balance by protecting users from the potential harms of online targeting, without inhibiting the kind of personalisation of the online experience that the public find useful. Clear governance will support the development and take-up of socially beneficial applications of online targeting, including by the public sector.

The report calls for internet regulation to be developed in a way that promotes human rights-based international norms, and recommends that the online harms regulator should have a statutory duty to protect and respect freedom of expression and privacy.

And from the report:

Key recommendations

Accountability

The government's new online harms regulator should be required to provide regulatory oversight of targeting:

  • The regulator should take a "systemic" approach, with a code of practice to set standards, and require online platforms to assess and explain the impacts of their systems.

  • To ensure compliance, the regulator needs information gathering powers. This should include the power to give independent experts secure access to platform data to undertake audits.

  • The regulator's duties should explicitly include protecting rights to freedom of expression and privacy.

  • Regulation of online targeting should encompass all types of content, including advertising.

  • The regulatory landscape should be coherent and efficient. The online harms regulator, ICO, and CMA should develop formal coordination mechanisms.

The government should develop a code for public sector use of online targeting to promote safe, trustworthy innovation in the delivery of personalised advice and support.

Transparency

  • The regulator should have the power to require platforms to give independent researchers secure access to their data where this is needed for research of significant potential importance to public policy.

  • Platforms should be required to host publicly accessible archives for online political advertising, "opportunity" advertising (jobs, credit and housing), and adverts for age-restricted products.

  • The government should consider formal mechanisms for collaboration to tackle "coordinated inauthentic behaviour" on online platforms.

User empowerment

Regulation should encourage platforms to provide people with more information and control:

  • We support the CMA's proposed "Fairness by Design" duty on online platforms.

  • The government's plans for labels on online electoral adverts should make paid-for content easy to identify, and give users some basic information to show that the content they are seeing has been targeted at them.

  • Regulators should increase coordination of their digital literacy campaigns. The emergence of "data intermediaries" could improve data governance and rebalance power towards users. Government and regulatory policy should support their development.

 

 

Young People, Pornography and Age-verification...

Research commissioned by the BBFC reveals that internet porn is part of normal life for 16 and 17 year olds, just like the over 18s


Link Here 31st January 2020
Full story: BBFC Internet Porn Censors...BBFC: Age Verification We Don't Trust
The most immediately interesting point is that the BBFC has elected not to promote the research that they commissioned and not to publish it on their website. Maybe this simply reflects that the BBFC no longer has the job of internet porn censor. The job looks set to be handed over to Ofcom as part of the government's upcoming online harms bill.

The study by Revealing Reality combined a statistically representative survey of secondary school-age children with in-depth interviews and focus groups with parents. It found that adult material was a prominent feature in British childhood. Almost half of teenagers aged 16 and 17 said they had recently seen pornography, with the researchers believing this figure is substantially lower than the true figure because of respondents' awkwardness when faced with the question.

While 75% of parents did not believe their children would have watched pornography, the majority of these parents' children told the researchers that they had viewed adult material.

The report also found that while parents thought their sons would watch pornography for sexual pleasure, many erroneously believed their daughters would primarily see pornography by accident. It said: This is contrary to the qualitative research findings showing that many girls were also using pornography for sexual pleasure.

The researchers said that one side effect of early exposure to online pornography is that gay, lesbian or bisexual respondents often understood their sexuality at a younger age. It was common for these respondents to start by watching heterosexual pornography, only to realise that they did not find this sexually gratifying and then gradually move to homosexual pornography.

The research very much affirms the government campaign to seek restrictions on porn access for children and notes that such measures as age verification requirements are unsurprisingly supported by parents.

However the research includes a very interesting section on the thoughts of 16 and 17 year olds who have passed the age of consent and unsurprisingly use porn on just about the same way as adults who have nominally passed the official, but not the biological and hormonal, age of maturity.

The report uses the term 'young people' to mean 16 - 18 year olds (included in the survey as speaking about their views and experiences as 16 and 17 year olds). The report notes:

While recognising the benefits of preventing younger children accessing pornography, young people had some concerns about age-verification restrictions. For example, some young people were worried that, in the absence of other adequate sources of sex education, they would struggle to find ways to learn about sex without pornography.

This was felt particularly strongly by LGB respondents in the qualitative research, who believed that pornography had helped them to understand their sexuality and learn about different types of sexual behaviours that they weren't taught in school.

Some young people also felt that the difference in the age of consent for having sex20416204and the age at which age-verification is targeted20418204was contradictory. They also struggled to understand why, for instance, they could serve in the armed forces and have a family and yet be blocked from watching pornography.

Young people also seemed well versed in knowing methods of working around age verification and website blocking:

The majority of parents and young people (aged 16 to 18) interviewed in the qualitative research felt that older children would be able to circumvent age-verification by a range of potential online workarounds. Additionally, many 16- to 18-year-olds interviewed in the qualitative work who could not identify a workaround at present felt they would be able to find a potential method for circumventing age-verification if required.

Some of the most commonly known workarounds that older children thought may potentially negate
age-verification included:

  • Using a VPN to appear as if you are accessing adult content from elsewhere in the world
  • Torrenting files by downloading the data in chunks
  • Using Tor (the ‘onion’ router) to disguise the user’s location
  • By accessing the dark web
  • By using proxy websites

Maybe the missed another obvious workaround, sharing porn amongst themselves via internet messaging or memory sticks.

 

 

Offsite Article: Harmful government...


Link Here26th January 2020
Internet regulation is necessary but an overzealous Online Harms bill could harm our rights. By Michael Drury and Julian Hayes

See article from euronews.com

 

 

A Brexit bounty...

UK Government wisely decides not to adopt the EU's disgraceful Copyright Directive that requires YouTube and Facebook to censor people's uploads if they contain even a snippet of copyrighted material


Link Here25th January 2020
Full story: Copyright in the EU...Copyright law for Europe
Universities and Science Minister Chris Skidmore has said that the UK will not implement the EU Copyright Directive after the country leaves the EU.

Several companies have criticised the disgraceful EU law, which would hold them accountable for not removing copyrighted content uploaded by users.

EU member states have until 7 June 2021 to implement the new reforms, but the UK will have left the EU by then.

It was Article 13 which prompted fears over the future of memes and GIFs - stills, animated or short video clips that go viral - since they mainly rely on copyrighted scenes from TV and film. Critics noted that Article 13 would make it nearly impossible to upload even the tiniest part of a copyrighted work to Facebook, YouTube, or any other site.

Other articles give the news industry total copyright control of news material that people have previously been widely used in people's blogs and posts commenting on the news.

Prime Minister Boris Johnson criticised the law in March, claiming that it was terrible for the internet.

Google had campaigned fiercely against the changes, arguing they would harm Europe's creative and digital industries and change the web as we know it. YouTube boss Susan Wojcicki had also warned that users in the EU could be cut off from the video platform.

 

 

Updated: Children likely to prove toxic to a website's monetisation...

ICO backs off a little from an age gated internet but imposes masses of red tape for any website that is likely to be accessed by under 18s


Link Here 23rd January 2020
Full story: ICO Age Appropriate Design...ICO calls for age assurance for websites accessed by children
The Information Commissioner's Office (ICO) has just published its Age Appropriate Design Code:

The draft was published last year and was opened to a public consultation which came down heavily against ICO's demands that website users should be age verified so that the websites could tailor data protection to the age of the user.

Well in this final release ICO has backed off from requiring age verification for everything, and instead suggested something less onerous called age 'assurance'. The idea seems to be that age can be ascertained from behaviour, eg if a YouTube user watches Peppa Pig all day then one can assume that they are of primary school age.

However this does seem lead to a loads of contradictions, eg age can be assessed by profiling users behaviour on the site, but the site isn't allowed to profile people until they are old enough to agree to this. The ICO recognises this contradiction but doesn't really help much with a solution in practice.

The ICO defines the code as only applying to sites likely to be accessed by children (ie websites appealing to all ages are considered caught up by the code even though they are not specifically for children.

On a wider point the code will be very challenging to monetisation methods for general websites. The code requires website to default to no profiling, no geo-location, no in-game sales etc. It assumes that adults will identify themselves and so enable all these things to happen. However it may well be that adults will quite like this default setting and end up not opting for more, leaving the websites without income.

Note that these rules are in the UK interpretation of GDPR law and are not actually in the European directive. So they are covered by statute, but only in the UK. European competitors have no equivalent requirements.

The ICO press release reads:

Today the Information Commissioner's Office has published its final Age Appropriate Design Code -- a set of 15 standards that online services should meet to protect children's privacy.

The code sets out the standards expected of those responsible for designing, developing or providing online services like apps, connected toys, social media platforms, online games, educational websites and streaming services. It covers services likely to be accessed by children and which process their data.

The code will require digital services to automatically provide children with a built-in baseline of data protection whenever they download a new app, game or visit a website.

That means privacy settings should be set to high by default and nudge techniques should not be used to encourage children to weaken their settings. Location settings that allow the world to see where a child is, should also be switched off by default. Data collection and sharing should be minimised and profiling that can allow children to be served up targeted content should be switched off by default too.

Elizabeth Denham, Information Commissioner, said:

"Personal data often drives the content that our children are exposed to -- what they like, what they search for, when they log on and off and even how they are feeling.

"In an age when children learn how to use an iPad before they ride a bike, it is right that organisations designing and developing online services do so with the best interests of children in mind. Children's privacy must not be traded in the chase for profit."

The code says that the best interests of the child should be a primary consideration when designing and developing online services. And it gives practical guidance on data protection safeguards that ensure online services are appropriate for use by children.

Denham said:

"One in five internet users in the UK is a child, but they are using an internet that was not designed for them.

"There are laws to protect children in the real world -- film ratings, car seats, age restrictions on drinking and smoking. We need our laws to protect children in the digital world too.

"In a generation from now, we will look back and find it astonishing that online services weren't always designed with children in mind."

The standards of the code are rooted in the General Data Protection Regulation (GDPR) and the code was introduced by the Data Protection Act 2018. The ICO submitted the code to the Secretary of State in November and it must complete a statutory process before it is laid in Parliament for approval. After that, organisations will have 12 months to update their practices before the code comes into full effect. The ICO expects this to be by autumn 2021.

This version of the code is the result of wide-ranging consultation and engagement.

The ICO received 450 responses to its initial consultation in April 2019 and followed up with dozens of meetings with individual organisations, trade bodies, industry and sector representatives, and campaigners.

As a result, and in addition to the code itself, the ICO is preparing a significant package of support for organisations.

The code is the first of its kind, but it reflects the global direction of travel with similar reform being considered in the USA, Europe and globally by the Organisation for Economic Co-operation and Development (OECD).

Update: The legals

23rd January 2020. See article from techcrunch.com

Schedule

The code now has to be laid before parliament for approval for a period of 40 sitting days -- with the ICO saying it will come into force 21 days after that, assuming no objections. Then there's a further 12 month transition period after it comes into force.

Obligation or codes of practice?

Neil Brown, an Internet, telecoms and tech lawyer at Decoded Legal explained:

This is not, and will not be, 'law'. It is just a code of practice. It shows the direction of the ICO's thinking, and its expectations, and the ICO has to have regard to it when it takes enforcement action but it's not something with which an organisation needs to comply as such. They need to comply with the law, which is the GDPR [General Data Protection Regulation] and the DPA [Data Protection Act] 2018.

Right now, online services should be working out how to comply with the GDPR, the ePrivacy rules, and any other applicable laws. The obligation to comply with those laws does not change because of today's code of practice. Rather, the code of practice shows the ICO's thinking on what compliance might look like (and, possibly, goldplates some of the requirements of the law too).

Comment: ICO pushes ahead with age gates

23rd January 2020. See article from openrightsgroup.org

The ICO's Age Appropriate Design Code released today includes changes which lessen the risk of widespread age gates, but retains strong incentives towards greater age gating of content.

Over 280 ORG supporters wrote to the ICO about the previous draft code, to express concerns with compulsory age checks for websites, which could lead to restrictions on content.

Under the code, companies must establish the age of users, or restrict their use of data. ORG is concerned that this will mean that adults only access websites when age verified creating severe restrictions on access to information.

The ICO's changes to the Code in response to ORG's concerns suggest that different strategies to establish age may be used, attempting to reduce the risk of forcing compulsory age verification of users.

However, the ICO has not published any assessment to understand whether these strategies are practical or what their actual impact would be.

The Code could easily lead to Age Verification through the backdoor as it creates the threat of fines if sites have not established the age of their users.

While the Code has many useful ideas and important protections for children, this should not come at the cost of pushing all websites to undergo age verification of users. Age Verification could extend through social media, games and news publications.

Jim Killock, Executive Director of Open Rights Group said:

The ICO has made some useful changes to their code, which make it clear that age verification is not the only method to determine age.

However, the ICO don't know how their code will change adults access to content in practice. The new code published today does not include an Impact Assessment. Parliament must produce one and assess implications for free expression before agreeing to the code.

Age Verification demands could become a barrier to adults reaching legal content, including news, opinion and social media. This would severely impact free expression.

The public and Parliament deserve a thorough discussion of the implications, rather than sneaking in a change via parliamentary rubber stamping with potentially huge implications for the way we access Internet content.

 

 

Unsafe law...

Elspeth Howe introduces a House of Lords Bill Private Member's Bill to resurrect the deeply flawed and unsafe age verification requirements for adult porn websites


Link Here22nd January 2020
Full story: BBFC Internet Porn Censors...BBFC: Age Verification We Don't Trust
The Age Verification for porn requirements inlcuded in the 2017 Digital Economy Acts were formally cancelled in October 2019. The 2017 was deeply flawed in omission of any effective requirements to keep porn users identity and browsing data safe. In addition the regime of enforcing the rules through BBFC censorship and ISP blocking were proving troublesome and expensive.

It is also interesting to note that the upcoming Online Harms bill has also been stripped of its ISP blocking enforcement options. I suspect that the police and security forces would rather not see half the population hidng their internet usage behind Tor and VPNs just so they can continue accessing porn.

For whatever reasons the government quite rightly considered that it would be a whole lot easier just to fine companies when they get it wrong and leave all the expensive technical details of how to to do this to the websites themselves. (This approach has worked well for gambling websites, where AV has been achieved without having to employ censors to make them toe the line).

So I don't tink the government will be interested in supporting the virtue signal lords and the bill will not be given time to get anywhere.

Elspeth Howe's short bill was introduced in the House of Lords on 21st January 2020 and reads:

Digital Economy Act 2017 (Commencement of Part 3) Bill

A bill to bring into force the remaining sections of Part 3 of the Digital Economy Act 2017.

1 The Secretary of State must make regulations under section 118(6) (commencement) of the Digital Economy Act 2017 to ensure that all provisions under Part 3 (online pornography) of that Act have come into force before 4 January 2021.

2 Extent, commencement and short title:

  1. This Act extends to England and Wales, Scotland and Northern Ireland.

  2. This Act comes into force on the day on which it is passed.

  3. This Act may be cited as the Digital Economy Act 2017 (Commencement of Part 3) Act 2020.

 

 

Commented: Verified as out of pocket...

Four companies hoping to profit from cancelled porn age verification go to court seeking compensation from the government


Link Here18th January 2020
Full story: BBFC Internet Porn Censors...BBFC: Age Verification We Don't Trust
Four age verification companies have launched legal action to challenge the government's decision to cancel the censorship scheme requiring age verification for access to internet porn. The companies have lodged a judicial review at the High Court Thursday.

The Telegraph understands the companies are arguing the decision was an abuse of power as the move had been approved by parliament. They are also claiming damages, understood to be in the region of £3 million, for losses sustained developing age verification technology.

The four companies behind the judicial review - AgeChecked Ltd, VeriMe, AVYourself and AVSecure - are arguing the secretary of state only had power to choose when the scheme came into force, not scrap it in the form passed by Parliament.

The legal action has been backed by campaigners from the Children's Charities' Coalition for Internet Safety (CCCIS), which represents organisations including the NSPCC and Barnardo's.

The CEO of AVSecure, Stuart Lawley, a British tech entrepreneur who made his fortune in the dotcom boom, said he had personally lost millions creating the technology. He said the company, which is behind other parental control apps such as Ageblock, had been preparing for up to 10 million people signing up for the service on day one.

Comment: Age Verification Judicial Review endangers UK citizens' privacy

18th January 2020. See article from openrightsgroup.org

Reacting to the Judicial Review launched by Tech companies to force Age Verification for adult content to be implemented Jim Killock, Executive Director of the Open Rights Group said:

These companies are asking us to trust them with records of millions of people's sexual preferences, with huge commercial reasons to use that data for profiling and advertising.

The adult industry has a terrible record on data security. We're being asked to hope they don't repeat the many, many times they have lost personal data, with the result that blackmail scams and worse proliferates.

The government did the responsible thing when it admitted its plans were not ready to proceed. Age Verification must not be pushed forward until there is compulsory privacy regulation put in place.

The companies behind the legal action are not subject to tight privacy regulations. Instead, the government can only ask for voluntary privacy commitments.

General data protection law is not sufficient for this industry as data breaches of this nature cannot be fixed by fines. They need to be prevented by the toughest and most specific regulation available.

 

 

Harmful haste when previous law failures suggest more careful consideration is required...

Sky boss supports a bill just launched in the House of Lords to hasten the appointment of Ofcom as the UK's internet censor


Link Here17th January 2020
Sky's chief executive Jeremy Darroch has urged the Government to speed up its plans for an online censor as a bill to appoint Ofcom to the job was introduced in the House of Lords on Tuesday.

Darroch has written to all MPs asking for their support in establishing an internet censor that that will tackle supposed online harms.

Darroch's beef seems to be what he described as the prolific spread of misinformation, online abuse and fake news in last month's general election. He claimed it had shown the damage that unregulated online platforms are doing to our society.

A DCMS spokesperson declined to say how soon it may be before a draft bill is introduced, but Culture Secretary Nicky Morgan pledged in a speech yesterday to develop a media literacy strategy to be published in the summer, which is expected to come before the online harms legislation.

Johnson plans to precede the online harms bill with interim codes of practice ordering tech companies to clamp down on use of their platforms by terrorists and those engaged in child sexual abuse and exploitation.

Tom McNally's Online Harms Reduction Regulator (Report) Bill started its journey in the House of Lords on Tuesday. He has said he prepared the bill to keep up a momentum I fear may be lost and to provide a platform for wider public debate. Th bill appoints Ofcoms as the UK's internet censors and tasks it with preparing for the introduction of a statutory duty of care obligation for online platforms. Ofcom would have to prepare a report with recommendations on how this should be done, and the Government would be forced to produce its draft bill within a year from the publication of this report.
 

 

 

But the idea failed because politicians like her didn't give a damn about the safety of adults...

Elspeth Howe threatens a House of Lords private members bill to reactivate the recently cancelled age verification censorship scheme for internet porn


Link Here12th January 2020
Full story: BBFC Internet Porn Censors...BBFC: Age Verification We Don't Trust
Elspeth Howe announced her intentions in a House of Lords debate about the Queen's Speech. She said:

My Lords, I welcome the Government's commitment to introduce its online harms Bill to improve internet safety for all, but, equally, stress that I remain deeply concerned by their failure to implement Part 3 of the Digital Economy Act. The rationale for focusing on the new Bill instead seems to be a desire to put attempts to protect children from pornographic websites on the same footing as attempts to protect them on social media platforms. It is entirely right to seek to promote safety in both contexts, but a basic error to suggest that both challenges should be addressed in the same way. The internet is complicated and one-size-fits-all policies simply will not work.

The focus of what I have read about the Government's plans for online harms revolves around social media companies and fining them if they do not do what they are supposed to do under a new legal duty of care. An article in the Times on 31 December suggested that Ofcom is going to draw up legally enforceable codes of practice that will include protecting children from accessing pornography. This approach may work for social media platforms if they have bases in the UK but it will be absolutely useless at engaging with the challenge of protecting children from pornographic websites.

Initially when the Digital Economy Bill was introduced in another place, the proposal was that statutory age-verification requirements should be enforced through fines, but a cross-party group of MPs pointed out that this would never work because the top 50 pornographic websites accessed in the UK are all based in other jurisdictions. One could certainly threaten fines but it would be quite impossible to enforce them in a way that would concentrate the minds of website providers because, based in other jurisdictions, they could simply ignore them.

Because of that, MPs amended the Bill to give the regulator the option of IP blocking. This would enable the regulator to tell a site based in say, Russia, that if it failed to introduce robust age-verification checks within a certain timeframe, the regulator would block it from accessing the UK market. Children would be protected either by the site being blocked after the specified timeframe or, more probably, by the site deciding that it would make more sense for it to introduce proper age-verification checks rather than risk disruption of its UK income stream. The Government readily accepted the amendment because the case for it was unanswerable.

I say again that I welcome the fact that the Government want to address online safety with respect to social media platforms through their new Bill. This must not, however, be used as an excuse not to proceed with implementing Part 3 of the Digital Economy Bill, which provides the very best way of dealing with the different challenge of protecting children from pornographic websites.

The failure to implement this legislation is particularly concerning because, rather than being a distant aspiration, it is all there on the statute book. The only thing standing in the way of statutory age verification with respect to pornographic websites is the Government's delay in relaying the BBFC age-verification guidance before Parliament and setting an implementation date. Having the capacity to deal with this problem204thanks to Part 3 of the Digital Economy Act204yet not bothering to avail ourselves of it does not reflect at all well on either the Government or British society as a whole. The Government must stop procrastinating over child safety with respect to pornographic websites and get on with implementing Part 3.

Mindful of that, on 21 January I will introduce my Digital Economy Act 2017 (commencement of Part 3) Bill, the simple purpose of which will be to implement Part 3 of the Digital Economy Act .

I hope that that will not be necessary and that the Minister will today confirm that, notwithstanding the new online safety Bill, the Government will now press ahead with implementation themselves. I very much look forward to hearing what the Minister has to say.

 

 

Offsite Article: Britain's Digital Nanny State...


Link Here7th January 2020
The way in which the UK is approaching the regulation of social media will undermine privacy and freedom of expression and have a chilling effect on Internet use by everyone in Britain. By Bill Dutton

See article from billdutton.me


 2013   2014   2015   2016   2017   2018   2019   2020   2021   2022   2023   2024   Latest 
Jan-March   April-June   July-Sept   Oct-Dec    

melonfarmers icon

Home

Top

Index

Links

Search
 

UK

World

Media

Liberty

Info
 

Film Index

Film Cuts

Film Shop

Sex News

Sex Sells
 
 

 
UK News

UK Internet

UK TV

UK Campaigns

UK Censor List
ASA

BBC

BBFC

ICO

Ofcom
Government

Parliament

UK Press

UK Games

UK Customs


Adult Store Reviews

Adult DVD & VoD

Adult Online Stores

New Releases/Offers

Latest Reviews

FAQ: Porn Legality
 

Sex Shops List

Lap Dancing List

Satellite X List

Sex Machines List

John Thomas Toys