|
UK Online Censorship Bill set to continue after 'tweaks'
|
|
|
| 16th September 2022
|
|
| See article from techdirt.com |
After a little distraction for the royal funeral, the UK's newly elected prime minister has said she will be continuing with the Online Censorship Bill. She said: We will be proceeding with the Online Safety Bill. There
are some issues that we need to deal with. What I want to make sure is that we protect the under-18s from harm and that we also make sure free speech is allowed, so there may be some tweaks required, but certainly he is right that we need to protect
people's safety online.
TechDirt comments: This is just so ridiculously ignorant and uninformed. The Online Safety Bill is a disaster in waiting and I wouldn't be surprised if some websites chose to
exit the UK entirely rather than continue to deal with the law. It won't actually protect the children, of course. It will create many problems for them. It won't do much at all, except make internet companies question whether
it's even worth doing business in the UK.
|
|
Former UK Supreme Court judge savages the government's censorship bill
|
|
|
|
18th August 2022
|
|
| See article from spectator.co.uk by Jonathan Sumption
|
Weighing in at 218 pages, with 197 sections and 15 schedules, the Online Safety Bill is a clunking attempt to regulate content on the internet. Its internal contradictions and exceptions, its complex paper chase of definitions, its weasel language
suggesting more than it says, all positively invite misunderstanding. Parts of it are so obscure that its promoters and critics cannot even agree on what it does. The real vice of the bill is that its provisions are not limited to
material capable of being defined and identified. It creates a new category of speech which is legal but harmful. The range of material covered is almost infinite, the only limitation being that it must be liable to cause harm to some people.
Unfortunately, that is not much of a limitation. Harm is defined in the bill in circular language of stratospheric vagueness. It means any physical or psychological harm. As if that were not general enough, harm also extends to anything that may increase
the likelihood of someone acting in a way that is harmful to themselves, either because they have encountered it on the internet or because someone has told them about it. This test is almost entirely subjective. Many things which
are harmless to the overwhelming majority of users may be harmful to sufficiently sensitive, fearful or vulnerable minorities, or may be presented as such by manipulative pressure groups. At a time when even universities are warning adult students
against exposure to material such as Chaucer with his rumbustious references to sex, or historical or literary material dealing with slavery or other forms of cruelty, the harmful propensity of any material whatever is a matter of opinion. It will vary
from one internet user to the next. If the bill is passed in its current form, internet giants will have to identify categories of material which are potentially harmful to adults and provide them with options to cut it out or
alert them to its potentially harmful nature. This is easier said than done. The internet is vast. At the last count, 300,000 status updates are uploaded to Facebook every minute, with 500,000 comments left that same minute. YouTube adds 500 hours of
videos every minute. Faced with the need to find unidentifiable categories of material liable to inflict unidentifiable categories of harm on unidentifiable categories of people, and threatened with criminal sanctions and enormous regulatory fines (up to
10 per cent of global revenue). What is a media company to do? The only way to cope will be to take the course involving the least risk: if in doubt, cut it out. This will involve a huge measure of regulatory overkill. A new era
of intensive internet self-censorship will have dawned. See full article from spectator.co.uk
|
|
British Computer Society experts are not impressed by The Online Censorship Bill
|
|
|
| 15th August
2022
|
|
| See article from bcs.org See
BSC report [pdf] from bcs.org |
Plans to compel social media platforms to tackle online harms are not fit for purpose according to a new poll of IT experts. Only 14% of tech professionals believed the Online Harms Bill was fit for purpose, according to the
survey by BCS, The Chartered Institute for IT. Some 46% said the bill was not workable, with the rest unsure. The legislation would have a negative effect on freedom of speech, most IT specialists (58%)
told BCS. Only 19% felt the measures proposed would make the internet safer, with 51% saying the law would not make it safer to be online. There were nearly 1,300 responses from tech professionals to the
survey by BCS. Just 9% of IT specialists polled said they were confident that legal but harmful content could be effectively and proportionately removed. Some 74% of tech specialists said they felt the bill
would do nothing to stop the spread of disinformation and fake news.
|
|
WhatsApp would rather be blocked in Britain rather than submit to UK demands for encryption backdoors
|
|
|
| 31st July 2022
|
|
| See article from bbc.co.uk |
The boss of WhatsApp says it will not lower the security of its messenger service. Will Cathcart told the BBC. If asked by the government to weaken encryption, it would be very foolish to accept. We
continue to work with the tech sector to support the development of innovative technologies that protect public safety without compromising on privacy. End-to-end encryption (E2EE) provides the most robust level of security, because - by
design - only the intended recipient holds the key to decrypt the message, which is essential for private communication. The technology underpins the online exchanges on apps including WhatsApp and Signal and - optionally - on Facebook messenger and
Telegram. Only the sender and receiver can read those messages - not law enforcement or the technology giants. The UK government wants phone software to scan people's phones for banned material prior to being encrypted for a message. Cathcart explained:
Client-side scanning cannot work in practice. Because millions of people use WhatsApp to communicate across the world, it needs to maintain the same standards of privacy across every country. If
we had to lower security for the world, to accommodate the requirement in one country, that...would be very foolish for us to accept, making our product less desirable to 98% of our users because of the requirements from 2%. What's being proposed is that we - either directly or indirectly through software - read everyone's messages. I don't think people want that.
Ella Jakubowska, policy adviser at campaign group European Digital Rights, said: Client-side scanning is almost like putting spyware on every person's phone. It also creates a backdoor for malicious actors
to have a way in to be able to see your messages.
|
|
whilst we still can!
|
|
|
|
31st July 2022
|
|
| |
Offsite Comment: Fixing the UK's Online Safety Bill, part 1: We need answers. 31st July 2022. See
article from webdevlaw.uk
by Heather Burns
Offsite Comment: The delay to the online safety bill It won't make it any easier to please everyone 17th July 2022. See
article from theguardian.com by Alex Hern
Offsite Comment: It’s time to kill the Online Safety Bill for good... Not only is it bad for business, bad for free speech, and -- by attacking encryption -- bad for online safety 16th July 2022. See
article from spectator.co.uk by Sam Ashworth-Hayes
|
|
|
|
|
|
22nd July 2022
|
|
|
GCHQ boss calls for snooping into people's phones as a backdoor to strong encryption See article from theregister.com
|
|
The government decides against introducing laws to ban loot boxes in video games
|
|
|
|
19th July 2022
|
|
| See article from
theguardian.com |
The video game monetisation method of loot boxes will not be banned in the UK, despite a government consultation claiming evidence of an association between the features and problem gambling. Loot boxes have attracted comparison with gambling because
they allow players to spend money to unlock in-game rewards, such as special characters, weapons or outfits, without knowing exactly what they will get. The features, popular in games such as Call of Duty and the Fifa football series, were
effectively banned in Belgium in 2018, but the censorship culture minister, Nadine Dorries, said the UK would not follow suit. Instead, after a 22-month consultation, she said the government would discuss tougher industry-led protections with the
UK's gaming trade. Dorries explained the decision saying that Legislating to impose curbs or a prohibition on loot boxes as part of an expected overhaul of the UK's gambling laws could have unintended consequences.
For example, legislation to introduce an outright ban on children purchasing loot boxes could have the unintended effect of more children using adult accounts, and thus having more limited parental oversight of their play and
spending, the government said, in a response to the consultation published in the early hours of Sunday morning. While the Department for Digital, Culture, Media and Sport (DCMS) stopped short of proposing legislation, Dorries said:
Children and young people should not be able to purchase loot boxes without parental approval.
|
|
|
|
|
| 7th July 2022
|
|
|
...er the same sleazy party loving people that gave you one rule for them and one rule for us! See article from reprobatepress.com
|
|
Legal analysis of UK internet censorship proposals
|
|
|
| 5th
July 2022
|
|
| |
Offsite Article: French lawyers provide the best summary yet 15th June 2022. See article
from taylorwessing.com Offsite Article: Have we opened Pandora's box? 20th June 2022. See
article from tandfonline.com
Abstract In thinking about the developing online harms regime (in the UK and elsewhere1) it is forgivable to think only of how laws placing responsibility on social media platforms to prevent hate speech may benefit
society. Yet these laws could have insidious implications for free speech. By drawing on Germany's Network Enforcement Act I investigate whether the increased prospect of liability, and the fines that may result from breaching the duty of care in the
UK's Online Safety Act - once it is in force - could result in platforms censoring more speech, but not necessarily hate speech, and using the imposed responsibility as an excuse to censor speech that does not conform to their objectives. Thus, in
drafting a Bill to protect the public from hate speech we may unintentionally open Pandora's Box by giving platforms a statutory justification to take more control of the message. See full
article from tandfonline.com Offsite Article: The Online Safety
Act - An Act of Betrayal 5th July 2022. See article from ukcolumn.org by Iain Davis
The Online Safety Bill (OSB) has been presented to the public as an attempt to protect children from online grooming and abuse and to limit the reach of terrorist propaganda. This, however, does not seem to be its primary focus.
The real objective of the proposed Online Safety Act (OSA) appears to be narrative control.
|
|
|