|
Anti-porn campaigner and president of the BBFC calls (disgracefully in a paywalled article) for further censorship of porn
|
|
|
| 25th March 2023
|
|
| From the Telegraph |
Natasha Kaplinsky, anti-porn campaigner and president of the British Board of Film Classification (BBFC), said in an exclusive paywalled article for the Sunday Telegraph, said that the Government needed not only to introduce tough age verification to
protect children but also to take action to restrict young adults from accessing the welter of illegal violent and abusive porn available online. Kaplinsky, who is also president of the UK's biggest children's charity Barnardo's, is calling for
amendments to the Online Safety Bill, currently before the House of Lords, that would bring the legislative treatment of porn on the internet in line with the restrictions that the BBFC polices in the offline world. Kaplinsky cited a recent
parliamentary report which revealed illegal porn was readily accessible online including depictions of rape, incest and sexual violence. She said: This was because the offline regulation of legitimate porn overseen by
the BBFC was not mirrored online and the Government's Bill as written did not plug this loophole. This meant content that would be illegal to distribute offline will continue to be legally available online,
She claimed that this
attempt at further internet censorship was not an attempt at censorship: To be clear: this is not about limiting the freedom of adults to access legal pornographic material. This is about the
regulation of appalling content that eroticises rape and the violent abuse of women, or which promotes an interest in abusive relationships. There is a big difference. It is only logical that where content is unacceptable offline, we as a society should
say it is unacceptable online too.
Presumably her reference to promoting an interest in abusive relationships is a reference to the plethora of 'step family' porn, but it must be noted that the BBFC has passed such material
R18, eg see That's Right, She's My Step Sister...so What from bbfc.co.uk Attempts to extend the
censorship of porn online are expected next month when it comes before the Lords. |
|
UK Government considers banning TikTok over fears of Chinese snooping on users or else controlling their newsfeed
|
|
|
| 15th March 2023
|
|
| See article from lancs.live |
Tom Tugendhat, the UK security minister, said he is awaiting a report from the National Cyber Security Centre (NCSC) before deciding on whether TikTok should be banned or restricted. Under pressure from some senior MPs, Rishi Sunak has hinted that
Britain could follow the US and the EU by banning the social media app from government phones and devices. The Prime Minister said the UK will look at what our allies are doing, with Washington and the European Commission having banned TikTok on staff
phones. Tugendhat was asked if he would go further and order a fully-fledged ban on the app, like those ordered by India and former US president Donald Trump. He responded: Looking at the various different apps people
have on their phones and the implications for them is a hugely important question and I've asked the National Cyber Security Centre to look into this. What certainly is clear is for many young people TikTok is now a news source
and, just as it's quite right we know who owns the news sources in the UK... it's important we know who owns the news sources that are feeding into our phones.
|
|
The government is set to grant itself an 18 week extension to the parliamentary time available to force through its unsafe Internet Censorship Bill
|
|
|
| 13th
March 2023
|
|
| |
The government's Internet 'Safety' Bill is coming under a lot of pressure for its disgraceful intention to compromise internet security for all British people by removing secure encrypted communication used to keep out hackers, blackmailers, scammers and
thieves. Perhaps acknowledging the opposition from security experts the government is giving itself another 18 weeks to push it through parliament. Otherwise the bill would be in danger of being timed out. The extension will be presented to
parliament tomorrow. |
|
Although the French decision to deem an internet censorship law as unconstitutional has passed into internet history, the Constitutional Council's decision provides some instructive comparisons when we examine the UK's Online Safety
Bill.
|
|
|
| 13th March 2023
|
|
| See article from cyberleagle.com by Graham Smith |
|
|
The British censors allow wealthy US streaming giants to self-certify at a very reduced cost, while still bleeding physical media distributors dry
|
|
|
| 12th
March 2023
|
|
| See article from reprobatepress.com See
press release from bbfc.co.uk
|
The British censors allow wealthy streaming giants to self-certify at a reduced cost, while still bleeding physical media distributors dry. The British Board of Film Classification has just issued another self-congratulatory
press release about how they have convinced yet another platform -- this time Amazon Prime -- to take on their ratings rather than having content either unrated or else using non-BBFC standard age classifications. For the BBFC to
allow huge, wealthy corporations to self-certify and use BBFC assets for a small fee (free for up to 100 titles, then from £573.90 plus VAT -- less than it would cost to certify one feature film on disc -- for up to 250 titles a year through to a maximum
of £4,591.22 plus VAT for anyone releasing 5000+ titles a year) while still charging much smaller distributors through the nose and making them pay for every element of a film including all the extras -- well, that seems outrageous. See full
article from reprobatepress.com The BBFC press release reads:
The British Board of Film Classification (BBFC) has announced that it has signed an agreement to enable Prime Video to build on their existing Trust & Safety tools, in order to move towards the in-house production of BBFC age ratings that are in line
with the BBFC's Classification Guidelines. This marks an important next step in the BBFC's long-standing content classification relationship with Prime Video, which aims to provide families across the UK with the information they need to make safe
viewing decisions. Through enhanced dialogue and processes, the BBFC will support Prime Video as they adapt their rating methodologies in the UK to fully reflect the BBFC's classification standards. This will extend the presence
of the BBFC's trusted guidance on the streaming service in the UK. As part of the agreement, the BBFC will share additional expertise and insight into the standards they apply when classifying film, video and TV content. The
BBFC's classification standards are underpinned by a transparent set of published guidelines, which are the result of wide-scale consultations with over 10,000 people across the UK, extensive research, and more than 100 years of experience. The BBFC also
works closely with young people, child psychologists and charities so as to ensure that standards continue to reflect the views and expectations of parents and families across the UK. The guidelines are updated every 4-5 years and the BBFC will consult
on its guidelines this year, with any changes required by the research coming into force in early 2024. The announcement comes as recent BBFC research, conducted by We Are Family, reveals that 90% of parents/caregivers of 4-to
15-year-olds and 80% of teenagers aged 16-19 consider age ratings and content advice to be of equal importance on streaming services as they are for films in the cinema. More generally, the research shows a high demand for both age ratings and content
advice on streaming services, particularly amongst parents and caregivers. Young people also see the value of such guidance: 51% of teens aged 16-19 check content advice before choosing what to watch, and 89% said that they pay more attention to content
advice if choosing for a person younger than them, such as siblings or other family members.
|
|
Internet porn censorship marches across many US states
|
|
|
| 12th March 2023
|
|
| See article
from henricocitizen.com See article from xbiz.com |
The Arkansas House has approved an amendment to SB 66, a Republican bill that would require age verification before entering a website offering pornography, over confusing language. SB 66 was introduced in January by state Senator Tyler Dees, who later
admitted that his state initiative is only a steppingstone toward the ultimate goal of a federal mandate. A vote in the Arkansas House sent the amended bill back to the Committee on House Rules for further consideration, the Northwest Arkansas
Democrat Gazette reported. Representative Mindy McAlindon told the paper that the amendment was needed to clarify distinctions between 'corporate entities' and 'third party vendors' in the bill. SB 66 is a copycat version of Louisiana's Act 440, a
new law enacted in January after being championed by a religious anti-porn activist Republican legislator. Meanwhile Virginia lawmakers recently passed a bill with near-unanimous support that would require pornography websites to more stringently
verify whether a person is 18 before allowing them access to the site. Websites would have to implement more advanced methods of their choosing to verify age, such as requiring users to submit copies of government-issued identification, biometric scans
or use other forms of commercial age verification software. Under the bill, a civil cause of action, or a lawsuit, could be brought on behalf of a minor who suffered damages from access to pornographic websites that didn't use age verification
measures. No one spoke in opposition when the bill was debated during the session, but some people took to social media to express their concerns. The bill now heads to Gov. Glenn Youngkin's desk for his signature. |
|
|
|
|
| 12th March 2023
|
|
|
Yes the government can demand that tech companies compromise the security of encrypted communications for all users See article from
untidy.substack.com |
|
YouTube unpicks recent attempts to censor strong language
|
|
|
| 9th March 2023
|
|
| Thanks to Nick See
article from techcrunch.com See
article from support.google.com |
YouTube has been attempting to force its creators into santising their strong language. Not because of some sort of moral highgrounding, but because the company wants to maximise its appeal to advertisers who would prefer not to advertise around content
with strong language. Well it seems that the attempt has wound up creators and now YouTube is rolling back some of its attempts to sanitise strong language. YouTube explains: Updated
Inappropriate language Ad-Friendly Guidelines Our update last November aimed to improve the clarity and enforcement of our Advertiser-friendly content guidelines and make it easier for Creators to monetize brand safe content.
However, we heard concerns from Creators that the new profanity policy actually resulted in a stricter approach than we intended. Effective March 7, we are making the following changes:
Usage of moderate profanity at any time in the video is now eligible for green icons. Usage of stronger profanity, like the f-word in the first 7 seconds or repeatedly throughout the majority of the
video can now receive limited ads (under the November update, this would have received no ad revenue). See specific examples of moderate and stronger profanity in our Help Center article . Video content using profanity,
moderate or strong, after the first 7 seconds will now be eligible for green icons, unless used repetitively throughout the majority of the video (under the November update, this would have received no ad revenue). We've also
clarified our guidance on how profanity in music is treated; moderate or strong profanity used in background music, backing tracks, intro/outro music can now earn full ad revenue (previously this would have received no ad revenue). -
Use of any profanity (moderate or stronger profanity) in titles and thumbnails will still be demonetized and cannot run ads, as was the case before the update in November,
|
|
Anti-porn campaigners from Parliament call for an end to commercial pornography
|
|
|
| 6th March
2023
|
|
| See article from appg-cse.uk See
report [pdf] from appg-cse.uk |
The grandly named All-Party Parliamentary Group on Commercial Sexual Exploitation (APPG-CSE) is a just a group of anti-porn MPs. It is not an official parliamentary committee tasked with monitoring policy on behalf of parliament. Of course it does try
present itself as something more important than it really is by referring to people its listens to as 'witnesses' in 'evidence' sessions and publishing its biased opinions as 'reports. Of course it never listens to any opposing views from sex workers,
film makers, or of course from people who enjoy sex entertainment. Anyway it has just published a moralist diatribe against porn titled: Pornography Regulation: The case for Parliamentary Reform Predictably its observations and
recommendations are simply to destroy the entire adult pornography business. The campaigners write: On 27 February 2023, the APPG on Commercial Sexual Exploitation launched the findings of its Inquiry on Pornography.
The report, Pornography Regulation: The case for Parliamentary Reform, concludes that the epidemic of male violence against women and girls cannot be ended unless the Government confronts the role pornography plays in fuelling sexual violence.
The report highlights the scale and nature of contemporary online pornography, finding that the user base of pornography is highly gendered, with significantly more men watching pornography than women. Violence against women is
prolific in mainstream pornography, and illegal content -- including videos of child sexual exploitation, rape and sex trafficking victims -- is freely accessible on mainstream pornography websites. The report evidences a
multiplicity of harms connected with the pornography industry. Pornography is found to fuel sexual violence and social and political harms against women and girls, as well as perpetuating racist stereotypes. Children continue to be exposed to online
pornography on an alarming scale, which is an egregious violation of child safeguarding. Meanwhile, sexual coercion is found to be inherent to the commercial production of pornography, with producers commonly adopting exploitative and abusive tactics to
coerce women into being filmed for pornography videos. The inquiry concludes that existing legislation relating to pornography is piecemeal and wholly inadequate with respect to preventing and providing redress for harms
perpetuated as part of the trade. As a result of its inquiry, the APPG on Commercial Sexual Exploitation has made the following recommendations to Government:
Make the regulation of pornography consistent across different online platforms, and between the online and offline spheres. Criminalise the supply of pornography online to children, and legally
require age verification for accessing pornography online. Address pornography as commercial sexual exploitation, and a form of violence against women, in legislation and policy. Legally require
online platforms to verify that every individual featured in pornographic content on their platform is an adult and gave permission for the content to be published there. Give individuals who feature in pornographic material
the legal right to withdraw their consent to material in which they feature being published and/or distributed. Hold exploiters to account by making it a criminal offence to enable or profit from the commercial sexual
exploitation of others. Conduct a comprehensive review of laws on pornography and obscenity.
Offsite Comment: Moral Coercion And Twisted Facts From The UK Parliament's Censorial Fanatics 5th March 2023. See
article from reprobatepress.com
Johnson's group took evidence during their from the usual suspects -- not just NCOSE but also Gail Dines, the anti-porn academic writer who is like Andrea Dworkin without the writing ability, and Laila Mickelwait, head of anti-sex work Christian
lobbyists Exodus Cry. They did not, you'll be unsurprised to hear, take any evidence from current sex workers or anyone else who might contradict their pre-existing beliefs -- because let's face it, anyone who is part of a group looking at commercial
sexual exploitation led by a notorious anti-porn politician is not exactly going in with an open mind. See full
article from reprobatepress.com
|
|
Children's campaigners claim that EU proposals for responding to child abuse don't go far enough and call for all internet communications to be open to snooping regardless of the safety of internet users from hackers, fraudsters and
thieves
|
|
|
| 6th March 2023
|
|
| See article from ec.europa.eu |
The European Commission proposed new EU rules to prevent and combat child sexual abuse (CSA) in May 2022. Complementing existing frameworks to fight online CSA, the EU proposal would introduce a new, harmonised European structure for assessing and
mitigating the spread of child sexual abuse material (CSAM) online. The thrust of the proposal is to react in a unified way, either to CSAM detected, or else to systems identified most at risk of being used to disseminate such material. However
as is always the case with campaigners, this is never enough. The campaigners basically want everybody's communications to be open to snooping and surveillance without the slightest consideration for people's safety from hackers, identity thieves,
scammers, blackmailers and fraudsters. The European Commission wrote: The Commission is
proposing new EU legislation to prevent and combat child sexual abuse online. With
85 million pictures and videos depicting child sexual abuse reported worldwide in 2021 alone, and many more going unreported, child sexual abuse is pervasive. The COVID-19 pandemic has exacerbated the issue, with the Internet Watch foundation noting a
64% increase in reports of confirmed child sexual abuse in 2021 compared to the previous year. The current system based on voluntary detection and reporting by companies has proven to be insufficient to adequately protect children and, in any case, will
no longer be possible once the interim solution currently in place expires. Up to 95% of all reports of child sexual abuse received in 2020 came from one company, despite clear evidence that the problem does not only exist on one platform.
To effectively address the misuse of online services for the purposes of child sexual abuse, clear rules are needed, with robust conditions and safeguards. The proposed rules will oblige providers to detect, report and remove child
sexual abuse material on their services. Providers will need to assess and mitigate the risk of misuse of their services and the measures taken must be proportionate to that risk and subject to robust conditions and safeguards. A
new independent EU Centre on Child Sexual Abuse (EU Centre) will facilitate the efforts of service providers by acting as a hub of expertise, providing reliable information on identified material, receiving and analysing reports from providers to
identify erroneous reports and prevent them from reaching law enforcement, swiftly forwarding relevant reports for law enforcement action and by providing support to victims. The new rules will help rescue children from further
abuse, prevent material from reappearing online, and bring offenders to justice. Those rules will include:
Mandatory risk assessment and risk mitigation measures: Providers of hosting or interpersonal communication services will have to assess the risk that their services are misused to disseminate child sexual abuse material or
for the solicitation of children, known as grooming. Providers will also have to propose risk mitigation measures. Targeted detection obligations, based on a detection order: Member States will need to designate
national authorities in charge of reviewing the risk assessment. Where such authorities determine that a significant risk remains, they can ask a court or an independent national authority to issue a detection order for known or new child sexual abuse
material or grooming. Detection orders are limited in time, targeting a specific type of content on a specific service. Strong safeguards on detection: Companies having received a detection order will only be able to
detect content using indicators of child sexual abuse verified and provided by the EU Centre. Detection technologies must only be used for the purpose of detecting child sexual abuse. Providers will have to deploy technologies that are the least
privacy-intrusive in accordance with the state of the art in the industry, and that limit the error rate of false positives to the maximum extent possible. Clear reporting obligations: Providers that have detected
online child sexual abuse will have to report it to the EU Centre. Effective removal: National authorities can issue removal orders if the child sexual abuse material is not swiftly taken down. Internet access
providers will also be required to disable access to images and videos that cannot be taken down, e.g., because they are hosted outside the EU in non-cooperative jurisdictions. Reducing exposure to grooming: The rules
require app stores to ensure that children cannot download apps that may expose them to a high risk of solicitation of children. Solid oversight mechanisms and judicial redress: Detection orders will be issued by
courts or independent national authorities. To minimise the risk of erroneous detection and reporting, the EU Centre will verify reports of potential online child sexual abuse made by providers before sharing them with law enforcement authorities and
Europol. Both providers and users will have the right to challenge any measure affecting them in Court.
The new EU Centre will support:
Online service providers, in particular in complying with their new obligations to carry out risk assessments, detect, report, remove and disable access to child sexual abuse online, by providing indicators to detect child sexual
abuse and receiving the reports from the providers; National law enforcement and Europol, by reviewing the reports from the providers to ensure that they are not submitted in error, and channelling them quickly to law
enforcement. This will help rescue children from situations of abuse and bring perpetrators to justice. Member States, by serving as a knowledge hub for best practices on prevention and assistance to victims, fostering an
evidence-based approach. Victims, by helping them to take down the materials depicting their abuse.
Next steps It is now for the European Parliament and the Council to agree on the proposal. Once adopted, the new Regulation will replace the current
interim Regulation .
Feedback from members of the public on
the proposals is open for a minimum of 8 weeks.*
According to child campaigners: On 8 February 2023, the European Parliament's Committee on the Internal Market and Consumer Protection (IMCO)
published its draft report on the European Commission's proposal to prevent and combat child sexual abuse. The draft report seeks a vastly reduced scope for the Regulation. It prioritises the anonymity of perpetrators of abuse over the rights of victims
and survivors of sexual abuse and seeks to reverse progress made in keeping children safe as they navigate or are harmed in digital environments that were not built with their safety in mind. The letter also criticises the removal of age
verification and claims that technology can meet high privacy standards, explaining that the new legislation adds in additional safeguards to already effective measures to prevent the spread of this material online. And of course the campaigners
demand that technology companies allow the surveillance of all messages via backdoors to encryption or perhaps just to ban encryption. See the letter from the likes of the NSPCC. See
article from iwf.org.uk |
|
|