Melon Farmers Original Version

Free Speech & Cancel Culture


2023: Oct-Dec

 2014   2015   2016   2017   2018   2019   2020   2021   2022   2023   2024   Latest 
Jan-March   April-June   July-Sept   Oct-Dec    

 

Extract: Kind Hearts and Coronets bowdlerised when broadcast on Talking Pictures...

Trigger warnings are bad enough -- but this butchery of classic films is unforgivable. By Simon Heffer


Link Here 23rd November 2023

The word ['nigger'] has been excised from the film I regard as the pinnacle of British cinema, Kind Hearts and Coronets. In a crucial scene, the about-to-be-hanged murderer and his mistress quote the traditional version of the rhyme Eeny, meeny, miny, moe: it is the moment when he realises that she knows he has murdered several members of his family, and might just murder his wife, too, if the mistress can have him reprieved. A TV channel on which I recently rewatched the film eliminated this exchange, rather than show a warning that it includes racially offensive language.

It is patronising to assume most of us don't know that the past is a foreign country; that they did things then that we are enlightened enough not to do now. Otherwise, Bowdler-like, we shall according to the latest obsession falsify whatever parts of our heritage culture warriors take exception to. We can't go through life without being offended; but we can and should present our cultural past honestly and in context, and explain the importance of its integrity.

 

 

Offsite Article: The madness of Bob Stewart's hate crime conviction...


Link Here 6th November 2023
When did trading insults become a police matter? By Fraser Myers

See article from spiked-online.com

 

 

No longer appropriate...

The Simpsons drops scenes showing Homer strangling his son


Link Here4th November 2023
The Simpsons has retired one of its recurring scenes, saying Homer strangling his son Bart is no longer appropriate because times have changed.

The decision was revealed in the 3rd episode of the long-running show's 35th series, in which Homer indicates he is a changed man

Homer strangling Bart had been a regular feature of the animated comedy sitcom ever since it first aired in 1989. Why you little...., a hapless Homer would often yell if his son had angered him, while squeezing his neck until Bart's eyes could be seen bulging out.

Homer's behaviour was addressed during an earlier episode in series 22, when a therapist attempted to get him to see the error of his ways.

 

 

Wrong type of bias...

Ofcom criticises GB News for deviating from the 'right think' line


Link Here24th October 2023
Full story: Ofcom vs Free Speech...Ofcom's TV censorship extended to criticism of woke poliical ideas
Martin Daubney (standing in for Laurence Fox)
GB News, 16 June 2023, 19:00

The above current affairs programme dealt with the topic of immigration and asylum policy, in particular in the context of controversy over small boats crossing the English Channel. The presenter, Martin Daubney, gave his own views on this topic and interviewed the leader of the Reform Party, Richard Tice.

Ofcom received a complaint about the programme.

We considered that immigration and asylum policy constituted a matter of major political controversy and a major matter relating to current public policy. When dealing with major matters, all Ofcom licensees must comply with the heightened special impartiality requirements in the Code. These rules require broadcasters to include and give due weight to an appropriately wide range of significant views.

We found that Mr Tice presented his position on a matter of major political controversy and a major matter of current public policy with insufficient challenge, and the limited alternative views presented were dismissed. The programme therefore did not include and give due weight to an appropriately wide range of significant views, as required by the Code.

The Licensee accepted that the content was not compliant with the heightened special impartiality requirements in the Code.

GB News failed to preserve due impartiality, in breach of Rules 5.11 and 5.12 of the Code.

Ofcom recognises that, in accordance with the right to freedom of expression, broadcasters have editorial freedom and can offer audiences innovative forms of discussion and debate ... However... in light of the likely similarity of the views of the participants in this programme on the major matter being discussed, the Licensee should have taken additional steps to ensure that due impartiality was preserved.

We expect GB News to take careful account of this Decision in its compliance of future programming.

 

 

More censors...

Ofcom investigates BitChute as an early test case of internet censorship


Link Here7th October 2023
Full story: Ofcom Video Sharing Censors...Video on Demand and video sharing
BitChute is a British based video sharing platform that is particularly well known for hosting content that has been banned from more censorial websites, notably YouTube.

The name was conceived from a portmanteau of the words bit , a unit of information in computing, and parachute . At the time of the site's launch, founder Ray Vahey described BitChute as an alternative to mainstream platforms; he believed these platforms had demonstrated increased levels of censorship over the previous few years by banning and demonetising users (barring them from receiving advertising revenue), and tweaking algorithms to send certain content into obscurity. In 2018, the creators of BitChute described themselves as a small team making a stand against Internet censorship because we believe it is the right thing to do.

Of course right leaning opionion does not sit well with the British establishment so it isn't a surprise to see it as an early example of interest from the UK internet censor, Ofcom. Note that censorship powers have already been granted to Ofcom for video sharing platforms stupid enough to be based in Britain. So these platforms are an interesting forerunner to how Ofcom will censor the wider internet when powers from the Online Censorship Bill are enacted.

Ofcom writes:

Investigation into BitChute Limited Case considered 3 October 2023

Summary

Compliance assurances from BitChute regarding its obligations under Part 4B of the Communications Act 2003: Improvements to the measures BitChute has in place to protect users from videos containing harmful material.

Ofcom's role is to ensure video-sharing platforms (VSPs) based in the UK have appropriate systems and processes in place as required under Part 4B of the Act to effectively protect their users from harmful video content in scope of the VSP regime.

In May 2022, a far-right extremist carried out a racially motivated attack in Buffalo, New York. Ofcom conducted analysis of the measures in place to protect users from harmful material on several VSPs, including BitChute, in light of this incident.

Our analysis of BitChute's platform raised concerns that some of its measures were not effectively protecting users from encountering videos related to terrorism and other harmful material prohibited under the VSP regime.

Following a period of close engagement with BitChute to discuss its compliance with its obligations under Part 4B of the Communications Act 2003, it has made some important changes and also committed to further improvements to protect users from harmful material.

Background

On 14 May 2022, a far-right extremist carried out a racially motivated attack in Buffalo, New York, killing ten people and wounding three others. The attacker livestreamed the shooting online, and versions of the footage were distributed on multiple online services, including BitChute and other UK-based VSPs that we currently regulate . This resulted in UK users being potentially exposed to harmful material related to terrorism and material likely to incite violence and hatred.

Ofcom's role is to ensure VSPs have appropriate systems and processes in place as required under Part 4B of the Act to effectively protect their users from harmful video content in scope of the VSP regime. Our approach to securing compliance focuses on oversight, accountability, and transparency, working with the industry where possible to drive improvements, as well as taking formal enforcement action where appropriate.

Our concerns

In the weeks following the Buffalo attack, we engaged with relevant VSPs and the wider industry to learn more about how platforms can set up internal systems and processes to prevent the livestreaming of such attacks and protect users from the sharing of associated video content. In October 2022, Ofcom published a report on the Buffalo attack that explored how footage of the attack, and related material, came to be disseminated online and to understand the implications of this for platforms' efforts to keep people safe online.

Our analysis raised concerns that BitChute's reporting and flagging measures were not effectively protecting users from encountering videos related to terrorism and other harmful material prohibited under the VSP regime. In particular, Ofcom was concerned that BitChute's reporting function was not open to non-registered users, and that the capacity and coverage of BitChute's content moderation team was insufficient to enable it to respond promptly to reports of harmful content.

BitChute's commitments

In response to our concerns, BitChute has made some important changes to its reporting and flagging mechanisms and to the content moderation processes which underpin these, as well as committing to further changes.

1. Coverage and capacity of content moderation

In our 2022 VSP Report , published in October, we found that all VSPs, including BitChute, have adequate terms and conditions that prohibit material that would come within the scope of laws relating to terrorism, racism, and xenophobia, as well as material likely to incite violence or hatred

However, the Buffalo attack exposed key deficiencies in BitChute's ability to effectively enforce its terms and conditions relating to hate and terror content: footage of the shooting was easily accessible on the platform in the days after the attack, and we learnt that the platform's content moderation team was modest in size and limited to certain working hours. This restricted BitChute's ability to respond quickly to reports that footage was on the platform following the attack.

BitChute has committed to triple the capacity of its moderation team by taking on more human moderators. It is also extending the coverage of its moderation team by increasing the number of hours that moderators are available to review reports and has committed to having a safety team operational 24/7 in autumn 2023.

2. User reporting and flagging mechanisms

Prior to the Buffalo attack, BitChute had reporting and flagging mechanisms in place to allow users to report potentially harmful content. However, on-platform flagging was only available to users who had a registered BitChute account. While all users (registered or unregistered) were able to report content by sending an email to BitChute, we were concerned that requiring non-registered users to email the platform, rather than click a reporting button next to the video, introduces a layer of friction to the reporting process that could disincentivise the user from making a report and increase the time taken to respond to reports.

As a result of our remediation work, BitChute has changed the design of its platform to allow non-registered users to directly report potentially harmful content. It has also updated its user-facing guidelines to set out more clearly what registered and non-registered users can expect from the flagging and reporting process.

3. Measuring effectiveness

BitChute has also committed to collecting additional metrics to measure the impact of changes made to its systems and processes, including the volume of content review reports raised each day and average response time in minutes for content reports. These metrics will help BitChute and Ofcom to evaluate the effectiveness of the platform's measures more easily.

We have also encouraged BitChute to implement additional reports on risk metrics, that measure the risk of harmful material being encountered on the platform and process metrics, that measure the effectiveness of BitChute's moderation systems.

Our response

Taking into account BitChute's willingness to make timely improvements to its systems and processes to directly address the concerns we identified following the Buffalo incident, and our desire to work with industry to secure changes that protect users, we have decided not to open an investigation against BitChute into its compliance with its duties under Part 4B of the Communications Act 2003 at this time. We will, however, closely monitor the implementation of the proposed changes and the impact these changes have on user safety.

We also note that, on 14 June 2023, BitChute became a member of the Global Internet Forum to Counter Terrorism (GIFCT). GIFCT is a cross-industry initiative designed to prevent terrorists and violent extremists from exploiting digital platforms. Whilst we do not consider this an indicator of compliance, it is an encouraging step -- GIFCT has rigorous standards for membership , including demonstrating "a desire to explore new technical solutions to counter terrorist and violent extremist activity online" and "support for expanding the capacity of civil society organisations to challenge terrorism and violent extremism".

While we welcome BitChute's commitments to further improvements and measuring their effectiveness, we are aware of reports -- some of which have been communicated to us directly -- alleging that content likely to incite violence and hatred continues to be uploaded to BitChute, can be accessed easily, and may pose significant risks to users.

It is important to note that the VSP regime is a systems and processes regime, meaning the presence of harmful video content on a service is not in itself a breach of the rules. Accordingly, Ofcom's focus is to drive improvements to platforms systems and processes to minimise the risks of users encountering harmful videos online in the first place.

However, such content can be indicative of an underlying issue with the user protections in place, and we will therefore continue to monitor BitChute closely to assess whether the changes it has made to its user reporting and content moderation systems result in tangible improvements to user safety. If we find that, despite BitChute's assurances and improvements, users are not being adequately protected from the categories of harmful material covered by the VSP regime, we will not hesitate to take further action, including formal enforcement action if necessary.


 2014   2015   2016   2017   2018   2019   2020   2021   2022   2023   2024   Latest 
Jan-March   April-June   July-Sept   Oct-Dec    


 


Liberty

Privacy

Copyright
 

Free Speech

Campaigners

Religion
 

melonfarmers icon

Home

Top

Index

Links

Search
 

UK

World

Media

Liberty

Info
 

Film Index

Film Cuts

Film Shop

Sex News

Sex Sells
 


Adult Store Reviews

Adult DVD & VoD

Adult Online Stores

New Releases/Offers

Latest Reviews

FAQ: Porn Legality
 

Sex Shops List

Lap Dancing List

Satellite X List

Sex Machines List

John Thomas Toys