Melon Farmers Original Version

Ofcom Watch


2023: Oct-Dec

 2003   2004   2005   2006   2007   2008   2009   2010   2011   2012   2013   2014   2015   2016   2017   2018   2019   2020   2021   2022   2023   2024   Latest 
Jan-March   April-June   July-Sept   Oct-Dec    

 

The 2023 Ofcom Top 10...

Ofcom publishes its list of most complained about TV


Link Here28th December 2023
Ofcom has published an end of year review. Ofcom writes:

Over the course of the last year, we received 69,236 complaints about 9,638 cases. That's nearly twice as many complaints as we dealt with in 2022

In 2023, we published 23 Broadcast and On Demand Bulletins which announced 57 new broadcast standards investigations, as well the outcome of 46 investigations. We found a total of 35 programmes in breach of our broadcasting rules and are working to conclude the others as quickly as possible. We also published 15 adjudications on complaints from individuals and organisations that complained to us that they had been treated unfairly and/or had their privacy unwarrantably infringed in TV and radio programmes.

We imposed sanctions on four broadcasters for content breaches, including a £40,000 fine to the Islam channel and £10,000 to Ahlebait TV , both for broadcasting antisemitic content.

We also found GB News in breach of our rules on five occasions after our investigations found it broke our rules that protect audiences from harm twice and our due impartiality rules three times.

Most complained about programmes of 2023

  • Dan Wootton Tonight, GB News, 26 September 2023 -- 8,867 complaints

    Viewers objected to the misogynistic comments made by Laurence Fox about journalist Ava Evans.

    Ofcom's investigation of this programme under our rules on offence is ongoing.
     

  • King Charles III: The Coronation, ITV1, 6 May 2023 -- 8,421 complaints

    The majority of complaints related to a comment made by actress Adjoa Andoh during the live broadcast, which focused on the 'whiteness' of the Royal Family on the balcony of Buckingham Palace.

    While we understand some viewers had strong feelings about this comment, after careful consideration we concluded that the comment was a personal observation which was part of a wide-ranging panel discussion which also touched on other diversity-related topics, and which contained a range of viewpoints.
     

  • Good Morning Britain, ITV1, 17 October 2023 -- 2,391 complaints

    We carefully assessed complaints about the presenter's line of questioning towards MP Layla Moran.

    We considered his live, unscripted remarks were potentially offensive. However, taking the entire interview into account, and in particular a preceding discussion about Hamas using civilians as human shields, we considered the question sought to explore whether civilians were aware of a potential escalation in hostilities, rather than suggesting that Ms Moran or her family were aware of specific plans for the Hamas attack on 7 October 2023. In her response, Ms Moran spoke about her surprise at the scale and sophistication of the attack. In light of this, we will not be pursuing further.
     

  • Jeremy Vine, Channel 5, 13 March 2023 -- 2,302 complaints

    We carefully considered complaints from viewers about a discussion on the junior doctors' pay dispute.

    While we recognise that some references about progression timelines and corresponding pay-scales were not strictly accurate, we do not consider that the errors were sufficient to have materially misled viewers so as to cause harm.
     

  • Breakfast with Kay Burley, Sky News, 23 November 2023 -- 1,880 complaints

    We carefully considered complaints about the presenter's line of questioning during an interview with Israeli spokesperson, Eylon Levy.

    Taking account of Mr Levy's forceful challenge to the premise of the question about the value of Israeli versus Palestinian lives, and the context of the wider discussion about the terms of the temporary ceasefire, we will not be pursuing further.
     

  • Lee Anderson's Real World, GB News, 29 September 2023 -- 1,697 complaints

    Complaints related to Lee Anderson's interview with Suella Braverman, on the grounds that they are both Conservative MPs.

    We published our assessment of this programme which found that it included an appropriately wide range of significant views on immigration and border control which were given due weight.
     

  • Breakfast with Kay Burley, Sky News, 10 October 2023 -- 1,640 complaints

    Complainants alleged Kay Burley misrepresented comments made by the Palestinian ambassador.

    We are assessing the complaints, before we decide whether or not to investigate.
     

  • Naked Education, Channel 4, 4 April 2023 -- 1,285 complaints

    We understand that some viewers were concerned about this programme, which included pre-watershed nudity.

    In our view, the programme had a clear educational focus, and the young participants reflected positively on their involvement. We also took into account that there were warnings to the audience before the programme aired.
     

  • This Morning, ITV, 18 December 2023, 1,092 complaints

    Complaints related to comments made by Vanessa Feltz about coeliac disease.

    We are assessing the complaints, before we decide whether or not to investigate.
     

  • Love Island, ITV2, 9 July 2023 -- 992 complaints

    The majority of complaints about this episode related to bullying against Scott.

    We carefully assessed complaints about this series on a range of issues including alleged bullying, homophobia and racism.

    We recognise that emotionally charged or confrontational scenes can upset some viewers. But, in our view, negative behaviour in the villa was not shown in a positive light. We also took into account that the format of this reality show is well-established and viewers would expect to see highs and lows as couples' relationships are tested.

    Viewers also complained about a contestant being voted off and returning to the programme, but this was an editorial decision for the broadcaster.

 

 

Updated: Sanctioned...

A suicide forum seems to be first in the cross hairs of the new UK internet censor Ofcom


Link Here11th November 2023
Ofcom is threatening to block a suicide website linked to 50 UK deaths after it said it would refuse to abide by new online censorship laws.

The website, Sanctioned Suicide is described by wiki as an internet forum known for its open discussion and encouragement of suicide and suicide methods. The forum was created in 2018 after the subreddit r/SanctionedSuicide was banned by Reddit. As of September 2022, the forum has over 25,000 members, receiving nearly 10 million page views that same month.

The BBC have been investigating the forum and reported:

We have discovered that at least six coroners have written to government departments demanding action to shut the forum down. Collating inquest reports, press articles and posts on the forum itself, we have identified at least 50 UK victims. We have learned that at least five police forces are aware of the forum, and have investigated deaths linked to it, but have been unable to take action.

The Online 'Safety' Bill, passed by Parliament last month, is due to get royal assent this week, investing Ofcom with immediate powers to take action against errant social media firms. Ofcom is due to set out its legally-enforced code of practice for firms to combat illegal harms including promoting suicide next month.

An Ofcom spokesman said:

Sites that failed to prevent users coming across such illegal material would face fines of up to 10% of their global turnover and bosses who persistently ignored warnings and requests for information could face up to two years in jail .

Operators of the site could also face up to 14 years in jail under laws against encouraging or assisting suicide including through online platforms. Because there are victims in the UK, the company bosses could be prosecuted in the UK and brought to the UK to face trial through an extradition request to the US.

Ofcom will also have powers to take out court orders that would enable it to prevent the company from gaining any access to UK users. ISPs would be then required by laws to block access to a service in the UK.

It could also order platforms hosting the site to no longer do so and require search engines and social networks to deny it any presence when users look for it.

We expect tech companies to be fully prepared to comply with their new duties when the time comes. It's a serious concern if companies say they are going to ignore the law. If services don't comply, we'll have a broad range of enforcement powers at our disposal to ensure they're held fully accountable for the safety of their users.

The forum responded to UK criticism from the BBC and Ofcom by displaying the front page message:

Hello Guest,

We will not be following or complying with the Online Safety Bill that was recently signed into law in the UK. This bill will not affect the operations of the site, nor do we have a presence in the UK to receive notice or fines that the UK Government may impose.

We would highly recommend that all users from the UK get some sort of VPN, and you should petition your lawmakers to let them know how you feel about this piece of draconian legislation.

 

Update: Blocked by Sky

31st October 2023. See article from bbc.co.uk

Sky's ISP has added the Sanctioned Suicide website to its voluntary blocking list. It is not clear what level of blocking and what blocking category the website falls into.

Sky vaguely says the forum will automatically be barred if home users are using its standard filters. The company said it had moved as quickly as possible and blocked the online forum with immediate effect.

A second ISP, TalkTalk, said the webssite had now been added to its list of inappropriate content and could also be blocked by users. TalkTalk told the BBC the site would now be blocked for any customer with its HomeSafe safety filter activated. It said it was unable to automatically block the site.

 

 Update: Self blocked

11th November 2023. See article from bbc.co.uk

A pro-suicide forum has decided to block itself from users in the UK following pressure from the British internet censor, Ofcom.

The Sanctioned Suicide forum was previously available online without any restrictions. But now the forum can now only be viewed by UK users already signed up as members.

Anyone visiting the site is now met with a banner saying content that violates the UK's new Online Safety Act will not be viewable to the British public.

It is unclear whether new users from the UK can still apply for membership. Existing members in the UK do still have access.

It will be interesting to see how many sites respond to British internet censorship by blocking themselves to British users.

 

 

Offsite Article: "You Don't Belong Here!"...


Link Here11th November 2023
Full story: Online Safety Act...UK Government legislates to censor social media
With 1500 pages outlining a mountain of suffocating red tape in the name of internet regulation, Ofcom delivers a message to small British internet companies

See article from webdevlaw.uk

 

 

Online Censorship Act...

The Online Unsafety Bill gets Royal Assent and so becomes law


Link Here29th October 2023
Full story: Online Safety Act...UK Government legislates to censor social media
The Online Safety Bill received Royal Assenton 26th October 2023, heralding a new era of internet censorship.

The new UK internet Ofcom was quick off the mark to outline its timetable for implementing the new censorship regime.

Ofcom has set out our plans for putting online safety laws into practice, and what we expect from tech firms, now that the Online Safety Act has passed. Ofcom writes:

The Act makes companies that operate a wide range of online services legally responsible for keeping people, especially children, safe online. These companies have new duties to protect UK users by assessing risks of harm, and taking steps to address them. All in-scope services with a significant number of UK users, or targeting the UK market, are covered by the new rules, regardless of where they are based.

While the onus is on companies to decide what safety measures they need given the risks they face, we expect implementation of the Act to ensure people in the UK are safer online by delivering four outcomes:

  • stronger safety governance in online firms;

  • online services designed and operated with safety in mind;

  • choice for users so they can have meaningful control over their online experiences; and

  • transparency regarding the safety measures services use, and the action Ofcom is taking to improve them, in order to build trust .

We are moving quickly to implement the new rules

Ofcom will give guidance and set out codes of practice on how in-scope companies can comply with their duties, in three phases, as set out in the Act.

Phase one: illegal harms duties

We will publish draft codes and guidance on these duties on 9 November 2023, including:

  • analysis of the causes and impacts of online harm, to support services in carrying out their risk assessments;

  • draft guidance on a recommended process for assessing risk;

  • draft codes of practice, setting out what services can do to mitigate the risk of harm; and

  • draft guidelines on Ofcom's approach to enforcement.

We will consult on these documents, and plan to publish a statement on our final decisions in Autumn 2024. The codes of practices will then be submitted to the Secretary of State for Science, Innovation and Technology, and subject to their approval, laid before Parliament.

Phase two: child safety, pornography and the protection of women and girls

Child protection duties will be set out in two parts. First, online pornography services and other interested stakeholders will be able to read and respond to our draft guidance on age assurance from December 2023. This will be relevant to all services in scope of Part 5 of the Online Safety Act.

Secondly, regulated services and other interested stakeholders will be able to read and respond to draft codes of practice relating to protection of children, in Spring 2024.

Alongside this, we expect to consult on:

  • analysis of the causes and impacts of online harm to children; and

  • draft risk assessment guidance focusing on children's harms.

We expect to publish draft guidance on protecting women and girls by Spring 2025, when we will have finalised our codes of practice on protection of children.

Phase three: transparency, user empowerment, and other duties on categorised services

A small proportion of regulated services will be designated Category 1, 2A or 2B services if they meet certain thresholds set out in secondary legislation to be made by Government. Our final stage of implementation focuses on additional requirements that fall only on these categorised services. Those requirements include duties to:

  • produce transparency reports;

  • provide user empowerment tools;

  • operate in line with terms of service;

  • protect certain types of journalistic content; and

  • prevent fraudulent advertising.

We now plan to issue a call for evidence regarding our approach to these duties in early 2024 and a consultation on draft transparency guidance in mid 2024.

Ofcom must produce a register of categorised services. We will advise Government on the thresholds for these categories in early 2024, and Government will then make secondary legislation on categorisation, which we currently expect to happen by summer 2024. Assuming this is achieved, we will:

  • publish the register of categorised services by the end of 2024;

  • publish draft proposals regarding the additional duties on these services in early 2025; and

  • issue transparency notices in mid 2025.

 

 

Wrong type of bias...

Ofcom criticises GB News for deviating from the 'right think' line


Link Here24th October 2023
Full story: Ofcom vs Free Speech...Ofcom's TV censorship extended to criticism of woke poliical ideas
Martin Daubney (standing in for Laurence Fox)
GB News, 16 June 2023, 19:00

The above current affairs programme dealt with the topic of immigration and asylum policy, in particular in the context of controversy over small boats crossing the English Channel. The presenter, Martin Daubney, gave his own views on this topic and interviewed the leader of the Reform Party, Richard Tice.

Ofcom received a complaint about the programme.

We considered that immigration and asylum policy constituted a matter of major political controversy and a major matter relating to current public policy. When dealing with major matters, all Ofcom licensees must comply with the heightened special impartiality requirements in the Code. These rules require broadcasters to include and give due weight to an appropriately wide range of significant views.

We found that Mr Tice presented his position on a matter of major political controversy and a major matter of current public policy with insufficient challenge, and the limited alternative views presented were dismissed. The programme therefore did not include and give due weight to an appropriately wide range of significant views, as required by the Code.

The Licensee accepted that the content was not compliant with the heightened special impartiality requirements in the Code.

GB News failed to preserve due impartiality, in breach of Rules 5.11 and 5.12 of the Code.

Ofcom recognises that, in accordance with the right to freedom of expression, broadcasters have editorial freedom and can offer audiences innovative forms of discussion and debate ... However... in light of the likely similarity of the views of the participants in this programme on the major matter being discussed, the Licensee should have taken additional steps to ensure that due impartiality was preserved.

We expect GB News to take careful account of this Decision in its compliance of future programming.

 

 

More censors...

Ofcom investigates BitChute as an early test case of internet censorship


Link Here7th October 2023
Full story: Ofcom Video Sharing Censors...Video on Demand and video sharing
BitChute is a British based video sharing platform that is particularly well known for hosting content that has been banned from more censorial websites, notably YouTube.

The name was conceived from a portmanteau of the words bit , a unit of information in computing, and parachute . At the time of the site's launch, founder Ray Vahey described BitChute as an alternative to mainstream platforms; he believed these platforms had demonstrated increased levels of censorship over the previous few years by banning and demonetising users (barring them from receiving advertising revenue), and tweaking algorithms to send certain content into obscurity. In 2018, the creators of BitChute described themselves as a small team making a stand against Internet censorship because we believe it is the right thing to do.

Of course right leaning opionion does not sit well with the British establishment so it isn't a surprise to see it as an early example of interest from the UK internet censor, Ofcom. Note that censorship powers have already been granted to Ofcom for video sharing platforms stupid enough to be based in Britain. So these platforms are an interesting forerunner to how Ofcom will censor the wider internet when powers from the Online Censorship Bill are enacted.

Ofcom writes:

Investigation into BitChute Limited Case considered 3 October 2023

Summary

Compliance assurances from BitChute regarding its obligations under Part 4B of the Communications Act 2003: Improvements to the measures BitChute has in place to protect users from videos containing harmful material.

Ofcom's role is to ensure video-sharing platforms (VSPs) based in the UK have appropriate systems and processes in place as required under Part 4B of the Act to effectively protect their users from harmful video content in scope of the VSP regime.

In May 2022, a far-right extremist carried out a racially motivated attack in Buffalo, New York. Ofcom conducted analysis of the measures in place to protect users from harmful material on several VSPs, including BitChute, in light of this incident.

Our analysis of BitChute's platform raised concerns that some of its measures were not effectively protecting users from encountering videos related to terrorism and other harmful material prohibited under the VSP regime.

Following a period of close engagement with BitChute to discuss its compliance with its obligations under Part 4B of the Communications Act 2003, it has made some important changes and also committed to further improvements to protect users from harmful material.

Background

On 14 May 2022, a far-right extremist carried out a racially motivated attack in Buffalo, New York, killing ten people and wounding three others. The attacker livestreamed the shooting online, and versions of the footage were distributed on multiple online services, including BitChute and other UK-based VSPs that we currently regulate . This resulted in UK users being potentially exposed to harmful material related to terrorism and material likely to incite violence and hatred.

Ofcom's role is to ensure VSPs have appropriate systems and processes in place as required under Part 4B of the Act to effectively protect their users from harmful video content in scope of the VSP regime. Our approach to securing compliance focuses on oversight, accountability, and transparency, working with the industry where possible to drive improvements, as well as taking formal enforcement action where appropriate.

Our concerns

In the weeks following the Buffalo attack, we engaged with relevant VSPs and the wider industry to learn more about how platforms can set up internal systems and processes to prevent the livestreaming of such attacks and protect users from the sharing of associated video content. In October 2022, Ofcom published a report on the Buffalo attack that explored how footage of the attack, and related material, came to be disseminated online and to understand the implications of this for platforms' efforts to keep people safe online.

Our analysis raised concerns that BitChute's reporting and flagging measures were not effectively protecting users from encountering videos related to terrorism and other harmful material prohibited under the VSP regime. In particular, Ofcom was concerned that BitChute's reporting function was not open to non-registered users, and that the capacity and coverage of BitChute's content moderation team was insufficient to enable it to respond promptly to reports of harmful content.

BitChute's commitments

In response to our concerns, BitChute has made some important changes to its reporting and flagging mechanisms and to the content moderation processes which underpin these, as well as committing to further changes.

1. Coverage and capacity of content moderation

In our 2022 VSP Report , published in October, we found that all VSPs, including BitChute, have adequate terms and conditions that prohibit material that would come within the scope of laws relating to terrorism, racism, and xenophobia, as well as material likely to incite violence or hatred

However, the Buffalo attack exposed key deficiencies in BitChute's ability to effectively enforce its terms and conditions relating to hate and terror content: footage of the shooting was easily accessible on the platform in the days after the attack, and we learnt that the platform's content moderation team was modest in size and limited to certain working hours. This restricted BitChute's ability to respond quickly to reports that footage was on the platform following the attack.

BitChute has committed to triple the capacity of its moderation team by taking on more human moderators. It is also extending the coverage of its moderation team by increasing the number of hours that moderators are available to review reports and has committed to having a safety team operational 24/7 in autumn 2023.

2. User reporting and flagging mechanisms

Prior to the Buffalo attack, BitChute had reporting and flagging mechanisms in place to allow users to report potentially harmful content. However, on-platform flagging was only available to users who had a registered BitChute account. While all users (registered or unregistered) were able to report content by sending an email to BitChute, we were concerned that requiring non-registered users to email the platform, rather than click a reporting button next to the video, introduces a layer of friction to the reporting process that could disincentivise the user from making a report and increase the time taken to respond to reports.

As a result of our remediation work, BitChute has changed the design of its platform to allow non-registered users to directly report potentially harmful content. It has also updated its user-facing guidelines to set out more clearly what registered and non-registered users can expect from the flagging and reporting process.

3. Measuring effectiveness

BitChute has also committed to collecting additional metrics to measure the impact of changes made to its systems and processes, including the volume of content review reports raised each day and average response time in minutes for content reports. These metrics will help BitChute and Ofcom to evaluate the effectiveness of the platform's measures more easily.

We have also encouraged BitChute to implement additional reports on risk metrics, that measure the risk of harmful material being encountered on the platform and process metrics, that measure the effectiveness of BitChute's moderation systems.

Our response

Taking into account BitChute's willingness to make timely improvements to its systems and processes to directly address the concerns we identified following the Buffalo incident, and our desire to work with industry to secure changes that protect users, we have decided not to open an investigation against BitChute into its compliance with its duties under Part 4B of the Communications Act 2003 at this time. We will, however, closely monitor the implementation of the proposed changes and the impact these changes have on user safety.

We also note that, on 14 June 2023, BitChute became a member of the Global Internet Forum to Counter Terrorism (GIFCT). GIFCT is a cross-industry initiative designed to prevent terrorists and violent extremists from exploiting digital platforms. Whilst we do not consider this an indicator of compliance, it is an encouraging step -- GIFCT has rigorous standards for membership , including demonstrating "a desire to explore new technical solutions to counter terrorist and violent extremist activity online" and "support for expanding the capacity of civil society organisations to challenge terrorism and violent extremism".

While we welcome BitChute's commitments to further improvements and measuring their effectiveness, we are aware of reports -- some of which have been communicated to us directly -- alleging that content likely to incite violence and hatred continues to be uploaded to BitChute, can be accessed easily, and may pose significant risks to users.

It is important to note that the VSP regime is a systems and processes regime, meaning the presence of harmful video content on a service is not in itself a breach of the rules. Accordingly, Ofcom's focus is to drive improvements to platforms systems and processes to minimise the risks of users encountering harmful videos online in the first place.

However, such content can be indicative of an underlying issue with the user protections in place, and we will therefore continue to monitor BitChute closely to assess whether the changes it has made to its user reporting and content moderation systems result in tangible improvements to user safety. If we find that, despite BitChute's assurances and improvements, users are not being adequately protected from the categories of harmful material covered by the VSP regime, we will not hesitate to take further action, including formal enforcement action if necessary.


 2003   2004   2005   2006   2007   2008   2009   2010   2011   2012   2013   2014   2015   2016   2017   2018   2019   2020   2021   2022   2023   2024   Latest 
Jan-March   April-June   July-Sept   Oct-Dec    

melonfarmers icon

Home

Top

Index

Links

Search
 

UK

World

Media

Liberty

Info
 

Film Index

Film Cuts

Film Shop

Sex News

Sex Sells
 
 

 
UK News

UK Internet

UK TV

UK Campaigns

UK Censor List
ASA

BBC

BBFC

ICO

Ofcom
Government

Parliament

UK Press

UK Games

UK Customs


Adult Store Reviews

Adult DVD & VoD

Adult Online Stores

New Releases/Offers

Latest Reviews

FAQ: Porn Legality
 

Sex Shops List

Lap Dancing List

Satellite X List

Sex Machines List

John Thomas Toys