BitChute is a British based video sharing platform that is particularly well known for hosting content that has been banned from more censorial websites, notably YouTube. The name was conceived from a portmanteau of the words bit
, a unit of information in computing, and parachute . At the time of the site's launch, founder Ray Vahey described BitChute as an alternative to mainstream platforms; he believed these platforms had demonstrated increased levels of censorship
over the previous few years by banning and demonetising users (barring them from receiving advertising revenue), and tweaking algorithms to send certain content into obscurity. In 2018, the creators of BitChute described themselves as a small team making
a stand against Internet censorship because we believe it is the right thing to do.
Of course right leaning opionion does not sit well with the British establishment so it isn't a surprise to see it as an early example of interest from the UK
internet censor, Ofcom. Note that censorship powers have already been granted to Ofcom for video sharing platforms stupid enough to be based in Britain. So these platforms are an interesting forerunner to how Ofcom will censor the wider internet when
powers from the Online Censorship Bill are enacted.
Ofcom writes:
Investigation into BitChute Limited Case considered 3 October 2023
Summary
Compliance
assurances from BitChute regarding its obligations under Part 4B of the Communications Act 2003: Improvements to the measures BitChute has in place to protect users from videos containing harmful material.
Ofcom's role is to
ensure video-sharing platforms (VSPs) based in the UK have appropriate systems and processes in place as required under Part 4B of the Act to effectively protect their users from harmful video content in scope of the VSP regime.
In May 2022, a far-right extremist carried out a racially motivated attack in Buffalo, New York. Ofcom conducted analysis of the measures in place to protect users from harmful material on several VSPs, including BitChute, in light of this incident.
Our analysis of BitChute's platform raised concerns that some of its measures were not effectively protecting users from encountering videos related to terrorism and other harmful material prohibited under the VSP regime.
Following a period of close engagement with BitChute to discuss its compliance with its obligations under Part 4B of the Communications Act 2003, it has made some important changes and also committed to further improvements to protect
users from harmful material.
Background
On 14 May 2022, a far-right extremist carried out a racially motivated attack in Buffalo, New York, killing ten people and wounding three others. The attacker
livestreamed the shooting online, and versions of the footage were distributed on multiple online services, including BitChute and other UK-based VSPs that we currently regulate . This resulted in UK users being potentially exposed to harmful material
related to terrorism and material likely to incite violence and hatred.
Ofcom's role is to ensure VSPs have appropriate systems and processes in place as required under Part 4B of the Act to effectively protect their users from
harmful video content in scope of the VSP regime. Our approach to securing compliance focuses on oversight, accountability, and transparency, working with the industry where possible to drive improvements, as well as taking formal enforcement action
where appropriate.
Our concerns
In the weeks following the Buffalo attack, we engaged with relevant VSPs and the wider industry to learn more about how platforms can set up internal systems and
processes to prevent the livestreaming of such attacks and protect users from the sharing of associated video content. In October 2022, Ofcom published a report on the Buffalo attack that explored how footage of the attack, and related material, came to
be disseminated online and to understand the implications of this for platforms' efforts to keep people safe online.
Our analysis raised concerns that BitChute's reporting and flagging measures were not effectively protecting
users from encountering videos related to terrorism and other harmful material prohibited under the VSP regime. In particular, Ofcom was concerned that BitChute's reporting function was not open to non-registered users, and that the capacity and coverage
of BitChute's content moderation team was insufficient to enable it to respond promptly to reports of harmful content.
BitChute's commitments
In response to our concerns, BitChute has made some
important changes to its reporting and flagging mechanisms and to the content moderation processes which underpin these, as well as committing to further changes.
1. Coverage and capacity of content moderation
In our 2022 VSP Report , published in October, we found that all VSPs, including BitChute, have adequate terms and conditions that prohibit material that would come within the scope of laws relating to terrorism, racism, and
xenophobia, as well as material likely to incite violence or hatred
However, the Buffalo attack exposed key deficiencies in BitChute's ability to effectively enforce its terms and conditions relating to hate and terror content:
footage of the shooting was easily accessible on the platform in the days after the attack, and we learnt that the platform's content moderation team was modest in size and limited to certain working hours. This restricted BitChute's ability to respond
quickly to reports that footage was on the platform following the attack.
BitChute has committed to triple the capacity of its moderation team by taking on more human moderators. It is also extending the coverage of its moderation
team by increasing the number of hours that moderators are available to review reports and has committed to having a safety team operational 24/7 in autumn 2023.
2. User reporting and flagging mechanisms
Prior to the Buffalo attack, BitChute had reporting and flagging mechanisms in place to allow users to report potentially harmful content. However, on-platform flagging was only available to users who had a registered BitChute
account. While all users (registered or unregistered) were able to report content by sending an email to BitChute, we were concerned that requiring non-registered users to email the platform, rather than click a reporting button next to the video,
introduces a layer of friction to the reporting process that could disincentivise the user from making a report and increase the time taken to respond to reports.
As a result of our remediation work, BitChute has changed the
design of its platform to allow non-registered users to directly report potentially harmful content. It has also updated its user-facing guidelines to set out more clearly what registered and non-registered users can expect from the flagging and
reporting process.
3. Measuring effectiveness
BitChute has also committed to collecting additional metrics to measure the impact of changes made to its systems and processes, including the volume of
content review reports raised each day and average response time in minutes for content reports. These metrics will help BitChute and Ofcom to evaluate the effectiveness of the platform's measures more easily.
We have also
encouraged BitChute to implement additional reports on risk metrics, that measure the risk of harmful material being encountered on the platform and process metrics, that measure the effectiveness of BitChute's moderation systems.
Our response
Taking into account BitChute's willingness to make timely improvements to its systems and processes to directly address the concerns we identified following the Buffalo incident, and our desire to work with
industry to secure changes that protect users, we have decided not to open an investigation against BitChute into its compliance with its duties under Part 4B of the Communications Act 2003 at this time. We will, however, closely monitor the
implementation of the proposed changes and the impact these changes have on user safety.
We also note that, on 14 June 2023, BitChute became a member of the Global Internet Forum to Counter Terrorism (GIFCT). GIFCT is a
cross-industry initiative designed to prevent terrorists and violent extremists from exploiting digital platforms. Whilst we do not consider this an indicator of compliance, it is an encouraging step -- GIFCT has rigorous standards for membership ,
including demonstrating "a desire to explore new technical solutions to counter terrorist and violent extremist activity online" and "support for expanding the capacity of civil society organisations to challenge terrorism and violent
extremism".
While we welcome BitChute's commitments to further improvements and measuring their effectiveness, we are aware of reports -- some of which have been communicated to us directly -- alleging that content likely to
incite violence and hatred continues to be uploaded to BitChute, can be accessed easily, and may pose significant risks to users.
It is important to note that the VSP regime is a systems and processes regime, meaning the presence
of harmful video content on a service is not in itself a breach of the rules. Accordingly, Ofcom's focus is to drive improvements to platforms systems and processes to minimise the risks of users encountering harmful videos online in the first place.
However, such content can be indicative of an underlying issue with the user protections in place, and we will therefore continue to monitor BitChute closely to assess whether the changes it has made to its user reporting and content
moderation systems result in tangible improvements to user safety. If we find that, despite BitChute's assurances and improvements, users are not being adequately protected from the categories of harmful material covered by the VSP regime, we will not
hesitate to take further action, including formal enforcement action if necessary.