|
Ofcom picks RevealMe.com seemingly as a start point to enforce ID verification requirements for adult content
|
|
|
| 30th September 2022
|
|
| See article from ofcom.org.uk
|
RevealMe.com is streaming service along the lines of OnlyFans that allows models to provide adult streaming videos and other content to subscribing fans. Ofcom have announced that they are investigating the company for failing to provide Ofcom with
information as to how they are implementing Age/ID verification ('protecting users' in Ofcom speak). Ofcom writes: Ofcom has been the regulator of UK established video sharing platforms (VSPs) since
November 2020. Earlier this year, Ofcom issued a number of information requests to VSPs to obtain information on the measures taken by VSPs to protect users. On 29 September 2022, Ofcom opened an investigation into Tapnet Ltd,
which provides the VSP RevealMe. This investigation concerns Tapnet's compliance with an information request notice, issued on 6 June 2022 under section 368Z10 of the Communications Act 2003. Tapnet was required to respond to the
Notice by no later than 4 July 2022. As of 29 September 2022, Tapnet had not provided a response to the Notice. The Notice explained that the reason for requesting the information was to understand and monitor the measures VSPs
have in place to protect users and to publish a report under section 368Z11 of the Act. Ofcom's investigation will examine whether there are reasonable grounds for believing that Tapnet has failed to comply with its statutory
duties in relation to Ofcom's information request. Ofcom will provide updates on this page as we progress this investigation.
|
| Offsite article: DoHoT...
|
|
|
| 30th September 2022
|
|
|
Better security, privacy, and integrity via load-balanced DNS over HTTPS over Tor. By Alec Muffett See
article from blog.apnic.net |
|
India is considering mandatory ID verification to use internet messaging
|
|
|
| 27th September 2022
|
|
| See article from reclaimthenet.org
|
The Indian government is considering new telecom legislation that would cover social media platforms and messaging services like Telegram, WhatsApp, and Signal. Among the things included in the bill is a broader definition of telecommunications
services. It includes broadcasting services and internet-based communication. Industry experts claim that the broader definition might mean communication and messaging platforms will have to follow the same rules as telecoms, taking licenses and sharing
revenue with the government. Telecom Minister Ashwini Vaishnaw said that the bill will require KYC (know your customer) data for call and messaging apps to identify callers. Collecting KYC data is not currently possible for internet-based calls
for example. So, the government is holding consultations to find a solution. The bill would also give the government the power to intercept messages on internet-based communication services. The government is currently considering input from
stakeholders. |
|
Ofcom publishes report seemingly trying categorise or classify these 'harms' and associated risks with view to its future censorship role
|
|
|
| 25th
September 2022
|
|
| See article from ofcom.org.uk See
report [pdf] from ofcom.org.uk |
Ofcom writes: The Online Safety Bill, as currently drafted, will require Ofcom to assess, and publish its findings about the risks of harm arising from content that users may encounter on in-scope services, and will require in-scope
services to assess the risks of harm to their users from such content, and to have systems and processes for protecting individuals from harm. Online users can face a range of risks online, and the harms they may experience are
wide-ranging, complex and nuanced. In addition, the impact of the same harms can vary between users. In light of this complexity, we need to understand the mechanisms by which online content and conduct may give rise to harm, and use that insight to
inform our work, including our guidance to regulated services about how they might comply with their duties. This report sets out a generic model for understanding how online harms manifest. This research aimed to test a
framework, developed by Ofcom, with real-life user experiences. We wanted to explore if there were common risks and user experiences that could provide a single framework through which different harms could be analysed. There are a couple of important
considerations when reading this report:
The research goes beyond platforms' safety systems and processes to help shed broader light on what people are experiencing online. It therefore touches on issues that are beyond the scope of the proposed online safety regime.
The research reflects people's views and experiences of their online world: it is based on people self- identifying as having experienced 'significant harm', whether caused directly or indirectly, or 'illegal content'.
Participants' definitions of harmful and illegal content may differ and do not necessarily align with how the Online Safety Bill, Ofcom or others may define them.
|
|
Decision by US payments company is a sinister form of cancel culture'
|
|
|
|
25th September 2022
|
|
| 21st September 2022. See article from telegraph.co.uk |
PayPal has shut down the account of the Free Speech Union, an organisation which defends people who have lost work for expressing opinions. The US payments company censors were clearly offended by free speech and decided to shut down the accounts of
the Free Speech Union, its founder Toby Young, and his opinion and news website the Daily Sceptic with no clear explanation. Paypal merely spouted the bollox explanations that the union had 'violated PayPal's Acceptable Use Policy'. The Telegraph
reports that a likely explanation is that the organisation has helped to defend people who claim they have lost work for expressing opinions, for example Gillian Philip, the author who said her contract was terminated because she stood up for JK Rowling
on Twitter amid a row over transgender rights. It has also challenged universities that have no-platformed gender-critical academics. Toby Young said: I suspect it's because in reality PayPal doesn't value free
expression and open dialogue or the people and organisations that stand up for those principles. Withdrawing financial services from dissidents and non-conformists and those who dare to defend them is the new frontline in the ongoing war against free
speech.
The Free Speech Union will be lobbying the Government to put new laws in to prevent companies like PayPal demonetising organisations and individuals because their employees disapprove of their views.
Offsite article: Why has PayPal cancelled the Free Speech Union? See article from spectator.co.uk by Toby Young
It's left me wanting to do something about this insidious new way of cancelling people. As the switch to a cashless society gathers speed, we need to put some laws in place to protect people from being punished by companies like
PayPal for saying something their employees disapprove of. Offsite article: Big Tech is waging financial war on dissenters See
article from spiked-online.com by Tom Slater
PayPal's banning of the Free Speech Union is its most sinister move yet.
Offsite article: PayPal is trying to silence us See
article from spiked-online.com by Molly Kingsley The co-founder of UsForThem speaks out against Big Tech censorship.
|
|
|
|
|
|
25th September 2022
|
|
|
But we should be careful what we wish for. Several senior lawyers are now raising concerns about this conviction. By Kate Maltby See
article from inews.co.uk |
|
Now It's Upset That Airbnb Is Banning People With Criminal Records
|
|
|
|
19th September 2022
|
|
| See
article from techdirt.com by Mike Masnick
|
For years, the media has hyped up the idea that Airbnbs may be dangerous and used by criminals. At some point, it's no wonder that the company would start to just cut off people with criminal records, because of the PR problems it causes. The company
even went so far as to buy a background check company that it had used. Either way, it seems clear that Airbnb is going too far in banning people like Hallam, but it really shouldn't be a surprise. If we keep pushing moral panic style
stories about the dangers of criminals using services like Airbnb, we shouldn't then be surprised when the company says okay, no more people with a criminal record no matter how totally unfair that might be. See full
article from techdirt.com Can't we find a
way to recognise good people from bad? Are we paving the way for the acceptability of Chinese style social scoring? |
|
|
|
|
| 19th September
2022
|
|
|
New state laws claiming to protect children will infantilise us all. By Norman Lewis See article from spiked-online.com
|
|
California Governor Signs Disastrously Stupid Age Appropriate Design Code
|
|
|
| 16th
September 2022
|
|
| See article from techdirt.com
by Mike Masnick |
Gavin Newsom, who wants to be President some day, and thus couldn't risk misleading headlines that he didn't protect the children, has now signed AB 2273 into law. At this point there's not much more I can say about why AB 2273 is
so bad. I've explained why it's literally impossible to comply with (and why many sites will just ignore it). I've explained how it's pretty clearly unconstitutional. I've explained how the whole idea was pushed for and literally sponsored by a Hollywood
director / British baroness who wants to destroy the internet. I've explained how it won't do much, if anything, to protect children, but will likely put them at much greater risk. I've explained how the company it will likely benefit most is the world's
largest porn company -- not to mention COVID disinfo peddlers and privacy lawyers. I've explained how the companies supporting the law insist that we shouldn't worry because websites will just start scanning your face when you visit.
None of that matters, though. Because, in this nonsense political climate where moral panics and culture wars are all that matter in politics, politicians are going to back laws that claim to protect the children, no matter how much
of a lie that is. The bill doesn't go into effect until the middle of 2024 and I would assume that someone will go to court to challenge it, meaning that what this bill is going to accomplish in the
short run is California wasting a ton of taxpayer dollars (just as Texas and Florida did) to try to pretend they have the power to tell companies how to design their products. See full
article from techdirt.com
|
|
UK Online Censorship Bill set to continue after 'tweaks'
|
|
|
| 16th September 2022
|
|
| See article from techdirt.com |
After a little distraction for the royal funeral, the UK's newly elected prime minister has said she will be continuing with the Online Censorship Bill. She said: We will be proceeding with the Online Safety Bill. There
are some issues that we need to deal with. What I want to make sure is that we protect the under-18s from harm and that we also make sure free speech is allowed, so there may be some tweaks required, but certainly he is right that we need to protect
people's safety online.
TechDirt comments: This is just so ridiculously ignorant and uninformed. The Online Safety Bill is a disaster in waiting and I wouldn't be surprised if some websites chose to
exit the UK entirely rather than continue to deal with the law. It won't actually protect the children, of course. It will create many problems for them. It won't do much at all, except make internet companies question whether
it's even worth doing business in the UK.
|
|
The Fight to Overturn FOSTA, an Unconstitutional Internet Censorship Law, Continues
|
|
|
| 16th September 2022
|
|
| See CC article from eff.org by Aaron Mackey
|
More than four years after its enactment, FOSTA remains an unconstitutional law that broadly censored the internet and harmed sex workers and others by chilling their ability to speak, organize, and access information online. And the fight to overturn FOSTA continues. Last week, two human rights organizations, a digital library, a sex worker activist, and a certified massage therapist filed their opening brief in a case that seeks to strike down the law for its many constitutional violations.
Their brief explains to a federal appellate court why FOSTA is a direct regulation of people's speech that also censors online intermediaries that so many rely upon to speak--classic First Amendment violations. The brief also
details how FOSTA has harmed the plaintiffs, sex workers, and allies seeking to decriminalize the work and make it safer, primarily because of its vague terms and its conflation of sex work with coercive trafficking. "FOSTA
created a predictable speech-suppressing ratchet leading to 'self-censorship of constitutionally protected material' on a massive scale," the plaintiffs, Woodhull Freedom Foundation, Human Rights Watch, The Internet Archive, Alex Andrews, and Eric
Koszyk, argue. "Websites that support sex workers by providing health-related information or safety tips could be liable for promoting or facilitating prostitution, while those that assist or make prostitution easier--i.e., 'facilitate' it--by
advocating for decriminalization are now uncertain of their own legality." FOSTA created new civil and criminal liability for anyone who "owns, manages, or operates an interactive computer service" and creates
content (or hosts third-party content) with the intent to "promote or facilitate the prostitution of another person." The law also expands criminal and civil liability to classify any online speaker or platform that allegedly assists, supports,
or facilitates sex trafficking as though they themselves were participating "in a venture" with individuals directly engaged in sex trafficking. FOSTA doesn't just seek to hold platforms and hosts criminally responsible
for the actions of sex-traffickers. It also introduces significant exceptions to the civil immunity provisions of one of the internet's most important laws, 47 U.S.C. § 230. These exceptions create new state law criminal and civil liability for online
platforms based on whether their users' speech might be seen as promoting or facilitating prostitution, or as assisting, supporting or facilitating sex trafficking. The plaintiffs are not alone in viewing FOSTA as an overbroad
censorship law that has harmed sex workers and other online speakers. Four friend-of-the-court briefs filed in support of their case this week underscore FOSTA's disastrous consequences. The Center for Democracy & Technology's
brief argues that FOSTA negated the First Amendment's protections for online intermediaries and thus undercut the vital role those services provide by hosting a broad and diverse array of users' speech online. "Although
Congress may have only intended the laudable goal of halting sex trafficking, it went too far: chilling constitutionally protected speech and prompting online platforms to shut down users' political advocacy and suppress communications having nothing to
do with sex trafficking for fear of liability," CDT's brief argues. A brief from the Transgender Law Center describes how FOSTA's breadth has directly harmed lesbian, gay, transgender, and queer people.
"Although FOSTA's text may not name gender or sexual orientation, FOSTA's regulation of speech furthers the profiling and policing of LGBTQ people, particularly TGNC people, as the statute's censorial effect has resulted in the
removal of speech created by LGBTQ people and discussions of sexuality and gender identity," the brief argues. "The overbroad censorship resulting from FOSTA has resulted in real and substantial harm to LGBTQ people's First Amendment rights as
well as economic harm to LGBTQ people and communities." Two different coalitions of sex worker advocacy and harm reduction groups filed briefs in support of the plaintiffs that show FOSTA's direct impact on sex workers and
how the law's conflation of consensual sex work with coercive trafficking has harmed both victims of trafficking and sex workers. A brief led by Call Off Your Old Tired Ethics (COYOTE) of Rhode Island published data from its
recent survey of sex workers showing that FOSTA has made sex trafficking more prevalent and harder to combat. "Every kind of sex worker, including trafficking survivors, have been impacted by FOSTA precisely because its broad
terms fail to distinguish between different types of sex work and trafficking," the brief argues. The brief goes on to argue that FOSTA's First Amendment problems have "made sex work more dangerous by curtailing the ability to screen clients on
trusted online databases, also known as blacklists." A brief led by Decriminalize Sex Work shows that "FOSTA is part of a legacy of federal and state laws that have wrongfully conflated human trafficking and adult
consensual sex work while overlooking the realities of each." "The limitations on free speech caused by FOSTA have essentially censored harm reduction and safety information sharing, removed tools that sex workers used
to keep themselves and others safe, and interrupted organizing and legislative endeavors to make policies that will enhance the wellbeing of sex workers and trafficking survivors alike," the brief argues. "Each of these effects has had a
devastating impact on already marginalized and vulnerable communities; meanwhile, FOSTA has not addressed nor redressed any of the issues cited as motivation for its enactment." The plaintiffs' appeal marks the second time
the case has gone up to the U.S. Court of Appeals for the District of Columbia. The plaintiffs previously prevailed in the appellate court when it ruled in 2020 that they had the legal right, known as standing, to challenge FOSTA, reversing an earlier
district court ruling.
|
|
French courts try to find an acceptable method of age verification for porn
|
|
|
| 11th
September 2022
|
|
| See article from 20minutes.fr
|
A Paris court has offered to organize a mediation to find a way to prevent minors from accessing pornography on the internet. Campaigners had asked telecom operators to immediately block several porn websites but French law does not actually specify
how this should be achieved. The decision to order the parties to negotiate will be formally taken soon as the French internet censor ARCOM does not seem willing or able to specify how age verification should be done. The Cypriot company MG
Freesit es, publisher of the Pornhub platform, one of the five sites targeted in this procedure, for its part filed a priority question of constitutionality (QPC) which calls into question the legitimacy of Arcom to act. The court will decide on
October 4 whether or not to send this QPC to the Court of Cassation. If this was not the case, a new civil hearing would be organized to examine the case on the merits. During the debates, the lawyers representing Pornhub, Tukif, XHamster,
Xvideos, Xnxx, targeted in December 2021 by a formal notice from Arcom and which are among the most visited sites in France, tried to prove their goodwill. " None of the technical solutions that we have tested have proven to be satisfactory. Recent reports from the Cnil and Peren, the government's center of expertise in data protectin, have recalled the risks associated with existing majority verification solutions offered by the industry, while by paving the way for a system based on trusted third parties.
|
|
Gulf states threaten legal action about gay characters in Netflix shows
|
|
|
| 11th September
2022
|
|
| See article from
theguardian.com |
Six Gulf states have warned Netflix over content violating Islamic values. The states have threatened Netflix with legal action if it continues broadcasting content that contradicts Islam, while Saudi state media indicated that the offending material
centred on shows depicting sexual minorities. A statement issued jointly by the Saudi media regulator and the six-member Gulf Cooperation Council (GCC), headquartered in the Saudi capital, Riyadh, did not specifically identify material, referring only
to content that contradicts Islamic and societal values. The statement said. Regional authorities will follow up on the platform's compliance with the directives, and in the event that the infringing content continues
to be broadcast, the necessary legal measures will be taken.
The Gulf Cooperation Council includes Bahrain, Kuwait, Oman, Qatar, Saudi Arabia , and the United Arab Emirates. |
|
The continuingly dangerous campaign to force ALL people to hand over sensitive ID details to porn sites in the name of protecting children from handing over sensitive ID details.
|
|
|
| 3rd September 2022
|
|
| See article from ico.org.uk
|
The UK's data protection censors at the Information Commissioner's Office ICO have generated a disgracefully onerous red tape nightmare called the Age Appropriate Design Code that requires any internet service that provides any sort of grown up content
to evaluate the age of all users so that under 18s can be protected from handing over sensitive ID data. Of course the age checking usually requires all users to hand over lots of sensitive and dangerous ID data to any website that asks. Now the ICO
has decided to make these requirements of porn sites given that they are often accessed by under 18s. ICO writes: Next steps We will continue to evolve our approach, listening to others to
ensure the code is having the maximum impact. For example, we have seen an increasing amount of research (from the NSPCC, 5Rights, Microsoft and British Board of Film Classification), that children are likely to be accessing
adult-only services and that these pose data protection harms, with children losing control of their data or being manipulated to give more data, in addition to content harms. We have therefore revised our position to clarify that adult-only services are
in scope of the Children's code if they are likely to be accessed by children. As well as engaging with adult-only services directly to ensure they conform with the code, we will also be working closely with Ofcom and the
Department for Digital, Culture, Media and Sport (DCMS) to establish how the code works in practice in relation to adult-only services and what they should expect. This work is continuing to drive the improvements necessary to provide a better internet
for children.
|
|
|