YouTube has banned Erika Lust's series In Conversation with Sex Workers.
There was NO explicit content, NO sex, NO naked bodies, NO (female) nipples or anything else that breaks YouTube's strict guidelines in the series, Lust wrote on her website. It was simply sex workers speaking about their work and experiences.
Presumably the censorship is inspired by the US FOSTA internet censorship where YouTube would be held liable for content that facilitates sex trafficking. It is cheaper and easier for YouTube to take down any content that could in anyway connected
to sex trafficking than spend time checking it out.
Erika Lust, a Barcelona-based erotic filmmaker, wrote in a blog post on Wednesday that YouTube terminated her eponymous channel on July 4, when it had around 11,000 subscribers. The ban came after an interviewee for the company's series In
Conversation With Sex Workers, which had been on YouTube for about a week, tweeted to promote her involvement in the film. Within hours of that tweet the channel was terminated, citing violation of community guidelines.
Sharon White, the CEO of Ofcom has put her case to be the British internet news censor, disgracefully from behind
the paywalled website of the The Times.
White says Ofcom has done research showing how little users trust what they read on social media. She said that only 39% consider social media to be a trustworthy news source, compared with 63% for newspapers, and 70% for TV.
But then again many people don't much trust the biased moralising from the politically correct mainstream media, including the likes of Ofcom.
White claims social media platforms need to be more accountable in how they curate and police content on their platforms, or face regulation.
In reality, Facebook's algorithm seems pretty straightforward, it just gives readers more of what they have liked in the past. But of course the powers that be don't like people choosing their own media sources, they would much prefer that the
BBC, or the Guardian , or Ofcom do the choosing.
Sharon White, wrote in the Times:
The argument for independent regulatory oversight of [large online players] has never been stronger.
In practice, this would place much greater scrutiny on how effectively the online platforms respond to harmful content to protect consumers, with powers for a regulator to enforce standards, and act if these are not met.
She continued, disgracefully revealing her complete contempt of the British people:
Many people admit they simply don't have the time or inclination to think critically when engaging with news, which has important implications for our democracy.
White joins a growing number of the establishment elite arguing that social media needs cenorship. The government has frequently suggested as much, with Matt Hancock, then digital, culture, media and sport secretary, telling Facebook in April:
Social media companies are not above the law and will not be allowed to shirk their responsibilities to our citizens.
A paper has been published on the effects of network level website blocking to try and prevent adolescents
from seeking out porn.
Internet Filtering and Adolescent Exposure to Online Sexual Material
BY Andrew K. Przybylski, and Victoria Nash
Early adolescents are spending an increasing amount of time online, and a significant share of caregivers now use Internet filtering tools to shield this population from online sexual material. Despite wide use, the efficacy of filters is poorly
understood. In this article, we present two studies: one exploratory analysis of secondary data collected in the European Union, and one preregistered study focused on British adolescents and caregivers to rigorously evaluate their utility. In
both studies, caregivers were asked about their use of Internet filtering, and adolescent participants were interviewed about their recent online experiences.
Analyses focused on the absolute and relative risks of young people encountering online sexual material and the effectiveness of Internet filters.
Results suggested that caregiver's use of Internet filtering had inconsistent and practically insignificant links with young people reports of encountering online sexual material.
The struggle to shape the experiences young people have online is now part of modern parenthood. This study was conducted to address the value of industry, policy, and professional advice concerning the appropriate role of Internet filtering in
this struggle. Our preliminary findings suggested that filters might have small protective effects, but evidence derived from a more stringent and robust empirical approach indicated that they are entirely ineffective. These findings highlight
the need for a critical cost -- benefit analysis in light of the financial and informational costs associated with filtering and age verification technologies such as those now being developed in some European countries like the United Kingdom.
Further, our results highlight the need for registered trials to rigorously evaluate the effectiveness of costly technological solutions for social and developmental goals.
The write up doesn't really put its conclusions with any real context as to what is actually happening beyond the kids still being able to get hold of porn. The following paragraph gives the best clue of what is going on:
We calculated absolute risk reduction of exposure to online sexual material associated with caregivers using filtering technology in practical terms. These resultswere used to calculate the number of households which would have to be
filtered to prevent one young person, who would otherwise see sexual material online, from encountering it over a 12-month period. Depending on the form of content, results indicated that between 17 and 77 households would need to be
filtered to prevent a young adolescent from encountering online sexual material. A protective effect lower than we would consider practically significant.
This seems to suggest that if one kid has a censored internet then he just goes around to a mate's house who isn't censored, and downloads from there. He wouldn't actually be blocked from viewing porn until his whole circle of friends are
similarly censored. It only takes one kid to be able download porn, as it can then be loaded on a memory stick to be passed around.
Russia's interior minister says he wants citizens to scour the internet for banned
The Russian internet censor Roskomnadzor, has an ever-expanding list of banned sites featuring material that Russian authorities don't like. The list takes in everything from LGBT sites to critics of the Kremlin and sites that allegedly carry
terrorist propaganda, the main justification for many of Russia's online censorship and surveillance laws.
Free-speech activists reckon the number of blocked websites now tops 100,000, but how best to keep adding to that list?
Russia's interior minister, Vladimir Kolokoltsev, says volunteers should step up to aid the search for banned information. Whilst speaking about the challenges faced by search and rescue volunteers, he said volunteers could help public authorities
in preventing drug abuse, combating juvenile delinquency, and monitoring the internet networks to search for banned information.
Uganda has just introduced a significant tax on social media usage. It is set at 200 shillings a day which adds up to about 3% of the average annual
income if used daily.
Use of a long list of websites including Facebook, Whatsapp, Twitter, Tinder triggers the daily taxed through billing by ISPs.
And as you may expect Uganda internet users are turning to VPNs so that ISPs can't detect access to taxed apps and websites.
In response, the government says it has ordered local ISPs to begin blocking VPNs. In a statement, Uganda Communications Commission Executive Director, Godfrey Mutabazi said that Internet service providers would be ordered to block VPNs to prevent
citizens from avoiding the social media tax.
Mutabazi told Dispatch that ISPs are already taking action to prevent VPNs from being accessible but since there are so many, it won't be possible to block them all. In the meantime, the government is trying to portray VPNs as more expensive to
use than the tax. In a post on Facebook this morning, Mutabazi promoted the tax as the sensible economic option.
it appears that many Ugandans are outraged at the prospect of yet another tax and see VPN use as a protest, despite any additional cost. Opposition figures have already called for a boycott with support coming in from all corners of society. The
government appears unmoved, however. Frank Tumwebaze, Minister of Information Technology and Communications said:
If we tax essentials like water, why not social media?
Uganda is reviewing its decision to impose taxes on the use of social media and on money transactions by mobile phone, following a public backlash.
Prime Minister Ruhakana Rugunda made the announcement soon after police broke up a protest against the taxes.
President Yoweri Museveni had pushed for the taxes to boost government revenue and to restrict criticism via WhatsApp, Facebook and Twitter.
The social media tax is 6000 Uganda shillings a month (£1.25), but it is represents about 3% of the average wage. Activists argue that while the amount may seem little, it represents a significant slice of what poorer people are paying for getting
online. There is also a 1% levy on the total value of mobile phone money transactions, affecting poorer Ugandans who rarely use banking services.
In a statement to parliament, Rugunda said:
Government is now reviewing the taxes taking into consideration the concerns of the public and its implications on the budget.
A revised budget is due to be tabled in parliament on 19 July.
Supporters of the US internet censorship law FOSTA were supposedly attempting to
target pimps and traffickers, but of course their target was the wider sex work industry. Hence they weren't really interested in the warning that the law would make it harder to target pimps and sex traffickers as their activity would be driven
Anyway it seems that the police at least have started to realise that the warning is coming true, but I don't suppose this will bother the politicians much.
Over in Indianapolis, the police have just arrested their first pimp in 2018, and it involved an undercover cop being approached by the pimp. The reporter asks why there have been so few such arrests, and the police point the finger right at the
shutdown of Backpage:
The cases, according to Sgt. John Daggy, an undercover officer with IMPD's vice unit, have just dried up. The reason for that is pretty simple: the feds closed police's best source of leads, the online personals site Backpage, earlier this year.
We've been a little bit blinded lately because they shut Backpage down. I get the reasoning behind it, and the ethics behind it, however, it has blinded us. We used to look at Backpage as a trap for human traffickers and pimps.
With Backpage, we would subpoena the ads and it would tell a lot of the story. Also, with the ads we would catch our victim at a hotel room, which would give us a crime scene. There's a ton of evidence at a crime scene. Now, since [Backpage] has
gone down, we're getting late reports of them and we don't have much to go by.
Jeremy Wright has been appointed as the new Secretary of State for Digital, Culture, Media and Sport.
He is the government minister in charge of the up 'n' coming regime to censor internet porn. He will also be responsible for several government initiatives attempting to censor social media.
He is a QC and was previously the government's Attorney General. His parliamentary career to date has not really given any pointers to his views and attitudes towards censorship.
The previous culture minister, Matt Hancock has move upwards to become minister for health. Perhaps in his new post he can continue to whinge about limiting what he considers the excessive amount of screen time being enjoyed by children.
Sky, TalkTalk and Virgin Media would back the creation of an internet censor to set out a framework for internet
companies in the UK, the House of Lords Communications Committee was told.
The three major UK ISPs were reporting to a House of Lords' ongoing inquiry into internet censorship. The companies' policy heads pushed for a new censor, or the expansion of the responsibility of a current censor, to set the rules for content
censorship and to better equip children using the internet amid safety concerns .
At the moment Information Commissioner's Office has responsibility for data protection and privacy; Ofcom censors internet TV; the Advertising Standards Authority censors adverts; and the BBFC censors adult porn.
Citing a report by consultancy Communications Chambers, Sky's Adam Kinsley said that websites and internet providers are making decisions but in a non structured way. Speaking about the current state of internet regulation, Kinsley said:
Companies are already policing their own platforms. There is no accountability of what they are doing and how they are doing it. The only bit of transparency is when they decide to do it on a global basis and at a time of their choosing. Policy
makers need to understand what is happening, and at the moment they don't have that.
The 13-strong House of Lords committee, chaired by Lord Gilbert of Panteg, launched an inquiry earlier this year to explore how the censorship of the internet should be improved. The committee will consider whether there is a need for new laws to
govern internet companies. This inquiry will consider whether websites are sufficiently accountable and transparent, and whether they have adequate governance and provide behavioural standards for users.
The committee is hearing evidence from April to September 2018 and will launch a report at the end of the year.
We are asking a court to declare the Allow States and Victims to Fight Online Sex Trafficking Act of 2017 ("FOSTA") unconstitutional and prevent it from being enforced. The law was written so poorly that it actually criminalizes a
substantial amount of protected speech and, according to experts, actually hinders efforts to prosecute sex traffickers and aid victims.
In our lawsuit, two human rights organizations, an individual advocate for sex workers, a certified non-sexual massage therapist, and the Internet Archive, are challenging the law as an unconstitutional violation of the First and Fifth Amendments.
Although the law was passed by Congress for the worthy purpose of fighting sex trafficking, its broad language makes criminals of those who advocate for and provide resources to adult, consensual sex workers and actually hinders efforts to
prosecute sex traffickers and aid victims.
FOSTA made three major changes to existing law. The first two involved changes to federal criminal law:
First, it created an entirely new federal crime by adding a new section to the Mann Act. The new law makes it a crime to "own, manage or operate" an online service with the intent to "promote or facilitate" "the
prostitution of another person." That crime is punishable by up to 10 years in prison. The law further makes it an "aggravated offense," punishable by up to 25 years in prison and also subject to civil lawsuits if
"facilitation" was of the prostitution of 5 or more persons, or if it was done with "reckless disregard" that it "contributed to sex trafficking." An aggravated violation may also be the basis for an individual's
civil lawsuit. The prior version of the Mann Act only made it illegal to physically transport a person across state lines for the purposes of prostitution.
Second, FOSTA expanded existing federal criminal sex trafficking law. Before SESTA, the law made it a crime to knowingly advertise sexual services of a minor or any person doing so only under force, fraud, or coercion, and also criminalized
several other modes of conduct. The specific knowledge requirement for advertising (that one must know he advertisement was for sex trafficking) was an acknowledgement that advertising was entitled to some First Amendment protection. The prior
law additionally made it a crime to financially benefit from "participation in a venture" of sex trafficking. FOSTA made seemingly a small change to the law: it defined "participation in a venture" extremely broadly to
include "assisting, supporting, or facilitating." But this new very broad language has created great uncertainty about liability for speech other than advertising that someone might interpret as "assisting" or
"supporting" sex trafficking, and what level of awareness of sex trafficking the participant must have.
As is obvious, these expansions of the law are fraught with vague and ambiguous terms that have created great uncertainty about what kind of online speech is now illegal. FOSTA does not define "facilitate", "promote",
"contribute to sex trafficking," "assisting," or supporting" -- but the inclusion of all of these terms shows that Congress intended the law to apply expansively. Plaintiffs thus reasonably fear it will be applied to them.
Plaintiffs Woodhull Freedom Foundation and Human Rights Watch advocate for the decriminalization of sex work, both domestically and internationally. It is unclear whether that advocacy is considered "facilitating" prostitution under
FOSTA. Plaintiffs Woodhull and Alex Andrews offer substantial resources online to sex workers, including important health and safety information. This protected speech, and other harm reduction efforts, can also be seen as "facilitating"
prostitution. And although each of the plaintiffs vehemently opposes sex trafficking, Congress's expressed
sense in passing the law was that sex trafficking and sex work were "inextricably linked." Thus, plaintiffs are legitimately concerned that their advocacy on behalf of sex workers will be seen as being done in reckless disregard of some
"contribution to sex trafficking," even though all plaintiffs vehemently oppose sex trafficking.
The third change significantly undercut the protections of one of the Internet's most important laws, 47 U.S.C. § 230, originally a provision of the Communications Decency Act, commonly known simply as Section 230 or CDA 230:
FOSTA significantly undermined the legal protections intermediaries had under 42 U.S.C. § 230, commonly known simply as Section 230. Section 230 generally immunized intermediaries form liability arising from content created by others--it was
thus the chief protection that allowed Internet platforms for user-generated content to exist without having to review every piece of content appearing posted to them for potential legal liability. FOSTA undercut this immunity in three
significant ways. First, Section 230 already had an exception for violations of federal criminal law, so the expansion of criminal law described above also automatically expanded the Section 230 exception. Second, FOSTA nullified the immunity
also for state criminal lawsuits for violations of state laws that mirror the violations of federal law. And third, FOSTA allows for lawsuits by individual civil litigants.
The possibility of these state criminal and private civil lawsuit is very troublesome. FOSTA vastly magnifies the risk an Internet host bears of being sued. Whereas federal prosecutors typically carefully pick and choose which violations of law
they pursue, the far more numerous state prosecutors may be more prone to less selective prosecutions. And civil litigants often do not carefully consider the legal merits of an action before pursing it in court. Past experience teaches us that
they might file lawsuits merely to intimidate a speaker into silence -- the cost of defending even a meritless lawsuit being quite high. Lastly, whereas with federal criminal prosecutions, the US Department of Justice may offer clarifying
interpretations of a federal criminal law that addresses concerns with a law's ambiguity, those interpretations are not binding on state prosecutors and the millions of potential private litigants.
FOSTA Has Already Censored The Internet
As a result of these hugely increased risks of liability, many platforms for online speech have shuttered or restructured. The following as just two examples:
Two days after the Senate passed FOSTA, Craigslist eliminated its Personals section, including non-sexual subcategories such as "Missed Connections" and "Strictly Platonic." Craigslist
this change to FOSTA, explaining "Any tool or service can be misused. We can't take such risk without jeopardizing all our other services, so we are regretfully taking craigslist personals offline. Hopefully we can bring them back some
day." Craigslist also shut down its Therapeutic Services section and will not permit ads that were previously listed in Therapeutic Services to be re-listed in other sections, such as Skilled Trade Services or Beauty Services.
VerifyHim formerly maintained various online tools that helped sex workers avoid abusive clients. It described itself as "the biggest dating blacklist database on earth." One such resource was JUST FOR SAFETY, which had screening tools
designed to help sex workers check to see if they might be meeting someone dangerous, create communities of common interest, and talk directly to each other about safety. Following passage of FOSTA, VerifyHim took down many of these tools,
including JUST FOR SAFETY, and explained
that it is "working to change the direction of the site."
Plaintiff Eric Koszyk is a certified massage therapist running his own non-sexual massage business as his primary source of income. Prior to FOSTA he advertised his services exclusively in Craigslist's Therapeutic Services section. That forum is
no longer available and he is unable to run his ad anywhere else on the site, thus seriously harming his business. Plaintiff the Internet Archive fears that it can no longer rely on Section 230 to bar liability for content created by third parties
and hosted by the Archive, which comprises the vast majority of material in the Archive's collection, on account of FOSTA's changes to Section 230. The Archive is concerned that some third-party content hosted by the Archive, such as archives of
particular websites, information about books, and the books themselves, could be construed as promoting or facilitating prostitution, or assisting, supporting, or facilitating sex trafficking under FOSTA's expansive terms. Plaintiff Alex Andrews
maintains the website RateThatRescue.org, a sex worker-led, public, free, community effort to share information about both the organizations and services on which sex workers can rely, and those they should avoid. Because the site is largely
user-generated content, Andrews relies on Section 230's protections. She is now concerned that FOSTA now exposes her to potentially ruinous civil and criminal liability. She has also suspended moving forward with an app that would offer harm
reduction materials to sex workers. Human Rights Watch relies heavily on individuals spreading its reporting and advocacy through social media. It is concerned that social media platforms and websites that host, disseminate, or allow users to
spread their reports and advocacy materials may be inhibited from doing so because of FOSTA.
And many many others are experiencing the same uncertainty and fears of prosecution that are plaguing other advocates, service providers, platforms, and platform users since FOSTA became law.
We have asked the court to preliminarily enjoin enforcement of the law so that the plaintiffs and others can exercise their First Amendment rights until the court can issue a final ruling. But there is another urgent reason to halt enforcement of
the law. Plaintiff Woodhull Freedom Foundation is holding its annual Sexual Freedom Summit August 2-, 2018. Like past years, the Summit features a track on sex work, this year titled "Sex as Work," that seeks to advance and promote the
careers, safety, and dignity of individuals engaged in professional sex work. In presenting and promoting the Sexual Freedom Summit, and the Sex Work Track in particular, Woodhull operates and uses interactive computer services in numerous ways:
Woodhull uses online databases and cloud storage services to organize, schedule and plan the Summit; Woodhull exchanges emails with organizers, volunteers, website developers, promoters and presenters during all phases of the Summit; Woodhull has
promoted the titles of all workshops on its Summit website
; Woodhull also publishes the biographies and contact information for workshop presenters on its website, including those for the sex workers participating in the Sex Work Track and other tracks. Is publishing the name and contact information for
a sex worker "facilitating the prostitution of another person"? If it is, FOSTA makes it a crime.
Moreover, most, if not all, of the workshops are also promoted by Woodhull on social media such as Facebook and Twitter; and Woodhull wishes to stream the Sex Work Track on Facebook, as it does other tracks, so that those who cannot attend can
benefit from the information and commentary.
Without an injunction, the legality under FOSTA of all of these practices is uncertain. The preliminary injunction is necessary so that Woodhull can conduct the Sex as Work track without fear of prosecution.
It is worth emphasizing that Congress was repeatedly warned that it was passing a law that would censor far more speech than was necessary to address the problem of sex trafficking, and that the law would indeed hinder law enforcement efforts and
pose great dangers to sex workers. During the Congressional debate on FOSTA and SESTA, anti-trafficking groups such as Freedom Network
and the International Women's Health Coalition
issued statements warning that the laws would hurt efforts to aid trafficking victims, not help them.
Even Senator Richard Blumenthal, an original cosponsor of the SESTA (the Senate bill) criticized the new Mann Act provision when it was proposed in the House bill, telling
"there is no good reason to proceed with a proposal that is opposed by the very survivors it claims to support." Nevertheless, Senator Blumenthal ultimately voted to pass FOSTA.
In support of the preliminary injunction
, we have submitted the declarations of several experts who confirm the harmful effects FOSTA is having on sex workers, who are being driven back to far more dangerous street-based work as online classified sites disappear, to the loss of online
"bad date lists" that informed sex workers of risks associated with certain clients, to making sex less visible to law enforcement, which can no longer scour and analyze formerly public websites where sex trafficking had been advertised.
For more information see the Declarations of Dr. Alexandra Lutnick
, Prof. Alexandra Frell Levy
, and Dr. Kimberly Mehlman-Orozco
One moment Facebook's algorithms are expected to be able to automatically distinguish
terrorism support from news reporting or satire, the next moment, it demonstrates exactly how crap it is by failing to distinguish hate speech from a profound, and nation establishing, statement of citizens rights.
Facebook's algorithms removed parts of the US Declaration of Independence from the social media site after determining they represented hate speech.
The issue came to light when a local paper in Texas began posting excerpts of the historic text on its Facebook page each day in the run up to the country's Independence Day celebrations on July 4.
However when The Liberty County Vindicator attempted to post its tenth extract, which refers to merciless Indian savages, on its Facebook page the paper received a notice saying the post went against its standards on hate speech.
Facebook later 'apologised' as it has done countless times before and allowed the posting.
As we have been covering in the last couple of articles, a controversial EU Copyright Directive has been under discussion at
the European Parliament, and in a surprising turn of events, it voted to reject
fast-tracking the tabled proposal by the JURI Committee which contained controversial proposals, particularly in
and Art 13
. The proposed Directive will now get a full discussion and debate in plenary in September.
I say surprising because for those of us who have been witnesses (and participants) to the Copyright Wars for the last 20 years, such a defeat of copyright maximalist proposals is practically unprecedented, perhaps with the exception of
. For years we've had a familiar pattern in the passing of copyright legislation: a proposal has been made to enhance protection and/or restrict liberties, a small group of ageing millionaire musicians would be paraded supporting the changes in
the interest of creators. Only copyright nerds and a few NGOs and digital rights advocates would complain, their opinions would be ignored and the legislation would pass unopposed. Rinse and repeat.
But something has changed, and a wide coalition has managed to defeat powerful media lobbies for the first time in Europe, at least for now. How was this possible?
The main change is that the media landscape is very different thanks to the Internet. In the past, the creative industries were monolithic in their support for stronger protection, and they included creators, corporations, collecting societies,
publishers, and distributors; in other words the gatekeepers and the owners were roughly on the same side. But the Internet brought a number of new players, the tech industry and their online platforms and tools became the new gatekeepers.
Moreover, as people do not buy physical copies of their media and the entire industry has moved towards streaming, online distributors have become more powerful. This has created a perceived imbalance, where the formerly dominating industries need
to negotiate with the new gatekeepers for access to users. This is why creators complain about a value gap
between what they perceive they should be getting, and what they actually receive from the giants.
The main result of this change from a political standpoint is that now we have two lobbying sides in the debate, which makes all the difference when it comes to this type of legislation. In the past, policymakers could ignore experts and digital
rights advocates because they never had the potential to reach them, letters and articles by academics were not taken into account, or given lip service during some obscure committee discussion just to be hidden away. Tech giants such as Google
have provided lobbying access in Brussels, which has at least levelled the playing field when it comes to presenting evidence to legislators.
As a veteran of the Copyright Wars, I have to admit that it has been very entertaining reading the reaction from the copyright industry lobby groups and their individual representatives, some almost going apoplectic with rage at Google's
intervention. These tend to be the same people who spent decades lobbying legislators to get their way unopposed, representing large corporate interests unashamedly and passing laws that would benefit only a few, usually to the detriment of users.
It seems like lobbying must be decried when you lose.
But to see this as a victory for Google and other tech giants completely ignores the large coalition that shares the view that the proposed Articles 11 and 13 are very badly thought-out, and could represent a real danger to existing rights. Some
of us have been fighting this fight when Google did not even exist, or it was but a small competitor of AltaVista, Lycos, Excite and Yahoo!
At the same time that more restrictive copyright legislation came into place, we also saw the rise of free and open source software, open access, Creative Commons and open data. All of these are legal hacks that allow sharing, remixing and
openness. These were created precisely to respond to restrictive copyright practices. I also remember how they were opposed as existential threats by the same copyright industries, and treated with disdain and animosity. But something wonderful
happened, eventually open source software started winning (we used to buy operating systems), and Creative Commons became an important part of the Internet's ecosystem by propping-up valuable common spaces such as Wikipedia.
Similarly, the Internet has allowed a great diversity of actors to emerge. Independent creators, small and medium enterprises, online publishers and startups love the Internet because it gives them access to a wider audience, and often they can
bypass established gatekeepers. Lost in this idiotic "Google v musicians" rhetoric has been the threat that both Art 11 and 13 represent to small entities. Art 11 proposes a new publishing right that has been proven to affect smaller
players in Germany and Spain; while Art 13 would impose potentially crippling economic restrictions to smaller companies as they would have to put in place automated filtering systems AND redress mechanisms against mistakes. In fact, it has been
often remarked that Art 13 would benefit existing dominant forces, as they already have filtering in place (think ContentID).
Similarly, Internet advocates and luminaries see the proposals as a threat to the Internet, the people who know the Web best think that this is a bad idea. If you can stomach it,
read this thread featuring
a copyright lobbyist attacking Neil Gaiman, who has been one of the Internet celebrities that have voiced their concerns about the Directive.
Even copyright experts
who almost never intervene in digital rights affairs the have been vocal in their opposition to the changes.
And finally we have political representatives from various parties and backgrounds who have been vocally opposed to the changes. While the leader of the political opposition has been the amazing Julia Reda, she has managed to bring together a
variety of voices from other parties and countries. The vitriol launched at her has been unrelenting, but futile. It has been quite a sight to see her opponents both try to dismiss her as just another clueless young Pirate commanded by Google,
while at the same time they try to portray her as a powerful enemy in charge of the mindless and uninformed online troll masses ready to do her bidding.
All of the above managed to do something wonderful, which was to convey the threat in easy-to-understand terms so that users could contact their representatives and make their voice heard. The level of popular opposition to the Directive has been
a great sight to behold.
Tech giants did not create this alliance, they just gave various voices access to the table. To dismiss this as Google's doing completely ignores the very real and rich tapestry of those defending digital rights, and it is quite clearly
patronising and insulting, and precisely the reason why they lost. It was very late until they finally realised that they were losing the debate with the public, and not even the last-minute deployment of musical dinosaurs could save the day.
But the fight continues, keep contacting your MEPs and keep applying pressure.
So who supported internet censorship in the EU parliamentary vote?
Mostly the EU Conservative Group and also half the Social Democrat MEPs and half the Far Right MEPs
In May, Tanzanian bloggers lost an appeal that had temporarily suspended a new set of regulations granting the country's Communication
Regulatory Authority discretionary powers to censor online content.
Officially dubbed the Electronic and Postal Communications (Online Content) Regulations, 2018 , the statute requires online content creators -- traditional media websites, online TV and radio channels, but also individual bloggers and podcasters
-- to pay roughly two million Tanzanian shillings (930 US dollars) in registration and licensing fees.
They must store contributors' details for 12 months and have means to identify their sources and disclose financial sponsors. Cyber cafes must install surveillance cameras, and all owners of electronic mobile devices, including phones, have to
protect them with a password. Failure to comply with the regulations -- which also forbid online content that is indecent, annoying, or that leads to public disorder -- will result in a five million Tanzanian shillings (2,202 US dollars) fine, a
jail term of not less than a year or both.
These new regulations are already forcing young content creators--and often poorer ones--offline. For a country like Tanzania, whose GDP per capita is 879 US dollars --and where approximately 70% of the population lives on less than two dollars a
day--the financial burden of these new laws means it is impossible to continue blogging.
Today we're releasing our latest desktop browser Brave 0.23 which features Private Tabs with Tor, a technology for defending against network surveillance. This new functionality, currently in beta, integrates Tor into the browser and gives users a
new browsing mode that helps protect their privacy not only on device but over the network. Private Tabs with Tor help protect Brave users from ISPs (Internet Service Providers), guest Wi-Fi providers, and visited sites that may be watching their
Internet connection or even tracking and collecting IP addresses, a device's Internet identifier.
Private Tabs with Tor are easily accessible from the File menu by clicking New Private Tab with Tor. The integration of Tor into the Brave browser makes enhanced privacy protection conveniently accessible to any Brave user directly within the
browser. At any point in time, a user can have one or more regular tabs, session tabs, private tabs, and Private Tabs with Tor open.
The Brave browser already automatically blocks ads, trackers, cryptocurrency mining scripts, and other threats in order to protect users' privacy and security, and Brave's regular private tabs do not save a user's browsing history or cookies.
Private Tabs with Tor improve user privacy in several ways. It makes it more difficult for anyone in the path of the user's Internet connection (ISPs, employers, or guest Wi-Fi providers such as coffee shops or hotels) to track which websites a
user visits. Also, web destinations can no longer easily identify or track a user arriving via Brave's Private Tabs with Tor by means of their IP address. Users can learn more about how the Tor network works by watching this video.
Private Tabs with Tor default to DuckDuckGo as the search engine, but users have the option to switch to one of Brave's other nineteen search providers. DuckDuckGo does not ever collect or share users' personal information, and welcomes anonymous
users without impacting their search experience 204 unlike Google which challenges anonymous users to prove they are human and makes their search less seamless.
In addition, Brave is contributing back to the Tor network by running Tor relays. We are proud to be adding bandwidth to the Tor network, and intend to add more bandwidth in the coming months.
Instagram has apologised for censoring a photo of two men kissing for violating community
The photo - featuring Jordan Bowen and Luca Lucifer - was taken down from photographer Stella Asia Consonni's Instagram.
A spokesperson for the image sharing site regurgitated the usual apology for shoddy censorship saying
This post was removed in error and we are sorry. It has since been reinstated.
The photo was published in i-D magazine as part of a series of photos by Stella exploring modern relationships, which she plans to exhibit later this year. It only reappeared after prominent people in fashion and LGBT+ rights raised awareness
about the removal of the photo.
Google, Facebook, YouTube and other sites would be required by law to take down extremist material within 24 hours of receiving an
official complaint under an amendment put forward for inclusion in new counter-terror legislation.
The Labour MP Stephen Doughty's amendment echoes censorship laws that came into effect in Germany last year. However the effect of the German law was to enable no-questions-asked censorship of anything the government doesn't like. Social media
companies have no interest in challenging unfair censorship and find the easiest and cheapest way to comply is to err on the side of the government, and take down anything asked regardless of the merits of the case.
The counter-terrorism strategy unveiled by the home secretary, Sajid Javid, this month, said the Home Office would place a renewed emphasis on engagement with internet providers and work with the tech industry to seek more investment in
technologies that automatically identify and remove terrorist content before it is accessible to all.
But Doughty, a member of the home affairs select committee, said his amendment was needed because the voluntary approach was failing. He said a wide variety of extremist content remained online despite repeated warnings.
If these companies can remove copyrighted video or music content from companies like Disney within a matter of hours, there is no excuse for them to be failing to do so for extremist material.
Doughty's amendment would also require tech companies to proactively check content for extremist material and take it down within six hours of it being identified.
The proactive check of content alludes to the censorship machines being introduced by the EU to scan uploads for copyrighted material. The extension to detect terrorist material coupled with the erring on the side of caution approach would
inevitably lead to the automatic censorship of any content even using vocabulary of terrorism, regardless of it being news reporting, satire or criticsim.
Two human rights organizations, a digital library, an activist for sex workers, and a certified massage therapist have filed a lawsuit asking a federal court to block enforcement of FOSTA, the new federal law that silences online speech by forcing
speakers to self-censor and requiring platforms to censor their users. The plaintiffs are represented by the Electronic Frontier Foundation (EFF), Davis, Wright Tremaine LLP, Walters Law Group, and Daphne Keller.
In Woodhull Freedom Foundation et al. v. United States , the plaintiffs argue that FOSTA is unconstitutional, muzzling online speech that protects and advocates for sex workers and forces well-established, general interest community forums offline
for fear of criminal charges and heavy civil liability for things their users might share.
FOSTA, or the Allow States and Victims to Fight Online Sex Trafficking Act, was passed by Congress in March. But instead of focusing on the perpetrators of sex trafficking, FOSTA goes after online speakers, imposing harsh penalties for any website
that might facilitate prostitution or contribute to sex trafficking. The vague language and multiple layers of ambiguity are driving constitutionally protected speech off the Internet at a rapid pace.
For example, plaintiff the Woodhull Freedom Foundation works to support the health, safety, and protection of sex workers, among other things. Woodhull wanted to publish information on its website to help sex workers understand what FOSTA meant to
them. But instead, worried about liability under FOSTA, Woodhull was forced to censor its own speech and the speech of others who wanted to contribute to their blog. Woodhull is also concerned about the impact of FOSTA on its upcoming annual
summit, scheduled for next month.
FOSTA chills sexual speech and harms sex workers, said Ricci Levy, executive director Woodhull Freedom Foundation. It makes it harder for people to take care of and protect themselves, and, as an organization working to protect people's
fundamental human rights, Woodhull is deeply concerned about the damaging impact that this law will have on all people.
FOSTA calls into serious question the legality of online speech that advocates for the decriminalization of sex work, or provides health and safety information to sex workers. Human Rights Watch (HRW), an international organization that is also a
plaintiff, advocates globally for ways to protect sex workers from violence, health risks, and other human rights abuses. The group is concerned that its efforts to expose abuses against sex workers and decriminalize voluntary sex work could be
seen as facilitating prostitution, or in some way assisting sex trafficking.
HRW relies heavily on individuals spreading its reporting and advocacy through social media, said Dinah Pokempner, HRW General Counsel. We are worried that social media platforms and websites may block the sharing of this information out of
concern it could be seen as demonstrating a reckless disregard of sex trafficking activities under FOSTA. This law is the wrong approach to the scourge of sex trafficking.
But FOSTA doesn't just impede the work of sex educators and activists. It also led to the shutdown of Craigslist's Therapeutic Services section, which has imperiled the business of a licensed massage therapist who is another plaintiff in this
case. The Internet Archive joined this lawsuit against FOSTA because the law might hinder its work of cataloging and storing 330 billion web pages from 1996 to the present.
Because of the critical issues at stake, the lawsuit filed today asks the court to declare that FOSTA is unconstitutional, and asks that the government be permanently enjoined from enforcing the law.
FOSTA is the most comprehensive censorship of Internet speech in America in the last 20 years, said EFF Civil Liberties Director David Greene. Despite good intentions, Congress wrote an awful and harmful law, and it must be struck down.
What is the mysterious hold that US Big Music has over Euro politicians?
Article 13, the proposed EU legislation that aims to restrict safe harbors for online platforms, was crafted to end the so-called "Value Gap" on YouTube.
Music piracy was traditionally viewed as an easy to identify problem, one that takes place on illegal sites or via largely uncontrollable peer-to-peer networks. In recent years, however, the lines have been blurred.
Sites like YouTube allow anyone to upload potentially infringing content which is then made available to the public. Under the safe harbor provisions of US and EU law, this remains legal -- provided YouTube takes content down when told to do so.
It complies constantly but there's always more to do.
This means that in addition to being one of the greatest legal platforms ever created, YouTube is also a goldmine of unlicensed content, something unacceptable to the music industry.
They argue that the existence of this pirate material devalues the licensed content on the platform. As a result, YouTube maintains a favorable bargaining position with the labels and the best licensing deal in the industry.
The difference between YouTube's rates and those the industry would actually like is now known as the "
" and it's become one of the hottest topics in recent years.
In fact, it is so controversial that new copyright legislation, currently weaving its way through the corridors of power in the EU Parliament, is specifically designed to address it.
If passed, Article 13 will require platforms like YouTube to pre-filter uploads to detect potential infringement. Indeed, the legislation may as well have been named the YouTube Act, since it's the platform that provoked this entire debate and
whole Value Gap dispute.
With that in mind, it's of interest to consider the words of YouTube's global head of music Lyor Cohen this week. In an interview with
, Cohen pledges that his company's new music service, YouTube Music, will not only match the rates the industry achieves from Apple Music and Spotify, but the company's ad-supported free tier viewers will soon be delivering more cash to the labels
too. "Of course [rights holders are] going to get more money," he told Music Week.
If YouTube lives up to its pledge, a level playing field will not only be welcomed by the music industry but also YouTube competitors such as Spotify, who currently offer a free tier on less favorable terms.
While there's still plenty of room for YouTube to maneuver, peace breaking out with the labels may be coming a little too late for those deeply concerned about the implications of Article 13.
YouTube's business model and its reluctance to pay full market rate for music is what started the whole Article 13 movement in the first place and with the Legal Affairs Committee of the Parliament (JURI)
adopting the proposals last week
, time is running out to have them overturned.
Behind the scenes, however, the labels and their associates are going flat out to ensure that Article 13 passes, whether YouTube decides to "play fair" or not. Their language suggests that force is the best negotiating tactic with the
Yesterday, UK Music CEO Michael Dugher led a delegation to the EU Parliament in support of Article 13. He was joined by deputy Labour leader Tom Watson and representatives from the BPI, PRS, and Music Publishers Association, who urged MEPs to
support the changes.
California is considering a bill that would require the state's attorney general to create a board of internet censors that would target
The group would include at least one person from the Department of Justice, representatives from social media providers, civil liberties advocates, and First Amendment scholars, according to CBS13. They would theoretically study how fake stories
spread through social media and then advise platforms on how to stop them.
The nonprofit Electronic Frontier Foundation is already taking a stand against the measure, noting that it violates the First Amendment and make the government responsible for deciding if news is true or false.
Litigation can always take twists and turns, but when EFF filed a lawsuit against Universal Music Group in 2007 on behalf of Stephanie Lenz, few would have anticipated it would be ten years until the case was finally resolved. But
, at last, it is. Along the way, Lenz v. Universal contributed to strengthening fair use law, bringing nationwide attention to the issues of copyright and fair use in new digital movie-making and sharing technologies.
It all started when Lenz posted a YouTube video
of her then-toddler-aged son dancing while Prince's song Let's Go Crazy played in the background, and Universal used copyright claims to get the link disabled. We brought the
hoping to get some clarity from the courts on a simple but important issue: can a rightsholder use the Digital Millennium Copyright Act to take down an obvious fair use, without consequence?
Congress designed the DMCA
to give rightsholders, service providers, and users relatively precise rules of the road for policing online copyright infringement. The center of the scheme is the notice and takedown process. In exchange for substantial protection from liability
for the actions of their users, service providers must promptly take offline content on their platforms that has been identified as infringing, as well as several other prescribed steps. Copyright owners, for their part, are given an expedited,
extra-judicial procedure for obtaining redress against alleged infringement, paired with explicit statutory guidance regarding the process for doing so, and provisions designed to deter and ameliorate abuse of that process.
Without Section 512, the risk of crippling liability for the acts of users would have prevented the emergence of most of the social media outlets we use today. Instead, the Internet has become the most revolutionary platform for the creation and
dissemination of speech that the world has ever known.
But Congress also knew that Section 512's powerful incentives could result also in lawful material being censored from the Internet, without prior judicial scrutiny--much less advance notice to the person who posted the material--or an opportunity
to contest the removal. To inhibit abuse, Congress made sure that the DMCA included a series of checks and balances, including Section 512(f), which gives users the ability to hold rightsholders accountable if they send a DMCA notice in bad faith.
In this case, Universal Music Group claimed to have a good faith belief that Ms. Lenz's video of her child dancing to a short segment of barely-audible music infringed copyright. Yet the undisputed facts showed Universal never considered whether
Ms. Lenz's use was lawful under the fair use doctrine. If it had done so, it could not reasonably have concluded her use was infringing. On behalf of Stephanie Lenz, EFF argued that this was a misrepresentation in violation of Section 512(f).
In response, Universal argued that rightsholders have no obligation to consider fair use at all. The U.S. Court of Appeals for the Ninth Circuit rejected
that argument, correctly holding that the DMCA requires a rightsholder to consider whether the uses she targets in a DMCA notice are actually lawful under the fair use doctrine. However, the court also held that a rightsholder's determination on
that question passes muster as long as she subjectively believes it to be true. This leads to a virtually incoherent result: a rightsholder must consider fair use, but has no incentive to actually learn what such a consideration should entail.
After all, if she doesn't know what the fair use factors are, she can't be held liable for not applying them thoughtfully.
We were disappointed in that part of the ruling, but it came with a big silver lining: the court also held that fair use is not simply a narrow defense copyright but an affirmative public right. For decades, rightsholders and scholars had
debated the issue, with many preferring to construe fair use as narrowly as possible. Thanks to the Lenz decision, courts will be more likely to think of fair use, correctly, as a crucial vehicle for achieving the real purpose of copyright
law: to promote the public interest in creativity and innovation. And rightsholders are on notice: they must at least consider fair use before sending a takedown notice.
Lenz and Universal filed petitions requesting
that the Supreme Court review the Ninth Circuit's ruling. The Supreme Court denied both petitions. This meant that the case returned to the district court for trial on the question of whether Universal's takedown was a misrepresentation under the
Ninth Circuit's subjective standard. Rather than go to trial, the parties have agreed to a settlement.
Lenz v. Universal helped make some great law on fair use and also played a role in leading to better takedown processes at Universal. EFF congratulates Stephanie Lenz for fighting the good fight, and we thank our co-counsel at
Keker, Van Nest & Peters LLP
and Kwun Bhansali Lazarus LLP
for being our partners through this long journey.
An item mocking China, Xi Jinping and Trump on John Oliver's HBO show Last Week Tonight seems to have wound up China's censors.
HBO's website has been blocked in China and social media censors have been working hard to eliminate comments about the show.
According to the anti-censorship and monitoring group Greatfire.org, HBO's website was completely blocked within China as of Saturday, days after media reports emerged that Weibo had censored new posts mentioning Oliver or his HBO show Last Week
In the show, Oliver made fun of the Chinese president's apparent sensitivity over comparisons of his figure with that of Winnie the Pooh. Images of the AA Milne character, used to mock Xi, have been censored in China. Oliver also took a serious
tone in the show, criticising Xi for the removal of term limits from the Chinese constitution, the use of political re-education camps in the Muslim province of Xinjiang, and a crackdown on civil society. Oliver noted the continued house arrest of
Liu Xia, wife of Chinese dissident and nobel laureate Liu Xiaobo who died last year while serving an 11-year prison sentence.
Video-on-demand streaming providers in Asean (Association of Southeast Asian Nations) countries, including ASTRO, dimsum, Fox+, HOOQ,
iflix, Netflix, tonton, TVB and Walt Disney, have joined forces to launch a self-censorship Subscription Video-on-Demand Industry Content Code.
The censorship rules ensure that the content offered on these platforms is authentic, free from hate speech, pornography and other forms of inappropriate content.
Furthermore, the Code also aims to provide users with age-appropriate content advice.
Companies participating in the Code said in a statement:
We share a mutual objective of putting consumer well-being at the heart of our services. This Code demonstrates our commitment to making sure that the consumer is able to make content viewing choices that are right for them and their families.
They also welcome other video-on-demand services to work under their rules.
On July 1, the Ugandan government began enforcing a new law that imposes a 200 shilling [US$0.05, 2£0.04] daily levy on people using internet messaging platforms, despite protests to the contrary from local and international online free speech
This move, according to Ugandan President Yoweri Museveni, has the dual purpose of strengthening the national budget and also curtailing gossip by Ugandans on social media. It was also popular among local telecom providers, who do not directly
benefit from the use of foreign-based over-the-top services such as Facebook, Twitter, and WhatsApp.
The polic was preceded with an order to register all new mobile SIM cards with the National Biometric Data Centre. The measure also forces Ugandans to only use mobile money accounts in order to recharge their SIM cards and makes it mandatory to
pay a one percent levy on the total value of transaction on any mobile money transaction .
These new policies make it more costly for Ugandans -- especially those living in poverty -- to communicate and perform everyday tasks using their mobile devices.
On July 2, civil society and legal advocates in Uganda filed a court challenge against the law, arguing that it violates the country's constitution.
A protester demonstrates his opposition to Uganda's social media tax at a gathering on July 6, 2018.
On July 6, concerned citizens and civil society advocates issued a joint press statement [see below] calling on Ugandans to avoid paying the tax by using alternate methods to exchange money and access social media, and to join a National Day of
Peaceful Protest Against Unfair Taxation on Wednesday, July 11, 2018.
The Global Voices community and our network of friends and allies wish to support this and other efforts to demand an end to the tax. We believe that this tax is simply a ploy to censor Ugandans and gag dissenting voices.
We believe social media should be freely accessible for all people, including Ugandans. The Ugandan social media tax must go!
On Monday, July 9, beginning at 14:00 East Africa Time, we plan to tweet at community leaders, government and diplomatic actors, and media influencers to increase awareness and draw public attention to the issue. We especially encourage fellow
bloggers and social media users all over the world to join us.
Social media censor announced to tackle gang-related online content
The Home Secretary Sajid Javid has announced £1.38 million to strengthen the police's response to violent and gang-related online content.
Funding from the government's £40 million Serious Violence Strategy will be used to create a 20-strong team of police staff and officers tasked with disrupting and removing overt and covert gang-related online content.
The social media censor will proactively flag illegal and harmful online content for social media companies to take down. Hosted by the Metropolitan Police, the new capability will also prevent violence on our streets by identifying gang-related
messages generating the most risk and violence.
The move follows the Serious Violence Taskforce chaired by the Home Secretary urging social media companies to do more to take down these videos. The Home Secretary invited representatives from Facebook and Google to Monday's meeting to explain
the preventative action they are already taking against gang material hosted on their platforms.
Home Secretary Sajid Javid said:
Street gangs are increasingly using social media as a platform to incite violence, taunt each other and promote crime.
This is a major concern and I want companies such as Facebook and Google to do more.
We are taking urgent action and the new social media hub will improve the police's ability to identify and remove this dangerous content.
Duncan Ball, Deputy Assistant Commissioner of the Metropolitan Police Service and National Policing lead for Gangs, said:
Police forces across the country are committed to doing everything we can to tackle violent crime and the impact that it has on our communities. Through this funding we can develop a team that is a centre of expertise and excellence that will
target violent gangs and those plotting and encouraging violence online.
By working together with social media companies we will ensure that online material that glamourises murder, lures young people into a dangerous, violent life of crime, and encourages violence is quickly dealt with to cut off this outlet for
gangs and criminals.
Looking to the future we aim to develop a world class capability that will tackle the type of dangerous social media activity that promotes or encourages serious violence.
It is already an offence to incite, assist, or encourage violence online and the Home Office is focused towards building on the relationships made with social media providers to identify where we can take action relevant to tackling serious
Comment: Making music videos is not a criminal activity -- no matter what genre
West London music group 1011 has recently been banned from recording or performing music without police permission.
On June 15, the Metropolitan police issued the group, which has been the subject of a two-year police investigation, with a Criminal Behaviour Order .
For the next three years, five members of the group -- which creates and performs a UK version of drill, a genre of hip-hop that emerged from Chicago -- must give 24 hours notice of the release of any music video, and 48 hours notice of any live
performance. They are also banned from attending Notting Hill Carnival and wearing balaclavas.
This is a legally unprecedented move, but it is not without context. A recent Amnesty UK report on the Metropolitan Police Gangs Matrix -- a risk assessment tool that links individuals to gang related crime -- stated that:
The sharing of YouTube videos and other social media activity are used as potential criteria for adding names to the Matrix, with grime music videos featuring gang names or signs considered a particular possible indicator of likely gang
Furthermore, recent research indicates that almost 90% of those on the Matrix are black or ethnic minority.
For young people who make music, video is a key way to share their work with a wider audience. Online platforms such as SBTV, LinkUp TV , GRM daily and UK Grime are all popular sites. Often using street corners and housing estates as a location,
these videos are a central component of the urban music scene. But the making of these music videos appears to feed into a continuing unease about youth crime and public safety.
Fifteen years ago, ministers were concerned about rap lyrics; in 2007 some MPs demanded to have videos banned after a shooting in Liverpool. UK drill music is only the focus of the most recent crackdown by the Metropolitan police, which has
requested YouTube to remove any music videos with violent content.
The production and circulation of urban music videos has become a contested activity -- and performance in the public sphere is presented as a cause for concern. This is leading to the criminalisation of everyday pursuits. Young people from poor
backgrounds are now becoming categorised as troublemakers through the mere act of making a music video.
Acid Software, the developer of a shooting simulator recently removed from Steam, will now struggle to sell its products
online thanks to censorship by PayPal.
The Active Shooter developer said this week that purchases of its highly controversial game were temporarily disabled while it tried to resolve issues with PayPal.
Paypal has confirmed it has banned the account saying:
PayPal has a longstanding, well-defined and consistently enforced Acceptable Use Policy, and regardless of the individual or organisation in question, we work to ensure that our services are not used to accept payments for activities that promote
violence, PayPal said in a statement.
Acid Software spokesperson Ata Berdyev told the Associated Press the future of the game is now in doubt.
The European Parliament's Committee on Legal Affairs (JURI) has officially approved Articles 11 and 13 of a Digital Single
Market (DSM) copyright proposal, mandating censorship machines and a link tax.
Articles 11 and 13 of the Directive of the European Parliament and of the Council on Copyright in the Digital Single Market have been the subject of considerable campaigning from pro-copyleft groups including the Open Rights Group and Electronic
Frontier Foundation of late.
Article 11, as per the final version of the proposal, discusses the implementation of a link tax - the requirement that any site citing third-party materials do so in a way that adheres to the exemptions and restrictions of a total of 28 separate
copyright laws or pays for a licence to use and link to the material;
Article 13, meanwhile, requires any site which allows users to post text, sound, program code, still or moving images, or any other work which can be copyrighted to automatically scan all such uploads against a database of copyright works - a
database which they will be required to pay to access.
Both Article 11 and Article 13 won't become official legislation until passed by the entire European Parliament in a plenary vote. There's no definite timetable for when such a vote might take place, but it would likely happen sometime between
December of this year and the first half of 2019.
On June 20, the EU's legislative committee will vote on the
new Copyright directive
, and decide whether it will include the controversial "Article 13" (automated censorship of anything an algorithm identifies as a copyright violation) and "Article 11" (no linking to news stories without paid permission from
These proposals will make starting new internet companies effectively impossible -- Google, Facebook, Twitter, Apple, and the other US giants will be able to negotiate favourable rates and build out the infrastructure to comply with these
proposals, but no one else will. The EU's regional tech success stories -- say Seznam.cz
, a successful Czech search competitor to Google -- don't have $60-100,000,000 lying around to build out their filters, and lack the leverage to extract favorable linking licenses from news sites.
If Articles 11 and 13 pass, American companies will be in charge of Europe's conversations, deciding which photos and tweets and videos can be seen by the public, and who may speak.
So far, the focus in the debate has been on the intended consequences of the proposals: the idea that a certain amount of free expression and competition must be sacrificed to enable rightsholders to force Google and Facebook to share their
But the unintended -- and utterly foreseeable -- consequences are even more important. Article 11's link tax allows news sites to decide who gets to link to them, meaning that they can exclude their critics. With election cycles dominated by
hoaxes and fake news, the right of a news publisher to decide who gets to criticise it is carte blanche to lie and spin.
Article 13's copyright filters are even more vulnerable to attack: the proposals contain no penalties for false claims of copyright ownership, but they do mandate that the filters must accept copyright claims in bulk, allowing
rightsholders to upload millions of works at once in order to claim their copyright and prevent anyone from posting them.
That opens the doors to all kinds of attacks. The obvious one is that trolls might sow mischief by uploading millions of works they don't hold the copyright to, in order to prevent others from quoting them: the works of Shakespeare, say, or
everything ever posted to Wikipedia, or my novels, or your family photos.
More insidious is the possibility of targeted strikes during crisis: stock-market manipulators could use bots to claim copyright over news about a company, suppressing its sharing on social media; political actors could suppress key articles
during referendums or elections; corrupt governments could use arms-length trolls to falsely claim ownership of footage of human rights abuses.
It's asymmetric warfare: falsely claiming a copyright will be easy (because the rightsholders who want this system will not tolerate jumping through hoops to make their claims) and instant (because rightsholders won't tolerate delays when their
new releases are being shared online at their moment of peak popularity). Removing a false claim of copyright will require that a human at an internet giant looks at it, sleuths out the truth of the ownership of the work, and adjusts the database
-- for millions of works at once. Bots will be able to pollute the copyright databases much faster than humans could possibly clear it.
I spoke with Wired UK's KG Orphanides about this, and their excellent article
on the proposal is the best explanation I've seen of the uses of these copyright filters to create unstoppable disinformation campaigns.
Doctorow highlighted the potential for unanticipated abuse of any automated copyright filtering system to make false copyright claims, engage in targeted harassment and even silence public discourse at sensitive times.
"Because the directive does not provide penalties for abuse -- and because rightsholders will not tolerate delays between claiming copyright over a work and suppressing its public display -- it will be trivial to claim copyright over key
works at key moments or use bots to claim copyrights on whole corpuses.
The nature of automated systems, particularly if powerful rightsholders insist that they default to initially blocking potentially copyrighted material and then releasing it if a complaint is made, would make it easy for griefers to use copyright
claims over, for example, relevant Wikipedia articles on the eve of a Greek debt-default referendum or, more generally, public domain content such as the entirety of Wikipedia or the complete works of Shakespeare.
"Making these claims will be MUCH easier than sorting them out -- bots can use cloud providers all over the world to file claims, while companies like Automattic (WordPress) or Twitter, or even projects like Wikipedia, would have to marshall
vast armies to sort through the claims and remove the bad ones -- and if they get it wrong and remove a legit copyright claim, they face unbelievable copyright liability."
Politicians, about to vote in favor of mandatory upload filtering in Europe, get channel deleted by
YouTube's upload filtering.
French politicians of the former Front National are furious: their entire YouTube channel was just taken down by automatic filters at YouTube for alleged copyright violations. Perhaps this will cause them to reconsider next week's vote, which they
have announced they will support: the bill that will make exactly this arbitrary, political, and unilateral upload filtering mandatory all across Europe.
The French party Front National, now renamed Rassemblemant National (national rally point), which is one of biggest parties in France, have gotten their YouTube channel disappeared on grounds of alleged copyright violations. In an interview with
French Europe1, their leader Marine Le Pen calls the takedown arbitrary, political, and unilateral.
Europe is about to vote on new copyright law next week. Next Wednesday or Thursday. So let's disregard here for a moment that this happened to a party normally described as far-right, and observe that if it can happen to one of France's biggest
parties regardless of their policies, then it can happen to anyone for political reasons 204 or any other reason.
The broadcast named TVLibert39s is gone, described by YouTube as YouTube has blocked the broadcast of the newscast of Thursday, June 14 for copyright infringement.
Marine Le Pen was quoted as saying, This measure is completely false; we can easily assert a right of quotation [to illustrate why the material was well within the law to broadcast].
She's right. Automated upload filters do not take into account when you have a legal right to broadcast copyrighted material for one of the myriad of valid reasons. They will just assume that this such reasons never exist; if nothing else, to make
sure that the hosting platform steers clear of any liability. Political messages will be disappeared on mere allegations by a political opponent, just as might have happened here.
And yet, the Rassemblemant National is going to vote in favor of exactly this mandatory upload filtering. The horror they just described on national TV as arbitrary, political, and unilateral.
It's hard to illustrate clearer that Europe's politicians have absolutely no idea about the monster they're voting on next week.
The decisions to come will be unilateral, political, and arbitrary. Freedom of speech will be unilateral, political, and arbitrary. Just as Marine Le Pen says. Just as YouTube's Content ID filtering is today, as has just been illustrated.
The article mandating this unilateral, political, and arbitrary censorship is called Article 13 of the upcoming European Copyright bill, and it must be removed entirely. There is no fixing of automated censorship machines.
Privacy remains your own responsibility. So do your freedoms of speech, information, and expression.
David Kaye, the UN's Special Rapporteur on freedom of expression has now chimed in with a very thorough report, highlighting how Article 13 of the
Directive -- the part about mandatory copyright filters -- would be a disaster for free speech and would violate the UN's Declaration on Human Rights, and in particular Article 19 which says:
Everyone has the right to freedom of opinion and expression; the right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media regardless of frontiers.
As Kaye's report notes, the upload filters of Article 13 of the Copyright Directive would almost certainly violate this principle.
Article 13 of the proposed Directive appears likely to incentivize content-sharing providers to restrict at the point of upload user-generated content that is perfectly legitimate and lawful. Although the latest proposed versions of Article 13 do
not explicitly refer to upload filters and other content recognition technologies, it couches the obligation to prevent the availability of copyright protected works in vague terms, such as demonstrating best efforts and taking effective and
proportionate measures. Article 13(5) indicates that the assessment of effectiveness and proportionality will take into account factors such as the volume and type of works and the cost and availability of measures, but these still leave
considerable leeway for interpretation.
The significant legal uncertainty such language creates does not only raise concern that it is inconsistent with the Article 19(3) requirement that restrictions on freedom of expression should be provided by law. Such uncertainty would also raise
pressure on content sharing providers to err on the side of caution and implement intrusive content recognition technologies that monitor and filter user-generated content at the point of upload. I am concerned that the restriction of
user-generated content before its publication subjects users to restrictions on freedom of expression without prior judicial review of the legality, necessity and proportionality of such restrictions. Exacerbating these concerns is the reality
that content filtering technologies are not equipped to perform context-sensitive interpretations of the valid scope of limitations and exceptions to copyright, such as fair comment or reporting, teaching, criticism, satire and parody.
Kaye further notes that copyright is not the kind of thing that an algorithm can readily determine, and the fact-specific and context-specific nature of copyright requires much more than just throwing algorithms at the problem -- especially when a
website may face legal liability for getting it wrong.
The designation of such mechanisms as the main avenue to address users' complaints effectively delegates content blocking decisions under copyright law to extrajudicial mechanisms, potentially in violation of minimum due process guarantees under
international human rights law. The blocking of content -- particularly in the context of fair use and other fact-sensitive exceptions to copyright -- may raise complex legal questions that require adjudication by an independent and impartial
judicial authority. Even in exceptional circumstances where expedited action is required, notice-and-notice regimes and expedited judicial process are available as less invasive means for protecting the aims of copyright law.
In the event that content blocking decisions are deemed invalid and reversed, the complaint and redress mechanism established by private entities effectively assumes the role of providing access to remedies for violations of human rights law. I
am concerned that such delegation would violate the State's obligation to provide access to an effective remedy for violations of rights specified under the Covenant. Given that most of the content sharing providers covered under Article 13 are
profit-motivated and act primarily in the interests of their shareholders, they lack the qualities of independence and impartiality required to adjudicate and administer remedies for human rights violations. Since they also have no incentive to
designate the blocking as being on the basis of the proposed Directive or other relevant law, they may opt for the legally safer route of claiming that the upload was a terms of service violation -- this outcome may deprive users of even the
remedy envisioned under Article 13(7). Finally, I wish to emphasize that unblocking, the most common remedy available for invalid content restrictions, may often fail to address financial and other harms associated with the blocking of
He goes on to point that while large platforms may be able to deal with all of this, smaller ones are going to be in serious trouble:
I am concerned that the proposed Directive will impose undue restrictions on nonprofits and small private intermediaries. The definition of an online content sharing provider under Article 2(5) is based on ambiguous and highly subjective criteria
such as the volume of copyright protected works it handles, and it does not provide a clear exemption for nonprofits. Since nonprofits and small content sharing providers may not have the financial resources to establish licensing agreements with
media companies and other right holders, they may be subject to onerous and legally ambiguous obligations to monitor and restrict the availability of copyright protected works on their platforms. Although Article 13(5)'s criteria for effective
and proportionate measures take into account the size of the provider concerned and the types of services it offers, it is unclear how these factors will be assessed, further compounding the legal uncertainty that nonprofits and small providers
face. It would also prevent a diversity of nonprofit and small content-sharing providers from potentially reaching a larger size, and result in strengthening the monopoly of the currently established providers, which could be an impediment to the
right to science and culture as framed in Article 15 of the ICESCR.
Several video downloading and MP3 conversion tools have thrown in the towel this week, disabling all functionality following
legal pressure. Pickvideo.net states that it received a cease and desist order, while Video-download.co and EasyLoad.co reference the lawsuit against YouTube-MP3 as the reason for their decision.
The music industry sees stream ripping as one of the largest piracy threats. The RIAA, IFPI, and BPI showed that they're serious about the issue when they filed legal action against YouTube-MP3, the largest stream ripping site at the time.
This case eventually resulted in a settlement where the site, once good for over a million daily visitors, agreed to shut down voluntarily last year.
YouTube-MP3's demise was a clear victory for the music groups, which swiftly identified their next targets, putting them under pressure, both in public and behind the scenes.
This week this appears to have taken its toll on several stream ripping sites, which allowed users to download videos from YouTube and other platforms, with the option to convert files to MP3s. The targets include Pickvideo.net , Video-download.co
and Easyload.co , which all inform their users that they've thrown in the towel.
With several million visits per month, Pickvideo is the largest of the three. According to the site, they took the drastic measures following a cease -and-desist letter.
The UK Supreme Court has today ruled that trade mark holders are not able to compel ISPs to bear the cost of implementing orders to block websites
selling counterfeit goods.
Jim, Alex and Myles at the Supreme CourtOpen Rights Group acted as an intervener in this case. We argued that Internet service providers (ISPs) as innocent parties should not bear the costs of website blocking, and that this was a long-standing
principle of English law.
Jim Killock, Executive Director of Open Rights Group said:
This case is important because if ISPs paid the costs of blocking websites, the result would be an increasing number of blocks for relatively trivial reasons and the costs would be passed to customers.
While rights holders may want websites blocked, it needs to be economically rational to ask for this.
Solicitor in the case David Allen Green said:
I am delighted to have acted, through my firm Preiskel, successfully for the Open Rights Group in their intervention.
We intervened to say that those enforcing private rights on internet should bear the costs of doing so, not others. This morning, the UK Supreme Court held unanimously that the rights holders should bear the costs.
The main party to the case was BT who opposed being forced to pay for costs incurred in blocking websites. Now rights-holders must reimburse ISPs for the costs of blocking rights-infringing material.
Supreme Court judge Lord Sumption, one of five n the panel, ruled:
There is no legal basis for requiring a party to shoulder the burden of remedying an injustice if he has no legal responsibility for the infringement and is not a volunteer but is acting under the compulsion of an order of the court.
It follows that in principle the rights-holders should indemnify the ISPs against their compliance costs. Section 97A of the Copyright, Designs and Patents Act 1988 allows rights-holders to go to court and get a blocking order -- the question in
the current case is who stumps up for the costs of complying with that order?
Of course this no asks the question about who should pay for mass porn website blocking that will be needed when the BBFC porn censorship regime stats its work.