There were supposed to be two big announcements this week regarding censorship. The first announcement was supposed to come from Valve regarding the recent ban-spree of anime games on Steam. That announcement has been postponed until further
The second big announcement was relating to Sony's U.K., division that was supposed to address the current censorship policies for the PS4 that was handed down by Sony Interactive Entertainment of America's office. Well, that announcement has
been delayed because the meeting for that announcement has been delayed.
On Tuesday the House of Lords approved the BBFC's scheme to implement internet porn censorship in the UK. Approval will now be sought from the House of Commons.
The debate in the Lords mentioned a few issues in passing but they seemed to be avoiding taking about some of the horrors of the scheme.
The Digital Economy Act defining the law behind the scheme offers no legal requirement for age verification providers to restrict how they can use porn viewers data. Lords mentioned that it is protected under the GDPR rules but these rules still
let companies do whatever they like with data, just with the proviso that they ask for consent. But of course the consent is pretty mandatory to sign up for age verification, and some of the biggest internet companies in the world have set the
precedent they can explain wide ranging usage of the data claiming it will be used say to improve customer experience.
Even if the lords didn't push very hard, people at the DCMS or BBFC have been considering this deficiency, and have come up with the idea that data use should be voluntarily restricted according to a kite mark scheme. Age verification schemes
will have their privacy protections audited by some independent group and if they pass they can display a gold star. Porn viewers are then expected to trust age verification schemes with a gold star. But unfortunately it sounds a little like the
sort of process that decided that cladding was safe for high rise blocks of flats.
The lords were much more concerned about the age verification requirements for social media and search engines, notably Twitter and Google Images. Clearly age verification schemes required for checking that users are 13 or 18 will be very
different from an 18 only check, and will be technically very different. So the Government explained that these wider issues will be addressed in a new censorship white paper to be published in 2019.
The lords were also a bit perturbed that the definition of banned material wasn't wide enough for their own preferences. Under the current scheme the BBFC will be expected to ban totally any websites with child porn or extreme porn. The lords
wondered why this wasn't extended to cartoon porn and beyond R18 porn, presumably thinking of fisting, golden showers and the like. However in reality if the definition of bannable porn was extended, then every major porn website in the word
would have to be banned by the BBFC. And anyway the government is changing its censorship rules such that fisting and golden showers are, or will soon be, allowable at R18 anyway.
The debate revealed that the banks and payment providers have already agreed to ban payments to websites banned by the BBFC. The government also confirmed its intention to get the scheme up and running by April. Saying that, it would seem a
little unfair for the website's 3 month implementation period to be set running before their age verification options are accredited with their gold stars. Otherwise some websites would waste time and money implementing schemes that may later be
Next a motion to approve draft legislation over the UK's age-verification regulations will be debated in the House of Commons. Stephen Winyard, AVSecure s chief marketing officer, told XBIZ:
We are particularly pleased that the prime minister is set to approve the draft guidance for the age-verification law on Monday. From this, the Department for Digital, Culture, Media and Sport will issue the effective start date and that will be
But maybe the prime minister has a few more urgent issues on her mind at the moment.
4,000,000 Europeans have signed a petition opposing Article 13 of the new Copyright in the Single Market Directive. They oppose it for two main reasons: because it will inevitably lead to the creation of algorithmic copyright filters that
only US Big Tech companies can afford (making the field less competitive and thus harder for working artists to negotiate better deals in) and because these filters will censor enormous quantities of legitimate material, thanks to inevitable
algorithmic errors and abuse.
On Monday, a delegation from the signatories officially presented the Trilogue negotiators with the names of 4,000,000+ Europeans who oppose Article 13. These 4,000,000 are in esteemed company: Article 13 is also opposed by the father of the
Internet, Vint Cerf, and the creator of the Web, Tim Berners-Lee and more than 70 of the Internet's top technical experts, not to mention Europe's largest sports leagues and film studios. Burgeoning movements opposing the measure have sprung up
in Italy and Poland.
But no matter how much damage the EU proposed law will do to European businesses and creators, it does not go far enough for the large corporates. This leaves a tricky negation for the EU power brokers of the EU Commission and EU Council of
Ministers. The law is widely opposed by European people but now the US corporates are whingeing that they don't like a few concessions made to get get the bill through the European Parliament. They want the full horror of censorship machines
resurrected. The EFF reports on a delay to proceedings:
This week EU negotiators in Strasbourg struggled to craft the final language of the Copyright in the Single Digital Market Directive, in their last possible meeting for 2019. They failed, thanks in large part to the Directive's two most
controversial clauses: Article 11, which requires paid licenses for linking to news stories while including more than a word or two; and Article 13, which will lead to the creation of error-prone copyright censorship algorithms that will block
users from posting anything that has been identified as a copyrighted work -- even if that posting is lawful. This means that the Directive will not be completed, as was expected, under Austria's presidency of the European Union. The negotiations
between the European Parliament, representatives of the member states, and the European Commission (called "trilogues") will continue under the Romanian presidency, in late January.
The controversy over Article 13 and Article 11 has not diminished since millions of Europeans voiced their opposition to the proposals and their effect on the Internet earlier this year. Even supporters and notional beneficiaries have now grown
critical of the proposals. An open letter signed by major rightsholder groups, including movie companies and sports leagues,
asks the EU to exempt their products from Article 13 altogether , and suggest it should only apply to the music industry's works. Meanwhile, the music industry wrote their own open letter, saying that he latest proposed text on Article 13
won't solve their problems. These rightsholders join the world's most eminent computer scientists, including the inventors of the Internet and the Web,
who denounced the whole approach and warned of the irreparable harm it will do to free expression and the hope of a fair, open Internet.
The collective opposition is unsurprising. Months of closed-door negotiations and corporate lobbying
have actually made the proposals worse : even less coherent, and more riddled with irreconcilable contradictions. The way that the system apportions liability (with stiff penalties for allowing a user to post something that infringes
copyright, and no consequences for censoring legitimate materials)
leads inexorably to filters . And as recent experiences
with Tumblr's attempt to filter adult material have shown, algorithms are simply not very good at figuring out when a user has broken a rule, let alone a rule as technical and fact-intensive as copyright.
What is worse, the Directive will only reinforce the power of US Big Tech companies by inhibiting the emergence of European competitors. That's because only the biggest tech companies have the millions of euros it will cost to deploy the filters
Article 13 requires. Proponents of Article 13 stress that the dominance of platforms like Google and Facebook leaves them with insufficient bargaining leverage and say this leads to a systematic undervaluing of their products. But Article 13 will
actually reduce that leverage even further by preventing the emergence of alternative platforms.
Compromises suggested by the negotiators to limit the damage are proving unlikely to help. Prior to the Trilogue, Article 13 was imposed on all online platforms save those businesses with less than 10 million euros in annual turnover. Some
parties, realising that this will limit the EU tech sector, have suggested changing the figure, but doubling that figure to 20 million doesn't help. If you own a European tech company that you hope will compete with Google someday, you will have
to do something Google never had to face: the day you make the leap from 20 million euros in annual turnover to 20,000,001 euros, you will have to find hundreds of millions of euros to implement an Article 13 copyright filter.
Others have proposed a "notice-and-staydown" system to reassure rightsholders that they will not have to invest their own resources in maintaining the copyright filters. But creating this model for copyright complaints extinguishes any
hope of moderating the harms Article 13 will do to small European companies. Earlier drafts of Article 13 spoke of case-by-case assessments for mid-sized platforms, which would exempt them from implementing filters if they were judged to be
engaged in good faith attempts to limit infringement. But notice-and-staydown (the idea that once a platform has been notified of a user's copyright violation, it must prevent every other user from making such a violation, ever) necessarily
requires filters. Others in the negotiation are now arguing that microenterprises should have to pay the burden, and are pressing for even these small and mid-sized business exemptions to be deleted from the text.
With European internet users, small business people, legal experts, technical experts, human rights and free speech experts all opposed to these proposals, we had hoped that they would be struck from the Trilogue's final draft. Now, they are
blocking the passage of other important copyright reforms. Even Article 13 and 11's original advocates are realising how much they depend on a working Internet, and a remuneration system that might have a chance of working.
Still, the lobbying will continue over the holiday break. Some of the world's biggest entertainment and Internet companies will be throwing their weight around the EU to find a "compromise" that will keep no-one happy, and will exclude
the needs and rights of individual Internet users, and European innovators.
Read more about the Directive, and contact your MEPs and national governments at
Save Your Internet .
Earlier this month, the Chinese government moved forward with its new Online Ethics Review Committee, a government censor that exists solely to review online games and determine whether or not they are acceptable according to Chinese
The creation of the new censor was in response to government concerns that Chinese citizens were playing online games that weren't being directly regulated by China. The censor was tasked with considering twenty online games in its first round of
As a result two major video games, Fortnite and PlayerUnknown's Battlegrounds , have been banned from in China altogether. Both games were big fixtures of the online multiplayer communities in China, but may not be permitted to
return since they have not been designated as needing corrective action but rather appear to have been banned outright.
According to online reports , those reviews have found both Fortnite and PUBG to be in direct violation of the new online ethical rules. According to reports, these two titles were both banned for their gratuitous depictions of blood and gore.
Other titles, like League of Legends , Overwatch , and Diablo were noted as needing corrective action but are not actually banned as of yet.
Dmitry Kuznetsov, better-known by his stage name Husky, was a minor star on Russia's flourishing hip hop scene until police arrested him last month for staging an impromptu concert from the roof of a parked car.
A brief brush with the law has boosted the rapper's profile and turned his I'll Sing My Music single into a national battle cry against arts censorship.
Husky is by no means the only artist feeling the heat as Russia cracks down on alternative music. But the public outcry about his case has highlighted the risks the Kremlin faces as it moves to exert control over Russian youth's favourite form of
Husky had leapt on to the roof of a car to perform in the southern city of Krasnodar on November 21st after a local club, citing concern about Russian anti-extremist laws, abruptly cancelled a gig he had planned. The following day he was
sentenced to 12 days in police detention on twin charges of petty hooliganism and refusing to take a drink and drugs test. Government censorship
In a surprise development Husky was released a few hours before his next performance having served less than half of his sentence. Navalny, who attended the Moscow concert with his family, said the authorities had let the rapper out not just
because they are scared but because they know they are in the wrong.
Lord Ashton of Hyde to move that the draft Regulations laid before the House on 10 October be approved. Special attention drawn to the instrument by the Joint Committee on Statutory Instruments, 38th Report, 4th Report from the Secondary
Legislation Scrutiny Committee (Sub-Committee B)
Guidance on Age-verification Arrangements
Lord Ashton of Hyde to move that the draft Guidance laid before the House on 25 October be approved. Special attention drawn to the instrument by the Joint Committee on Statutory Instruments, 39th Report, 4th Report from the Secondary Legislation
Scrutiny Committee (Sub-Committee B)
Lord Stevenson of Balmacara to move that this House regrets that the draft Online Pornography (Commercial Basis) Regulations 2018 and the draft Guidance on Age-verification Arrangements do not bring into force section 19 of the Digital Economy
Act 2017, which would have given the regulator powers to impose a financial penalty on persons who have not complied with their instructions to require that they have in place an age verification system which is fit for purpose and effectively
managed so as to ensure that commercial pornographic material online will not normally be accessible by persons under the age of 18.
Guidance on Ancillary Service Providers
Lord Ashton of Hyde to move that the draft Guidance laid before the House on 25 October be approved. Special attention drawn to the instrument by the Joint Committee on Statutory Instruments, 39th Report, 4th Report from the Secondary Legislation
Scrutiny Committee (Sub-Committee B)
The DCMS and BBFC age verification scheme has been widely panned as fundamentally the law provides no requirement to actually protect people's identity data that can be coupled with their sexual preferences and sexuality. The scheme only offers
voluntary suggestions that age verification services and websites should protect their user's privacy. But one only has to look to Google, Facebook and Cambridge Analytica to see how worthless mere advice is. GDPR is often quoted but that only
requires that user consent is obtained. One will have to simply to the consent to the 'improved user experience' tick box to watch the porn, and thereafter the companies can do what the fuck they like with the data.
The UK's intelligence agencies are to significantly increase their use of large-scale data hacking after claiming that more targeted operations are being rendered obsolete by technology.
The move will see an expansion in what is known as the bulk equipment interference (EI) regime -- the process by which GCHQ can target entire communication networks overseas in a bid to identify individuals who pose a threat to national security.
[Note that the idea this is somehow only targeted at foreigners is misleading. Five countries cooperate so that they can mutually target each others users to work round limits on snooping on one's own country].
A letter from the security minister, Ben Wallace, to the head of the intelligence and security committee, Dominic Grieve, quietly filed in the House of Commons library last week, states:
Following a review of current operational and technical realities, GCHQ have ... determined that it will be necessary to conduct a higher proportion of ongoing overseas focused operational activity using the bulk EI regime than was originally
As the controversy over the EU's Article 13 censorship machines continue, Twitter appears to be the communications weapon of choice for parties on both sides.
As one of the main opponents of Article 13 and in particular its requirement for upload filtering, Julia Reda MEP has been a frequent target for proponents. Accused of being a YouTube/Google shill (despite speaking out loudly against YouTube's
maneuvering), Reda has endured a lot of criticism. As an MEP, she's probably used to that.
However, a recent response to one of her tweets from music giant IFPI opens up a somewhat ironic can of worms that deserves a closer look.
Since kids will be affected by Article 13, largely due to their obsessiveness with YouTube, Reda recently suggested that they should lobby their parents to read up on the legislation. In tandem with pop-ups from YouTube advising users to oppose
Article 13, that seemed to irritate some supporters of the proposed law.
As the response from IFPI's official account shows, Reda's advice went down like a lead balloon with the music group, a key defender of Article 13. The IFPI tweeted:.
Shame on you: Do you really approve of minors being manipulated by big tech companies to deliver their commercial agenda?
It's pretty ironic that IFPI has called out Reda for informing kids about copyright law to further the aims of big tech companies. As we all know, the music and movie industries have been happily doing exactly the same to further their own aims
for at least ten years and probably more.
Digging through the TF archives, there are way too many articles detailing how big media has directly targeted kids with their message over the last decade. Back in 2009, for example, a former anti-piracy consultant for EMI lectured kids as young
as five on anti-piracy issues.
Ofcom has appointed Stephen Nuttall to its Content Board.
Ofcom's Content Board is a committee of the main Ofcom Board. It has advisory responsibility for a wide range of content issues, including the regulation of television, radio and video-on-demand quality and standards.
Stephen Nuttall has more than thirty years' experience working as a senior executive and a consultant in the sports, media and digital industries. Stephen's previous positions include Senior Director at YouTube EMEA and Group Commercial Director
Video game developer Ubisoft recently made the censorship news by deciding that their Rainbow Six Siege , a shooter in an environmental setting, will be unified into a single worldwide version. This meant that the game would have to be
heavily censored to the standards of the lowest common denominator, China.
This caused a little bit of stink amongst fans, so Ubisoft announced a U turn for the China friendly censorship policy.
Now it seems that Unbsoft is still keen on a heavily censored version that can be played in China, but this time the company is changing tack on its reasoning. Ubisoft are now reporting that parents and consumer groups are complaining that the
game has too many references to sex, many violent scenes, and allusions to gambling . It adds that parents say these issues are troubling in a game intended for teenagers.
'After listening to criticism', the company decided to make some changes to the game. It will remove some of the sexual references and violent content, and make the loot boxes easier to come by. Ubisoft is hoping the changes will be enough to
satisfy the critics and make the customers happy as well. (Especially those in China).
Online games distributor Steam recently relaxed is previous prohibitions on adult gaming but it still draws the line at games it considers illegal.
Now, according to some developers, Valve, the company behind Steam, is going after games that feature themes of child exploitation, which it seems to define, at least in part, as games with sex scenes or nudity where the characters are in high
Over the past few weeks, the company has removed the store pages of several visual novels, including cross-dressing yaoi romance Cross Love , catholic school adult visual novel Hello Goodbye , a story about the love between siblings
Imolicious , and cat girl game MaoMao Discovery Team . The developers of these games all claim to have received similar emails stating that their games could not be released on Steam.
There are a common threads that link the games in question: 1) Cross Love, Hello Goodbye, and Imolicious feature school settings, and 2) all four of the aforementioned games contain adult elements and centre around anime-styled characters who
appear young -- in some cases uncomfortably so.
A bill that would force ISPs in Israel to censor pornographic sites by default has been amended after heavy criticism from lawmakers over privacy concerns.
AN earlier version of the bill that was unanimously approved by the Ministerial Committee for Legislation in late Octoberr but now a new version of the legislation has been passed which was sponsored by Likud MK Miki Zohar and Jewish Home MK
Shuli Moalem-Refaeli. The differences seem subtle and are whether customers opt in or opt out of network level website blocking.
Customers will have to confirm their preferences for website blocking every 3 months but may change their settings at any time.
The bill will incentivize internet companies to actively market existing website blocking software to families. ISPs will receive NIS 0.50 ($0.13 cents) for every subscriber who opts to block adult sites.
In a refreshing divergence from UK internet censorship, ISPs will be legally required to delete all data related to their users' surfing habits, to prevent creating de facto -- and easily leaked -- black lists of pornography consumers.
In comparison, internet companies are allowed to use or sell UK customer data for any purpose they so desire as long as customers tick a consent box with some woolly text about improving the customer's experience.
Israeli Prime Minister Benjamin Netanyahu moved to halt the adoption of a new law aimed at curbing pornographic content on the Internet and possibly keeping tabs on people who watch porn. Netanyahu inquired:
We don't want our children to be exposed to harmful content, but my concern is about inserting regulation into a space in which there is no government regulation. Who will decide which content is permitted and which is forbidden? Who will decide
Facebook has added a new category of censorship, sexual solicitation. It added the update on 15thh October but no one really noticed until recently.
The company has quietly updated its content-moderation policies to censor implicit requests for sex.The expanded policy specifically bans sexual slang, hints of sexual roles, positions or fetish scenarios, and erotic art when mentioned with a sex
act. Vague, but suggestive statements such as looking for a good time tonight when soliciting sex are also no longer allowed.
The new policy reads:
15. Sexual Solicitation Policy
Do not post:
Content that attempts to coordinate or recruit for adult sexual activities including but not limited to:
Filmed sexual activities Pornographic activities, strip club shows, live sex performances, erotic dances Sexual, erotic, or tantric massages
Content that engages in explicit sexual solicitation by, including but not limited to the following, offering or asking for:
Sex or sexual partners Sex chat or conversations Nude images
Content that engages in implicit sexual solicitation, which can be identified by offering or asking to engage in a sexual act and/or acts identified by other suggestive elements such as any of the following:
Vague suggestive statements, such as "looking for a good time tonight" Sexualized slang Using sexual hints such as mentioning sexual roles, sex positions, fetish scenarios, sexual preference/sexual partner preference, state of arousal,
act of sexual intercourse or activity (sexual penetration or self-pleasuring), commonly sexualized areas of the body such as the breasts, groin, or buttocks, state of hygiene of genitalia or buttocks Content (hand drawn, digital, or real-world
art) that may depict explicit sexual activity or suggestively posed person(s).
Content that offers or asks for other adult activities such as:
Commercial pornography Partners who share fetish or sexual interests
Sexually explicit language that adds details and goes beyond mere naming or mentioning of:
A state of sexual arousal (wetness or erection) An act of sexual intercourse (sexual penetration, self-pleasuring or exercising fetish scenarios)
Comment: Facebook's Sexual Solicitation Policy is a Honeypot for Trolls
Facebook just quietly adopted a policy that could push thousands of innocent people off of the platform. The new " sexual solicitation " rules forbid pornography and other explicit sexual content (which was already functionally
banned under a different statute ), but they don't stop there: they also ban "implicit sexual solicitation" , including the use of sexual slang, the solicitation of nude images, discussion of "sexual partner
preference," and even expressing interest in sex . That's not an exaggeration: the new policy bars "vague suggestive statements, such as 'looking for a good time tonight.'" It wouldn't be a stretch to think that asking
" Netflix and chill? " could run afoul of this policy.
The new rules come with a baffling justification, seemingly blurring the line between sexual exploitation and plain old doing it:
[P]eople use Facebook to discuss and draw attention to sexual violence and exploitation. We recognize the importance of and want to allow for this discussion. We draw the line, however, when content facilitates, encourages or coordinates sexual
encounters between adults.
In other words, discussion of sexual exploitation is allowed, but discussion of consensual, adult sex is taboo. That's a classic censorship model: speech about sexuality being permitted only when sex is presented as dangerous and shameful. It's
especially concerning since healthy, non-obscene discussion about sex--even about enjoying or wanting to have sex--has been a component of online communities for as long as the Internet has existed, and has for almost as long been the target of
governmental censorship efforts .
Until now, Facebook has been a particularly important place for groups who aren't well represented in mass media to discuss their sexual identities and practices. At very least, users should get the final say about whether they want to see such
speech in their timelines.
Overly Restrictive Rules Attract Trolls
Is Facebook now a sex-free zone ? Should we be afraid of meeting potential partners on the platform or even disclosing our sexual orientations ?
Maybe not. For many users, life on Facebook might continue as it always has. But therein lies the problem: the new rules put a substantial portion of Facebook users in danger of violation. Fundamentally, that's not how platform moderation
policies should work--with such broadly sweeping rules, online trolls can take advantage of reporting mechanisms to punish groups they don't like.
Combined with opaque and one-sided flagging and reporting systems , overly restrictive rules can incentivize abuse from bullies and other bad actors. It's not just individual trolls either: state actors have systematically abused Facebook's
flagging process to censor political enemies. With these new rules, organizing that type of attack just became a lot easier. A few reports can drag a user into Facebook's labyrinthine enforcement regime , which can result in having a group page
deactivated or even being banned from Facebook entirely. This process gives the user no meaningful opportunity to appeal a bad decision .
Given the rules' focus on sexual interests and activities, it's easy to imagine who would be the easiest targets: sex workers (including those who work lawfully), members of the LGBTQ community, and others who congregate online to discuss issues
relating to sex. What makes the policy so dangerous to those communities is that it forbids the very things they gather online to discuss.
Even before the recent changes at Facebook and Tumblr , we'd seen trolls exploit similar policies to target the LGBTQ community and censor sexual health resources . Entire harassment campaigns have organized to use payment processors' reporting
systems to cut off sex workers' income . When online platforms adopt moderation policies and reporting processes, it's essential that they consider how those policies and systems might be weaponized against marginalized groups.
A recent Verge article quotes a Facebook representative as saying that people sharing sensitive information in private Facebook groups will be safe , since Facebook relies on reports from users. If there are no tattle-tales in your group, the
reasoning goes, then you can speak freely without fear of punishment. But that assurance rings rather hollow: in today's world of online bullying and brigading, there's no question of if your private group will be infiltrated by the trolls
; it's when .
Did SESTA/FOSTA Inspire Facebook's Policy Change?
The rule change comes a few months after Congress passed the Stop Enabling Sex Traffickers Act and the Allow States and Victims to Fight Online Sex Trafficking Act (SESTA/FOSTA), and it's hard not to wonder if the policy is the direct result of
the new Internet censorship laws.
SESTA/FOSTA opened online platforms to new criminal and civil liability at the state and federal levels for their users' activities. While ostensibly targeted at online sex trafficking, SESTA/FOSTA also made it a crime for a platform to
"promote or facilitate the prostitution of another person." The law effectively blurred the distinction between adult, consensual sex work and sex trafficking. The bill's supporters argued that forcing platforms to clamp down on all
sex work was the only way to curb trafficking--nevermind the growing chorus of trafficking experts arguing the very opposite .
As SESTA/FOSTA was debated in Congress, we repeatedly pointed out that online platforms would have little choice but to over-censor : the fear of liability would force them not just to stop at sex trafficking or even sex work, but to take much
more restrictive approaches to sex and sexuality in general, even in the absence of any commercial transaction. In EFF's ongoing legal challenge to SESTA/FOSTA , we argue that the law unconstitutionally silences lawful speech online.
While we don't know if the Facebook policy change came as a response to SESTA/FOSTA, it is a perfect example of what we feared would happen: platforms would decide that the only way to avoid liability is to ban a vast range of discussions of sex.
Wrongheaded as it is, the new rule should come as no surprise. After all, Facebook endorsed SESTA/FOSTA . Regardless of whether one caused the other or not, both reflect the same vision of how the Internet should work--a place where certain
topics simply cannot be discussed. Like SESTA/FOSTA, Facebook's rule change might have been made to fight online sexual exploitation. But like SESTA/FOSTA, it will do nothing but push innocent people offline.
Facebook has been fined ?10m (£8.9m) by Italian authorities for misleading users over its data practices.
The two fines issued by Italy's competition watchdog are some of the largest levied against the social media company for data misuse.
The Italian regulator found that Facebook had breached the country's consumer code by:
Misleading users in the sign-up process about the extent to which the data they provide would be used for commercial purposes.
Emphasising only the free nature of the service, without informing users of the "profitable ends that underlie the provision of the social network", and so encouraging them to make a decision of a commercial nature that they would not
have taken if they were in full possession of the facts.
Forcing an "aggressive practice" on registered users by transmitting their data from Facebook to third parties, and vice versa, for commercial purposes.
The company was specifically criticised for the default setting of the Facebook Platform services, which in the words of the regulator, prepares the transmission of user data to individual websites/apps without express consent from users.
Although users can disable the platform, the regulator found that its opt-out nature did not provide a fully free choice.
The authority has also directed Facebook to publish an apology to users on its website and on its app.
Cuba has passed a new law that gives the government Inspectorate the power to close down any exhibition or performances that are considered a violation of the socialist revolutionary values of the country.
The law, known as decree 349, published in July, allows so-called 'Supervisory inspectors' to censor cultural events ranging from art exhibitions to concerts, and to immediately close any of them if they saw it as denigrating the value of the
country. They also have the right to confiscate a license for business of any restaurant or bar host an 'undesirable' event.
The decree applies to obscene speech, vulgarity, sexism, excessive use of force and more.
Despite the claim that the authorities are trying to reduce the degree of resentment in society, cultural representatives still cal the law fascist.
Image hosting service Tumblr is banning all adult images of sex and nudity from 17th December 2018. This seems to have been sparked by the app being banned from Apple Store after a child porn image was detected being hosted by Tumblr. Tumblr
explained the censorship process in a blog post:
Starting Dec 17, adult content will not be allowed on Tumblr, regardless of how old you are. You can read more about what kinds of content are not allowed on Tumblr in our Community Guidelines. If you spot a post that you don't think belongs on
Tumblr, period, you can report it: From the dashboard or in search results, tap or click the share menu (paper airplane) at the bottom of the post, and hit Report.
Adult content primarily includes photos, videos, or GIFs that show real-life human genitals or female-presenting nipples, and any content204including photos, videos, GIFs and illustrations204that depicts sex acts.
Examples of exceptions that are still permitted are exposed female-presenting nipples in connection with breastfeeding, birth or after-birth moments, and health-related situations, such as post-mastectomy or gender confirmation surgery. Written
content such as erotica, nudity related to political or newsworthy speech, and nudity found in art, such as sculptures and illustrations, are also stuff that can be freely posted on Tumblr.
Any images identified as adult will be set as unviewable by anyone except the poster. There will be an appeals process to contest decisions held to be incorrect.
Inevitably Tumblr algorithms are not exactly accurate when it comes to detecting sex and nudity. The Guardian noted that ballet dancers, superheroes and a picture of Christ have all fallen foul of Tumblr's new pornography ban, after the images
were flagged up as explicit content by the blogging site's artificial intelligence (AI) tools.
The actor and Tumblr user Wil Wheaton posted one example:
An image search for beautiful men kissing, which was flagged as explicit within 30 seconds of me posting it.
These images are not explicit. These pictures show two adults, engaging in consensual kissing. That's it. It isn't violent, it isn't pornographic. It's literally just two adult humans sharing a kiss.
Other users chronicled flagged posts, including historical images of (clothed) women of colour, a photoset of the actor Sebastian Stan wearing a selection of suits with no socks on, an oil painting of Christ wearing a loincloth, a still of ballet
dancers and a drawing of Wonder Woman carrying fellow superhero Harley Quinn. None of the images violate Tumblr's stated policy.
Tumblr, after years of being a space for nsfw artists to reach a community of like-minded individuals to enjoy their work, has decided to close their metaphorical doors to adult content.
Solution Stop it. Let people post porn, it's 90% of the reason anybody is on the site in the first place. Or, if you really want a non-18+ tumblr, start a new one with that specific goal in mind. Don't rip down what people have spent years
The Free Speech coalition [representing the US adult trade] released the following statement regarding the recent announcement about censorship at Tumblr:
The social media platform Tumblr has announced that on December 17, it will effectively ban all adult content. Tumblr follows the lead of Facebook, Instagram, YouTube and other social media platforms, who over the past few years have meticulously
scrubbed their corners of the internet of adult content, sex, and sexuality, in the name of brand protection and child protection.
While some in the adult industry may cheer the end of Tumblr as a never-ending source of free content, specifically pirated content, it is concerning that of the major social media platforms, only Twitter and Reddit remain in any way tolerant of
adult workers -- and there are doubts as to how much longer that will last.
As legitimate platforms ban or censor adult content -- having initially benefited from traffic that adult content brought them -- illegitimate platforms for distribution take their place. The closure of Tumblr only means more piracy, more
dispersal of community, and more suffering for adult producers and performers.
Free Speech Coalition was founded to fight government censorship -- set raids and FBI entrapment, bank seizures and jail terms. The internet gave us freedom from much that had plagued us, particularly local ordinances and overzealous prosecutors.
But now, when corporate censors suspend your account, the only choice is to abandon the platform 203 there is no opportunity for arbitration or appeal.
When companies like Google and Facebook (and subsidiaries like YouTube and Instagram) control over 70% of all web traffic, adult companies are denied a market as effectively as a state-level sex toy ban. And when sites like Tumblr and Twitter can
close an account with millions of followers without warning, the effect is the same on a business -- particularly a small, performer-run one -- as an FBI seizure.
As social media companies become more powerful, we must demand recourse, but we also must look beyond our industry and continue to build alliances -- with women, with LGBTQ groups, with sex workers and sex educators, with artists -- who
implicitly understand the devastating effect of this new form of censorship.
These communities have seen the devastation wreaked when platforms use purges of adult content as a sledgehammer, broadly banning sexual health information, vibrant communities based around non-normative genders and sexualities, resources for sex
workers, and political and cultural commentary that engages with such topics.
The loss of these platforms isn't just about business, it's about the loss of vital communities and education -- and organizing. We use these platforms not only to grow our reach, but to communicate with one another, to rally, to drive awareness
of issues of sex and sexuality. They have become a central source of power. And today, we're one step closer to losing that as well.
Poland stands up to the EU to champion the livelihoods of thosands of Europeans against the disgraceful EU that wants to grant large, mostly American companies, dictatorial copyright control of the internet
In 2011, Europeans rose up over ACTA , the misleadingly named "Anti-Counterfeiting Trade Agreement," which created broad surveillance and censorship regimes for the internet. They were successful in large part thanks to the Polish
activists who thronged the streets to reject the plan, which had been hatched and exported by the US Trade Representative.
The Poles aren't having any of it:
a broad coalition of Poles from the left and the right have come together to oppose the new Directive, dubbing it "ACTA2," which should give you an idea of how they feel about the matter.
There are now enough national governments opposed to the Directive to constitute a "blocking minority" that could stop it dead. Alas, the opposition is divided on whether to reform the offending parts of the Directive, or eliminate them
outright (this division is why the Directive squeaked through the last vote, in September), and unless they can work together, the Directive still may proceed.
A massive coalition of 15,000 Polish creators whose videos, photos and text are enjoyed by over 20,000,000 Poles have signed an open letter supporting the idea of a strong, creator-focused copyright and rejecting the new Copyright Directive as a
direct path to censoring filters that will deprive them of their livelihoods.
The coalition points out that online media is critical to the lives of everyday Poles for purposes that have nothing to do with the entertainment industry: education, the continuation of Polish culture, and connections to the global Polish
Polish civil society and its ruling political party are united in opposing ACTA2; Polish President Andrzej Duda vowed to oppose it.
Early next month, the Polish Internet Governance Forum will host a roundtable on the question; they have invited proponents of the Directive to attend and publicly debate the issue.
The Daily Mail reports on large scale data harvesting of your data and notes that Paypal have been passing on passport photos used for account verification to Microsoft for their facial recognition database
Parliament's fake news inquiry has published a cache of seized Facebook documents including internal emails sent between Mark Zuckerberg and the social network's staff. The emails were obtained from the chief of a software firm that is suing the
tech giant. About 250 pages have been published, some of which are marked highly confidential.
Facebook had objected to their release.
Damian Collins MP, the chair of the parliamentary committee involved, highlighted several key issues in an introductory note. He wrote that:
Facebook allowed some companies to maintain "full access" to users' friends data even after announcing changes to its platform in 2014/2015 to limit what developers' could see. "It is not clear that there was any user consent for
this, nor how Facebook decided which companies should be whitelisted," Mr Collins wrote
Facebook had been aware that an update to its Android app that let it collect records of users' calls and texts would be controversial. "To mitigate any bad PR, Facebook planned to make it as hard as possible for users to know that this
was one of the underlying features," Mr Collins wrote
Facebook used data provided by the Israeli analytics firm Onavo to determine which other mobile apps were being downloaded and used by the public. It then used this knowledge to decide which apps to acquire or otherwise treat as a threat
there was evidence that Facebook's refusal to share data with some apps caused them to fail
there had been much discussion of the financial value of providing access to friends' data
In response, Facebook has said that the documents had been presented in a very misleading manner and required additional context.
Mastercard and Microsoft are collaborating in an identity management system that promises to remember users' identity verification and passwords between sites and services.
Mastercard highlights four particular areas of use: financial services, commerce, government services, and digital services (eg social media, music streaming services and rideshare apps). This means the system would let users manage their data
across both websites and real-world services.
However, the inclusion of government services is an eyebrow-raising one. Microsoft and Mastercard's system could link personal information including taxes, voting status and criminal record, with consumer services like social media accounts,
online shopping history and bank accounts.
As well as the stifling level of tailored advertising you'd receive if the system knew everything you did, this sets the dangerous precedent for every byte of users' information to be stored under one roof -- perfect for an opportunistic hacker
or businessman. Mastercard mention it is working closely with players like Microsoft, showing that many businesses have access to the data.
Neither Microsoft nor Mastercard have slated a release date for the system, only promising additional details on these efforts will be shared in the coming months.
Defending equal access to the free and open internet is core to Reddit's ideals, and something that redditors have told us time and again they hold dear too, from the SOPA/PIPA battle to the fight for Net Neutrality. This is why even though
we are an American company with a user base primarily in the United States, we've nevertheless spent a lot of time this year
warning about how an overbroad EU Copyright Directive could restrict Europeans' equal access to the open Internet--and to Reddit.
Despite these warnings, it seems that EU lawmakers still don't fully appreciate the law's potential impact, especially on small and medium-sized companies like Reddit. So we're stepping things up to draw attention to the problem. Users in the EU
will notice that when they access Reddit via desktop, they are greeted by a modal informing them about the Copyright Directive and referring them to
detailed resources on proposed fixes .
The problem with the Directive lies in Articles 11 (link licensing fees) and 13 (copyright filter requirements), which set sweeping, vague requirements that create enormous liability for platforms like ours. These requirements eliminate the
previous safe harbors that allowed us the leeway to give users the benefit of the doubt when they shared content. But under the new Directive, activity that is core to Reddit, like sharing links to news articles, or the use of existing content
for creative new purposes (r/photoshopbattles, anyone?) would suddenly become questionable under the law, and it is not clear right now that there are feasible mitigating actions that we could take while preserving core site functionality. Even
worse, smaller but similar attempts in various countries in Europe in the past have shown that
such efforts have actually harmed publishers and creators .
Accordingly, we hope that today's action will drive the point home that there are grave problems with Articles 11 and 13, and that the current trilogue negotiations will choose to remove both entirely. Barring that, however, we have a number of
suggestions for ways to improve both proposals. Engine and the Copia Institute have compiled them
https://dontwreckthe.net/ . We hope you will read them and consider calling your Member of European Parliament (
look yours up here ). We also hope that EU lawmakers will listen to those who use and understand the internet the most, and reconsider these problematic articles. Protecting rights holders need not come at the cost of silencing European
Parliamentary scrutiny committee condemns as 'defective' a DCMS Statutory Instrument excusing Twitter and Google images from age verification. Presumably one of the reasons for the delayed introduction
There's a joint committee to scrutinise laws passed in parliament via Statutory Instruments. These are laws that are not generally presented to parliament for discussion, and are passed by default unless challenged.
The committee has now taken issue with a DCMS law to excuse the likes of social media and search engines from requiring age verification for any porn images that may get published on the internet. The committee reports from a session on 21st
November 2018 that the law was defective and 'makes an unexpected use of the enabling power'. Presumably this means that the DCMS has gone beyond the scope of what can be passed without full parliamentary scrutiny.
Draft S.I.: Reported for defective drafting and for unexpected use of powers Online Pornography (Commercial Basis) Regulations 2018
7.1 The Committee draws the special attention of both Houses to these draft Regulations on the grounds that they are defectively drafted and make an unexpected use of the enabling power.
7.2 Part 3 of the Digital Economy Act 2017 ("the 2017 Act") contains provisions designed to prevent persons under the age of 18 from accessing internet sites which contain pornographic material. An age-verification regulator 1 is given
a number of powers to enforce the requirements of Part 3, including the power to impose substantial fines. 2
7.3 Section 14(1) is the key requirement. It provides:
"A person contravenes [Part 3 of the Act] if the person makes pornographic material available on the internet to persons in the United Kingdom on a commercial basis other than in a way that secures that, at any given time, the material
is not normally accessible by persons under the age of 18".
7.4 The term "commercial basis" is not defined in the Act itself. Instead, section 14(2) confers a power on the Secretary of State to specify in regulations the circumstances in which, for the purposes of Part 3, pornographic material
is or is not to be regarded as made available on a commercial basis. These draft regulations would be made in exercise of that power. Regulation 2 provides:
"(1) Pornographic material is to be regarded as made available on the internet to persons in the United Kingdom on a commercial basis for the purposes of Part 3 of the Digital Economy Act 2017 if either paragraph (2) or (3) are met.
(2) This paragraph applies if access to that pornographic material is available only upon payment.
(3) This paragraph applies (subject to paragraph (4)) if the pornographic material is made available free of charge and the person who makes it available receives (or reasonably expects to receive) a payment, reward or other benefit in
connection with making it available on the internet.
(4) Subject to paragraph (5), paragraph (3) does not apply in a case where it is reasonable for the age-verification regulator to assume that pornographic material makes up less than one-third of the content of the material made available on
or via the internet site or other means (such as an application program) of accessing the internet by means of which the pornographic material is made available.
(5) Paragraph (4) does not apply if the internet site or other means (such as an application program) of accessing the internet (by means of which the pornographic material is made available) is marketed as an internet site or other means of
accessing the internet by means of which pornographic material is made available to persons in the United Kingdom."
7.5 The Committee finds these provisions difficult to understand, whether as a matter of simple English or as legal propositions. Paragraphs (4) and (5) are particularly obscure.
7.6 As far as the Committee can gather from the Explanatory Memorandum, the policy intention is that a person will be regarded as making pornographic material available on the internet on a commercial basis if:
(A) a charge is made for access to the material; OR
(B) the internet site is accessible free of charge, but the person expects to receive a payment or other commercial benefit, for example through advertising carried on the site.
7.7 There is, however, an exception to (B): in cases in which no access charge is made, the person will NOT be regarded as making the pornographic material available on a commercial basis if the material makes up less than one-third of the
content on the internet site--even if the person expects to receive a payment or other commercial benefit from the site. But that exception does not apply in a case where the person markets it as a pornographic site, or markets an "app"
as a means of accessing pornography on the site.
7.8 As the Committee was doubtful whether regulation 2 as drafted is effective to achieve the intended result, it asked the Department for Digital, Culture, Media and Sport a number of questions. These were designed to elicit information about
the regulation's meaning and effect.
7.9 The Committee is disappointed with the Department's memorandum in response, printed at Appendix 7: it fails to address adequately the issues raised by the Committee.
7.10 The Committee's first question asked the Department to explain why paragraph (1) of regulation 2 refers to whether either paragraph (2) or (3) "are met" 3 rather than "applies". The Committee raised this point because
paragraphs (2) and (3) each begin with "This paragraph applies if ...". There is therefore a mismatch between paragraph (1) and the subsequent paragraphs, which could make the regulation difficult to interpret. It would be appropriate
to conclude paragraph (1) with "is met" only if paragraphs (2) and (3) began with "The condition in this paragraph is met if ...". The Department's memorandum does not explain this discrepancy. The Committee accordingly
reports regulation 2(1) for defective drafting.
7.11 The first part of the Committee's second question sought to probe the intended effect of the words in paragraph (4) of regulation 2 italicised above, and how the Department considers that effect is achieved.
7.12 While the Department's memorandum sets out the policy reasons for setting the one-third threshold, it offers little enlightenment on whether paragraph (4) is effective to achieve the policy aims. Nor does it deal properly with the second
part of the Committee's question, which sought clarification of the concept of "one-third of ... material ... on ... [a] means .... of accessing the internet ...".
7.13 The Committee is puzzled by the references in regulation 2(4) to the means of accessing the internet. Section 14(2) of the 2017 Act confers a power on the Secretary of State to specify in regulations circumstances in which pornographic
material is or is not to be regarded as made available on the internet on a commercial basis. The means by which the material is accessed (for example, via an application program on a smart phone) appears to be irrelevant to the question of
whether it is made available on the internet on a commercial basis. The Committee remains baffled by the concept of "one-third of ... material ... on [a] means ... of accessing the internet".
7.14 More generally, regulation 2(4) fails to specify how the one-third threshold is to be measured and what exactly it applies to. Will the regulator be required to measure one-third of the pictures or one-third of the words on a particular
internet site or both together? And will a single webpage on the site count towards the total if less than one-third of the page's content is pornographic--for example, a sexually explicit picture occupying 32% of the page, with the remaining 68%
made up of an article about fishing? The Committee worries that the lack of clarity in regulation 2(4) may afford the promoter of a pornographic website opportunities to circumvent Part 3 of the 2017 Act.
7.15 The Committee is particularly concerned that a promoter may make pornographic material available on one or more internet sites containing multiple pages, more than two-thirds of which are non-pornographic. For every 10 pages of pornography,
there could be 21 pages about (for example) gardening or football. Provided the sites are not actively marketed as pornographic, they would not be regarded as made available on a commercial basis. This means that Part 3 of the Act would not
apply, and the promoter would be free to make profits through advertising carried on the sites, while taking no steps at all to ensure that they were inaccessible to persons under 18.
7.16 The Committee anticipates that the shortcomings described above are likely to cause significant difficulty in the application and interpretation of regulation 2(4). The Committee also doubts whether Parliament contemplated, when enacting
Part 3 of the 2017 Act, that the power conferred by section 14(2) would be exercised in the way provided for in regulation 2(4). The Committee therefore reports regulation 2(4) for defective drafting and on the ground that it appears to make
an unexpected use of the enabling power.
Sony president Atsushi Morita has made the first official comments about his company's new found enthusiasm for video game censorship. Posted on Japanese website Ebitsu.net, but without official translation, he purportedly told attendees at
a Japan Studio event that expression restrictions [have been] adjusted to the global standards. He apparently concluded:
Considering the balance between freedom of expression and safety to children, I think that it is a difficult problem.
One video game series thats been affected by Sony's censorship is Senran Kagura . The producer of the latest game, Kenichiro Takaki commented that the next title in the series is going to take time as it deals with these new
regulations. He said:
We have to make games in a way that they aren't misunderstood. Certain things are harder than they've ever been before. Given that, I think [the game] is going to take some time.
Kingdom Hearts 3 is an upcoming video game that features Winnie the Pooh.
Now China's president Xi Jinping has taken offence at his gait and pot belly being likened to Pooh bear so Chinese censors have to spend hours ensuring that images of the bear are airbrushed out of Chinese life.
A Chinese website sharing images of the upcoming game revealed the game's interesting form of censorship. The iconic Winnie the Pooh is censored out with a gigantic white light.
Chinese internet companies have started keeping detailed records of their users' personal information and online activity. The new rules from China's internet censor went into effect Friday.
The new requirements apply to any company that provides online services which can influence public opinion or mobilize the public to engage in specific activities, according to a notice posted on the Cyber Administration of China's website.
Citing the need to safeguard national security and social order, the Chinese internet censor said companies must be able to verify users' identities and keep records of key information such as call logs, chat logs, times of activity and network
Officals will carry out inspections of companies' operations to ensure compliance. But the Cyber Administration didn't make clear under what circumstances the companies might be required to hand over logs to authorities.