|
Vodafone ISP starts blocking pirate websites without waiting for a court order
|
|
|
| 22nd December 2018
|
|
| See article from torrentfreak.com
|
ISP Vodafone has begun blocking a pair of illicit streaming portals in unusual circumstances. Burning Series and Serial Stream were rendered inaccessible on Tuesday, but not as the result of a specific blocking injunction. The ISP says that following a
decision by the Federal Court of Justice in the summer, it felt compelled to block the sites following a request from a copyright holder. The fact that ISPs around the world are blocking pirate sites to prevent copyright infringement is nothing
new. Aside from voluntary arrangements, such as the one currently playing out in Portugal, ISPs tend to wait for courts to hand down an injunction before blocking a site. In Germany, however, a new situation has raised its head. On Tuesday,
subscribers to Vodafone discovered that they could no longer access streaming portals Burning Series (BS.to) and Serial Stream (S.to). Rather than accessing the thousands of TV shows usually on offer, they were instead met by a blocking message presented
by their ISP. Both sites currently have messages on their main pages, explaining that Vodafone has chosen to block their platforms.
|
|
Copyright lobbyists whinge about Julia Reda's campaign methods
|
|
|
| 10th
December 2018
|
|
| See Creative Commons
article from torrentfreak.com |
As the controversy over the EU's Article 13 censorship machines continue, Twitter appears to be the communications weapon of choice for parties on both sides. As one of the main opponents of Article 13 and in particular its
requirement for upload filtering, Julia Reda MEP has been a frequent target for proponents. Accused of being a YouTube/Google shill (despite speaking out loudly against YouTube's maneuvering), Reda has endured a lot of criticism. As an MEP, she's
probably used to that. However, a recent response to one of her tweets from music giant IFPI opens up a somewhat ironic can of worms that deserves a closer look. Since kids will be affected by Article 13,
largely due to their obsessiveness with YouTube, Reda recently suggested that they should lobby their parents to read up on the legislation. In tandem with pop-ups from YouTube advising users to oppose Article 13, that seemed to irritate some supporters
of the proposed law. As the response from IFPI's official account shows, Reda's advice went down like a lead balloon with the music group, a key defender of Article 13. The IFPI tweeted:. Shame on you: Do you really approve of minors being manipulated by big tech companies to deliver their commercial agenda?
It's pretty ironic that IFPI has called out Reda for informing kids about copyright law to further the aims of big tech companies. As we all know, the music and movie industries have been happily doing exactly the
same to further their own aims for at least ten years and probably more. Digging through the TF archives, there are way too many articles detailing how big media has directly targeted kids with their message over the last decade.
Back in 2009, for example, a former anti-piracy consultant for EMI lectured kids as young as five on anti-piracy issues.
|
|
Yes, the EU's New #CopyrightDirective is All About Filters. By Cory Doctorow
|
|
|
| 30th November 2018
|
|
| See CC article from eff.org |
When the EU started planning its new Copyright Directive (the "Copyright in the Digital Single Market Directive"), a group of powerful entertainment industry lobbyists pushed a terrible idea: a mandate that all online platforms would have
to create crowdsourced databases of "copyrighted materials" and then block users from posting anything that matched the contents of those databases. At the time, we, along with academics and technologists explained why
this would undermine the Internet, even as it would prove unworkable. The filters would be incredibly expensive to create, would erroneously block whole libraries' worth of legitimate materials, allow libraries' more worth of infringing materials to slip
through, and would not be capable of sorting out "fair dealing" uses of copyrighted works from infringing ones. The Commission nonetheless included it in their original draft. Two years later, after the European
Parliament went back and forth on whether to keep the loosely-described filters, with German MEP Axel Voss finally squeezing a narrow victory in his own committee, and an emergency vote of the whole Parliament. Now, after a lot of politicking and
lobbying, Article 13 is potentially only a few weeks away from becoming officially an EU directive, controlling the internet access of more than 500,000,000 Europeans. The proponents of Article 13 have a problem, though: filters
don't work, they cost a lot, they underblock, they overblock, they are ripe for abuse (basically, all the objections the Commission's experts raised the first time around). So to keep Article 13 alive, they've spun, distorted and obfuscated its
intention, and now they can be found in the halls of power, proclaiming to the politicians who'll get the final vote that "Article 13 does not mean copyright filters." But it does. Here's a list
of Frequently Obfuscated Questions and our answers. We think that after you've read them, you'll agree: Article 13 is about filters, can only be about filters, and will result in filters.
Article 13 is about filtering, not "just" liability Today, most of the world (including the EU) handles copyright infringement with some sort of takedown process. If you provide the public
with a place to publish their thoughts, photos, videos, songs, code, and other copyrightable works, you don't have to review everything they post (for example, no lawyer has to watch 300 hours of video every minute at YouTube before it goes live).
Instead, you allow rightsholders to notify you when they believe their copyrights have been violated and then you are expected to speedily remove the infringement. If you don't, you might still not be liable for your users' infringement, but you lose
access to the quick and easy 'safe harbor' provided by law in the event that you are named as part of any copyright lawsuit (and since the average internet company has a lot more money than the average internet user, chances are you will be named
in that suit). What you're not expected to be is the copyright police. And in fact, the EU has a specific Europe-wide law that stops member states from forcing Internet services from having to play this role: the same rule that defines the limits
of their liability, the E-Commerce Directive, in the very next article, prohibits a "general obligation to monitor." That's to stop countries from saying "you should know that your users are going to break some law, some time, so
you should actively be checking on them all the time -- and if you don't, you're an accomplice to their crimes." The original version of Article tried to break this deal, by re-writing that second part. Instead of a prohibition on monitoring, it required
it, in the form of a mandatory filter. When the European Parliament rebelled against that language, it was because millions of Europeans had warned them of the dangers of copyright filters. To bypass this outrage, Axel Voss
proposed an amendment to the Article that replaced an explicit mention of filters, but rewrote the other part of the E-Commerce directive. By claiming this "removed the filters", he got his amendment passed -- including by gaining votes
by MEPs who thought they were striking down Article 13.Voss's rewrite says that sharing sites are liable unless they take steps to stop that content before it goes online. So yes, this is about liability, but it's also
about filtering. What happens if you strip liability protections from the Internet? It means that services are now legally responsible for everything on their site. Consider a photo-sharing site where millions of photos are posted every hour. There
are not enough lawyers -- let alone copyright lawyers -- let alone copyright lawyers who specialise in photography -- alive today to review all those photos before they are permitted to appear online. Add to that all the
specialists who'd have to review every tweet, every video, every Facebook post, every blog post, every game mod and livestream. It takes a fraction of a second to take a photograph, but it might take hours or even days to ensure that everything the photo
captures is either in the public domain, properly licensed, or fair dealing. Every photo represents as little as an instant's work, but making it comply with Article 13 represents as much as several weeks' work. There is no way that Article 13's purpose
can be satisfied with human labour. It's strictly true that Axel Voss's version of Article 13 doesn't mandate filters -- but it does create a liability system that can only be satisfied with filters.
But there's more: Voss's stripping of liability protections has Big Tech like YouTube scared, because if the filters aren't perfect, they will be potentially liable for any infringement that gets past them -- and given their billions,
that means anyone and everyone might want to get a piece of them. So now, YouTube has started lobbying for the original text, copyright filters and all. That text is still on the table, because the trilogue uses both Voss' text (liability to get filters)
and member states' proposal (all filters, all the time) as the basis for the negotiation.
Most online platforms cannot have lawyers review all the content they make available The only online services that can have lawyers review their content are services for delivering relatively small
libraries of entertainment content, not the general-purpose speech platforms that make the Internet unique. The Internet isn't primarily used for entertainment (though if you're in the entertainment industry, it might seem that way): it is a digital
nervous system that stitches together the whole world of 21st Century human endeavor. As the UK Champion for Digital Inclusion discovered when she commissioned a study of the impact of Internet access on personal life, people use the Internet to do
everything, and people with Internet access experience positive changes across their lives : in education, political and civic engagement, health, connections with family, employment, etc. The job we ask, say, iTunes and Netflix
to do is a much smaller job than we ask the online companies to do. Users of online platforms do sometimes post and seek out entertainment experiences on them, but as a subset of doing everything else: falling in love, getting and keeping a job,
attaining an education, treating chronic illnesses, staying in touch with their families, and more. iTunes and Netflix can pay lawyers to check all the entertainment products they make available because that's a fraction of a slice of a crumb of all the
material that passes through the online platforms. That system would collapse the instant you tried to scale it up to manage all the things that the world's Internet users say to each other in public.
It's impractical for users to indemnify the platforms Some Article 13 proponents say that online companies could substitute click-through agreements for filters, getting users to pay them back for
any damages the platform has to pay out in lawsuits. They're wrong. Here's why. Imagine that every time you sent a tweet, you had to click a box that said, "I promise that this doesn't infringe copyright and I will pay
Twitter back if they get sued for this." First of all, this assumes a legal regime that lets ordinary Internet users take on serious liability in a click-through agreement, which would be very dangerous given that people do not have enough hours in
the day to read all of the supposed 'agreements' we are subjected to by our technology. Some of us might take these agreements seriously and double-triple check everything we posted to Twitter but millions more wouldn't, and they
would generate billions of tweets, and every one of those tweets would represent a potential lawsuit. For Twitter to survive those lawsuits, it would have to ensure that it knew the true identity of every Twitter user (and how to
reach that person) so that it could sue them to recover the copyright damages they'd agreed to pay. Twitter would then have to sue those users to get its money back. Assuming that the user had enough money to pay for Twitter's legal fees and the fines it
had already paid, Twitter might be made whole... eventually. But for this to work, Twitter would have to hire every contract lawyer alive today to chase its users and collect from them. This is no more sustainable than hiring every copyright lawyer alive
today to check every tweet before it is published.
Small tech companies would be harmed even more than large ones It's true that the Directive exempts "Microenterprises and small-sized enterprises" from Article 13, but that doesn't mean
that they're safe. The instant a company crosses the threshold from "small" to "not-small" (which is still a lot smaller than Google or Facebook), it has to implement Article 13's filters. That's a multi-hundred-million-dollar tax on
growth, all but ensuring that the small Made-in-the-EU competitors to American Big Tech firms will never grow to challenge them. Plus, those exceptions are controversial in the Trilogue, and may disappear after yet more rightsholder lobbying.
Existing filter technologies are a disaster for speech and innovation ContentID is YouTube's proprietary copyright filter. It works by allowing a small, trusted cadre of rightsholders to claim works
as their own copyright, and limits users' ability to post those works according to the rightsholders' wishes, which are more restrictive than what the law's user protections would allow. ContentID then compares the soundtrack (but not the video
component) of any user uploads to the database to see whether it is a match. Everyone hates ContentID. Universal and the other big rightsholders complain loudly and frequently that ContentID is too easy for infringers to bypass.
YouTube users point out that ContentID blocks all kind of legit material, including silence , birdsong , and music uploaded by the actual artist for distribution on YouTube . In many cases, this isn't a 'mistake,' in the sense that Google has agreed to
let the big rightsholders block or monetize videos that do not infringe any copyright, but instead make a fair use of copyrighted material. ContentID does a small job, poorly: filtering the soundtracks of videos to check for
matches with a database populated by a small, trusted group. No one (who understands technology) seriously believes that it will scale up to blocking everything that anyone claims as a copyrighted work (without having to show any proof of that claim or
even identify themselves!), including videos, stills, text, and more.
Online platforms aren't in the entertainment business The online companies most impacted by Article 13 are platforms for general-purpose communications in every realm of human endeavor, and if we try
to regulate them like a cable operator or a music store, that's what they will become.
The Directive does not adequately protect fair dealing and due process Some drafts of the Directive do say that EU nations should have "effective and expeditious complaints and redress
mechanisms that are available to users" for "unjustified removals of their content. Any complaint filed under such mechanisms shall be processed without undue delay and be subject to human review. Right holders shall reasonably justify their
decisions to avoid arbitrary dismissal of complaints." What's more, "Member States shall also ensure that users have access to an independent body for the resolution of disputes as well as to a court or another relevant
judicial authority to assert the use of an exception or limitation to copyright rules." On their face, these look like very good news! But again, it's hard (impossible) to see how these could work at Internet scale. One of
EFF's clients had to spend ten years in court when a major record label insisted -- after human review, albeit a cursory one-- that the few seconds' worth of tinny background music in a video of her toddler dancing in her kitchen infringed copyright. But
with Article 13's filters, there are no humans in the loop: the filters will result in millions of takedowns, and each one of these will have to receive an "expeditious" review. Once again, we're back to hiring all the lawyers now alive -- or
possibly, all the lawyers that have ever lived and ever will live -- to check the judgments of an unaccountable black box descended from a system that thinks that birdsong and silence are copyright infringements. It's pretty clear
the Directive's authors are not thinking this stuff through. For example, some proposals include privacy rules: "the cooperation shall not lead to any identification of individual users nor the processing of their personal data." Which is
great: but how are you supposed to prove that you created the copyrighted work you just posted without disclosing your identity? This could not be more nonsensical if it said, "All tables should weigh at least five tonnes and also be easy to lift
with one hand."
The speech of ordinary Internet users matters Eventually, arguments about Article 13 end up here: "Article 13 means filters, sure. Yeah, I guess the checks and balances won't scale. OK, I guess
filters will catch a lot of legit material. But so what? Why should I have to tolerate copyright infringement just because you can't do the impossible? Why are the world's cat videos more important than my creative labour?" One thing about this argument: at least it's honest. Article 13 pits the free speech rights of every Internet user against a speculative theory of income maximisation for creators and the entertainment companies they ally themselves with: that filters will create revenue for them.
It's a pretty speculative bet. If we really want Google and the rest to send more money to creators, we should create a Directive that fixes a higher price through collective licensing. But let's take a
moment here and reflect on what "cat videos" really stand in for here. The personal conversations of 500 million Europeans and 2 billion global Internet users matter : they are the social, familial, political and educational discourse of
a planet and a species. They have worth, and thankfully it's not a matter of choosing between the entertainment industry and all of that -- both can peacefully co-exist, but it's not a good look for arts groups to advocate that everyone else shut up and
passively consume entertainment product as a way of maximising their profits.
|
|
India ups the ante with 12500 websites blocked to protect a local blockbuster movie
|
|
|
| 30th November 2018
|
|
| See CC article from torrentfreak.com
|
The Madras High Court has handed down one of the most aggressive site-blocking orders granted anywhere in the world. Following an application by Lyca Productions , more than 12,500 sites will be preemptively blocked by 37 Indian ISPs to prevent 2.0
- India's most expensive film ever - being leaked following its premiere. What we're looking at here is a preemptive blocking order of a truly huge scale against sites that have not yet made the movie available and may never do so. In the
meantime, however, a valuable lesson about site-blocking is already upon us. Within hours of the blocks being handed down, a copy of 2.0 appeared online and is now available via various torrent and streaming sites labeled as a 1080p PreDVDRip. Forums
reviewed by TF suggest users aren't having a problem obtaining it. With a reported budget of US$76 million, 2.0 is the most expensive Indian film. The sci-fi flick is attracting huge interest and at one stage it was reported that Arnold
Schwarzenegger had been approached to play a leading role in the flagship production. |
|
Australian parliament passes new law with wide ranging blocking of copyright infringing websites
|
|
|
| 28th November 2018
|
|
| See article from torrentfreak.com
|
The Australian Parliament has passed controversial amendments to copyright law. There will now be a tightened site-blocking regime that will tackle mirrors and proxies more effectively, restrict the appearance of blocked sites in Google search, and
introduce the possibility of blocking dual-use cyberlocker type sites. Section 115a of Australia's Copyright Act allows copyright holders to apply for injunctions to force ISPs to prevent subscribers from accessing pirate sites. While rightsholders
say that it's been effective to a point, they have lobbied hard for improvements. The resulting Copyright Amendment (Online Infringement) Bill 2018 contained proposals to close the loopholes. After receiving endorsement from the Senate earlier
this week, the legislation was today approved by Parliament. Once the legislation comes into force, proxy and mirror sites that appear after an injunction against a pirate site has been granted can be blocked by ISPs without the parties having to
return to court. Assurances have been given, however, that the court will retain some oversight. Search engines, such as Google and Bing, will also be affected. Accused of providing backdoor access to sites that have already been blocked, search
providers will now have to remove or demote links to overseas-based infringing sites, along with their proxies and mirrors. The Australian Government will review the effectiveness of the new amendments in two years' time. |
|
Julia Reda outlines amendments to censorship machines and link tax as the upcoming internet censorship law gets discussed by the real bosses of the EU
|
|
|
|
15th November 2018
|
|
| See article from juliareda.eu |
The closed-door trilogue efforts to finalise the EU Copyright Directive continue. The Presidency of the Council, currently held by Austria, has now circulated among the EU member state governments a new proposal for a compromise between the differing
drafts currently on the table for the controversial Articles 11 and 13. Under this latest proposal, both upload filters and the link tax would be here to stay -- with some changes for the better, and others for the worse.
Upload filters/Censorshipmachines Let's recall: In its final position, the European Parliament had tried its utmost to avoid specifically mentioning upload filters, in order to avoid the massive public
criticism of that measure. The text they ended up with, however, was even worse: It would make online platforms inescapably liable for any and all copyright infringement by their users, no matter what action they take. Not even the strictest upload
filter in the world could possibly hope to catch 100% of unlicensed content. This is what prompted YouTube's latest lobbying efforts in favour of upload filters and against the EP's proposal of inescapable liability. Many have
mistaken this as lobbying against Article 13 as a whole -- it is not. In Monday's Financial Times, YouTube spelled out that they would be quite happy with a law that forces everyone else to build (or, presumably, license from them) what they already have
in place: Upload filters like Content ID. In this latest draft, the Council Presidency sides with YouTube, going back to rather explicitly prescribing upload filters. The Council proposes two alternative options on how to phrase
that requirement, but they match in effect: Platforms are liable for all copyright infringements committed by their users, EXCEPT if they
cooperate with rightholders by implementing effective and proportionate steps to prevent works they've been informed about from ever going online determining which steps those are must take into
account suitable and effective technologies Under this text, wherever upload filters are possible, they must be implemented: All your uploads will require prior approval by error-prone copyright bots .
On the good side, the Council Presidency seems open to adopting the Parliament's exception for platforms run by small and micro businesses . It also takes on board the EP's better-worded exception for open source code sharing
platforms like GitHub. On the bad side, Council rejects Parliament's efforts for a stronger complaint mechanism requiring reviews by humans and an independent conflict resolution body. Instead it takes on board the EP's insistence
that licenses taken out by a platform don't even have to necessarily cover uses of these works by the users of that platform. So, for example, even if YouTube takes out a license to show a movie trailer, that license could still prevent you as an
individual YouTuber from using that trailer in your own uploads. Article 11 Link tax On the link tax, the Council is mostly sticking to its position: It wants the requirement to license even short
snippets of news articles to last for one year after an article's publication, rather than five, as the Parliament proposed. In a positive development, the Council Presidency adopts the EP's clarification that at least the facts
included in news articles as such should not be protected. So a journalist would be allowed to report on what they read in another news article, in their own words. Council fails to clearly exclude hyperlinks -- even those that
aren't accompanied by snippets from the article. It's not uncommon for the URLs of news articles themselves to include the article's headline. While the Council wants to exclude insubstantial parts of articles from requiring a license, it's not certain
that headlines count as insubstantial. (The Council's clause allowing acts of hyperlinking when they do not constitute communication to the public would not apply to such cases, since reproducing the headline would in fact constitute such a communication
to the public.) The Council continues to want the right to only apply to EU-based news sources -- which could in effect mean fewer links and listings in search engines, social networks and aggregators for European sites, putting
them at a global disadvantage. However, it also proposes spelling out that news sites may give out free licenses if they so choose -- contrary to the Parliament, which stated that listing an article in a search engine should not
be considered sufficient payment for reproducing snippets from it.
|
|
Satanic temple sues Netflix with a copyright claim over a statue of Baphomet
|
|
|
| 8th November 2018
|
|
| Thanks to Nick See
article from eu.usatoday.com |
The Satanic Temple in Salem, Massachusetts is suing Netflix and producers Warner Brothers over a statue of the goat-headed deity Baphomet that appears in the TV series Chilling Adventures of Sabrina . The temple is claiming that Netflix and
Warners are violating the copyright and trademark of the temple's own Baphomet statue, which it built several years ago. Historically, the androgynous deity has been depicted with a goat's head on a female body, but The Satanic Temple created this
statue with Baphomet having a male chest an idea that was picked up by Netflix. The Temple is seeking damages of at least $50 million for copyright infringement, trademark violation and injury to business reputation. In the Sabrina storyline, the
use of the statue as the central focal point of the school associated with evil, cannibalism and possibly murder is injurious to TST's business, the Temple says in its suit. |
|
An Italian change of heart against the EU's disgraceful policy to put corporate interests above free speech means that there could yet be a challenge to the upcoming copyright law
|
|
|
| 24th October
2018
|
|
| See article from boingboing.net CC by Cory Doctorow |
When the EU voted for mandatory copyright censorship of the internet in September, Italy had a different government; the
ensuing Italian elections empowered a new government, who oppose the filters. Once states totalling 35% of the EU's population oppose the new Copyright Directive, they can form a "blocking minority" and kill it or cause
it to be substantially refactored. With the Italians opposing the Directive because of its draconian new internet rules (rules introduced at the last moment, which have been hugely controversial), the reputed opponents of the Directive have now crossed
the 35% threshold, thanks to Germany, Finland, the Netherlands, Slovenia, Belgium and Hungary. Unfortunately, the opponents of Article 11 (the "link tax") and Article 13 (the copyright filters) are not united on their
opposition -- they have different ideas about what they would like to see done with these provisions. If they pull together, that could be the end of these provisions. If you're a European
this form will let you contact your MEP quickly and painlessly and let them know how you feel about the proposals. That's where matters stand
now: a growing set of countries who think copyright filters and link taxes go too far, but no agreement yet on rejecting or fixing them. The trilogues are not a process designed to resolve such large rifts when both the EU states
and the parliament are so deeply divided. What happens now depends entirely on how the members states decide to go forward: and how hard they push for real reform of Articles 13 and 11. The balance in that discussion has changed,
because Italy changed its position. Italy changed its position because Italians spoke up. If you reach out to your countries' ministry in charge of copyright, and tell them that these Articles are a concern to you, they'll start paying attention too. And
we'll have a chance to stop this terrible directive from becoming terrible law across Europe.
|
|
|
|
|
| 21st October 2018
|
|
|
Australian Government Looks To Creep Site Censorship Into Search Censorship See
article from techdirt.com |
|
EU Internet Censorship Will Censor the Whole World's Internet
|
|
|
| 10th
October 2018
|
|
| See article from eff.org CC by Cory Doctorow
|
As the EU advances the new Copyright Directive towards becoming law in its 28 member-states, it's important to realise that
the EU's plan will end up censoring the Internet for everyone , not just Europeans. A quick refresher: Under Article 13 of the new Copyright Directive, anyone who operates a (sufficiently large) platform where people can
post works that might be copyrighted (like text, pictures, videos, code, games, audio etc) will have to crowdsource a database of "copyrighted works" that users aren't allowed to post, and block anything that seems to match one of the database
entries. These blacklist databases will be open to all comers (after all, anyone can create a copyrighted work): that means that billions of people around the world will be able to submit anything to the blacklists, without
having to prove that they hold the copyright to their submissions (or, for that matter, that their submissions are copyrighted). The Directive does not specify any punishment for making false claims to a copyright, and a platform that decided to block
someone for making repeated fake claims would run the risk of being liable to the abuser if a user posts a work to which the abuser does own the rights . The major targets of this censorship plan are the social media
platforms, and it's the "social" that should give us all pause. That's because the currency of social media is social interaction between users . I post something, you reply, a third person chimes in, I reply
again, and so on. Now, let's take a hypothetical Twitter discussion between three users: Alice (an American), Bob (a Bulgarian) and Carol (a Canadian). Alice posts a picture of a political march: thousands
of protesters and counterprotesters, waving signs. As is common
around the
world , these signs include copyrighted images, whose use is permitted under US
"fair use" rules that permit parody. Because Twitter enables users to communicate significant amounts of user-generated content, they'll fall within the ambit of Article 13. Bob lives in Bulgaria, an EU member-state
whose copyright law does not permit parody . He might want to reply to Alice with a quote from the Bulgarian dissident Georgi Markov , whose works were
translated into English in the late 1970s and are still in copyright. Carol, a Canadian who met Bob and Alice through their shared love of Doctor Who, decides to post a witty meme from " The Mark of the Rani ," a 1985
episode in which Colin Baker travels back to witness the Luddite protests of the 19th Century. Alice, Bob and Carol are all expressing themselves through use of copyrighted cultural works, in ways that might not be lawful in the
EU's most speech-restrictive copyright jurisdictions. But because (under today's system) the platform typically is only required to to respond to copyright complaints when a rightsholder objects to the use, everyone can see everyone else's posts and
carry on a discussion using tools and modes that have become the norm in all our modern, digital discourse. But once Article 13 is in effect, Twitter faces an impossible conundrum. The Article 13 filter will be tripped by Alice's
lulzy protest signs, by Bob's political quotes, and by Carol's Doctor Who meme, but suppose that Twitter is only required to block Bob from seeing these infringing materials. Should Twitter hide Alice and Carol's messages from
Bob? If Bob's quote is censored in Bulgaria, should Twitter go ahead and show it to Alice and Carol (but hide it from Bob, who posted it?). What about when Bob travels outside of the EU and looks back on his timeline? Or when Alice goes to visit Bob in
Bulgaria for a Doctor Who convention and tries to call up the thread? Bear in mind that there's no way to be certain where a user is visiting from, either. The dangerous but simple option is to subject all Twitter messages
to European copyright censorship, a disaster for online speech. And it's not just Twitter, of course: any platform with EU users will have to solve this problem. Google, Facebook, Linkedin, Instagram, Tiktok, Snapchat, Flickr,
Tumblr -- every network will have to contend with this. With Article 13, the EU would create a system where copyright complainants get a huge stick to beat the internet with, where people who abuse this power face no penalties,
and where platforms that err on the side of free speech will get that stick right in the face. As the EU's censorship plan
works its way through the next steps on the way to becoming binding across the EU, the whole world has a stake -- but only a
handful of appointed negotiators get a say. If you are a European, the rest of the world would be very grateful indeed if you would take a moment to contact
your MEP and urge them to protect us all in the new Copyright Directive. ( Image: The World Flag , CC-BY-SA )
|
|
Canadian TV music group calls for a levy on broadband use over 15 GBytes per month to compensate them for declining income
|
|
|
| 7th October
2018
|
|
| See article from torrentfreak.com
|
In Canada, there have been ongoing discussions and proposals about new levies and fees to compensate creators for supposed missed revenue. There have been calls to levy a tax on mobile devices such as iPhones, for example. This week the Screen Composers
Guild of Canada took things up a notch, calling for a copyright levy on all broadband data use above 15 gigabytes per month. A proposal from the Screen Composers Guild of Canada (SCGC), put forward during last week's Government hearings, suggests
to simply add a levy on Internet use above 15 gigabytes per month. The music composers argue that this is warranted because composers miss out on public performance royalties. One of the reasons for this is that online streaming services are not
paying as much as terrestrial broadcasters. The composers SCGC represents are not the big music stars. They are the people who write music for TV-shows and other broadcasts. Increasingly these are also shown on streaming services where the
compensation is, apparently, much lower. SCGC writes: With regard to YouTube, which is owned by the advertising company Alphabet-Google, minuscule revenue distribution is being reported by our members. Royalties from
the large streaming services, like Amazon and Netflix, are 50 to 95% lower when compared to those from terrestrial broadcasters. Statistics like this indicate that our veteran members will soon have to seek employment elsewhere
and young screen-composers will have little hope of sustaining a livelihood, the guild adds, sounding the alarm bell.
SCGC's solution to this problem is to make every Canadian pay an extra fee when they use over 15 gigabytes of data
per month. This money would then be used to compensate composers and fix the so-called value gap. As a result, all Internet users who go over the cap will have to pay more. Even those who don't watch any of the programs where the music is used. However, SCGC doesn't see the problem and believes that 15 gigabytes are enough. People who want to avoid paying can still use email and share photos, they argue. Those who go over the cap are likely streaming not properly compensated videos. SCGC notes:
An ISP subscription levy that would provide a minimum or provide a basic 15 gigabytes of data per Canadian household a month that would be unlevied. Lots of room for households to be able to do Internet transactions,
business, share photos, download a few things, emails, no problem. [W]hen you're downloading and consuming over 15 gigabytes of data a month, you're likely streaming Spotify. You're likely streaming YouTube. You're likely
streaming Netflix. So we think because the FANG companies will not give us access to the numbers that they have, we have to apply a broad-based levy. They're forcing us to.
The last comment is telling. The composers guild believes
that a levy is the only option because Netflix, YouTube, and others are not paying their fair share. That sounds like a licensing or rights issue between these services and the authors. Dragging millions of Canadians into this dispute seems questionable,
especially when many people have absolutely nothing to do with it.
|
|
|
|
|
| 2nd October 2018
|
|
|
NAFTA Replacement Extends Canada's Copyright Term to Life +70 years and agrees policies on ISPs and copyright enforcement See
article from torrentfreak.com |
|
|