|
After More Than a Decade of Litigation, the Dancing Baby Has Done His Part to Strengthen Fair Use for Everyone
|
|
|
| 28th June 2018
|
|
| See article from eff.org |
Litigation can always take twists and turns, but when EFF filed a lawsuit against Universal Music Group in 2007 on behalf of Stephanie Lenz, few would have anticipated it would be ten years until the case was finally resolved. But
today , at last, it is. Along the way, Lenz v. Universal contributed to strengthening
fair use law, bringing nationwide attention to the issues of copyright and fair use in new digital movie-making and sharing technologies. It all started when Lenz posted a YouTube
video of her then-toddler-aged son dancing while Prince's song Let's Go Crazy played in the background, and Universal used copyright claims to get the link
disabled. We brought the case hoping to get some clarity from the courts on a simple but important issue: can a rightsholder use the Digital Millennium Copyright Act
to take down an obvious fair use, without consequence? Congress designed the DMCA to give rightsholders, service providers, and users relatively precise
rules of the road for policing online copyright infringement. The center of the scheme is the notice and takedown process. In exchange for substantial protection from liability for the actions of their users, service providers must promptly take offline
content on their platforms that has been identified as infringing, as well as several other prescribed steps. Copyright owners, for their part, are given an expedited, extra-judicial procedure for obtaining redress against alleged infringement, paired
with explicit statutory guidance regarding the process for doing so, and provisions designed to deter and ameliorate abuse of that process. Without Section 512, the risk of crippling liability for the acts of users would have
prevented the emergence of most of the social media outlets we use today. Instead, the Internet has become the most revolutionary platform for the creation and dissemination of speech that the world has ever known. But Congress
also knew that Section 512's powerful incentives could result also in lawful material being censored from the Internet, without prior judicial scrutiny--much less advance notice to the person who posted the material--or an opportunity to contest the
removal. To inhibit abuse, Congress made sure that the DMCA included a series of checks and balances, including Section 512(f), which gives users the ability to hold rightsholders accountable if they send a DMCA notice in bad faith.
In this case, Universal Music Group claimed to have a good faith belief that Ms. Lenz's video of her child dancing to a short segment of barely-audible music infringed copyright. Yet the undisputed facts showed Universal never
considered whether Ms. Lenz's use was lawful under the fair use doctrine. If it had done so, it could not reasonably have concluded her use was infringing. On behalf of Stephanie Lenz, EFF argued that this was a misrepresentation in violation of Section
512(f). In response, Universal argued that rightsholders have no obligation to consider fair use at all. The U.S. Court of Appeals for the Ninth Circuit
rejected that argument, correctly holding that the DMCA requires a rightsholder to consider whether the uses she targets in a DMCA notice are
actually lawful under the fair use doctrine. However, the court also held that a rightsholder's determination on that question passes muster as long as she subjectively believes it to be true. This leads to a virtually incoherent result: a rightsholder
must consider fair use, but has no incentive to actually learn what such a consideration should entail. After all, if she doesn't know what the fair use factors are, she can't be held liable for not applying them thoughtfully. We
were disappointed in that part of the ruling, but it came with a big silver lining: the court also held that fair use is not simply a narrow defense copyright but an affirmative public right. For decades, rightsholders and scholars had debated the
issue, with many preferring to construe fair use as narrowly as possible. Thanks to the Lenz decision, courts will be more likely to think of fair use, correctly, as a crucial vehicle for achieving the real purpose of copyright law: to promote the
public interest in creativity and innovation. And rightsholders are on notice: they must at least consider fair use before sending a takedown notice. Lenz and Universal filed
petitions requesting that the Supreme Court review the Ninth Circuit's ruling. The Supreme Court denied both
petitions. This meant that the case returned to the district court for trial on the question of whether Universal's takedown was a misrepresentation under the Ninth Circuit's subjective standard. Rather than go to trial, the parties have agreed to a
settlement. Lenz v. Universal helped make some great law on fair use and also played a role in leading to better takedown processes at Universal. EFF congratulates Stephanie Lenz for fighting the good fight, and we thank
our co-counsel at Keker, Van Nest & Peters LLP and Kwun Bhansali Lazarus LLP for being our partners through
this long journey.
|
|
European Parliament committee passed vote to hand over censorship of the internet to US corporate giants
|
|
|
|
20th June 2018
|
|
| See article from bit-tech.net
|
The European Parliament's Committee on Legal Affairs (JURI) has officially approved Articles 11 and 13 of a Digital Single Market (DSM) copyright proposal, mandating censorship machines and a link tax. Articles 11 and 13 of the Directive of the
European Parliament and of the Council on Copyright in the Digital Single Market have been the subject of considerable campaigning from pro-copyleft groups including the Open Rights Group and Electronic Frontier Foundation of late. Article 11, as
per the final version of the proposal, discusses the implementation of a link tax - the requirement that any site citing third-party materials do so in a way that adheres to the exemptions and restrictions of a total of 28 separate copyright laws or pays
for a licence to use and link to the material; Article 13, meanwhile, requires any site which allows users to post text, sound, program code, still or moving images, or any other work which can be copyrighted to automatically scan all such uploads
against a database of copyright works - a database which they will be required to pay to access. Both Article 11 and Article 13 won't become official legislation until passed by the entire European Parliament in a plenary vote. There's no definite
timetable for when such a vote might take place, but it would likely happen sometime between December of this year and the first half of 2019. |
|
Nascent censorship machines already rise up against the stupid politicians that support their Genesis
|
|
|
| 18th June 2018
|
|
| See
article from
privateinternetaccess.com CC by Rick Falkvinge |
Politicians, about to vote in favor of mandatory upload filtering in Europe, get channel deleted by YouTube's upload filtering. French politicians of the former Front National are furious: their entire YouTube channel was just
taken down by automatic filters at YouTube for alleged copyright violations. Perhaps this will cause them to reconsider next week's vote, which they have announced they will support: the bill that will make exactly this arbitrary, political, and
unilateral upload filtering mandatory all across Europe. The French party Front National, now renamed Rassemblemant National (national rally point), which is one of biggest parties in France, have gotten their YouTube channel
disappeared on grounds of alleged copyright violations. In an interview with French Europe1, their leader Marine Le Pen calls the takedown arbitrary, political, and unilateral. Europe is about to vote on new copyright law next
week. Next Wednesday or Thursday. So let's disregard here for a moment that this happened to a party normally described as far-right, and observe that if it can happen to one of France's biggest parties regardless of their policies, then it can happen to
anyone for political reasons 204 or any other reason. The broadcast named TVLibert39s is gone, described by YouTube as YouTube has blocked the broadcast of the newscast of Thursday, June 14 for copyright infringement.
Marine Le Pen was quoted as saying, This measure is completely false; we can easily assert a right of quotation [to illustrate why the material was well within the law to broadcast]. She's right. Automated
upload filters do not take into account when you have a legal right to broadcast copyrighted material for one of the myriad of valid reasons. They will just assume that this such reasons never exist; if nothing else, to make sure that the hosting
platform steers clear of any liability. Political messages will be disappeared on mere allegations by a political opponent, just as might have happened here. And yet, the Rassemblemant National is going to vote in favor of exactly
this mandatory upload filtering. The horror they just described on national TV as arbitrary, political, and unilateral. It's hard to illustrate clearer that Europe's politicians have absolutely no idea about the monster they're
voting on next week. The decisions to come will be unilateral, political, and arbitrary. Freedom of speech will be unilateral, political, and arbitrary. Just as Marine Le Pen says. Just as YouTube's Content ID filtering is today,
as has just been illustrated. The article mandating this unilateral, political, and arbitrary censorship is called Article 13 of the upcoming European Copyright bill, and it must be removed entirely. There is no fixing of
automated censorship machines. Privacy remains your own responsibility. So do your freedoms of speech, information, and expression.
|
|
The UN's free speech rapporteur condemns the EU's censorship machines that will violate human rights
|
|
|
| 17th June 2018
|
|
| See
article from techdirt.com |
David Kaye, the UN's Special Rapporteur on freedom of expression has now chimed in with a very thorough report, highlighting how Article 13 of the Directive -- the part about mandatory copyright filters -- would be a disaster for free speech and would
violate the UN's Declaration on Human Rights, and in particular Article 19 which says: Everyone has the right to freedom of opinion and expression; the right includes freedom to hold opinions without interference and to
seek, receive and impart information and ideas through any media regardless of frontiers.
As Kaye's report notes, the upload filters of Article 13 of the Copyright Directive would almost certainly violate this principle.
Article 13 of the proposed Directive appears likely to incentivize content-sharing providers to restrict at the point of upload user-generated content that is perfectly legitimate and lawful. Although the latest proposed
versions of Article 13 do not explicitly refer to upload filters and other content recognition technologies, it couches the obligation to prevent the availability of copyright protected works in vague terms, such as demonstrating best efforts and taking
effective and proportionate measures. Article 13(5) indicates that the assessment of effectiveness and proportionality will take into account factors such as the volume and type of works and the cost and availability of measures, but these still leave
considerable leeway for interpretation. The significant legal uncertainty such language creates does not only raise concern that it is inconsistent with the Article 19(3) requirement that restrictions on freedom of expression
should be provided by law. Such uncertainty would also raise pressure on content sharing providers to err on the side of caution and implement intrusive content recognition technologies that monitor and filter user-generated content at the point of
upload. I am concerned that the restriction of user-generated content before its publication subjects users to restrictions on freedom of expression without prior judicial review of the legality, necessity and proportionality of such restrictions.
Exacerbating these concerns is the reality that content filtering technologies are not equipped to perform context-sensitive interpretations of the valid scope of limitations and exceptions to copyright, such as fair comment or reporting, teaching,
criticism, satire and parody.
Kaye further notes that copyright is not the kind of thing that an algorithm can readily determine, and the fact-specific and context-specific nature of copyright requires much more than just throwing
algorithms at the problem -- especially when a website may face legal liability for getting it wrong. The designation of such mechanisms as the main avenue to address users' complaints effectively delegates content
blocking decisions under copyright law to extrajudicial mechanisms, potentially in violation of minimum due process guarantees under international human rights law. The blocking of content -- particularly in the context of fair use and other
fact-sensitive exceptions to copyright -- may raise complex legal questions that require adjudication by an independent and impartial judicial authority. Even in exceptional circumstances where expedited action is required, notice-and-notice regimes and
expedited judicial process are available as less invasive means for protecting the aims of copyright law. In the event that content blocking decisions are deemed invalid and reversed, the complaint and redress mechanism
established by private entities effectively assumes the role of providing access to remedies for violations of human rights law. I am concerned that such delegation would violate the State's obligation to provide access to an effective remedy for
violations of rights specified under the Covenant. Given that most of the content sharing providers covered under Article 13 are profit-motivated and act primarily in the interests of their shareholders, they lack the qualities of independence and
impartiality required to adjudicate and administer remedies for human rights violations. Since they also have no incentive to designate the blocking as being on the basis of the proposed Directive or other relevant law, they may opt for the legally safer
route of claiming that the upload was a terms of service violation -- this outcome may deprive users of even the remedy envisioned under Article 13(7). Finally, I wish to emphasize that unblocking, the most common remedy available for invalid content
restrictions, may often fail to address financial and other harms associated with the blocking of timesensitive content.
He goes on to point that while large platforms may be able to deal with all of this, smaller ones are going to be
in serious trouble: I am concerned that the proposed Directive will impose undue restrictions on nonprofits and small private intermediaries. The definition of an online content sharing provider under Article 2(5) is
based on ambiguous and highly subjective criteria such as the volume of copyright protected works it handles, and it does not provide a clear exemption for nonprofits. Since nonprofits and small content sharing providers may not have the financial
resources to establish licensing agreements with media companies and other right holders, they may be subject to onerous and legally ambiguous obligations to monitor and restrict the availability of copyright protected works on their platforms. Although
Article 13(5)'s criteria for effective and proportionate measures take into account the size of the provider concerned and the types of services it offers, it is unclear how these factors will be assessed, further compounding the legal uncertainty that
nonprofits and small providers face. It would also prevent a diversity of nonprofit and small content-sharing providers from potentially reaching a larger size, and result in strengthening the monopoly of the currently established providers, which could
be an impediment to the right to science and culture as framed in Article 15 of the ICESCR.
|
|
More YouTube video and audio download sites closed down following legal pressure from the music industry
|
|
|
| 15th June 2018
|
|
| See article from torrentfreak.com
|
Several video downloading and MP3 conversion tools have thrown in the towel this week, disabling all functionality following legal pressure. Pickvideo.net states that it received a cease and desist order, while Video-download.co and EasyLoad.co reference
the lawsuit against YouTube-MP3 as the reason for their decision. The music industry sees stream ripping as one of the largest piracy threats. The RIAA, IFPI, and BPI showed that they're serious about the issue when they filed legal action against
YouTube-MP3, the largest stream ripping site at the time. This case eventually resulted in a settlement where the site, once good for over a million daily visitors, agreed to shut down voluntarily last year. YouTube-MP3's demise was a clear
victory for the music groups, which swiftly identified their next targets, putting them under pressure, both in public and behind the scenes. This week this appears to have taken its toll on several stream ripping sites, which allowed users to
download videos from YouTube and other platforms, with the option to convert files to MP3s. The targets include Pickvideo.net , Video-download.co and Easyload.co , which all inform their users that they've thrown in the towel. With several million
visits per month, Pickvideo is the largest of the three. According to the site, they took the drastic measures following a cease -and-desist letter. |
|
UK Supreme Court rules that the cost of website blocking should not be borne by ISPs, and indirectly, internet users
|
|
|
| 14th June
2018
|
|
| See article from theregister.co.uk See
article from openrightsgroup.org |
The UK Supreme Court has today ruled that trade mark holders are not able to compel ISPs to bear the cost of implementing orders to block websites selling counterfeit goods. Jim, Alex and Myles at the Supreme CourtOpen Rights Group acted as an
intervener in this case. We argued that Internet service providers (ISPs) as innocent parties should not bear the costs of website blocking, and that this was a long-standing principle of English law. Jim Killock, Executive Director of Open Rights
Group said: This case is important because if ISPs paid the costs of blocking websites, the result would be an increasing number of blocks for relatively trivial reasons and the costs would be passed to customers.
While rights holders may want websites blocked, it needs to be economically rational to ask for this.
Solicitor in the case David Allen Green said: I am delighted to have acted,
through my firm Preiskel, successfully for the Open Rights Group in their intervention. We intervened to say that those enforcing private rights on internet should bear the costs of doing so, not others. This morning, the UK
Supreme Court held unanimously that the rights holders should bear the costs.
The main party to the case was BT who opposed being forced to pay for costs incurred in blocking websites. Now rights-holders must reimburse ISPs for the
costs of blocking rights-infringing material. Supreme Court judge Lord Sumption, one of five n the panel, ruled: There is no legal basis for requiring a party to shoulder the burden of remedying an injustice if
he has no legal responsibility for the infringement and is not a volunteer but is acting under the compulsion of an order of the court. It follows that in principle the rights-holders should indemnify the ISPs against their
compliance costs. Section 97A of the Copyright, Designs and Patents Act 1988 allows rights-holders to go to court and get a blocking order -- the question in the current case is who stumps up for the costs of complying with that order?
Of course this no asks the question about who should pay for mass porn website blocking that will be needed when the BBFC porn censorship regime stats its work. |
|
Vint Cerf, Tim Berners-Lee, and Dozens of Other Computing Experts Oppose Article 13 of the EU's new internet censorship law
|
|
|
| 13th June 2018
|
|
| See article from eff.org See
joint letter that was released today [pdf] |
As Europe's latest copyright proposal heads to a critical vote on June 20-21, more than 70 Internet and
computing luminaries have spoken out against a dangerous provision, Article 13, that would require Internet platforms to automatically filter uploaded content. The group, which includes Internet pioneer Vint Cerf, the inventor of the World Wide Web Tim
Berners-Lee, Wikipedia co-founder Jimmy Wales, co-founder of the Mozilla Project Mitchell Baker, Internet Archive founder Brewster Kahle, cryptography expert Bruce Schneier, and net neutrality expert Tim Wu , wrote in a
joint letter that was released today : By requiring Internet platforms to perform automatic filtering all of the
content that their users upload, Article 13 takes an unprecedented step towards the transformation of the Internet, from an open platform for sharing and innovation, into a tool for the automated surveillance and control of its users.
The prospects for the elimination of Article 13 have continued to worsen. Until late last month, there was the hope that that Member States (represented by the Council of the European Union) would find a compromise. Instead, their
final negotiating mandate doubled down on it. The last hope for defeating the proposal now lies with the European Parliament. On June 20-21 the Legal Affairs (JURI) Committee will vote on the proposal. If it votes against upload
filtering, the fight can continue in the Parliament's subsequent negotiations with the Council and the European Commission. If not, then automatic filtering of all uploaded content may become a mandatory requirement for all user content platforms that
serve European users. Although this will pose little impediment to the largest platforms such as YouTube, which already uses its Content ID system to filter content, the law will create an expensive barrier to entry for smaller platforms and startups, which may choose to establish or move their operations overseas in order to avoid the European law.
For those platforms that do establish upload filtering, users will find that their contributions--including video, audio, text, and even source code --will be
monitored and potentially blocked if the automated system detects what it believes to be a copyright infringement. Inevitably, mistakes will happen . There is no
way for an automated system to reliably determine when the use of a copyright work falls within a copyright limitation or exception under European law, such as quotation or parody. Moreover, because these exceptions are not
consistent across Europe, and because there is no broad fair use right as in the United States, many harmless uses of copyright works in memes, mashups, and remixes probably are technically infringing even if no reasonable copyright owner would
object. If an automated system monitors and filters out these technical infringements, then the permissible scope of freedom of expression in Europe will be radically curtailed, even without the need for any substantive changes in copyright law.
The upload filtering proposal stems from a misunderstanding about the purpose of copyright
. Copyright isn't designed to compensate creators for each and every use of their works. It is meant to incentivize creators as part of an effort to promote the public interest in innovation and expression. But that public interest isn't served
unless there are limitations on copyright that allow new generations to build and comment on the previous contributions . Those limitations are both legal, like fair dealing, and practical, like the zone of tolerance for harmless uses. Automated upload
filtering will undermine both. The authors of today's letter write: We support the consideration of measures that would improve the ability for creators to receive fair remuneration for the use
of their works online. But we cannot support Article 13, which would mandate Internet platforms to embed an automated infrastructure for monitoring and censorship deep into their networks. For the sake of the Internet's future, we urge you to vote for
the deletion of this proposal.
What began as a bad idea offered up to copyright lobbyists as a solution to an imaginary "
value gap " has now become an outright crisis for future of the Internet as we know it. Indeed, if
those who created and sustain the operation of the Internet recognize the scale of this threat, we should all be sitting up and taking notice. If you live in Europe or have European friends or family, now could be your last
opportunity to avert the upload filter. Please take action by clicking the button below, which will take you to a campaign website where you can phone, email, or Tweet at your representatives, urging them to stop this threat to the global Internet before
it's too late. Take Action at saveyourinternet.eu
|
|
TorrentFreak explains the grave threat to internet users and European small businesses
|
|
|
|
6th June 2018
|
|
| See article from torrentfreak.com cc
See also saveyourinternet.eu |
The EU's plans to modernize copyright law in Europe are moving ahead. With a crucial vote coming up later this month, protests from various opponents are on the rise as well. They warn that the proposed plans will result in Internet filters which
threaten people's ability to freely share content online. According to Pirate Party MEP Julia Reda, these filters will hurt regular Internet users, but also creators and businesses. September 2016, the European Commission
published its proposal for a modernized copyright law. Among other things, it proposed measures to require online services to do more to fight piracy. Specifically, Article 13 of the proposed Copyright Directive will require
online services to track down and delete pirated content, in collaboration with rightsholders. The Commission stressed that the changes are needed to support copyright holders. However, many legal scholars , digital activists ,
politicians , and members of the public worry that they will violate the rights of regular Internet users. Last month the EU Council finalized the latest version of the proposal. This means that the matter now goes to the Legal
Affairs Committee of the Parliament (JURI), which must decide how to move ahead. This vote is expected to take place in two weeks. Although the term filter is commonly used to describe Article 13, it is not directly mentioned in
the text itself . According to Pirate Party Member of Parliament (MEP) Julia Reda , the filter keyword is avoided in the proposal to prevent a possible violation of EU law and the Charter of Fundamental Rights. However, the
outcome is essentially the same. In short, the relevant text states that online services are liable for any uploaded content unless they take effective and proportionate action to prevent copyright infringements, identified by
copyright holders. That also includes preventing these files from being reuploaded. The latter implies some form of hash filtering and continuous monitoring of all user uploads. Several companies, including Google Drive, Dropbox,
and YouTube already have these types of filters, but many others don't. A main point of critique is that the automated upload checks will lead to overblocking, as they are often ill-equipped to deal with issues such as fair use.
The proposal would require platforms to filter all uploads by their users for potential copyright infringements -- not just YouTube and Facebook, but also services like WordPress, TripAdvisor, or even Tinder. We know from
experience that these algorithmic filters regularly make mistakes and lead to the mass deletion of legal uploads, Julia Reda tells TF. Especially small independent creators frequently see their content taken down because others
wrongfully claim copyright on their works. There are no safeguards in the proposal against such cases of copyfraud. Besides affecting uploads of regular Internet users and smaller creators, many businesses will also be hit. They
will have to make sure that they can detect and prevent infringing material from being shared on their systems. This will give larger American Internet giants, who already have these filters in place, a competitive edge over
smaller players and new startups, the Pirate Party MEP argues. It will make those Internet giants even stronger, because they will be the only ones able to develop and sell the filtering technologies necessary to comply with the
law. A true lose-lose situation for European Internet users, authors and businesses, Reda tells us. Based on the considerable protests in recent days, the current proposal is still seen as a clear threat by many.
In fact, the save youri nternet campaign, backed by prominent organizations such as Creative Commons, EFF, and Open Media, is ramping up again. They urge the
European public to reach out to their Members of Parliament before it's too late. Should Article 13 of the Copyright Directive proposal be adopted, it will impose widespread censorship of all the content you share online. The
European Parliament is the only one that can step in and Save your Internet, they write. The full Article 13 text includes some language to limit its scope. The nature and size of online services must be taken into account, for
example. This means that a small and legitimate niche service with a few dozen users might not be directly liable if it operates without these anti-piracy measures. Similarly, non-profit organizations will not be required to
comply with the proposed legislation, although there are calls from some member states to change this. In addition to Article 13, there is also considerable pushback from the public against Article 11, which is regularly referred
to as the link tax . At the moment, several organizations are planning a protest day next week, hoping to mobilize the public to speak out. A week later, following the JURI vote, it will be judgment day. If
they pass the Committee the plans will progress towards the final vote on copyright reform next Spring. This also means that they'll become much harder to stop or change. That has been done before, such as with ACTA, but achieving that type of momentum
will be a tough challenge.
|
|
The Open Rights Group finds that nearly 40% of court order blocks are in error
|
|
|
| 5th June 2018
|
|
| See article from openrightsgroup.org
|
Open Rights Group today released figures that show that High Court injunctions are being improperly administrated by ISPs and rights holders. A new tool added to its
blocked.org.uk project examines over 1,000 domains blocked under the UK's 30 injunctions against over 150 services, ORG found 37% of those domains are blocked in
error, or without any legal basis. The majority of the domains blocked are parked domains, or no longer used by infringing services. One Sci-Hub domain is blocked without an injunction, and a likely trademark infringing site, is also blocked without an
injunction. However, the list of blocked domains is believed to be around 2,500 domains, and is not made public, so ORG are unable to check for all possible mistakes. Jim Killock, Executive Director of Open
Rights Group said: It is not acceptable for a legal process to result in nearly 40% maladministration. These results show a great deal of carelessness. We expect ISPs and rights holders to
examine our results and remove the errors we have found as swiftly as possible. We want ISPs to immediately release lists of previously blocked domains, so we can check blocks are being removed by everyone.
Rights holders must make public exactly what is being blocked, so we can be ascertain how else these extremely wide legal powers are being applied.
ORG's conclusions are:
The administration process of adding and subtracting domains to be blocked is very poor Keeping the lists secret makes it impossible to check errors Getting mistakes
corrected is opaque. The ISP pages suggest you go to court.
Examples Some are potential subject to an injunction, which has not been sought, for instance: http://www.couchtuner.es One directs to a
personal blog: http://kat.kleisauke.nl Full results and statistical breakdowns
https://www.blocked.org.uk/legal-blocks/errors Export full results
https://www.blocked.org.uk/legal-blocks For a list of UK injunctions, see: The UK has 30 copyright and trademark
injunctions, blocking over 150 websites. https://wiki.451unavailable.org.uk/wiki/Main_Page
|
|
Sesame Street sues puppet movie over reference to their characters
|
|
|
| 31st May 2018
|
|
| 26th May 2018. See article from xbiz.com
See trailer from YouTube |
Creators of Sesame Street are suing the production company behind The Happytime Murders, claiming the mainstream comedy that features ejaculating puppets and other sexual puppetry routines is appropriating its brand. Sesame Workshop, creators of
the kids show, alleges that the misuse of its brand is intent on confusing the public and infringes on it intellectual property rights. The company has initiated a lawsuit as a result of a trailer with explicit, profane, drug-using, misogynistic,
violent, copulating and even ejaculating puppets, along with the tagline 'NO SESAME. ALL STREET'. The Happytime Murders, set for an August, is a murder mystery revolving around puppets who exhibit raunchy behavior.
Update: Judge not impressed by Sesame Street claims 31st May 2018. See article from pagesix.com
Manhattan federal Judge Vernon Broderick has rejected a request by the Sesame Workshop for a temporary retraining order to halt ads for the upcoming comedy Happytime Murders, including a YouTube trailer with the tagline, No Sesame.
All Street. Broderick ruled that the STX film -- directed by Brian Henson, the son of the late Jim Henson, whose Muppets have been central characters in the children's mainstay since its inception in 1969 -- was geared toward an entirely
different audience than Sesame Street. He also found that the trailer's No Sesame. All Street tagline was intended to differentiate the raunchy adult film from the wholesome educational show featuring Big Bird and Oscar the Grouch. The judge added:
I find the use of the tagline to disclaim -- albeit in a short and pithy manner.
|
|
Music industry is quick to lobby for Hancock's safe internet plans to be hijacked for their benefit
|
|
|
| 24th May 2018
|
|
| See article from torrentfreak.com
|
This week, Matt Hancock, Secretary of State for Digital, Culture, Media and Sport, announced the launch of a consultation on new legislative measures to clean up the Wild West elements of the Internet. In response, music group BPI says the government
should use the opportunity to tackle piracy with advanced site-blocking measures, repeat infringer policies, and new responsibilities for service providers. This week, the Government published its response to the Internet Safety Strategy green paper ,
stating unequivocally that more needs to be done to tackle online harm. As a result, the Government will now carry through with its threat to introduce new legislation, albeit with the assistance of technology companies, children's charities and other
stakeholders. While emphasis is being placed on hot-button topics such as cyberbullying and online child exploitation, the Government is clear that it wishes to tackle the full range of online harms. That has been greeted by UK music group BPI
with a request that the Government introduces new measures to tackle Internet piracy. In a statement issued this week, BPI chief executive Geoff Taylor welcomed the move towards legislative change and urged the Government to encompass the music
industry and beyond. He said: This is a vital opportunity to protect consumers and boost the UK's music and creative industries. The BPI has long pressed for internet intermediaries and online platforms to take
responsibility for the content that they promote to users. Government should now take the power in legislation to require online giants to take effective, proactive measures to clean illegal content from their sites and services.
This will keep fans away from dodgy sites full of harmful content and prevent criminals from undermining creative businesses that create UK jobs.
The BPI has published four initial requests, each of which provides food for thought.
The demand to establish a new fast-track process for blocking illegal sites is not entirely unexpected, particularly given the expense of launching applications for blocking injunctions at the High Court. The BPI has taken a large number of
actions against individual websites -- 63 injunctions are in place against sites that are wholly or mainly infringing and whose business is simply to profit from criminal activity, the BPI says. Those injunctions can be expanded fairly easily to
include new sites operating under similar banners or facilitating access to those already covered, but it's clear the BPI would like something more streamlined. Voluntary schemes, such as the one in place in Portugal , could be an option but it's unclear
how troublesome that could be for ISPs. New legislation could solve that dilemma, however. Another big thorn in the side for groups like the BPI are people and entities that post infringing content. The BPI is very good at taking these listings
down from sites and search engines in particular (more than 600 million requests to date) but it's a game of whac-a-mole the group would rather not engage in. With that in mind, the BPI would like the Government to impose new rules that would
compel online platforms to stop content from being re-posted after it's been taken down while removing the accounts of repeat infringers. Thirdly, the BPI would like the Government to introduce penalties for online operators who do not provide
transparent contact and ownership information. The music group isn't any more specific than that, but the suggestion is that operators of some sites have a tendency to hide in the shadows, something which frustrates enforcement activity. Finally,
and perhaps most interestingly, the BPI is calling on the Government to legislate for a new duty of care for online intermediaries and platforms. Specifically, the BPI wants effective action taken against businesses that use the Internet to encourage
consumers to access content illegally. While this could easily encompass pirate sites and services themselves, this proposal has the breadth to include a wide range of offenders, from people posting piracy-focused tutorials on monetized YouTube
channels to those selling fully-loaded Kodi devices on eBay or social media. Overall, the BPI clearly wants to place pressure on intermediaries to take action against piracy when they're in a position to do so, and particularly those who may not
have shown much enthusiasm towards industry collaboration in the past. Legislation in this Bill, to take powers to intervene with respect to operators that do not co-operate, would bring focus to the roundtable process and ensure that
intermediaries take their responsibilities seriously, the BPI says. |
|
|
|
|
|
7th May 2018
|
|
|
YouTube has 'how to' videos for pretty much everything See article from torrentfreak.com |
|
The EU's latest copyright proposal is so bad, it even outlaws Creative Commons licenses
|
|
|
| 12th April 2018
|
|
| See article from boingboing.net CC by Cory Doctorow |
The EU is mooting a new copyright regime for the largest market in the world, and the Commissioners who are drafting the new rules are completely captured by the entertainment industry, to the extent that they have ignored their own experts and produced
a farcical Big Content wishlist that includes the most extensive internet censorship regime the world has ever seen, perpetual monopolies for the biggest players, and a ban on European creators using Creative Commons licenses to share their works.
Under the new rules, anyone who allows the public to post material will have to maintain vast databases of copyrighted works
claimed by rightsholders , and any public communications that matches anything in these databases has to be blocked. These databases have been tried on much more modest scales -- Youtube's Content ID is a prominent example -- and they're a mess.
Because rightsholders are free to upload anything and claim ownership of it, Content ID is a font of garbagey, sloppy, fraudulent copyright abuse:
five different companies claim to own the rights to white noise ; Samsung claims to
own any drawing of its phones ; Nintendo claims it
owns gamers' animated mashups ; Sony
claims it owns stock footage it stole from a filmmaker whose work it had censored; the biggest music companies in the world
all claim to own the rights to "Silent Night" , a rogues' gallery of sleazy copyfraudsters
claim to own NASA's spacecraft landing footage -- all in all,
these systems benefit the large and the unethical at the cost of small and nimble. That's just for starters.
Since these filter systems are incredibly expensive to create and operate, anyone who wants to get into business competing with the companies that grew large without having to create systems like these will have to source hundreds of
millions in capital before they can even enter the market. Youtube 2018 can easily afford Content ID; Youtube 2005 would have been bankrupted if they'd had to build it. And then there's the matter of banning Creative Commons
licenses. In order to bail out the largest newspapers in the EU, the Commission is proposing a Link Tax -- a fee that search engines and sites like Boing Boing will have to pay just for the right to link to news stories on the
web. This idea has been tried before in Spain and Germany and the newspapers who'd called for it quickly admitted it wasn't working and stopped using it. But the new, worse-than-ever Link Tax contains a new wrinkle: rightsholders
will not be able to waive the right to be compensated under the Link Tax. That means that European creators -- who've released hundreds of millions of works under Creative Commons licenses that allow for free sharing without fee or permission -- will no
longer be able to choose the terms of a Creative Commons license; the inalienable, unwaivable right to collect rent any time someone links to your creations will invalidate the core clause in these licenses. Europeans can write to
their MEPs and the European Commission using this joint Action Centre ; please act before it's too late. The European Copyright Directive
was enacted in 2001 and is now woefully out of date. Thanks in large part to the work of Pirate Party MEP Julia Reda, many good ideas for updating European copyright law were put forward in a report of the European Parliament in July 2015. The European
Commission threw out most of these ideas, and instead released a legislative proposal in October 2016 that focused on giving new powers to publishers. That proposal was referred to several of the committees of the European Parliament, with the
Parliament's Legal Affairs (JURI) Committee taking the lead. As the final text must also be accepted by the Council of the European Union (which can be considered as the second part of the EU's bicameral legislature), the Council
Presidency has recently been weighing in with its own "compromise" proposals (although this is something of a misnomer, as they do little to improve the Commission's original text, and in some respects make it worse). Not to be outdone, German
MEP (Member of the European Parliament) Axel Voss last month introduced a new set of his own proposals [PDF] for "compromise," which are somehow worse still. Since Voss leads the JURI committee, this is a big problem.
|
|
|
|
|
| 8th April 2018
|
|
|
YouTube-MP3 was the world's largest YouTube-ripping service but last year it shut down following a lawsuit filed by the world's largest record labels. But what about companies that supply rip-it-yourself downloading tools See
article from torrentfreak.com |
|
|