|
Child protection campaigners and companies form trade group to oppose free speech and privacy arguments
|
|
|
| 27th April 2020
|
|
| See article from
ostia.org.uk
|
Online technology companies have joined to form a trade group to promote the case for child protection as the Government's Online Harms internet censorship bill works its way through parliament. The Online Safety Tech Industry Association (OSTIA)
launched at Leeds Digital Festival April 27, 2020 and brings together companies who operate in the field of online safety. Initiated by Edinburgh-based Cyan Forensics and PUBLIC, 14 separate tech companies have joined. Members include Yoti, Crisp,
Securium, Super Awesome and Safe To Net. The group says that one of its aims is to counter free speech and privacy arguments that hit a chord with the public in press coverage of the aborted age verification censorship measures included in the
government's Digital Economy Act. The group states on its website: Too often the debate about Online Safety is focused around what cannot be done, what is technically impossible, and the conflict with other rights such
as privacy and freedom of speech. We seek to provide an alternative voice in the debate. The UK government as welcomed the launch and the Internet Watch Foundation (IWF) and NSPCC are also supporting the group. Caroline Dinenage,
Minister of State for Digital and Culture in the Department for Digital, Culture, Media and Sport, said: We are determined to make the UK the safest place in the world to be online and have set out world-leading proposals
to put a duty of care on online companies, enforced by an independent regulator. We are backing the industry to support our work by developing new products to improve online security and drive growth in the digital economy. This new association will help
bring together relevant organisations to collaborate, innovate and create a safer online world.
|
|
Inevitably with baked in state central control
|
|
|
| 24th
April 2020
|
|
| 4th April 2020. See CC
article from privateinternetaccess.com by Caleb Chen |
The Chinese government and the Chinese telecommunications companies such as Huawei under its control are proposing a New IP addressing system for the internet to replace TCP/IP. The New IP system includes top-down checks and balances and such features as
a shut up command that would allow a central controller to stop packets from being received or sent by a target New IP address. The China led proposal was first unveiled at the International Telecommunications Union (ITU) meeting in September 2019. The
associated power point presentation and formal proposal have been made available by Financial Times. In it, the Chinese government and its state controlled telecommunications service and hardware providers (i.e. Huawei) make the
case that TCP/IP is broken and won't scale for use in the future internet which will include things like holographs and space-terrestrial communications. China argues that these new technologies on the old system would require complex translators and
increase the overall cost to society. The New IP proposal admits that TCP/IP has achieved relatively good security. However, China feels that this is still far away from what we actually require in the future. If the security is
admittedly relatively good, what could possibly be missing? Apparently, the answer to that question is trust. The proposal reads: As universal connectivity develops, a better security and trust model need to be
designed and deployed to provide a stable, trustworthy, and long-term environment for people to use.
Let's be clear: Trust should have no part in this. Especially this type of absolute trust in centralized
institutions -- that have repeatedly proven to be unworthy of such trust -- which is exactly what China is trying to force down the internet world's throat. Let's not forget that China is the same country that already forces real name and identification
to be tied to all internet or phone services and also runs a censorship campaign against the open internet so large that it's called the Great Firewall . NATO report warns against China's New IP system and its proposed
Splinternet Oxford Information Labs (Oxil) has prepared a research report for the North Atlantic Treaty Organization (NATO) that does not look kindly on the New IP proposal or the breakneck pace that it is being rushed through
the approval process. The report authors from Oxil spoke with and provided an advance copy of the NATO report to Infosecurity. Oxil summed up the problem with New IP concisely: New IP would centralize control over the
network into the hands of telecoms operators, all of which are either state run or state-controlled in China. So, internet infrastructure would become an arm of the Chinese state.
The New IP model also takes pot shots
at current centralized parts of the internet, such as the Domain Name System (DNS), and offers Distributed Ledger Technology (DLT) solutions under the guise of promoting a Decentralized Internet Infrastructure (DII) to address them. While that may sound
like the holy grail of blockchain technology and true decentralization that real public blockchain technologies such as Handshake provide, what is being proposed by China is absolutely not that. Oxil notes that the proposed DLTs would undoubtedly be
under Chinese government control -- bringing about that call for trust again. Oxil explained to Infosecurity: It is not uncommon for language of 'trust' to replace 'security' in Chinese DII-related discussions. This is
concerning because it indicates that the principle of 'security by design' -- at least in the Western context -- is not being adopted in DII's development. In the long-term this could negatively impact cybersecurity globally.
It doesn't matter how distributed or decentralized parts of a protocol seem on the surface, if there is a centralized command at the top that can issue shut up commands to devices supposedly connected to an open internet -- said
devices aren't actually connected to an open internet, are they. China will move towards using New IP with or without ITU approval Huawei is apparently already building internet infrastructure that utilices New IP as opposed to
TCP/IP with partner countries, likely in Africa. Besides that, the Chinese proposal for a more top-down controlled internet has also seen support from Russia, Saudi Arabia, and Iran. While Huawei claims that this is an open process, and is open to
scientists and engineers worldwide to participate in and contribute to, the fact that nobody really knows what's going on besides those involved in the process is telling. Robert Clark writing for LightReading calls New IP Huawei's real threat to
networking and describes the situation aptly: Huawei's important additional role here is as the major supplier to telcos in many developing countries. It is these governments that are likely the biggest enthusiasts for
a manageable Internet without being hectored by Western governments about openness and freedom. And Huawei staff are on hand to help them build it.
That is to say, Huawei is already going ahead and building New IP
systems with shut up commands and all -- in effect creating the very network islands that they want to use as a reason that TCP/IP won't work. In reality, those seeking to expand network functionality to new types of devices and services such as
holograms or satellite comms and more internet of things devices have all the incentive in the world to make something that works with the existing TCP/IP world. In contrast, China and other countries that do not want true freedom on the internet are all
too eager to create a form of the internet that gives them ultimate, centralized control. That China is proffering this New IP model to the free world as an improvement should be expected, and thoroughly ignored and lambasted.
Update: Opposed by European internet industry 24th April 2020. See article from zdnet.com
Ripe is the Regional Internet Registry for Europe, the Middle East and parts of Central Asia. It allocates and registers blocks of Internet number resources to Internet service providers (ISPs) and other organisations. The RIPE NCC membership consists
mainly of Internet service providers, telecommunication organisations and large corporations. RIPE is opposing a proposal to remodel core internet protocols, a proposal backed by the Chinese government, Chinese telecoms, and Chinese networking
equipment vendor Huawei. Named New IP, this proposal consists of a revamped version of the TCP/IP standards to accommodate new technologies, a shutoff protocol to cut off misbehaving parts of the internet, and a new top-to-bottom governance model that
centralizes the internet and puts it into the hands of a few crucial node operators. The proposal received immediate criticism from the general public and privacy advocates due to its obvious attempt to hide internet censorship features behind a
technical redesign of the TCP/IP protocol stack. Millions of eyebrows were raised when authoritarian countries like Iran, Russia, and Saudi Arabia expressed support for the proposal. In a blog post this week, RIPE NCC, the regional Internet
registry for Europe, West Asia, and the former USSR, formally expressed a public opinion against China New IP proposal. Marco Hogewoning, the current acting Manager Public Policy and Internet Governance at the RIPE NCC said:
Do we need New IP? I don't think we do. Although certain technical challenges exist with the current Internet model, I do not believe that we need a whole new architecture to address them. Any endeavors to
revamp internet protocols should be left to the Internet Engineering Task Force (IETF), the international body that has been in charge of defining internet standards for decades. Such issues should not be left to the ITU, which is the United Nation's
telecommunications body, and an agency where political influence rules, rather than technically-sound arguments. In addition, RIPE is also concerned with the attempt to change the internet's current decentralized nature.
|
|
Australia accelerates its link tax proposals
|
|
|
| 23rd April 2020
|
|
| 20th April 2020. See
article from theguardian.com |
The Australian Competition and Consumer Commission (ACCC) is accelerating its proposals to require social media companies to sare revenue obtained from sharing or linking to Australian media sources. A mandatory code being developed by the ACCC will
include penalties for Google, Facebook and other media platforms that share news content. The code originally scheduled for November 2020 is being brought forward as newspapers struggle for income in coronavirus lockdown. The code originally
required internet companies to negotiate in good faith on how to pay news media for use of their content, advise news media in advance of algorithm changes that would affect content rankings, favour original source news content in search page results,
and share data with media companies. But of course limited success in early negotiations between the platforms and the news industry has led to more of a mandatory imposition approach. Now a draft code will be finalised by the end of July.
Update: Britain too 23rd April 2020. See article from dailymail.co.uk Facebook and Google should be made to pay for news content generated by the UK media to avoid the death of the industry, UK ministers were told today.
Ex-Culture Committee chair Damian Collins is urging the government to follow the example of Australia, where new rules are being brought in to help prop up publications amid coronavirus turmoil. |
|
The EFF responds to a petition calling for EU censorship machines to be required in the US too. By Katharine Trendacosta and Corynne McSherry
|
|
|
| 22nd April 2020
|
|
| See article from eff.org |
Right now, we really are living our everyday lives online. Teachers are trying to teach classes online, librarians are trying to host digital readings, and trainers are trying to offer home classes. With more people entering
the online world, more people are encountering the barriers created by copyright. Now is no time to make those barriers higher, but a
new petition directed at tech companies does exactly that, and in the
process tries to do for the US what Article 17 of last's year's European Copyright Directive is doing for Europe--create a rule requiring online service providers to send everything we post to the Internet to black-box machine learning filters that will
block anything that the filters classify as "copyright infringement." The petition from musical artists, calls on companies to "reduce copyright infringement by establishing 'standard technical measures.'" The
argument is that, because of COVID-19, music labels and musicians cannot tour and, therefore, are having a harder time making up for losses due to online copyright infringement. So the platforms must do more to prevent that infringement.
Musical artists are certainly facing grave financial harms due to COVID-19, so it's understandable that they'd like to increase their revenue wherever they can. But there are at least three problems with this approach, and each
creates a situation which would cause harm for Internet users and wouldn't achieve the ends musicians are seeking. First, the Big Tech companies targeted by the petition already employ a wide variety of technical measures
in the name of blocking infringement, and long experience with these systems has proven them to be utterly toxic to lawful, online speech.
YouTube even warned that this current crisis would prompt even more mistakes, since human review and appeals were
going to be reduced or delayed. It has, at least, decided not to issue strikes except where it has "high confidence" that there was some violation of YouTube policy. In a situation where more people than before are relying on these platforms to
share their legally protected expression, we should, if anything, be looking to lessen the burden on users, not increase it. We should be looking to make them fairer, more transparent, and appeals more accessible, not adding more technological barriers.
YouTube's Content ID tool has flagged everything from someone speaking into a mic to check
the audio to a synthesizer test. Scribd's filter caught and removed a duplicate upload of the Mueller report, despite the fact that anything created by a federal government employee as part of their work can't even be copyrighted. Facebook's
Rights Manager keeps flagging its users'
performances of classical music composed hundreds of years ago. Filters can't distinguish lawful from unlawful content. Human beings need to
review these matches. But they don't. Or if they do, they aren't trained to distinguish lawful uses. Five rightsholders were happy to monetize ten hours of static because Content ID matched it. Sony refused the dispute by one Bach
performer, who only got his video unblocked after leveraging public outrage. A video explaining how musicologists determine whether one song infringes on another was taken down by Content ID, and the system was so confusing that law professors who are
experts in intellectual property couldn't figure out the effect of the claim in their account if they disputed it. They only got the video restored because they were
able to get in touch with YouTube via their connections . Private connections, public outrage, and press coverage
often get these bad matches undone, but they are not a substitute for a fair system. Second, adding more restrictions will raise make making and sharing our common culture harder at a time when, if anything, it needs to be easier.
We should not require everyone online become experts in law and the specific labyrinthine policies of a company or industry just when whole new groups of people are transferring their lives, livelihoods, and communities to the Internet.
If there's one lesson recent history has taught us, it's that "temporary, emergency measures" have a way of sticking around after the crisis passes, becoming a new normal. For the same reason that we should be worried about
contact tracing apps becoming a permanent means for governments to track and control whole populations, we should be alarmed at the thought that all our online lives (which, during the quarantine, are almost our whole lives) will be subjected to
automated surveillance, judgment and censorship by a system of unaccountable algorithms operated by massive corporations where it's impossible to get anyone to answer an email. Third, this petition appears to be a dangerous step
toward the content industry's Holy Grail: manufacturing an industry consensus on standard technical measures (STMs) to police copyright infringement. According to Section 512 of the Digital Millennium Copyright Act (DMCA), service providers must
accommodate STMs in order to receive the safe harbor protections from the risk of crippling copyright liability. To qualify as an STM, a measure must (1) have been developed pursuant to a broad consensus in an "open, fair, voluntary, multi-industry
standards process"; (2) be available on reasonable and nondiscriminatory terms; and (3) cannot impose substantial costs on
service providers
. Nothing has ever met all three requirements, not least because no "open, fair, voluntary, multi-industry standards process" exists. Many in the content industries would like to change that, and some would like to
see U.S. follow the EU in adopting mandatory copyright filtering. The EU's Copyright Directive--also known as Article 17, the most controversial part --passed a year ago, but only one country has
made progress towards implementing it [pdf]. Even before the current crisis, countries were having trouble reconciling the rights of users,
the rights of copyright holders, and the obligations of platforms into workable law. The United Kingdom took Brexit
as a chance not to implement it . And requiring automated filters in the EU runs into the problem that the EU has recognized the danger
of algorithms by giving users the right not to be subject to decisions made by automated tools. Put simply, the regime envisioned by Article 17 would end up being too complicated and expensive for most platforms to build and
operate. YouTube's Content ID alone has cost $100,000,000 to date, and it just filters videos for one service. Musicians are 100 percent right to complain about the size and influence of YouTube and Facebook, but mandatory filtering creates a world in
which only YouTube and Facebook can afford to operate. Cementing Big Tech's dominance is not in the interests of musicians or users. Mandatory copyright filters aren't a way to control Big Tech: they're a way for Big Tech to buy Perpetual Internet
Domination licenses that guarantee that they need never fear a competitor. Musicians are under real financial stress due to COVID-19, and they are not incorrect to see something wrong with just how much of the world is in the
hands of Big Tech. But things will not get better for them or for users by entrenching its position or making it harder to share work online.
|
|
Facebook censors anti-lockdown protests if prohibited by the state
|
|
|
| 21st April 2020
|
|
| See article from
dailymail.co.uk |
Facebook says it will consult with state governments on their lockdown orders and will shut down pages planning anti-quarantine protests accordingly. Events that defy government's guidance on social distancing have also been banned from Facebook.
The move has been opposed by Donald Trump Jr and the Missouri Senator Josh Hawley. They note that Facebook is violating Americans' First Amendment rights. Facebook said it has already removed protest messages in California, New Jersey and
Nebraska. However, protests are still being organized on Facebook. A massive protest took place in Harrisburg, Pennsylvania on Monday afternoon that was organized on the Facebook group Pennsylvanians against Excessive Quarantine Orders. |
|
|
|
|
|
20th April 2020
|
|
|
The internet industry is still scratching its head about an upcoming EU copyright law requiring social media to block uploads of illegal content whilst requiring that they do not over block legal content See
article from euractiv.com |
|
|
|
|
|
18th April 2020
|
|
|
Why there's a danger in allowing a single entity to influence what our society deems decent. By Katie Wheeler See
article from theguardian.com |
|
Government set to launch an NHSX coronavirus contact tracing app
|
|
|
| 17th April 2020
|
|
| 6th April 2020. See article
from cloudpro.co.uk |
The UK government is reportedly preparing to launch an app that will warn users if they are in close proximity to someone who has tested positive for coronavirus . The contact-tracking app will be released just before the lockdown is lifted or in its
immediate aftermath and will use short-range Bluetooth signals to detect other phones in close vicinity and then store a record of those contacts on the device. If somebody tests positive for COVID-19, they will be able to upload those contacts,
who can then be alerted via the app. It is reported that will not generally be shared with a central authority, potentially easing concerns that the app could snitch up users to the police for going jogging twice a day, or spending the night at
your girlfriend's place. NHSX, the innovation arm of the UK's National Health Service, will reportedly appoint an ethics board to oversee the development of the app, with its board members set to be announced over the coming weeks. It is a bit
alarming that the government is envisaging such a long development schedule, suggesting perhaps that the end to the lockdown will be months away. The NHS is reportedly counting on the app being downloaded by more than 50% of the population.
Offsite Comment: The government must explain its approach to mobile contact tracing 7th April 2020. See
article from openrightsgroup.org by Jim Killock
The idea is for some 60% of the population to use an app which will look for people with the same app to record proximity. This data is then stored centrally. Health officials then add data of people who have been positively tested for COVID-19.
Finally, persons who may be at risk because of their proximity to someone with the virus are alerted to this and asked to self-isolate. This approach is likely to work best late on, when people are out of the full lock down and
meeting people more than they were. It may be a key part of the strategy to move us out of lockdown and for dealing with the disease for some time afterwards. At the current time, during lockdown, it would not be so useful, as people are avoiding risk
altogether. Of course, it will be a huge challenge to persuade perhaps 75% or more of smartphone users (80% of adults have a smartphone) to install such an app, and keep it running for however long it is needed. And there are
limitations: for instance a window or a wall may protect you while the app produces a false positive for risky contact. The clinical efficicacy of any approach needs to be throughly evaluated, or any app will risk making matters worse.
Getting users to install and use an application like this, and share location information, creates huge privacy and personal risks. It is an enormous ask for people to trust such an app -- which explains why both the UK and EU are
emphasising privacy in the communications we have heard, albeit the EU project is much more explicit. It has a website , which explains: PEPP-PR was explicitly created to adhere to strong European privacy and data protection laws
and principles. The idea is to make the technology available to as many countries, managers of infectious disease responses, and developers as quickly and as easily as possible. The technical mechanisms and standards provided by PEPP-PT fully protect
privacy and leverage the possibilities and features of digital technology to maximize speed and real-time capability of any national pandemic response. There are plenty of other questions that arise from this approach. The
European project and the UK project share the same goals; the companies, institutions and governments involved must be talking with each other, but there is no sign of any UK involvement on the European project's website. The
European project has committed to producing its technology in the open, for the world to share, under a Mozilla licence. This is the only sane approach in this crisis: other countries may need this tool. It also builds trust as people can evaluate how
the technology works. We don't know if the UK will share technology with this project, or if it will develop its own. On the face of it, sharing technology and resources would appear to make sense. This needs clarifying. In any
event, the UK should be working to produce open source, freely reusable technology. We urgently need to know how the projects will work together. This is perhaps the most important question. People do, after all, move across
borders; the European project places a strong emphasis on interoperability between national implementations. In the, UK at the Irish border, it would make no sense for systems lacking interoperability to exist in the North and Eire.
Thus the UK and Europe will need to work together. We need to know how they will do this. We are in a crisis that demands we share resources and technology, but respect the privacy of millions of people as best
as we can. These values could easily flip -- allowing unrestricted sharing of personal data but failing to share techologies. The government has already made a number of communications mis-steps relating to data, including
statements that implied data protection laws do not apply in a health crisis; using aggregate mobile data without explaining why and how this is done; and employing the surveillance company Palantir without explaining or stating that it would be kept
away from further tasks involving personal data. These errors may be understandable, but to promote a mobile contact tool using massive amounts of personal location data, that also relies on voluntary participation, the UK
government will have to do much better. PEPP-PT is showing how transparency can be done; while it too is not yet at a point where we understand their full approach, it is at least making a serious effort to establish trust. We
need the UK government to get ahead, as Europe is doing, and explain its approach to this massive, population-wide project, as soon as possible. Offsite Comment: The EU also has an app 7th April 2020. See
article from politico.eu Years of efforts to safeguard personal data running headlong into calls for drastic actions to
counter the pandemic. Update: The Challenge of Proximity Apps For COVID-19 Contact Tracing
11th April 2020. See article from eff.org The Challenge of Proximity Apps For COVID-19 Contact Tracing
Developers are rapidly coalescing around applications for proximity tracing, which measures Bluetooth signal strength to determine whether two smartphones were close enough together for their users to transmit the virus. In this
approach, if one of the users becomes infected, others whose proximity has been logged by the app could find out, self-quarantine, and seek testing. Just today, Apple and Google announced joint application programming interfaces (APIs) using these
principles that will be rolled out in iOS and Android in May. A number of similarly designed applications are now available or will launch soon.
Update: Confirmed 13th April 2020. See article from bbc.co.uk The UK has confirmed plans for an app that will warn users if they have recently
been in close proximity to someone suspected to be infected with the coronavirus. The health secretary Matt Hancock announced the move at the government's daily pandemic press briefing. He said the NHS was working closely with the world's leading tech
companies on the initiative. The BBC has learned that NHSX - the health service's digital innovation unit - will test a pre-release version of the software with families at a secure location in the North of England next week.
Update: Can your smartphone crack Covid?
14th April 2020. See article from unherd.com by Timandra Harkness I write constantly about the threat to privacy of
letting our smartphones share data that reveals where we go, what we do, and who shares our personal space. And although these are exceptional circumstances, we should not stop valuing our privacy. Emergency measures have a habit of becoming the new
normal. And information about who we've been close to could be of interest to all sorts of people, from blackmailers to over-enthusiastic police officers enforcing their own interpretation of necessary activities. Update:
NHS in standoff with Apple and Google over coronavirus tracing
17th April 2020.See article from theguardian.com Tech firms place
limitations on how tracing apps may work in effort to protect users' privacy |
|
US court rules that posting images on Instagram effectively grants 3rd parties website copyright permission to embed those images
|
|
|
| 15th April 2020
|
|
| See article from
theverge.com |
A US court ruled yesterday that Mashable can embed a professional photographer's photo without breaking copyright law, thanks to Instagram's terms of service. The New York district court determined that Stephanie Sinclair offered a valid sublicense to
use the photograph when she posted it publicly on Instagram. The case stems from a 2016 Mashable post on female photographers, which included Sinclair and embedded an image from her Instagram feed. Mashable had previously failed to license the
image directly, and Sinclair sued parent company Ziff Davis for using Instagram embedding as a workaround. The judge noted that Instagram reserves a fully paid and royalty-free, transferable, sub-licensable right to photos on its service. If a
photo is posted publicly, it also offers embedding as an option -- which, in Wood's estimation, effectively grants a sublicense to display the picture. The user who initially uploaded the content has already granted Instagram the authority to sublicense
the use of 'public' content to users who share it, Wood wrote. That makes copyright questions moot. By posting the photograph to her public Instagram account, Plaintiff made her choice. |
|
OnlyFans, an instagram-like website that allows adult content
|
|
|
| 15th April 2020
|
|
| See article from spectator.co.uk
|
Julie Bindel is a British anti-porn campaigner. She has had a whinge about OnlyFans website in the Spectator. She writes: OnlyFans.com (OF) is the latest kid on the block to be billed as a safe, consequence-free way of
selling sex and home-grown porn that empowers women. The social media site is similar to Instagram, but users pay to subscribe to creators' feeds. The top earners on OF are women whose subscribers are male. These men pay between
£5 to £20 a month to view images considered too pornographic for Instagram. Subscribers can also direct message women and pay tips to get personalised videos or photos, depending on his individual sexual tastes. OF is a huge money
machine and is doing extremely well during the Covid-19 lockdown. It now has around 17.5 million users worldwide and over 70,000 content creators, who have received over $150 million (£119 million) since its launch. Content providers keep 80% of their
income, while the company takes the remaining 20%. OnlyFans' subscription-based model has led some to claim that it is somehow empowering women. Outlets like the New York Times say it has put X-rated entertainment in the hands of
its entertainers and means content creators perform fewer sex acts. Others think that because OF has reduced physical sexual exploitation, it does not put women in danger.
But of course Bindel disagrees, and should you want to read
her see article from spectator.co.uk |
|
Turkish government proposes new internet censorship law requiring social media companies to identify their users on request
|
|
|
| 11th April 2020
|
|
| See article from ahvalnews.com
|
Turkish President Recep Tayyip Erdogan's government has proposed a draft law which seeks to attach a series of online censorship measures to an economic aid package aimed at stabilising an economy hit by the coronavirus crisis. The new law defines
social media platforms very widely, as the people or legal entities who allow users to create, view or share data like text, images, voice, location online with the purpose of social interaction, and states that they will be held responsible for any
inappropriate content that their users post on their platforms. The law will apply to any platform with more than 1 million users in Turkey. The draft law states: A foreign based social network provider that has access to more than 1 million
people in Turkey is responsible for assigning at least one authorised person as a representative in Turkey to register the notifications, declarations or requests sent by institutions, associations, legal and administrative offices and also to be
responsible for sharing the identity and communication information of this person [who has posted inappropriate content] with the institution. The Turkish government is aiming to effectively end anonymity on social media platforms. This is very
similar to what Western governments have attempted and failed to do already, because anonymity has always been a core part of the internet, and it is unrealistic to expect all social media sites to implement systems to confirm the ID of their users.
This part of the law seems intended to make it easier for the Turkish government to access data about social media users based in Turkey. Presumably, this would make it easier for them to obtain data on anonymous users of social media who are heavily
critical of the Turkish government. Turkey's Interior Ministry reported that 2,000 social media users had been identified and arrested for provocative social media posts related to the coronavirus outbreak at the end of March. The law also seeks to
impose fines on social media providers who do not respond to takedown requests. Such fines can be from as little as 100 Turkish lira ($15) to as much as 5 million lira ($746,500). One of the problems with this law will be how the Turkish government is
going to force foreign social media companies to set up legally responsible offices in Turkey which they are already threatening with substantial fines. In 2016, the Turkish government asked PayPal to move their server operations to Turkey, but instead
of complying, PayPal simply abandoned the market. |
|
|
|
|
| 10th
April 2020
|
|
|
Twitter Removes Privacy Option, and Shows Why We Need Strong Privacy Laws See article from
eff.org |
|
|
|
|
| 10th April 2020
|
|
|
France reports that its implementation of the EU Copyright Directive requires Google to pay for links to French news sources See
article from politico.eu |
|
Twitch reworks its dress code for games streamers
|
|
|
| 9th April 2020
|
|
| See article from blog.twitch.tv |
Twitch has evolved from streamers commentating on games they are playing towards something more about entertainment, personalities, and sexiness. The website has been trying to reign in this latter attributes and its latest move is to rework its dress
code. Twitch explains in a blog post: We are shifting from a garment-specific policy to one based on a standard level of coverage, with exceptions for certain situations. We've outlined these minimum levels of coverage
to increase clarity on expectations, so you're not left guessing what is or is not acceptable. We don't permit streamers to be fully or partially nude, including exposing genitals or buttocks. We do not permit the visible outline
of genitals, even when covered. Broadcasting nude or partially nude minors is always prohibited, regardless of context. For those who present as women, we ask that you cover your nipples. We do not permit exposed underbust.
Cleavage is unrestricted as long as these coverage requirements are met. For all streamers, you must cover the area extending from your hips to the bottom of your pelvis and buttocks. For those areas of the
body where coverage is required, the coverage must be fully opaque - sheer or partially see-through clothing does not constitute coverage
|
|
India has quietly unblocked Pornhub
|
|
|
| 9th April
2020
|
|
| See article from avn.com
|
After Indian Prime Minister Narendra Modi ordered into a nationwide shutdown, reports out of that country indicated that at least some of the hundreds of porn sites blocked there since October of 2018 were quietly coming back online . AVN.com reported
that though there was no official lifting of the porn ban, PornHub quietly became accessible to Indian internet users just a day after the stay-at-home order went into effect, albeit via the site's .org address. Locked-down Indian citizens have
been accessing PornHub content at a record rate since Modi's shutdown order, according to a report by India's Free Press Journal. Pornhub has reported a staggering 95% increase in traffic from India as of late last week. The tube site xHamster
reported a 20% rise in Indian traffic over the first three weeks in March . Some online commenters theorized that the government had quietly relaxed the national porn ban as an added incentive to keep Indians in their homes during the scheduled
21-day lockdown. |
|
The government calls in social media companies for a meeting about quashing rumours about a link between coronavirus contagion and 5G
|
|
|
| 5th April
2020
|
|
| See article from bbc.co.uk |
The UK culture secretary is to order social media companies to be more aggressive in their response to conspiracy theories linking 5G networks to the coronavirus pandemic. Oliver Dowden plans to hold virtual meetings with representatives from several
tech firms next week to discuss the matter. It follows a number of 5G masts apparently being set on fire. A spokeswoman for the Department for Digital, Culture, Media and Sport told the BBC: We have received
several reports of criminal damage to phone masts and abuse of telecoms engineers apparently inspired by crackpot conspiracy theories circulating online, Those responsible for criminal acts will face the full force of the law. We
must also see social media companies acting responsibly and taking much swifter action to stop nonsense spreading on their platforms which encourages such acts.
Several platforms have already taken steps to address the problem but
have not banned discussion of the subject outright. It is not really very clear what the rumours are based upon beyond a correlation between big cities becoming SARS 2 hotspots and big cities being selected for the initial roll out of 5G. But
surely denser housing and the larger households found in big cities provides a more compelling reason for big cities being the hotspots. One could ask why western countries seem too being hit hardest when the housing density argument would seem to make
mega cities in the developing world more logical centres for the largest contagions, which doesn't seem to be happening so far. Ofcom's unevidenced refutation 5th April 2020. See
article from ofcom.org.uk
Ofcom has imposed a sanction on Uckfield Community Radio Limited after a discussion about the causes and origins of Covid-19 on its community radio station Uckfield FM was found to have breached broadcasting rules. The broadcaster must broadcast a
summary of our findings to its listeners. On 28 February 2020, Uckfield FM broadcast a discussion which contained potentially harmful claims about the coronavirus virus, including unfounded claims that the virus outbreak in Wuhan,
China was linked to the roll out of 5G technology. Ofcom's investigation concluded that the broadcaster failed to adequately protect listeners and had breached Rule 2.1 of the Ofcom Broadcasting Code. Given the seriousness of this
breach, Ofcom has directed the Licensee to broadcast a statement of Ofcom's findings on a date and in a form to be determined by Ofcom. |
|
Disney continues to make the news with cut versions on Disney+
|
|
|
| 5th April 2020
|
|
| See article from movie-censorship.com |
|
|
Google is accused in court of downranking competitors in searches. A judge says that it must reveal its algorithms to prove its unlikely denial of the accusation
|
|
|
| 5th
April 2020
|
|
| See article from theregister.co.uk |
|
|
Disney elects to use the cut UK version for its Disney+ channel
|
|
|
| 4th April 2020
|
|
| See article from independent.co.uk
|
Lilo & Stitch is a 2002 USA children's cartoon comedy by Dean DeBlois and Chris Sanders. Starring Daveigh Chase, Chris Sanders and Tia Carrere.
The film follows an extra-terrestrial who impersonates
a dog and finds himself adopted by a young girl after arriving on Earth. Several moments in the film see the trouble-making pair get up to no good, with one particular scene showing Lilo hiding from her sister in a tumble dryer.
Disney+ users in the US have been left confused by an edit in the children's cartoon
Lilo & Stitch . Those re-watching the film, though, will see Lilo no longer hides in a dryer but within a piece of furniture that is bizarrely blocked by a pizza box. According to reports, the reason for the change was to avoid
the chance for children to emulate Lilo's dangerous behaviour, especially considering they're going to be cooped up indoors due to the coronavirus lockdown. In fact this is the UK version with cuts as demanded by the BBFC at the time of original
release. Presumably the international reach of the channel means that it must comply with multiple territories. And in this case maybe there was a sensible basis to the BBFC cuts anyway. The BBFC commented in 2002:
Distributor chose to remove sight of dangerous activity which might be copied by young children (child character emerging from a hiding place inside a washing machine or tumble drier) in order to achieve a U. A 12 uncut was
available to the distributor.
In response the producers reworked the cartoon so that the dryer was converted to a wooden cupboard. |
|
|
|
|
| 4th April 2020
|
|
|
How EFF Evaluates Government Demands for New Surveillance Powers. By Adam Schwartz See article from
eff.org |
|
|
|
|
| 3rd April 2020
|
|
|
The secret tech that lets government agencies collect masses of data from your apps See
article from privacyinternational.org |
|
Opera introduces major updates to its blockchain-browser on Android
|
|
|
| 2nd April 2020
|
|
| See article from press.opera.com
See also Chinese Netizens Use Ethereum To Avoid China's COVID-19
Censorshipfrom forbes.com
|
Web 3 is about rethinking the way we access data online. One of the important new Web 3 protocols which make this possible is IPFS. IPFS is a protocol which allows you to store data on the web without having to rely on a single server or specific
cloud service. How does it work? Instead of asking the network for a file using it's location, the browser can ask the network for a file using its cryptographic hash (unique to the file). IPFS then takes care of delivering the file to the browser,
wherever it is stored. Each network node stores only the content it is interested in, plus some indexing information which helps figure out which node is storing what. When looking up a file to view or download, one asks the network to find the
nodes that are storing the content behind a given file's hash. One doesn't, however, need to remember the hash as every file can be found by human-readable names using a decentralized naming system like Unstoppable Domains or the Ethereum Name System
(ENS). This means that files, as well as websites, can be stored in a decentralized and secure way and accessed without relying on a single server 203 a truly cloudless form of storage similar to BitTorrent. Opera has worked directly with Protocol
Labs, the main actor behind the development of the IPFS protocol, to integrate this experience into Opera for Android. Charles Hamel, Head of Crypto at Opera, commented: Browsers have a critical role to play
in Web 3 and we believe that integrating these new protocols into our popular browser will accelerate their adoption, said
|
|
|
|
|
| 2nd April 2020
|
|
|
Everyone seems to be writing an app for coronavirus surveillance, and the EU is no exception See article from politico.eu
|
|
|