|
French court rejects request to force ISP blocking of major porn websites
|
|
|
| 25th May 2022
|
|
| See article from xbiz.com See
article from hitechwiki.com |
A Paris appeals court has rejected requests to block the most popular adult tube sites in the country. Following months of statements and threats pressuring tube sites to implement undefined age verification schemes, French media censor ARCOM went to
court in March and April to demand that French ISPs block Pornhub, xHamster, xVideos, Cliphunter and Xnxx. Meanwhile 2 morality campaign groups are also pushing for ISP blocking via the courts, and their blocking requests add YouPorn, RedTube and
Tukif to the list. ARCOM had sent formal notices to tube site operators demanding they find a more robust solution than the simple declaration of age. The porn sites did not comply, seemingly because ARCOM had not defined what age
verification methods would be acceptable. French reports now reveal that the Council of State, an appeals court, issued a ruling rejecting the ARCOM blocking requests. The reasons for the rejection are not wholly clear so far but the most
plausible is that recourse to blocking should be the last resort rather than the first call. It seems that ARCOM should be doing more work to define what age verification measures the porn websites should be taking. |
|
Europe's proposed laws could undermine end-to-end encryption for billions of people.
|
|
|
| 11th May 2022
|
|
| See article from wired.com See
EU snooping law [pdf] from alecmuffett.com |
An upcoming EU law has been leaked that requires big tech companies to scan the private messages of all their users regardless of any end to end encryption technolgy being used. Of course the EU cites child porn and grooming as the nominal justification
but when messages have been scanned I am sure that governments will demand that the tech companies hand over the messages for a much wider range of reasons than that claimed. Under the plans, tech companies -- ranging from web hosting services to
messaging platforms - can be ordered to detect both new and previously discovered child sexual abuse material (CSAM) as well as potential instances of grooming. The detection could take place in chat messages, files uploaded to online services, or on
websites that host abusive material. The plans echo an effort by Apple last year to scan photos on people's iPhones for abusive content before it was uploaded to iCloud. Apple paused its efforts after a widespread backlash. If passed, the European
legislation would require tech companies to conduct risk assessments for their services to assess the levels of CSAM on their platforms and their existing prevention measures. If necessary, regulators or courts may then issue detection orders that say
tech companies must start installing and operating technologies to detect CSAM. The draft legislation doesn't specify what technologies must be installed or how they will operate -- these will be vetted by the new EU Centre -- but says they should be
used even when end-to-end encryption is in place. Read the full details in article from wired.com |
|
Finland prosecutors get wound up about hateful passages in the Bible
|
|
|
| 4th May 2022
|
|
| See article from reclaimthenet.org
|
Finland's top prosecutor has said she will appeal a decision by a court that ruled against her hate speech allegation against a politician for posting anti-gay Bible verses on Twitter. The Prosecutor General argued that Paivi Räsänen, a member of
parliament, broke Finland's hate speech laws by using Christian theology in a public debate on radio and in tweets. In a tweet to the leaders of the Evangelical Lutheran Church of Finland, Räsänen posted a passage from Romans (1:24-27) saying:
Because of this, God gave them over to shameful lusts. Even their women exchanged natural sexual relations for unnatural ones. In the same way the men also abandoned natural relations with women and were inflamed with lust
for one another. Men committed shameful acts with other men, and received in themselves the due penalty for their error.
Räsänen was accused of criticizing Finland's state church hosting an LGBT event and writing a booklet titled Male and Female He Created Them.
After the prosecutor announced she would appeal the ruling, Räsänen said: The prosecutor's decision to appeal the acquittal verdict may lead to the case going all the way to the Supreme Court, giving the
possibility of securing precedent protecting freedom of speech and religion for all Finnish people. Also I am happy that this decision will lead to the discussion of the teaching of the Bible continuing in Finnish society. I am ready to defend freedom of
speech and religion in all necessary courts, and as far as the European Court of Human rights.
|
|
|
|
|
| 4th May 2022
|
|
|
The Digital Services Act will be the envy of autocrats the world over. By Andrew Tettenborn See article from
spiked-online.com |
|
Council of Europe calls for porn blocking software to be installed on all personal devices
|
|
|
|
27th April 2022
|
|
| See article from xbiz.com
|
The Council of Europe is the Europe-wide (beyond the EU) organisation most well known for running the European Court of Human Rights. Now the Parliamentary Assembly of the Council of Europe issued a resolution urging European nations to mandate online
filters for pornographic materials on all devices, to be systematically activated in public spaces, such as schools, libraries and youth clubs. The parliamentarians expressed deep concern at: The unprecedented
exposure of children to pornographic imagery, which is detrimental to their psychological and physical development. This exposure brings increased risks of harmful gender stereotyping, addiction to pornography and early, unhealthy sex.
The parliamentarians didn't offer any definitions of what they consider unhealthy sex or pornographic materials, nor did they explain how these mandatory filters would be coded and by whom. The Council's statement nvites member states to
examine the existing means and provisions to combat children's exposure to pornographic content and address the gaps in relevant legislation and practice with a view to better protecting children. It calls for relevant legislation to ensure that both
dedicated websites hosting adult content and mainstream and social media which include adult content, are obliged to use age verification tools. \The resolution also advocates the introduction of an alert button or similar solutions for children to
report accidental access to pornographic content, and envisages follow-up actions, such as warnings or penalties for relevant websites. |
|
Don't hold ordinary social media users responsible for other users responses
|
|
|
|
27th April 2022
|
|
| See CC article from eff.org
|
Courts and legislatures around the globe are hotly debating to what degree online intermediaries--the chain of entities that facilitate or support speech on the internet--are liable for the content they help publish. One thing they should not be
doing is holding social media users legally responsible for comments posted by others to their social media feeds, EFF and Media Defence told the European Court of Human Rights (ECtHR). Before the court is the case Sanchez v.
France , in which a politician argued that his right to freedom of expression was violated when he was subjected to a criminal fine for not promptly deleting hateful comments posted on the "wall" of his Facebook account by others. The
ECtHR's Chamber, a judicial body that hears most of its cases, found there was no violation of freedom of expression, extending its rules for online intermediaries to social media users. The politician is seeking review of this decision by ECtHR's Grand
Chamber, which only hears its most serious cases. EFF and Media Defence, in an amicus brief submitted to the Grand Chamber, asked it to revisit the Chamber's expansive interpretation of how intermediary liability rules should
apply to social media users. Imposing liability on them for third-party content will discourage social media users, especially journalists, human rights defenders, civil society actors, and political figures, from using social media platforms, as they
are often targeted by governments seeking to suppress speech. Subjecting these users to liability would make them vulnerable to coordinated attacks on their sites and pages meant to trigger liability and removal of speech, we told the court.
Further, ECtHR's current case law does not support and should not apply to social media users who act as intermediaries, we said. The ECtHR laid out its intermediary liability rules in Delfi A.S. v. Estonia , which concerned
the failure of a commercial news media organization to monitor and promptly delete "clearly unlawful" comments online. The ECtHR rules consider whether the third-party commenters can be identified, and whether they have any control over their
comments once they submit them. In stark contrast, Sanchez concerns the liability of an individual internet user engaged in non-commercial activity. The politician was charged with incitement to hatred or violence against a
group of people or an individual on account of their religion based on comments others posted on his Facebook wall. The people who posted the comments were convicted of the same criminal offence, and one of them later deleted the allegedly unlawful
comments. What's more, the decision about what online content is "clearly unlawful" is not always straightforward, and generally courts are best placed to assess the lawfulness of the online content. While social media
users may be held responsible for failing or refusing to comply with a court order compelling them to remove or block information, they should not be required to monitor content on their accounts to avoid liability, nor should they be held liable simply
when they get notified of allegedly unlawful speech on their social media feeds by any method other than a court order. Imposing liability on an individual user, without a court order, to remove the allegedly unlawful content in question will be
disproportionate, we argued. Finally, the Grand Chamber should decide whether imposing criminal liability for third party content violates the right to freedom of expression, given the peculiar circumstances in this case. Both the
applicant and the commenters were convicted of the same offence a decade ago. EFF and Media Defence asked the Grand Chamber to assess the quality of the decades-old laws--one dating back to 1881--under which the politician was convicted, saying criminal
laws should be adapted to meet new circumstances, but these changes must be precise and unambiguous to enable someone to foresee what conduct would violate the law. Subjecting social media users to criminal responsibility for
third-party content will lead to over-censorship and prior restraint. The Grand Chamber should limit online intermediary liability, and not chill social media users' right to free expression and access to information online. You can read our
amicus brief here: https://www.eff.org/document/sanchez-v-france-eff-media-defence-ecthr-brief
|
|
The EU is moving towards the conclusion of its new internet censorship law, the Digital Services Act
|
|
|
| 22nd April 2022
|
|
| See article from nytimes.com |
The European Union is nearing a conclusion to internet censorship legislation that would force Facebook, YouTube and other internet services to censor 'misinformation', disclose how their services algorithms and stop targeting online ads based on a
person's ethnicity, religion or sexual orientation. The law, called the Digital Services Act, is intended to more aggressively police social media platforms for content deemed unacceptable or risk billions of dollars in fines. Tech companies would be
compelled to set up new policies and procedures to remove flagged hate speech, terrorist propaganda and other material defined as illegal by countries within the European Union. The Digital Services Act is part of a one-two punch by the European
Union to address the societal and economic effects of the tech giants. Last month, the 27-nation bloc agreed to a different sweeping law, the Digital Markets Act, to counter what regulators see as anticompetitive behavior by the biggest tech firms,
including their grip over app stores, online advertising and internet shopping. The new law is ambitious but the EU is noted for producing crap legislation that doesn't work in practice. Lack of enforcement of the European Union's data privacy law,
the General Data Protection Regulation, or G.D.P.R., has cast a shadow over the new law. Like the Digital Services Act and Digital Markets Act, G.D.P.R. was hailed as landmark legislation. But since it took effect in 2018, there has been little action
against Facebook, Google and others over their data-collection practices. Many have sidestepped the rules by bombarding users with consent windows on their websites. |
|
Macron continues to call for ID verification before people are allowed to use social media
|
|
|
| 20th April 2022
|
|
| See article from reclaimthenet.org |
Two weeks prior to the French presidential election, President Emanuel Macron reopened the debate on ending online anonymity. The president is open to the idea of dismantling foreign platforms if they do not require users to verify their identity before
they can post. Macron told Le Point last week: In a democratic society, there should be no anonymity. You can't walk around in the street wearing a hood. On the Internet, people allow themselves, because they are hooded
behind a pseudonym, to say the worst abjections.
Macron began his campaign against online anonymity in January 2019, saying it was time to move towards a gradual lifting of all forms of anonymity. In the latest interview,
Macron attacked US Big Tech platforms, claiming: They come to use our ancient or post-revolutionary freedoms to divert from their essence. We need to create a public order, like in the street.
This is not the state of nature. On social media networks, you can kill reputations, spread false news, drive people to suicide.
Macron hopes that the Digital Markets Act and Digital Services Act will be a solution to the problem of
online anonymity and Big Tech antitrust practices. |
|
|