|
And finding it in draft Australian censorship codes
|
|
|
| 27th October 2024
|
|
| See article from
theguardian.com |
The Australian internet industry has produced draft censorship rules related to age/ID verification. The schedule is for these to come into force in 2025. One of the rules that has caught the attention is that search engines will be required to
age/ID verify users before links to porn or gambling sites sites can be provided. The draft codes will apply to websites, social media, video games, search engines, gaming companies, app developers and internet service providers, among others. As
is the case in most other countries, the authorities are refusing to specify exactly what age/ID verification mechanisms will be acceptable and will leave it to companies to take enormous commerical risks in guessing what mechanisms will be acceptable.
Examples of options include checking photo ID, facial age estimation, credit card checks, digital ID wallets or systems, or attestation by a parent or guardian. The codes have been developed by the Australian Mobile Telecommunications Association
(Amta), the Communications Alliance, the Consumer Electronics Suppliers Association (CESA), the Digital Industry Group Inc. (Digi), and the Interactive Games and Entertainment Association (IGEA). Dr Jennifer Duxbury, Digi's director for policy,
regulatory affairs, and research, told Guardian Australia that the group doesn't speak for the porn industry, and added: I can't predict what their reaction might be, whether they would withdraw from the market, or what's
the likely outcome.
|
|
French court gives porn websites 15 days to implement censorship via age verification
|
|
|
| 20th October 2024
|
|
| See article from avn.com
|
A French Court of Appeals in Paris has ruled that certain porn websites are subject to a national age verification requirement adopted under a 2020 French law. The ruling applies to websites that don't operate in European Union member states. Websites Tukif, xHamster, MrSexe, and IciPorno, all non-EU platforms, must adopt more rigorous age verification measures within 15 days or else they will be blocked by French ISPs.
Porn websites are currently under duress via the EU and the websites are presently challenging Digital Services Act rules. The Court of Appeal ruled that: Children's general interest was an overriding
consideration which may justify infringement of other rights such as freedom of expression or communication. Giving priority to the protection of the private lives of adult consumers, by ruling out age verification, is
incompatible with the protection of minors.
AVN also reported on a national age verification requirement granting Arcom, the audiovisual and internet censor for France, the right to enforce age-verification rules.
|
|
Ofcom announces a timetable for UK age verification censorship rules and implementation for porn websites
|
|
|
|
17th October 2024
|
|
| See article from ofcom.org.uk
|
Ofcom writes: Parliament set us a deadline of April 2025 to finalise our codes and guidance on illegal harms and children's safety. We will finalise our illegal harms codes and guidance ahead of this deadline. Our expected timing for
key milestones over the next year -- which could change -- include:
December 2024: Ofcom will publish first edition illegal harms codes and guidance. Platforms will have three months to complete illegal harms risk assessment. January 2025: Ofcom will finalise
children's access assessment guidance and guidance for pornography providers on age assurance. Platforms will have three months to assess whether their service is likely to be accessed by children. February 2025: Ofcom
will consult on best practice guidance on protecting women and girls online, earlier than previously planned. March 2025: Platforms must complete their illegal harms risk assessments, and implement appropriate safety measures. -
April 2025: Platforms must complete children's access assessments. Ofcom to finalise children's safety codes and guidance. Companies will have three months to complete children's risk assessment. Spring
2025: Ofcom will consult on additional measures for second edition codes and guidance. July 2025: Platforms must complete children's risk assessments, and make sure they implement appropriate safety measures.
We will review selected risk assessments to ensure they are suitable and sufficient, in line with our guidance, and seek improvements where we believe firms have not adequately mitigated the risks they face. Ready to take enforcement
action. Ofcom has the power to take enforcement action against platforms that fail to comply with their new duties, including imposing significant fines where appropriate. In the most serious cases, Ofcom will be able to seek a
court order to block access to a service in the UK, or limit its access to payment providers or advertisers. We are prepared to take strong action if tech firms fail to put in place the measures that will be most impactful in
protecting users, especially children, from serious harms such as those relating to child sexual abuse, pornography and fraud. |
|
A new Californian law will have far reaching effects censorsing social media and requiring widespread age verification
|
|
|
| 3rd October 2024
|
|
| See article from latimes.com |
California's governor Gavin Newsom has signed a wide ranging bill to limit the ability of social media companies to provide feeds to minors that politicians claim to be addictive. Newsom signed Senate Bill 976, named the Protecting Our Kids From
Social Media Addiction Act and introduced by state Senator Nancy Skinner. Of course the fundamental social media 'algorithm' is to provide a user with more of the content that they showed they enjoyed. Politicians and campaigners would clearly
prefer that users would instead get a feed of what they 'should' be enjoying. The legislation was widely opposed by groups including the American Civil Liberties Union of California, Equality California and associations representing giants in the
industry that own TikTok, Instagram and Facebook. The California Chamber of Commerce argued that the legislation unconstitutionally burdens access to lawful content, setting up the potential for another lawsuit in an ongoing court battle between the
state and social media companies over use of the platforms by children. The bill, which will take effect Jan. 1, 2027, with Newsom's signature, prohibits internet service and applications from providing addictive feeds, defined as media curated based
on information gathered on or provided by the user, to minors without parental consent. SB 976 also bans companies from sending notifications to users identified as minors between midnight and 6 a.m. or during the school day from 8 a.m. to 3 p.m. unless
parents give the OK. The bill will effectively require companies to make posts from people children know and follow appear in chronological order on their social media feeds instead of in an arrangement to maximize engagement. The bill doesn't
specifically mandate age verification but the policies outlined above do require the internet companies to know the age of a user whether specified or not. |
|
|