|
Labour government inevitably continues with creation of online ID system
|
|
|
| 3rd November 2024
|
|
| See article from reclaimthenet.org |
The UK government has launched the Office for Digital Identities and Attribute (OfDIA), a group within the Department for Science, Innovation, and Technology, tasked with encouraging the growth of the digital ID market under the leadership of chief
executive Hannah Rutter. In fact the Labour government is continuing with a Tory idea that was first announced by the previous government in 2022. Rutter claimed that digital identity can make people's lives easier, and unlock billions of
pounds of economic growth. Rutter made sure to address one of the criticisms regarding the security of such schemes -- centralization -- by saying that the system her office is working on does not have a centralized digital database. Currently,
OfDIA is working to create a trusted and secure digital identity market and this work focuses on five areas, starting with developing and maintaining the digital identity and framework, and then being in charge of a register of accredited organizations
that meet the framework's requirements. |
|
And finding it in draft Australian censorship codes
|
|
|
| 27th October 2024
|
|
| See article from
theguardian.com |
The Australian internet industry has produced draft censorship rules related to age/ID verification. The schedule is for these to come into force in 2025. One of the rules that has caught the attention is that search engines will be required to
age/ID verify users before links to porn or gambling sites sites can be provided. The draft codes will apply to websites, social media, video games, search engines, gaming companies, app developers and internet service providers, among others. As
is the case in most other countries, the authorities are refusing to specify exactly what age/ID verification mechanisms will be acceptable and will leave it to companies to take enormous commercial risks in guessing what mechanisms will be acceptable.
Examples of options include checking photo ID, facial age estimation, credit card checks, digital ID wallets or systems, or attestation by a parent or guardian. The codes have been developed by the Australian Mobile Telecommunications Association
(Amta), the Communications Alliance, the Consumer Electronics Suppliers Association (CESA), the Digital Industry Group Inc. (Digi), and the Interactive Games and Entertainment Association (IGEA). Dr Jennifer Duxbury, Digi's director for policy,
regulatory affairs, and research, told Guardian Australia that the group doesn't speak for the porn industry, and added: I can't predict what their reaction might be, whether they would withdraw from the market, or what's
the likely outcome.
|
|
French court gives porn websites 15 days to implement censorship via age verification
|
|
|
| 20th October 2024
|
|
| See article from avn.com
|
A French Court of Appeals in Paris has ruled that certain porn websites are subject to a national age verification requirement adopted under a 2020 French law. The ruling applies to websites that don't operate in European Union member states. Websites Tukif, xHamster, MrSexe, and IciPorno, all non-EU platforms, must adopt more rigorous age verification measures within 15 days or else they will be blocked by French ISPs.
Porn websites are currently under duress via the EU and the websites are presently challenging Digital Services Act rules. The Court of Appeal ruled that: Children's general interest was an overriding
consideration which may justify infringement of other rights such as freedom of expression or communication. Giving priority to the protection of the private lives of adult consumers, by ruling out age verification, is
incompatible with the protection of minors.
AVN also reported on a national age verification requirement granting Arcom, the audiovisual and internet censor for France, the right to enforce age-verification rules.
|
|
Ofcom announces a timetable for UK age verification censorship rules and implementation for porn websites
|
|
|
|
17th October 2024
|
|
| See article from ofcom.org.uk
|
Ofcom writes: Parliament set us a deadline of April 2025 to finalise our codes and guidance on illegal harms and children's safety. We will finalise our illegal harms codes and guidance ahead of this deadline. Our expected timing for
key milestones over the next year -- which could change -- include:
December 2024: Ofcom will publish first edition illegal harms codes and guidance. Platforms will have three months to complete illegal harms risk assessment. January 2025: Ofcom will finalise
children's access assessment guidance and guidance for pornography providers on age assurance. Platforms will have three months to assess whether their service is likely to be accessed by children. February 2025: Ofcom
will consult on best practice guidance on protecting women and girls online, earlier than previously planned. March 2025: Platforms must complete their illegal harms risk assessments, and implement appropriate safety measures. -
April 2025: Platforms must complete children's access assessments. Ofcom to finalise children's safety codes and guidance. Companies will have three months to complete children's risk assessment. Spring
2025: Ofcom will consult on additional measures for second edition codes and guidance. July 2025: Platforms must complete children's risk assessments, and make sure they implement appropriate safety measures.
We will review selected risk assessments to ensure they are suitable and sufficient, in line with our guidance, and seek improvements where we believe firms have not adequately mitigated the risks they face. Ready to take enforcement
action. Ofcom has the power to take enforcement action against platforms that fail to comply with their new duties, including imposing significant fines where appropriate. In the most serious cases, Ofcom will be able to seek a
court order to block access to a service in the UK, or limit its access to payment providers or advertisers. We are prepared to take strong action if tech firms fail to put in place the measures that will be most impactful in
protecting users, especially children, from serious harms such as those relating to child sexual abuse, pornography and fraud. |
|
|