|
Parliamentary committee whinges about the lack of age verification in games and their monetisation via loot boxes
|
|
|
| 12th September 2019
|
|
| See
article from parliament.uk |
Call to regulate video game loot boxes under gambling law and ban their sale to children among measures needed to protect players, say MPs. Lack of honesty and transparency reported among representatives of some games and social media companies in giving
evidence. The wide-ranging report calls upon games companies to accept responsibility for addictive gaming disorders, protect their players from potential harms due to excessive play-time and spending, and along with social media
companies introduce more effective age verification tools for users. The immersive and addictive technologies inquiry investigated how games companies operate across a range of social media platforms and other technologies,
generating vast amounts of user data and operating business models that maximise player engagement in a lucrative and growing global industry. Sale of loot boxes to children should be banned Government should regulate loot boxes
under the Gambling Act Games industry must face up to responsibilities to protect players from potential harms Industry levy to support independent research on long-term effects of gaming Serious concern at lack of effective system to keep children off
age-restricted platforms and games MPs on the Committee have previously called for a new Online Harms regulator to hold social media platforms accountable for content or activity that harms individual users. They say the new
regulator should also be empowered to gather data and take action regarding addictive games design from companies and behaviour from consumers. E-sports, competitive games played to an online audience, should adopt and enforce the same duty of care
practices enshrined in physical sports. Finally, the MPs say social media platforms must have clear procedures to take down misleading deep-fake videos 203 an obligation they want to be enforced by a new Online Harms regulator. In
a first for Parliament, representatives of major games including Fortnite maker Epic Games and social media platforms Snapchat and Instagram gave evidence on the design of their games and platforms. DCMS Committee Chair Damian
Collins MP said: Social media platforms and online games makers are locked in a relentless battle to capture ever more of people's attention, time and money. Their business models are built on this, but it's time for
them to be more responsible in dealing with the harms these technologies can cause for some users. Loot boxes are particularly lucrative for games companies but come at a high cost, particularly for problem gamblers, while
exposing children to potential harm. Buying a loot box is playing a game of chance and it is high time the gambling laws caught up. We challenge the Government to explain why loot boxes should be exempt from the Gambling Act. Gaming contributes to a global industry that generates billions in revenue. It is unacceptable that some companies with millions of users and children among them should be so ill-equipped to talk to us about the potential harm of their products.
Gaming disorder based on excessive and addictive game play has been recognised by the World Health Organisation. It's time for games companies to use the huge quantities of data they gather about their players, to do more to
proactively identify vulnerable gamers. Both games companies and the social media platforms need to establish effective age verification tools. They currently do not exist on any of the major platforms which rely on
self-certification from children and adults. Social media firms need to take action against known deepfake films, particularly when they have been designed to distort the appearance of people in an attempt to maliciously damage
their public reputation, as was seen with the recent film of the Speaker of the US House of Representatives, Nancy Pelosi.
Regulate 'loot boxes' under the Gambling Act: Loot box
mechanics were found to be integral to major games companies' revenues, with further evidence that they facilitated profits from problem gamblers. The Report found current gambling legislation that excludes loot boxes because they do not meet the
regulatory definition failed to adequately reflect people's real-world experiences of spending in games. Loot boxes that can be bought with real-world money and do not reveal their contents in advance should be considered games of chance played for
money's worth and regulated by the Gambling Act. Evidence from gamers highlighted the loot box mechanics in Electronic Arts's FIFA series with one gamer disclosing spending of up to £1000 a year. The Report
calls for loot boxes that contain the element of chance not to be sold to children playing games and instead be earned through in-game credits. In the absence of research on potential harms caused by exposing children to gambling, it calls for the
precautionary principle to apply. In addition, better labelling should ensure that games containing loot boxes carry parental advisories or descriptors outlining that they feature gambling content.
The Government should bring forward regulations under section 6 of the Gambling Act 2005 in the next parliamentary session to specify that loot boxes are a game of chance. If it determines not to regulate loot boxes under the Act
at this time, the Government should produce a paper clearly stating the reasons why it does not consider loot boxes paid for with real-world currency to be a game of chance played for money's worth. UK Government should
advise PEGI to apply the existing 'gambling' content labelling, and corresponding age limits, to games containing loot boxes that can be purchased for real-world money and do not reveal their contents before purchase.
Safeguarding younger players: With three-quarters of those aged 5 to 15 playing online games, MPs express serious concern at the lack of an effective system to keep children off age-restricted platforms
and games. Evidence received highlighted challenges with age verification and suggested that some companies are not enforcing age restrictions effectively. Legislation may be needed to protect children from playing games that are
not appropriate for their age. The Report identifies inconsistencies in age-ratings stemming from the games industry's self-regulation around the distribution of games. For example, online games are not subject to a legally enforceable age-rating system
and voluntary ratings are used instead. Games companies should not assume that the responsibility to enforce age-ratings applies exclusively to the main delivery platforms: all companies and platforms that are making games available online should uphold
the highest standards of enforcing age-ratings.
|
|
MPs and campaigners call for 'misogyny' to be defined as on 'online harm' requiring censorship by social media. What could go wrong?
|
|
|
| 7th September
2019
|
|
| See article from theguardian.com
|
MPs and activists have urged the government to protect women through censorship. They write in a letter Women around the world are 27 times more likely to be harassed online than men. In Europe, 9 million girls have experienced
some kind of online violence by the time they are 15 years old. In the UK, 21% of women have received threats of physical or sexual violence online. The basis of this abuse is often, though not exclusively, misogyny. Misogyny
online fuels misogyny offline. Abusive comments online can lead to violent behaviour in real life. Nearly a third of respondents to a Women's Aid survey said where threats had been made online from a partner or ex-partner, they were carried out. Along
with physical abuse, misogyny online has a psychological impact. Half of girls aged 11-21 feel less able to share their views due to fear of online abuse, according to Girlguiding UK . The government wants to make Britain the
safest place in the world to be online, yet in the online harms white paper, abuse towards women online is categorised as harassment, with no clear consequences, whereas similar abuse on the grounds of race, religion or sexuality would trigger legal
protections. If we are to eradicate online harms, far greater emphasis in the government's efforts should be directed to the protection and empowerment of the internet's single largest victim group: women. That is why we back the
campaign group Empower's calls for the forthcoming codes of practice to include and address the issue of misogyny by name, in the same way as they would address the issue of racism by name. Violence against women and girls online is not harassment.
Violence against women and girls online is violence. Ali Harris Chief executive, Equally Ours Angela Smith MP Independent Anne Novis Activist Lorely Burt Liberal Democrat, House of Lords
Ruth Lister Labour, House of Lords Barry Sheerman MP Labour Caroline Lucas MP Green Daniel Zeichner MP Labour Darren Jones MP Labour Diana Johnson MP Labour Flo Clucas Chair,
Liberal Democrat Women Gay Collins Ambassador, 30% Club Hannah Swirsky Campaigns officer, René Cassin Joan Ryan MP Independent Group for Change Joe Levenson Director of communications and campaigns, Young
Women's Trust Jonathan Harris House of Lords, Labour Luciana Berger MP Liberal Democrats Mandu Reid Leader, Women's Equality Party Maya Fryer WebRoots Democracy Preet Gill MP Labour Sarah Mann
Director, Friends, Families and Travellers Siobhan Freegard Founder, Channel Mum Jacqui Smith Empower
Offsite Patreon Comment: What will go wrong? See subscription article from patreon.com
|
|
Group of parliamentarians rant against DNS over HTTPS in a letter to the press
|
|
|
| 12th August 2019
|
|
| |
Web browser risk to child safety We are deeply concerned that a new form of encryption being introduced to our web browsers will have terrible consequences for child protection. The new system 204
known as DNS over HTTPS -- would have the effect of undermining the work of the Internet Watch Foundation (IWF); yet Mozilla, provider of the Firefox browser, has decided to introduce it, and others may follow. The amount of
abusive content online is huge and not declining. Last year, the IWF removed more than 105,000 web pages showing the sexual abuse of children. While the UK has an excellent record in eliminating the hosting of such illegal content, there is still a
significant demand from UK internet users: the National Crime Agency estimates there are 144,000 internet users on some of the worst dark-web child sexual abuse sites. To fight this, the IWF provides a URL block list that allows
internet service providers to block internet users from accessing known child sexual abuse content until it is taken down by the host country. The deployment of the new encryption system in its proposed form could render this service obsolete, exposing
millions of people to the worst imagery of children being sexually abused, and the victims of said abuse to countless sets of eyes. Advances in protecting users' data must not come at the expense of children. We urge the secretary
of state for digital, culture, media and sport to address this issue in the government's upcoming legislation on online harms.
- Sarah Champion MP;
- Tom Watson MP;
- Carolyn Harris MP;
- Tom Brake MP;
- Stephen
Timms MP;
- Ian Lucas MP;
- Tim Loughton MP;
- Giles Watling MP;
- Madeleine Moon MP;
-
Vicky Ford MP;
- Rosie Cooper MP;
- Baroness Howe;
- Lord Knight;
- Baroness Thornton;
- Baroness Walmsley;
- Lord Maginnis;
- Baroness Benjamin;
- Lord Harris of Haringey
The IWF service is continually being rolled out as an argument against DoH but I am starting to wonder if it is still relevant. Given the universal revulsion against child sex abuse then I'd suspect that little of it would now be located on the open
internet. Surely it would be hiding away in hard to find places like the dark web, that are unlikely to stumbled on by normal people. And of course those using the dark web aren't using ISP DNS servers anyway. In reality the point of using DoH is
to evade government attempts to block legal porn sites. If they weren't intending to block legal sites then surely people would be happy to use the ISP DNS including the IWF service.
|
|
UK Gambling Commission recognises issues with game monetisation but does not consider loot boxes to be gambling and so these are out of remit
|
|
|
| 24th July 2019
|
|
| See article from bbc.com |
The UK Gambling Commission has told MPs that it does not currently oversee the purchase of in-game content like Fifa player packs and video game loot boxes. This is because there is no official way to monetise what is inside them. A prize has to
be either money or have monetary value in order for it to fall under gambling legislation. However, there are unauthorised third party sites which buy and sell in-game content or enable it to be used as virtual currency. Gambling Commission
programme director Brad Enright admitted that games publisher EA, which sells the football team management game Fifa, faced a constant battle against unauthorised secondary markets. Dozens of parents have complained that their children are
spending hundreds of pounds on in-game purchases, and have criticised the process as a form of gambling as there is an element of chance in the outcome and their children are then tempted to buy again in order to try to get the result they want. Gambling Commission chief executive Neil McCarthur admitted that there were significant concerns around children playing video games in which there were elements of expenditure and chance. However, he added that under current legislation it did not classify as gambling.
|
|
DCMS will consult about online ID cards so that your porn viewing and all your PC misdemeanours on social media can be logged against your social score
|
|
|
| 14th July 2019
|
|
| See House of Commons Committee Report [pdf] from
publications.parliament.uk |
Despite concern among some groups of witnesses, a shift in approach in the UK Government's position seems on the horizon. The Minister for Digital and the Creative industries, for example, implied support for a universal digital ID in a recent
interview with The Daily Telegraph in 2019: think there are advantages of a universally acclaimed digital ID system which nowhere in the world has yet. There is a great prize to be won once the technology and the
public's confidence are reconciled.
On 11 June 2019, DCMS and the Cabinet Office announced their intentions to launch a consultation on digital identify verification in the coming weeks. The following actions were set
out:
A consultation to be issued in the coming weeks on how to deliver the effective organisation of the digital identity market. The creation of a new Digital Identity Unit, which is a collaboration
between DCMS and Cabinet Office. The Unit will help bring the public and private sector together, ensure the adoption of interoperable standards, specification and schemes, and deliver on the outcome of the consultation. The
start of engagement on the commercial framework for using digital identities from the private sector for the period from April 2020 to ensure the continued delivery of public services.
Single unique identifiers for citizens can transform the efficiency and transparency of Government services. We welcome the Government's announcement in June 2019 that it will consult shortly on digital identity. While we recognise
that in the UK there are concerns about some of the features of a single unique identifier, as demonstrated by the public reaction to the 2006 Identity Card Act, we believe that the Government should recognise the value of consistent identity
verification. The Government should facilitate a national debate on single unique identifiers for citizens to use for accessing public services along with the right of the citizen to know exactly what the Government is doing with their data.
Offline Comment: Privacy International explains some of the reasons why this is a bad idea 14th July 2019. See
article from privacyinternational.org
The debate shouldn't be about having insight into how your identifier is used. It should be about making sure that identifiers are never usable. After all, any unique identifier will not be limited to government use. Whether
through design or commercial necessity, any such number will also find it's way into the private sector. This was another fear highlighted in the mid-2000s, but it has played out elsewhere. For example, the Indian Supreme Court, in their ruling on the
Aadhaar system that provided a unique number to more than a billion people, that there were dangers of its use in the private sector: Allowing private entities to use Aadhaar numbers will lead to commercial exploitation of an individual's personal data
without his/her consent and could lead to individual profiling. Given everything that's happened since, the 13 years since the 2006 ID Card Act (that was repealed in 2010) can seem like a lifetime. But it's clear that the concerns
expressed then remain prescient now. Now that we know so much more about the risks that the exploitation of people's data plays - and the targeting, profiling and manipulating of individuals and groups - we should be even more fearful today of such a
system than we were a decade ago. Furthermore, it's been shown that we do not need such a unique identifier for people to securely access government services online, and it's on such concepts we must build going forward. See the full
article from privacyinternational.org
|
|
But the News Media Association points out that it would force websites to choose between being devoid of audience or stripped of advertising
|
|
|
| 4th July 2019
|
|
| See article from newsmediauk.org See also
criticism of ICO plan from newsmediauk.org |
For some bizarre reason the ICO seems to have been given powers to make wide ranging internet censorship law on the fly without needing it to be considered by parliament. And with spectacular
incompetence, they have come up with a child safety plan to require nearly every website in Britain to implement strict age verification. Baldric would have been proud, it is more or less an internet equivalent of making children safe on the roads by
banning all cars. A trade association for news organisations, News Media Association, summed up the idea in a consultation response saying: ICO's Age Appropriate Code Could Wreak Havoc On News Media
Unless amended, the draft code published for consultation by the ICO would undermine the news media industry, its journalism and business innovation online. The ICO draft code would require commercial news media publishers to choose between their online
news services being devoid of audience or stripped of advertising, with even editorial content subject to ICO judgment and sanction, irrespective of compliance with general law and codes upheld by the courts and relevant regulators.
The NMA strongly objects to the ICO's startling extension of its regulatory remit, the proposed scope of the draft code, including its express application to news websites, its application of the proposed standards to all users in the
absence of robust age verification to distinguish adults from under 18-year olds and its restrictions on profiling. The NMA considers that news media publishers and their services should be excluded from scope of the proposed draft Code.
Attracting and retaining audience on news websites, digital editions and online service, fostering informed reader relationships, are all vital to the ever evolving development of successful newsbrands and their services, their
advertising revenues and their development of subscription or other payment or contribution models, which fund and sustain the independent press and its journalism. There is surely no justification for the ICO to attempt by way of
a statutory age appropriate design code, to impose access restrictions fettering adults (and children's) ability to receive and impart information, or in effect impose 'pre watershed' broadcast controls upon the content of all currently publicly
available, free to use, national, regional and local news websites, already compliant with the general law and editorial and advertising codes of practice upheld by IPSO and the ASA. In practice, the draft Code would undermine
commercial news media publishers' business models, as audience and advertising would disappear. Adults will be deterred from visiting newspaper websites if they first have to provide age verification details. Traffic and audience will also be reduced if
social media and other third parties were deterred from distributing or promoting or linking titles' lawful, code compliant, content for fear of being accused of promoting content detrimental to some age group in contravention of the Code. Audience
measurement would be difficult. It would devastate advertising, since effective relevant personalised advertising will be rendered impossible, and so destroy the vital commercial revenues which actually fund the independent media, its trusted journalism
and enable it to innovate and evolve to serve the ever-changing needs of its audience. The draft Code's impact would be hugely damaging to the news industry and wholly counter to the Government's policy on sustaining high quality,
trusted journalism at local, regional, national and international levels. Newspapers online content, editorial and advertising practices do not present any danger to children. The ICO has not raised with the industry any evidence
of harm, necessitating such drastic restrictions, caused by reading news or service of advertisements where these are compliant with the law and the standards set by specialist media regulators.
| The Information Commissioner's Office
has a 'cunning plan' |
Of course the News Media Association is making a strong case for its own exclusion from the ICO's 'cunning plan', but the idea is equally devastating for websites from any other internet sector. Information Commissioner Elizabeth Denham was
called to give evidence to Parliament's DCMS Select Committee this week on related matters, and she spoke of a clearly negative feedback to her age verification idea. Her sidekick backtracked a little, saying that the ICO did not mean Age
Verification via handing over passport details, more like one of those schemes where AI guesses age by scanning what sort of thing the person has been posting on social media. (Which of course requires a massive grab of data that should be best kept
private, especially for children). The outcome seems to be a dictate to the internet industry to 'innovate' and find a solution to age verification that does not require the mass hand over of private data (you know like what the data protection laws
are supposed to be protecting). The ICO put a time limit on this innovation demand of about 12 months. In the meantime the ICO has told the news industry that age verification idea won't apply to them, presumably because they can kick up a hell of
stink about the ICO in their mass market newspapers. Denham said: We want to encourage children to find out about the world, we want children to access news sites. So the concern about the
impact of the code on media and editorial comment and journalism I think is unfounded. We don't think there will be an impact on news media sites. They are already regulated and we are not a media regulator.
She did speak any similar
reassuring words to any other sector of the internet industry who are likely to be equally devastated by the ICO's 'cunning plan'. |
|
|