|
|
|
|
| 31st December 2019
|
|
|
People hate it but government's love it, guess who is prevailing? See article from politico.eu
|
|
|
|
|
| 30th December 2019
|
|
|
Encrypting DNS. By Max Hunter and Seth Schoen See article from eff.org
|
|
|
|
|
| 29th December 2019
|
|
|
Fancy New Terms, Same Old Backdoors: The Encryption Debate in 2019. By Joe Mullin See article from eff.org
|
|
France initiates a program of mass social media surveillance in the name of preventing tax fraud
|
|
|
| 28th December 2019
|
|
| See article from bbc.com |
The French government has come up with an innovative way of financing a program of mass social media, surveillance, to use it to detect tax fraud. The self financing surveillance scheme has now been given the go the constitutional court. Customs and
tax officials will be allowed to review users' profiles, posts and pictures for evidence of undisclosed income. In its ruling, the court acknowledged that users' privacy and freedom of expression could be compromised, but its applied caveats to
the legislation. It said authorities would have to ensure that password-protected content was off limits and that they would only be able to use public information pertaining to the person divulging it online. However the wording suggests that the non
public data is available and can be used for other more covert reasons. The mass collection of data is part of a three-year online monitoring experiment by the French government and greatly increases the state's online surveillance powers.
|
|
|
|
|
| 28th December 2019
|
|
|
Well if they would create a stupid law of inane tick boxing that is impossible to comply with, and so there are so many transgressions that regulators don't know where to start from See
article from politico.eu |
|
|
|
|
|
21st December 2019
|
|
|
By Simon McDougall, ICO's Executive Director for Technology and Innovation See article from ico.org.uk
|
|
Firefox to add another encrypted DNS over HTTPS option next year
|
|
|
| 18th December 2019
|
|
| See article from zdnet.com |
Mozilla has announced that NextDNS would be joining Cloudflare as the second DNS-over-HTTPS (DoH) provider inside Firefox. The browser maker says NextDNS passed the conditions imposed by its Trusted Recursive Resolver (TRR) program. These conditions
include
- limiting the data NextDNS collects from the DoH server used by Firefox users;
- being transparent about the data they collect; and
- promising not to censor, filter, or block DNS traffic unless specifically requested by law enforcement.
The new option will appear some time next year. DNS-over-HTTPS, or DoH, is a new feature that was added to Firefox last year. When enabled, it encrypts DNS traffic coming in and out of the browser.DNS traffic is not only encrypted but also moved
from port 53 (for DNS traffic) to port 443 (for HTTPS traffic), effectively hiding DNS queries and replies inside the browser's normal stream of HTTPS content. |
|
India dreams up a GDPR style data 'protection' law that is more of a data 'grab' for the government
|
|
|
| 14th December
2019
|
|
| See article from vpncompare.co.uk |
|
|
|
|
|
| 30th November 2019
|
|
|
Why not cut to the end game and ban it? If clear and informed consent is required, then very few will sign up, profiling has nothing positive to offer people, only negatives See
article from marketingtechnews.net |
|
EU plans for extending censorship laws to US messaging services falters
|
|
|
| 26th November 2019
|
|
| See
article from reuters.com
|
The European Commission is struggling to agree how to extend internet censorship and control to US messaging apps such as Facebook's WhatsApp and Microsoft's Skype. These services are run from the US and it is not so easy for European police to obtain
say tracking or user information as it is for more traditional telecoms services. The Commission has been angling towards applying the rules controlling national telecoms companies to these US 'OTT' messaging services. Extended ePrivacy regulation
was the chosen vehicle for new censorship laws. But now it is reported that the EU countries have yet to find agreement on such issues as tracking users' online activities, provisions on detecting and deleting child pornography and of course how
to further the EU's silly game of trying to see how many times a day EU internet users are willing to click consent boxes without reading reams of terms and conditions. EU ambassadors meeting in Brussels on Friday again reached an impasse, EU
officials said. Tech companies and some EU countries have criticized the ePrivacy proposal for being too restrictive, putting them at loggerheads with privacy activists who back the plan. Now doubt the censorship plans will be resuming soon.
|
|
|
|
|
| 21st
November 2019
|
|
|
The AdTech showdown is coming but will the ICO bite? See article from openrightsgroup.org |
|
Microsoft announces that it is in the process of implementing options to use encrypted DNS servers
|
|
|
| 19th November
2019
|
|
| See article from
techcommunity.microsoft.com by Tommy Jensen, Ivan Pashov, and Gabriel Montenegro
|
Windows will improve user privacy with DNS over HTTPS Here in Windows Core Networking, we're interested in keeping your traffic as private as possible, as well as fast and reliable. While there are many ways we can and do approach
user privacy on the wire, today we'd like to talk about encrypted DNS. Why? Basically, because supporting encrypted DNS queries in Windows will close one of the last remaining plain-text domain name transmissions in common web traffic.
Providing encrypted DNS support without breaking existing Windows device admin configuration won't be easy. However, at Microsoft we believe that
"we have to treat privacy as a human right. We have to have end-to-end cybersecurity built into technology." We also believe Windows adoption of encrypted DNS will help make the overall Internet ecosystem healthier.
There is an assumption by many that DNS encryption requires DNS centralization. This is only true if encrypted DNS adoption isn't universal. To keep the DNS decentralized, it will be important for client operating systems (such as Windows) and Internet
service providers alike to widely adopt encrypted DNS . With the
decision made to build support for encrypted DNS, the next step is to figure out what kind of DNS encryption Windows will support and how it will be configured. Here are our team's guiding principles on making those decisions:
Windows DNS needs to be as private and functional as possible by default without the need for user or admin configuration because Windows DNS traffic represents a snapshot of the user's browsing history. To Windows users,
this means their experience will be made as private as possible by Windows out of the box. For Microsoft, this means we will look for opportunities to encrypt Windows DNS traffic without changing the configured DNS resolvers set by users and system
administrators. Privacy-minded Windows users and administrators need to be guided to DNS settings even if they don't know what DNS is yet. Many users are interested in controlling their privacy and go looking for
privacy-centric settings such as app permissions to camera and location but may not be aware of or know about DNS settings or understand why they matter and may not look for them in the device settings. Windows users and
administrators need to be able to improve their DNS configuration with as few simple actions as possible. We must ensure we don't require specialized knowledge or effort on the part of Windows users to benefit from encrypted DNS. Enterprise policies
and UI actions alike should be something you only have to do once rather than need to maintain. Windows users and administrators need to explicitly allow fallback from encrypted DNS once configured. Once Windows has
been configured to use encrypted DNS, if it gets no other instructions from Windows users or administrators, it should assume falling back to unencrypted DNS is forbidden.
Based on these principles, we are making plans to adopt DNS over HTTPS (or DoH) in the Windows DNS client. As a platform, Windows Core Networking seeks
to enable users to use whatever protocols they need, so we're open to having other options such as DNS over TLS (DoT) in the future. For now, we're prioritizing DoH support as the most likely to provide immediate value to everyone. For example, DoH
allows us to reuse our existing HTTPS infrastructure. ... Why announce our intentions in advance of DoH being available to Windows Insiders? With encrypted DNS gaining more attention, we felt it was
important to make our intentions clear as early as possible. We don't want our customers wondering if their trusted platform will adopt modern privacy standards or not.
|
|
Google to withhold details from advertisers about where people are browsing on the internet
|
|
|
| 17th November 2019
|
|
| See article from mediapost.com
|
In what sounds like a profound change to the commercial profiling of people's website browsing history, Google has announced that it will withhold data from advertisers that categorises web pages. In response to the misuse of medical related browsing
data, Google has announced that from February 2020 it will cease to inform advertisers about the content of webpage where advertising space is up for auction. Presumably this is something along the lines of Google having an available advert slot on
worldwidepharmacy.com but not telling the advertiser that the John Doe is browsing an STD diagnosis page, but the advertiser will still be informed of the URL. Chetna Bindra, senior product manager of trust and privacy at Google wrote:
While we already prohibit advertisers from using our services to build user profiles around sensitive categories, this change will help avoid the risk that any participant in our auctions is able to associate individual ad
identifiers with Google's contextual content categories.
Google also plans to update its EU User Consent Policy audit program for publishers and advertisers, as well as our audits for the Authorized Buyers program, and continue to
engage with data protection authorities, including the Irish Data Protection Commission as they continue their investigation into data protection practices in the context of Authorized Buyers. Although this sounds very good news for people wishing
to keep their sensitive data private it may not be so good for advertisers who will see costs rise and publishers who will see incomes fall. ANd of course Google will still know itself that John Doe has been browsing STD diagnosis pages. There
could be other consequences such as advertisers sending their own bots out to categorise likely advertising slots. |
|
US Federal Court rules that border police must has reasonable suspicions that a mobile device contains illegal contraband before searching it
|
|
|
| 14th November 2019
|
|
| See article
from aclu.org |
In a major victory for privacy rights, a federal court has held that the federal government's suspicionless searches of smartphones, laptops, and other electronic devices at airports or other U.S. ports of entry are unconstitutional. In recent years,
as the number of devices searched at the border has quadrupled, international travelers returning to the United States have increasingly reported cases of invasive searches. Documents and testimony we and the Electronic Frontier Foundation
obtained as part of our lawsuit challenging the searches revealed that the government has been using the border as a digital dragnet. CBP and ICE claim sweeping authority to search our devices for purposes far removed from customs enforcement, such as
finding information about someone other than the device's owner. The court's order makes clear that these fishing expeditions violate the Fourth Amendment. The government must now demonstrate reasonable suspicion that a device contains illegal
contraband. That's a far more rigorous standard than the status quo, under which officials claim they can rummage through the personal information on our devices at whim and with no suspicion at all. |
|
'When we enter a building we expect it to be safe. We are not expected to examine and understand all the paperwork and then tick a box that lets the companies involved off the hook'
|
|
|
| 4th November 2019
|
|
| See
press release from parliament.uk See
report [pdf] from publications.parliament.uk |
The UK Parliament's Joint Committee on Human Rights has reported on serious grounds for concern about the nature of the "consent" people provide when giving over an extraordinary range of information about themselves, to
be used for commercial gain by private companies:
- Privacy policies are too complicated for the vast majority of people to understand: while individuals may understand they are consenting to data collection from a given site in exchange for "free" access to content,
they may not understand that information is being compiled, without their knowledge, across sites to create a profile. The Committee heard alarming evidence about eye tracking software being used to make assumptions about people's sexual orientation,
whether they have a mental illness, are drunk or have taken drugs: all then added to their profile.
- Too often the use of a service or website is conditional on consent being given -- raising questions about whether
it is freely given
- People cannot find out what they have consented to: it is difficult, if not nearly impossible, for people - even tech experts - to find out who their data has been shared with, to stop it being
shared or to delete inaccurate information about themselves.
- The consent model relies on individuals knowing about the risks associated with using web based services when the system should provide adequate protection
from the risks as a default..
- It is completely inappropriate to use consent when processing children's data: children aged 13 and older are, under the current legal framework, considered old enough to consent to
their data being used, even though many adults struggle to understand what they are consenting to.
Key conclusions and recommendations The Committee points out that there is a real risk of discrimination against some groups and individuals through the way this data is used: it heard deeply
troubling evidence about some companies using personal data to ensure that only people of a certain age or race, for example, see a particular job opportunity or housing advertisement. There are also long-established
concerns about the use of such data to discriminate in provision of insurance or credit products. Unlike traditional print advertising where such blatant discrimination would be obvious and potentially illegal
personalisation of content means people have no way of knowing how what they see online compares to anyone else. Short of whistleblowers or work by investigative journalists, there currently appears to be no mechanism
for protecting against such privacy breaches or discrimination being in the online "Wild West". The Committee calls on the Government to ensure there is robust regulation over how our data can be collected
and used and it calls for better enforcement of that regulation. The Committee says:
- The "consent model is broken" and should not be used as a blanket basis for processing. It is impossible for people to know what they are consenting to when making a non-negotiable, take it-or-leave-it
"choice" about joining services like Facebook, Snapchat and YouTube based on lengthy, complex T&Cs, subject to future changes to terms.
- This model puts too much onus on the individual, but the
responsibility of knowing about the risks with using web based services cannot be on the individual. The Government should strengthen regulation to ensure there is safe passage on the internet guaranteed
- Its
completely inadequate to use consent when it comes to processing children's data,. If adults struggle to understand complex consent agreements, how do we expect our children to give informed consent? The Committee says setting the digital age of consent
at 13 years old should be revisited.
- The Government should be regulating to keep us safe online in the same way as they do in the real world - not by expecting us to become technical experts who can judge whether our
data is being used appropriately but by having strictly enforced standards that protect our right to privacy and freedom from discrimination.
- It should be made much simpler for individuals to see what data has been
shared about them, and with whom, and to prevent some or all of their data being shared.
- The Government should look at creating a single online registry that would allow people to see, in real time, all the companies
that hold personal data on them, and what data they hold.
The report is worth a read and contains many important points criticising the consent model as dictated by GDPR and enfoced by ICO. Here are a few passages from the report's summary: The evidence we heard during this inquiry,
however, has convinced us that the consent model is broken. The information providing the details of what we are consenting to is too complicated for the vast majority of people to understand. Far too often, the use of a service or website is conditional
on consent being given: the choice is between full consent or not being able to use the website or service. This raises questions over how meaningful this consent can ever really be. Whilst most of us are probably unaware of who
we have consented to share our information with and what we have agreed that they can do with it, this is undoubtedly doubly true for children. The law allows children aged 13 and over to give their own consent. If adults struggle to understand complex
consent agreements, how do we expect our children to give informed consent. Parents have no say over or knowledge of the data their children are sharing with whom. There is no effective mechanism for a company to determine the age of a person providing
consent. In reality a child of any age can click a consent button. The bogus reliance on consent is in clear conflict with our right to privacy. The consent model relies on us, as individuals, to understand, take decisions, and be
responsible for how our data is used. But we heard that it is difficult, if not nearly impossible, for people to find out whom their data has been shared with, to stop it being shared or to delete inaccurate information about themselves. Even when
consent is given, all too often the limit of that consent is not respected. We believe companies must make it much easier for us to understand how our data is used and shared. They must make it easier for us to opt out of some or all of our data being
used. More fundamentally, however, the onus should not be on us to ensure our data is used appropriately - the system should be designed so that we are protected without requiring us to understand and to police whether our freedoms are being protected.
As one witness to our inquiry said, when we enter a building we expect it to be safe. We are not expected to examine and understand all the paperwork and then tick a box that lets the companies involved off the hook. It is
the job of the law, the regulatory system and of regulators to ensure that the appropriate standards have been met to keep us from harm and ensure our safe passage. We do not believe the internet should be any different. The Government must ensure that
there is robust regulation over how our data can be collected and used, and that regulation must be stringently enforced. Internet companies argue that we benefit from our data being collected and shared. It means the content we
see online - from recommended TV shows to product advertisements - is more likely to be relevant to us. But there is a darker side to personalisation. The ability to target advertisements and other content at specific groups of people makes it possible
to ensure that only people of a certain age or race, for example, see a particular job opportunity or housing advertisement. Unlike traditional print advertising, where such blatant discrimination would be obvious, personalisation of content means people
have no way of knowing how what they see online compares to anyone else. Short of a whistle-blower within the company or work by an investigative journalist, there does not currently seem to be a mechanism for uncovering these cases and protecting people
from discrimination. We also heard how the data being used (often by computer programmes rather than people) to make potentially life-changing decisions about the services and information available to us is not even necessarily
accurate, but based on inferences made from the data they do hold. We were told of one case, for example, where eye-tracking software was being used to make assumptions about people's sexual orientation, whether they have a mental illness, are drunk or
have taken drugs. These inferences may be entirely untrue, but the individual has no way of finding out what judgements have been made about them. We were left with the impression that the internet, at times, is like the Wild
West, when it comes to the lack of effective regulation and enforcement. That is why we are deeply frustrated that the Government's recently published Online Harms White Paper explicitly excludes the protection of people's
personal data. The Government is intending to create a new statutory duty of care to make internet companies take more responsibility for the safety of their users, and an independent regulator to enforce it. This could be an ideal vehicle for
requiring companies to take people's right to privacy, and freedom from discrimination, more seriously and we would strongly urge the Government to reconsider its decision to exclude data protection from the scope of their new regulatory framework. In
particular, we consider that the enforcement of data protection rules - including the risks of discrimination through the use of algorithms - should be within scope of this work. |
|
VTS Media camgirl websites accidentally reveal users' login details.
|
|
|
| 4th November 2019
|
|
| See article from rt.com |
Several popular camgirl sites have exposed the email addresses and other sensitive information of millions of users and sex workers after a backend was left wide open. VTS Media, a company based in Barcelona, runs the affected sites, out of which
amateur.tv is one of the most popular cam sites in Spain, according to traffic-ranking service Alexa. Others include placercams.com and webcampornoxxx.net. This data exposure does not come at the hands of any sort of hack or exploit, instead, just
an oversight by the company, TechCrunch reported. The administrative backends were left open, without a password, for several weeks. This allowed anyone to access the network's database, which included usernames, email addresses, IP addresses, browser
user-agents, private chat logs, login timestamps, and even failed login attempts, which stored attempted passwords in plaintext. The backend also contained data related to the videos that registered users were watching and renting. Users who
broadcasted sexual content to viewers on these sites also had some of their personal information revealed. With millions of users affected, this is one of the largest data exposures for adult sites since Ashley Madison's massive breach in 2015,
and rather highlights jsut how dangerous it is to hand over personal details to porn sites, and just image how much worse it would have been if UK age verification were in place, the date would include names and addresses, birthdates ad passport numbers.
|
|
Sky News sees Labour and Tory documents showing that they buy data from data brokers revealing people's likely political preferences
|
|
|
| 26th October 2019
|
|
| See article from
news.sky.com by Rowland Manthorpe |
Manuals for Labour campaigners, seen by Sky News, show the party buys data from credit reference agency Experian in order to target both traditional canvassing and advertising on Facebook. Data obtained from the Conservatives shows it categorises people
using Experian data in its VoteSource database. Experian's data is widely used by political parties and private companies, who prize its ability to classify voters on a street-by-street basis into categories such as Bank of Mum and Dad,
Disconnected Youth and Midlife Stopgap. Labour and the Conservatives buy Experian's Mosaic database, which uses more than 850 million records, including crime data, GCSE results, gas and electricity consumption and child benefits, to classify
people into types. Until last year, Labour also used an Experian tool called Origin to target voters based on ethnicity. Labour quietly stopped using the tool in 2018 after deciding it would not be legal under new data protection legislation.
This all seems totally at odds with the new GDPR law enacted this year which suggest that these people should be seeking people's consent before they amass and sell data in this manner. Tim Turner, founder of data protection consultancy 2040
Training commented: People have explicit rights to opt-out of processing like this, with no exemptions, but if you don't know it's happening, how can you exercise these rights? Does the average person know about all
this?
Pat Walshe, director of data protection firm Privacy Matters, added: Absent of a specific notice and consent I don't see how such actions would amount to transparent, fair and lawful
processing. I can only hope the Information Commissioner's Office (ICO) is actively scrutinising the activities of UK political parties.
|
|
|
|
|
| 15th October 2019
|
|
|
The US, UK and Australia are taking on Facebook in a bid to undermine the only method that protects our personal information. By Edward Snowden See
article from theguardian.com |
|
|
|
|
| 11th October
2019
|
|
|
Apple's Safari browser hands over your browsing history to a company controlled by the Chinese government See article from reclaimthenet.org
|
|
|
|
|
| 6th October 2019
|
|
|
The Lib Dems are using creepy databases to profile every voter in UK for their political preferences See
article from news.sky.com |
|
|