Location data, privacy and consent
Trust in technology has taken a hit in the past few years. The word “Techlash” came to define 2018. The Cambridge Analytica data scandal may have been the high-water mark for this growing sense of distrust in technology, but two years later, fears about connected devices listening to our conversations and tracking our movements persist, and may be increasing.
Low consumer trust is a challenge for companies that provide digital services based on location data. It also raises difficult issues for consumers themselves, who rely on apps they may suspect are unreliable custodians of their personal data.
The conversation around location tracking has at times both overestimated and underestimated the threat posed by the misuse of location data. Even anonymised location data can identify individuals when cross-referenced with publicly available information. It takes just four points in time and space to identify individuals, and the market in location data about smartphone users is worth $21 billion per year in the US alone.
And it’s not just services like Google Maps and Uber collecting location data. Research has shown up to 90 percent of apps have tracking software inserted into their code, known as third-party trackers, that allows them to collect location data. The privacy, security and legal implications of third-party trackers are far reaching.
With app permission requests offering a binary opt-in or opt-out decision to users about consent to use location data, along with impenetrable terms and conditions can obscure the extent and purpose of data collection, the issue of what constitutes informed consent has become a grey area.
The value of location data-enabled services, both to the economy and to the public is clear. Getting directions, hailing a cab or ordering food is easier and cheaper than ever before. It is in the interest of the companies that make up the digital ecosystem underpinned by location data to win the trust of their users. Tech companies, regulatory bodies and the public each have a role to play, but there is uncertainty over who has ultimate responsibility in improving trust and accountability.
Research by consumer rights groups has highlighted a feeling of helplessness on the part of consumers. This is exacerbated by business models of free-to-use digital services that make users feel that they themselves are the product being offered to third party companies. Users find this “creepy”.
Dr Ana Basiri, Lecturer in Spatial Data Science and Visualisation at University College London’s Centre for Advanced Spatial Analysis, says that the companies providing location-based services could address concerns by offering more options to users. Instead of location access being set to on or off, it could be offered on a sliding scale of resolution. “We don’t have a say as a user over the quality of the data we share,” she says. “If we had a process to optimise the type of service according to the quality of the data we choose to share, then consumer behaviour would probably be very different.”
This would put the onus on app developers and digital service providers to empower users to make more informed choices. If access to certain features of apps was dependent on granting access to varying levels of resolution of location data, the trade-off between privacy and functionality would become more transparent to users.
Privacy by design
While developers themselves might be open to giving users greater control over access to location data, there is currently little incentive to do so. Dr Hannah Fry, Associate Professor in the Mathematics of Cities, and colleague of Dr Basiri at the UCL Centre for Advanced Spatial Analysis, suggests this is a cultural issue within data science. To overcome it, she suggests, data scientists could be made to take a Hippocratic oath, much like medics. This would instill the importance of ethics in data science education from the outset. “We need a Hippocratic oath in the same way it exists for medicine,” Fry says. “In medicine, you learn about ethics from day one. In mathematics, it’s a bolt-on at best. It has to be there from day one and at the forefront of your mind in every step you take.”
The responsibility for educating the public on the decisions they make regarding location tracking doesn’t have to rest solely on developers.
With location information making up part of the metadata of every photo taken on a smartphone, members of the public may be inadvertently sharing their exact location when they post a selfie on social media. Without understanding these implications, young people could be putting themselves at risk. Technology literacy becoming a core part of school curricula at early stages would go some way to address this, and would shift the responsibility for education to policymakers.
Terms and conditions, often lengthy, sometimes concealed, do not offer meaningful transparency, according to Slavka Bielikova, Advocacy Programme Coordinator at Consumers International. “There’s no point in flooding the consumer with information – that’s not informed consent.” she says. “There’s no point in transparency if it’s not meaningful.”
One solution to this issue is to offer users a distilled summary of an app or service’s overall trustworthiness in the form of a privacy score, suggests Alex Wrottesley, Head of Ordnance Survey’s Geovation incubator. “We rely on user reviews for products on Amazon and food delivery apps,” he says. “Could something similar, either provided by users or a regulatory board, be used for location data privacy?”
This idea could offer a promising way forward, but such privacy assurances raise issues of responsibility and reliability. Wholesale manipulation of Amazon customer reviews has eroded their trustworthiness. Amid accusations of censorship and fake news, establishing a system of marking trustworthiness of news articles has been difficult for social media platforms, although Facebook is experimenting with such a system.
“The issue of trust marks are a problem because it builds-in inequality,” says Bielikova. “This is especially a problem when so-called trusted services come at a premium. Our research suggests people might opt to use a less trusted service if it is available for free.”
Privacy at a premium
Apple has managed to build the perception that its flagship product, the iPhone, offers greater security and privacy than cheaper competitors, partly as a result of high-profile and carefully chosen battles with the FBI. But consumer advocates maintain that privacy should be a right; that consumers shouldn't be in a position where they have to buy it. Instead, advocates suggest a consumer-centric approach should be part of designing location data services, taking into account consumer rights.
In the ongoing conversation around location data privacy, it’s important we don’t lose sight of its usefulness. Location-enabled apps can literally save lives by providing emergency services with precise information. Navigation apps can empower women in environments where asking for directions carries risks.
The idea the public no longer expects privacy is an “insidious narrative”, according to Wrottesley, who says Facebook’s public-by-default policy played a part in propagating it. “It serves organisations that use that data that such a narrative exists,” he says. While there is some data to support the idea millennials are less concerned about privacy than previous generations, there are also studies showing a different picture.
Through an ongoing dialogue between consumers, developers and policy makers, the Benchmark Initiative is putting the ethics of location data at the centre of the privacy debate. Improving trust among users of location-enabled services should be taken seriously by businesses that rely on it. Responsible and transparent use of location data can help to rebuild trust among consumers and help to change the perception of the value that location data offers to society.