How to measure representation in mobility data and protect people’s privacy
Project Summary Blog by Georgina Burke - first published 19th August 2020
What gets measured gets changed
The tradeoff of privacy for societal value is a false choice. We can compute useful metrics without succumbing to abusive and illicit surveillance. This is more important now than ever before; to address systemic racism in mobility services we need to know how people are using them and we need to acquire that knowledge in a privacy preserving way. At the moment for organisations to see representation they need to collect sensitive data. But there are ways for product teams to glean insights from data sets without being able to identify individuals, by using privacy-preserving techniques.
We’ve been working with Benchmark, an initiative exploring the ethical uses of location data, to show how to use privacy preserving techniques in practice. Over the last three weeks we created the following blog posts and prototypes, looking at how to apply the randomised response technique to mobility data:
- How to measure representation in a data set and protect people’s privacy. – An introduction to the project and our approach
- Equality in mobility matters now more than ever – a deeper dive into why inequality in mobility is critical now
- Applying randomised response to mobility data – we explain step by step how to apply the randomised response technique
- What privacy preserving techniques make possible for transport authorities – through prototypes we imagine what this might practically enable for different stakeholders
- What privacy preserving techniques make possible for mobility providers
There are ways for product teams to glean insights from data sets without being able to identify individuals, by using privacy-preserving techniques.
Equality in mobility matters now more than ever
First we looked at why inequality in mobility is critical at the moment. In the last few months people have been travelling around cities differently. Since Covid-19, more people are cycling and walking rather than using public transport. In response, city planners have rushed to accommodate for these changes. They have fast-tracked projects to widen pavements and improve cycling infrastructure that would normally have taken years.
There are known inequalities in how Black communities access micro mobility. Yet there aren’t many metrics or datasets to help organisations and communities understand and measure these inequalities.
Typically it has been hard to open data about mobility because it’s so sensitive. Last year Los Angeles Department of Transport demanded micro mobility services share anonymised data about customer’s journeys via the Mobility Data Specification (MDS) or lose their license to operate. Yet sharing this data impacts riders’ privacy and gives authorities the ability to track where citizens go.
Applying the randomised response technique
Next we showed how to apply the randomised response, one of the simplest privacy preserving techniques, to see how you can gain insights from data without compromising privacy.
We introduced the randomised response technique and explained step-by-step how it works. We looked at how it can be applied to a real world problem – tracking the number of hired bikes passing through each neighbourhood in a city. We created a Python notebook to accompany this post implementing the techniques we describe. Creating practical tools helps show how these techniques work what kind of problems they can help solve.
Using privacy preserving techniques changes how people can use data. Organisations that hold data previously regarded as too sensitive to share could open data sets to wider audiences. This makes it possible for people to use data in new ways.
What privacy preserving techniques make possible: for transport authorities
The Mayor of London listed cycling and walking as key population health indicators in the London Health Inequalities Strategy. The pandemic has only amplified the need for people to use cycling as a safer and healthier mode of transport. Yet as the majority of cyclists are white, Black communities are less likely to get the health benefits that cycling provides. Groups like Transport for London (TfL) should monitor how different communities cycle and who is excluded. Organisations like the London Office of Technology and Innovation (LOTI) could help boroughs procure privacy preserving technology to help their efforts.
Instead of requiring access to all customer trip data, authorities could ask specific questions like, where are the least popular places to cycle? If mobility providers apply techniques like randomised response, an individual’s identity is obscured by the noise added to the data.
It’s easy to imagine transport authorities like TfL combining privacy preserved mobility data from multiple mobility providers to compare insights and measure service provision. They could cross reference the privacy preserved bike trip data with demographic data in the local area to learn how different communities cycle. The first step to addressing inequality is being able to measure it.
Prototype showing how privacy-protected data about bike trips could be used to understand the availability of micro-mobility services in areas of the city where Black communities live.
What privacy preserving techniques make possible: for mobility providers
Tamika Butler says systemic racism can’t be tackled without tackling it in cycling. By offering lower cost alternatives to owning a bike, mobility providers and bike share companies are uniquely placed to address racism in cycling and take action.
Following the Black Lives Matter protests in June, companies rushed to publish solidarity statements but these words are meaningless if companies aren’t true to those values in practice. There is yet to be a mobility company that has stepped forward and publicly evaluated racism in its services. One challenge is that this information – user race – is sensitive and therefore risky to collect, measure, and share even in aggregate.
But using privacy-preserving techniques makes it possible to compute insights without compromising people’s privacy. This kind of anti-racist leadership is desperately needed from organisations in order for real change to be made rather than marketing campaigns. We imagined how a fictional bike share company called Bikez might surface mobility data as a public commitment to anti-racist service provision.
What it could look like if mobility providers acknowledged inequality in their services and actively tried to change it.
We want to make privacy-preserving techniques more accessible for product teams working in mobility companies and transport authorities. We can help you test and apply randomised response, or other privacy-preserving techniques to see what value you can unlock from mobility data.
Get in touch if you’re interested in finding out more or are already using privacy-preserving techniques.
IF uses data and design as tools of change and defiance to build systems that promote trust, public value and equitable futures. Visit projectsbyif.com to see more of our work.