Uber’s venture into developing a privacy tool to safeguard confidential information
Uber is assumed by the people to be a ride-hailing app or a brand which emphasizes on self-driving methodology and is overwhelmed with several lawsuits and cases. Not taking them into account, Uber is also a massive streamlined database which features location coordinates, traffic influx, transaction details and other such information. All these details stored in Uber can make the user base concerned.
Coming as a savior, Uber has rolled out a unique privacy tool which is used to scrutinize its database. This tool is based on Differential Privacy logic where the information is properly scanned without exposing the individual user information which is integrated with the same. Google and Apple manipulate this for enlightenment regarding user data circumventing the concern that privacy is undermined.
An executive from Uber’s department of privacy engineering noted that it is an approach to analyze queries and determine how sensitive the generated data is without making the subjecting the query to activation.
To elucidate clearly, take a scenario where Uber’s analysts want to get insight about the mean distance associated with a ride in San Francisco. Related to this a lot of information and data has to be queried related to the rides but this could result in disclosing data related to the driver base and the passengers. Uber has ensured that the relevant data is muddled with the incorporation of noise thereby making the process to avail trip information futile.
Related to some other queries, the noise that has to be injected must be more. The mean Uber ride distance in a small city which features only less amount of trips needs more noise to secure privacy. Differential Privacy is what that estimates the noise related to the sensitivity as noted by an Uber software engineer related to the team.
For Uber to measure the level of sensitivity, it sought the help of security researchers from the University of California, Berkeley. The latter made the research for more than year and subject the measuring logic termed alternatively as Elastic Sensitivity to inception. This is what Uber has rolled out as an open-source tool.
This tool helps the Uber analysts and others to instantly manipulate a collection of various differential privacy standards related to a bunch of several queries. In the past, an analyst would have carried out the process where tried to expunge the confidential and sensitive information. But henceforth, the data will be clean without being associated with needless data.
Uber has shown considerable attention related to the aspect of people performing their tasks with privacy augmented. Elastic Sensitivity will offer insight about the intensity of the noise to be injected for scrambling and maintaining privacy. In other cases, it will define whether the query should be made to function. This tool helps in the generation of an extra layer of security. The Uber analysts now can avail the needed data and the probability of the individual private data being exposed as an offshoot is mitigated.