How Google Can Solve its Privacy Problems

How Google Can Solve its Privacy Problems

Google and the University of Chicago’s Medical Center have made headlines for the wrong reasons.  According to a June 26th New York Times report, a lawsuit filed in the US District Court for Northern Illinois alleged that a data-sharing partnership between the University of Chicago’s Medical Center and Google had “shared too much personal information,” without appropriate consent. Though the datasets had ostensibly been anonymized, the potential for re-identification was too high. Therefore, they had compromised the privacy rights of the individual named in the lawsuit.

The project was touted as a way to improve prediction in medicine and realize the utility of electronic health records, through data science. Its coverage today instead focuses on risks to patients, and invasions of privacy.  Across industries like finance, retail, telecom, and more, the same potential for positive impact through data science exists, as does the potential for exposure-risk to consumers. The potential value created through data science is such that institutions must figure out how to address privacy concerns.

No one wants their medical records and sensitive information to be exposed. Yet they do want research to progress, and to benefit from innovation. That is the dilemma faced by individuals today. People are okay with their data being used in medical research, so long as their data is protected, and cannot be used to re-identify them. So where did the University of Chicago go wrong in sharing data with Google — and was it a case of negligence, ignorance, or a lack of investment?

The basis of the lawsuit claims that the data shared between the two parties were still susceptible to re-identification through inference attacks and mosaic effects. Though the datasets had been stripped of direct identifiers and anonymized, they still contained date stamps of when patients checked in and out of the hospital. When combined with other data that Google held separately, like location data from phones and mapping apps, they could be used to re-identify individuals in the data set. Free text medical notes from doctors, though de-identified in some fashion, were also contained in the data set, further compounding the exposure of private information.

Inference attacks and mosaic effect methods combine information from different datasets to re-identify individuals. They are now well-documented realities that institutions cannot be excused in being ignorant of. Indirect identifiers must also be assessed for the risk of re-identification of an individual and included when considering privacy-protection. What most are unaware of, is that they can be, without decimating the analytical value of the data required for data science, analytics, and ML.

Significant advancements in data science have led to improvements in data privacy technologies, and controls for data collaboration.  Autonomous, systematic, meta-data classification, and re-identification risk assessment and scoring are two that would have made an immediate difference, in this case. Differential Privacy and Secure Multiparty-Computation are two others.

Privacy Automation systems encompassing these technologies are a reality today.   Privacy management is often seen as an additional overhead cost to data science projects. That is a mistake. Tactical use of data security solutions like encryption and hashing to privacy-protect datasets are also not enough, as attested to by this case involving Google and the University of Chicago Medical Center.  

As we saw with Cybersecurity over the last decade, it took several years and continued data theft and hacks making headlines before organizations implemented advanced Cybersecurity and intrusion detection systems. Cybersecurity solutions are now seen as an essential component of an enterprise’s infrastructure and have a commitment at the board level to keep company data safe and their brand untarnished. Boards must reflect on the negative outcomes of lawsuits like this one, where the identity of its customers are being compromised, and their trust damaged. 

Today’s data science projects, without advanced automated privacy protection solutions, should not pass internal privacy governance and data compliance. Additionally, these projects should not use customer data, even if the data is anonymized, until automated privacy risk assessments solutions can accurately reveal the level of re-identification risk (inclusive of inference attacks, and the mosaic effect).  

With the sensitivity around privacy in data science projects in our public discourse today, any enterprise not investing and implementing advanced privacy management systems only exposes itself as having no regard for the ethical use of customer data. The potential for harm is not a matter of if, but when.

Join our newsletter


Productivity at the Cost of Privacy? WhatsApp Has Been Compromised?

Productivity at the Cost of Privacy? WhatsApp Has Been Compromised?

Facebook privacy issues

Smart homes are not so smart when it comes to protecting privacy. WhatsApp gets hacked by Israeli spies. Intel notifies customers about security flaws with chipNew regulations hint companies toward having better data management. Australian data breach affects 10 million civilians.

Smart Homes: Not so Smart

Smart homes definitely reduce effort and make life easier, but it comes at a cost. You and your family’s privacy is put at risk because of the trade-off between productivity and safety.

One of the most popular forms of a smart home is the digital assistant. Google Home and Alexa are the major players in this area. These devices are continuously listening for “activation” words or phrases and thus, your entire conversation history is saved in their server. As a result, many scary and embarrassing stories have surfaced, and yes, even from Amazon and Google products. 

If consumers do their part and take the necessary security steps, they should be able to enjoy the benefits of their smart home without paying a price. Here are some ways you can secure your smart home:

  • Review and delete your voice history from time to time.
  • Secure your network.
  • Change your wake or activation word or phrase.
  • Delete old recordings.
  • Strengthen your passwords.

Do everything you can to secure your home from being vulnerable to attacks.

WhatsApp Gets Hacked

WhatsApp, an app used by millions of people worldwide, has been compromised. On Tuesday, an Israeli spy firm injected malware into targeted phones to steal data, by simply placing a call. Recipients did not even need to answer the call. What’s worse, the call could not be traced in the log. The company states that only a select few have been affected, as they don’t know the exact number.

Intel Chip Suffers Security Flaws

In other news, Intel, also known as the worldwide computer chip maker, has just notified the world about a security flaw that can essentially prove to be harmful to millions of PCs. Attackers are able to get their hands on any data that a victim’s processor touches. Not scary at all.

New Regulations Call for Better Data Management

With privacy laws such as the GDPR and CCPA in place, businesses now need to allow for firmer data privacy enforcement. 

Every company we interact with uses our data-from The Weather Network to IBM. “The companies used the data to calibrate advertising campaigns to potential customers’ preferences, a type of personalization 90 percent of consumers say they find appealing,” says, Eric Archer-Smith, from BETA News. Although it helps with preferences and marketing, if found in the wrong hands, it could prove to be dangerous. Thus, companies today must find the perfect balance between personalization and privacy when collecting consumer data for analysis.

Australian Data Breach Affects 10 Million Civilians

The Office of the Australian Information Commissioner (OAIC) recently reported over 10 million people were hit in a single Australian data breach. Although the report did not specify the origin of the breach that affected these people, the breach was disclosed to be between January 1, 2019, and March 31, 2019. Furthermore, private health was yet again the most affected sector.

 

Join our newsletter



See How Companies Are Taking Part in Privacy Awareness Week

See How Companies Are Taking Part in Privacy Awareness Week

Facebook privacy issues
It’s Privacy Awareness week! This year, the theme is how ‘protecting privacy is everyone’s responsibility’. Google is trying to fix their privacy blunders even though experts are not impressed, while Amazon is still making the same blunders as before. Beware Canada, a rise in data breaches prompts significant warning. Canadian wireless carrier, Freedom Mobile, exposed for leaking 15,000 of their customers’ data.

Google wants us to know they have changed. They are emphasizing privacy like never before. For example, they are enhancing existing and adding new features.

A lot of their new moves are straight out of Apple’s playbook, such as:

  • On-device machine learning
  • Better in-app privacy controls
  • More control over websites tracking them with cookies
  • Incognito mode on Google Search and Maps

Not to mention, by reducing the cost of their hardware, they have an upper hand over Apple’s costlier hardware.

However, with their announcement on how they plan to give people more privacy control, experts are not very impressed. Ad-blocker company, Ghostery, says these changes are more to save face and less to save consumer privacy. These are marginal improvements, as they may be ignoring larger problems associated with consumer data privacy.

Regardless, Google’s new privacy features put the responsibility on users. They recently announced Android Q, its latest mobile OS, combined with 50 privacy and security features, including enhanced location tracking controls. Additionally, Google users can now set time limits for how long Google retains a certain type of information.

While Google is trying to make up for its data sins, Amazon is still making the same mistakes. Amazon Echo’s kid version, Echo Dot Kids, has been accused of tracking kids data without consent. Complaints have been filed to the Federal Trade Commission urging investigations are made. “We urge the FTC to investigate Amazon’s violations of the Children’s Online Privacy Protection Act (COPPA) for the safety and privacy of American children”.

Specifically, in Canada, the BC Office for the Information and Privacy Commissioner and the Better Business Bureau are warning individuals and companies to do a better job protecting their personal data. Last year, online shopping scams reached a whopping 3.5 million across Canada. “People get caught in the excitement to capitalize on a sale, grab that risk-free trial or purchase the last item before it goes out of stock and ends up jeopardizing their privacy.”

That being said, a Freedom mobile data breach has hit 15000 customers. They were warned by researchers days before the breach, but Freedom responded only after the warnings. Luckily, they found no evidence leading them to believe data has been misused and they are now “conducting a full forensic investigation to determine the full scope of impact”.

With privacy awareness week upon us, now is a great time to stay informed on the best tools to help your business remain full-proof in terms of data breaches and privacy protection.

 

 

Join our newsletter