Your health records are online, and Amazon wants you to wear Alexa on your face

Your health records are online, and Amazon wants you to wear Alexa on your face

This week’s news was flooded with a wealth of sensitive medical information landing on the internet, and perhaps, in the wrong hands. Sixteen million patient scans were exposed online, the European Court of Justice ruled Google does not need to remove links to sensitive information, and Amazon released new Alexa products for you to wear everywhere you go.

Over five million patients have had their privacy breached and their private health information exposed online. These documents contain highly sensitive data, like names, birthdays, and in some cases, social security numbers. Worse, the list of compromised medical record systems is rapidly increasing, and the data can all be accessed with a traditional web browser. In fact, Jackie Singh, a cybersecurity researcher and chief executive of the consulting firm Spyglass Security, reports “[i]t’s not even hacking,” because the data is so easily accessible to the average person (Source).

One of these systems belongs to MobilexUSA, whose records, which showed patients’ names, date of birth, doctors, and a list of procedures, were found online (Source

Experts report that this could be a direct violation of HIPAA and many warn that the potential consequences of this leak are devastating, as medical data is so sensitive, and if in the wrong hands, could be used maliciously (Source).

According to Oleg Pianykh, the director of medical analytics at Massachusetts General Hospital’s radiology department, “[m]edical-data security has never been soundly built into the clinical data or devices, and is still largely theoretical and does not exist in practice.” (Source

Such a statement signals a privacy crisis in the healthcare industry that requires a desperate fix. According to Pianykh, the problem is not a lack of regulatory standards, but rather that “medical device makers don’t follow them.” (Source) If that is the case, should we expect HIPAA to crackdown the same way GDPR has?

With a patient’s privacy up in the air in the US, a citizens’ “Right to be Forgotten” in the EU is also being questioned. 

The “Right to be Forgotten” states that “personal data must be erased immediately where the data are no longer needed for their original processing purpose, or the data subject has withdrawn [their] consent” (Source). This means that upon request, a data “controller” must erase any personal data in whatever means necessary, whether that is physical destruction or permanently over-writing data with “special software.” (Source)

When this law was codified in the General Data Protection Regulation (GDPR), it was implemented to govern over Europe. Yet, France’s CNIL fined Google, an American company, $110,000 in 2016 for refusing to remove private data from search results. Google argued changes should not need to be applied to the google.com domain or other non-European sites (Source). 

On Tuesday, The European Court of Justice agreed and ruled that Google is under no obligation to extend EU rules beyond European borders by removing links to sensitive personal data (Source). However, the court made a distinct point that Google “must impose new measures to discourage internet users from going outside the EU to find that information.” (Source) This decision sets a precedent for the application of a nation’s laws outside its borders when it comes to digital data. 

While the EU has a firm stance on the right to be forgotten, Amazon makes clear that you can “automatically delete [your] voice data”… every three to eighteen months (Source). The lack of immediate erasure is potentially troublesome for those concerned with their privacy, especially alongside the new product launch, which will move Alexa out of your home and onto your body.

On Wednesday, Amazon launched Alexa earbuds (Echo Buds), glasses (Echo Frames), and rings (Echo Loop). The earbuds are available on the marketplace, but the latter two are an experiment and are only available by invitation for the time being (Source). 

With these products, you will be able to access Alexa support wherever you are, and in the case of the EchoBuds, harness the noise-reduction technology of Bose for only USD $130 (Source). However, while these products promise to make your life more convenient, in using these products Amazon will be able to monitor your daily routines, behaviour, quirks, and more. 

Amazon specified that their goal is to make Alexa “ubiquitous” and “ambient” by spreading it everywhere, including our homes, appliances, cars, and now, our bodies. Yet, at the same time as they open up about their strategy for lifestyle dominance, Amazon claims to prioritize privacy, as the first tech giant to allow users to opt-out of their voice data being transcribed and listened to by employees. Despite this, it is clear that “Alexa’s ambition and a truly privacy-centric customer experience do not go hand in hand.” (Source). 

With Amazon spreading into wearables, Google winning the “Right to be Forgotten” case, and patient records being exposed online, this week is wrapping up to be a black mark on user privacy. Stay tuned for our next weekly news blog to learn about how things shape up. 

Join our newsletter


The Business Incentives to Automate Privacy Compliance Under CCPA

The Business Incentives to Automate Privacy Compliance Under CCPA

On January 1, 2020, the California Consumer Privacy Act (CCPA) comes into effect. The CCPA is a sweeping piece of legislation, aimed at protecting the personal information of California residents. It is going to force businesses to make major changes to how they handle their data. 

Lots of the CCPA’s regulations are built upon the bill’s conception of “personal information.” The CCPA defines personal information as “any information that identifies, relates to, describes, is capable of being associated with, or could be reasonably linked, directly or indirectly, with a particular consumer or household.” In the CCPA era, how businesses handle this personal information will define whether or not they stay compliant and stay successful. 

The Wild West days of monetizing secondary data are over

CCPA compliance is multifaceted. But here is one of the biggest challenges it creates for businesses: the need to implement organizational and technical controls to demonstrate that data used for secondary purposes – such as data science, analytics and monetization – is truly de-personalized. 

Why? Because the CCPA says that if it classifies as personal data, it can’t be used for secondary purposes. Unless each user is notified and given the opportunity to opt-out for every different instance of use – a risky undertaking for any business. 

In effect, the real challenge here is that the cat is already out of the bag, and businesses need a way to catch it, and put it back in.

This is because until now, many data science, analytics, AI and ML projects have had open access to consumer data. The understandable appetite for greater business and customer access has led to a Wild West approach, where monetization teams have enjoyed limitless access to unprotected customer data. This is exactly the area of focus the CCPA wants to bring under control. The bill will force organizations to establish controls that protect consumer privacy rights. 

These controls will require minimizing or removing personally identifiable data (PI) to prevent the risk of re-identification. This is the only way to put the cat back into the bag, and the only way not to violate CCPA. 

But: organizations are desperate for their data to stay useful and monetizable. How can this balance be achieved?

How to de-identify data but retain its analytical value?

Governance processes and practices need to show CCPA compliance. But various forms of analytics, data science and data monetization are incredibly valuable to enterprises, powering precious business and customer insights. 

If you apply data security encryption-based approaches and techniques like hashing, you will reduce or remove the analytical value of the data for data science. No good; you lose the value. 

You want to de-identify the date, remove the risk of re-identification, and meet the legal specifications of de-identified data under CCPA. But you want to do this while preserving analytical value. 

Both sides of the coin carry a financial imperative.The business motivation and incentive for organizations to ensure its customer data is de-identified is critical, as the overheads and restrictions of CCPA with regard to secondary use of data is prohibitive. Get it wrong, and what you believe to be de-identified data will in fact be in violation of CCPA. 

This means you will be on the wrong side of the law. And under CCPA, intentional violations can bring civil penalties of up to $7,500 per violation, and consumer lawsuits can result in statutory damages of up to $750 per consumer per incident.

However, keeping your secondary data useable is financially essential, as correctly de-identified data meeting a CCPA specification will be considered outside the scope of CCPA. This means you can continue to use it for valuable analytics, data science and data monetization. 

But you can only do this if the de-identification techniques you have used haven’t rendered the data unusable.

This is the central challenge CCPA creates: How can I de-identify my data and meet the legal specifications outlined under CCPA, but still leverage my data for important organizational initiatives? 

The CCPA creates the need for Privacy Automation

To be clear: the CCPA does not restrict an organization’s ability to collect, use, retain, sell, or disclose a consumer’s information that is de-identified or aggregated. However, the CCPA establishes a very high bar for claiming data is de-identified or aggregated. 

In practice, the only way to meet this bar will be through Privacy Automation: state-of-the-art risk assessments tools for the risk of identification; advanced privacy protection actions that retain the analytical value of datasets; and audit reporting. These and other techniques make up what is being termed ‘Privacy by Design’ and ‘Privacy by Default.’

In the CCPA era, a manual, ‘two eyes’ approach to assessing the risk of re-identification won’t cut it. The scale and the legal significance of proving privacy compliance under the CCPA is too great. 

Effective de-identification can be broken into three focus areas. You must:

  • Use a ‘state-of-the-art’ de-identification method. You need a process whereby consumer and personal data (as defined under CCPA) is transformed so that this data becomes de-personalized. This practice is at the heart of meeting, demonstrating and defending CCPA privacy compliance. This has to include cutting-edge privacy protection tools that retain the analytical value of the data for data science, rather than data encryption tools that break the analytical value of the data.
  • Assess for the likelihood of re-identification:  Research in 2000 proved that 87% of U.S. citizens can be re-identified on the basis of their gender, ZIP code and age. Just de-identifying direct identifiers alone still leaves an individual at risk of being identified from other information, whether within or without the dataset. Demonstrating the risk of re-identification using automated ‘state-of-the-art’ tools must be prioritized as organizations can no longer depend on manual processes.
  • Implement Segregation of Duties: Companies need to ensure that customer data is only shared with departments and individuals who have a legitimate purpose in receiving the consumer personal information. They need to implement appropriate controls so that segregation of duties exists, and so that data required for secondary purposes is truly de-personalized and thus CCPA-compliant. 

Instead, organizations must look to invest in automation and leverage new tools that instantly assess the risk of re-identification. These tools constitute an automated system that makes compliance watertight, while approaches also offer the starting point for  transforming the data that retains that much needed analytical value for data science, but is a key component of a privacy governance framework to demonstrate privacy compliance but to also easily defend privacy compliance, especially using automated and systems based approaches to the risks of re-identification.

Post CCPA, almost all privacy programs will require updating and modifying to accommodate the imposed requirements relating to CCPA, and to leverage the availability of new automated and state-of-the-art tools and systems.

These tools and systems will be ones that instantly assess the risk of re-identification, and constitute an automated system that makes compliance watertight, while also enabling data science by not ruining the insight value of datasets. 

CryptoNumerics Privacy Automation helps this CCPA dilemma by;

  • Promoting a better understanding of how it is possible to de-identify a dataset and still preserve the analytical value of the data for data science.
  • Leveraging systems-based technology to assess the risks of re-identification of an individual or individuals in datasets.
  • Applying modern anonymization protection using privacy protection actions such as generalisation, hierarchies, and differential privacy techniques to demonstrate that datasets are fully anonymized.
  • Build Privacy Automation into the heart of your data compliance plans and strategy.
  • Through data protection by default and by design, make privacy by default and by design the heart of your compliance strategy and plan and defend compliance with PIAs and Audit reporting.

For more information, read our blog on the Power of AI and Big Data.

Join our newsletter


CryptoNumerics Named Strata Data Top 3 Disruptive Finalist

CryptoNumerics Named Strata Data Top 3 Disruptive Finalist

TORONTO, September 24, 2019— Cutting edge techniques and technology. We’ve been named a Strata Data Top 3 “Disruptive Start-up” Finalist!

CryptoNumerics is proud to have been named a top 3 “Disruptive Startup” finalist by Strata. The Strata Data Awards recognize the most innovative startups, leaders, and data science projects from around the world, and we are honoured to be amongst them. Being recognized, alongside some of the most innovative and advanced new ideas, validates the increasing importance of privacy in the world of big data.

Our team will be at the Strata Data Conference in New York from September 24-26 to engage with industry leaders and showcase the revolutionary business impact of their solutions. Stop by our booth (P21) to discuss how we can help automate your big data privacy-protection and investigate the intersections between cutting-edge data science and privacy.  

Please text to 22333 and type “CRYPTO” to vote for us and show us your support!

About CryptoNumerics:

CryptoNumerics is where data privacy meets data science. The company creates enterprise-class software solutions which include privacy automation and virtual data collaboration that Fortune 1000 enterprises are deploying to address privacy compliance such as the GDPR, CCPA, and PIPEDA, while still driving data science and innovation projects to obtain greater business and customer insights. CryptoNumerics’ privacy automation reduces corporate liability and protects brand value from privacy non-compliance exposures.

Join our newsletter


CryptoNumerics Partners with TrustArc on Privacy Insight Webinar

CryptoNumerics Partners with TrustArc on Privacy Insight Webinar

We’re excited to partner up with TrustArc on their Privacy Insight Series on Thursday, September 26th at 12pm ET to talk about “Leveraging the Power of Automated Intelligence for Privacy Management”! 

With the increasing prevalence of privacy technology, how can the privacy industry leverage the benefits of artificial intelligence and machine learning to drive efficiencies in privacy program management? Many papers have been written on managing the potential privacy issues of automated decision-making, but far fewer on how the profession can utilize the benefits of technology to automate and simplify privacy program management.

Privacy tools are starting to leverage technology to incorporate powerful algorithms to automate repetitive, time-consuming tasks. Automation can generate significant cost and time savings, increase quality, and free up the privacy office’s limited resources to focus on more substantive and strategic work. This session will bring together expert panelists who can share examples of leveraging intelligence within a wide variety of privacy management functions.

 

Key takeaways from this webinar:
  • Understand the difference between artificial Intelligence, machine learning, intelligent systems and algorithms
  • Hear examples of the benefits of using intelligence to manage privacy compliance
  • Understand how to incorporate intelligence into your internal program and/or client programs to improve efficiencies

Register Now!

Can’t make it? Register anyway – TrustArc will automatically send you an email with both the slides and recording after the webinar.

To read more privacy articles, click here.

This content was originally posted on TrustArc’s website. Click here to view the original post.

Join our newsletter


What is your data worth?

What is your data worth?

How much compensation would you require to give a company complete access to your data? New studies demonstrate that prescribing a price tag to data may be the wrong approach to go about fines for noncompliance. Meanwhile, 51 CEOs write an open letter to Congress to request a federal consumer data privacy law and the Internet Associations joins them in their campaign. At the same time, Facebook is caught using Bluetooth in the background to track users and drive up profits.

Would you want your friends to know every facet of your digital footprint? How about your:

  • Location
  • Visited sites
  • Searched illnesses
  • Devices connected to the internet
  • Content read
  • Religious views
  • Political views
  • Photos
  • Purchasing habits


How about strangers? No? We didn’t think so. Then, the question remains, why are we sharing non-anonymized or improperly-anonymized copies of our personal information with companies? 

Today, many individuals are regularly sharing their data unconsciously with companies who collect it for profit. This data is used to monitor behaviour and profile you for targeted advertising that will make big data and tech companies, like Facebook, $30 per year in revenue per North American user (Source). Due to the profitability of data mining and the increasing number of nine-figure fines for data breaches, researchers have become fascinated by the economics of privacy. 

A 2019 study in the Journal of Consumer Policy questioned how users value their data. In the study, individuals stated they would only be willing to pay $5/month to protect personal data. While the low price tag may sound like privacy is a low priority, it is more likely that individuals’ believe their privacy should be a given, rather than something they have to pay to receive. This theory is corroborated by the fact that in reversing ownership in the question, and asking how much users would accept for full access to their data, there was a median response of $80/month (Source). 

While this study demonstrates a clear value placed on data from the majority, some individuals attributed a much higher cost and others said they would share data for free. Thus, the study concluded that “both willingness to pay and willingness to accept measures are highly unreliable guides to the welfare effects of retaining or giving up data privacy.” (Source)

In calling into question the ability of traditional measures of economic value to determine fines for data breaches and illegally harvesting data, other influential players in the data privacy research were asked how to go about holding corporations accountable to privacy standards. Rebecca Kelly Slaughter, Federal Trade Commission (FTC) Commissioner, stated that “injury to the public can be difficult to quantify in monetary terms in the case of privacy violations.” (Source

Rohit Chopra, a fellow FTC commissioner, also explained that current levels of monetary fines are not a strong deterrent for companies like Facebook, as their business model will remain untouched. As a result, the loss could be recouped through the further monetization of personal data. Consequently, both commissioners suggested that holding Facebook executives personally liable would be a stronger approach (Source).

If no price can equate to the value of personal data, and fines do not deter prolific companies like Facebook, should we continue asking what data is worth? Alessandro Acquisti, of Carnegie Mellon University, suggests an alternative method to look at data privacy is to view it as a human right. This model of thinking poses an interesting line of inquiry for both big data players and lawmakers, especially as federal data privacy legislature increases in popularity in the US (Source).

On September 10, 51 top CEOs, members of Business Roundtable, an industry lobbying organization, sent an open letter to Congress to request a US federal data privacy law that would supersede state-level privacy laws to simplify product design, compliance, and data management. Amongst the CEOs were the executives from Amazon, IBM, Salesforce, Johnson & Johnson, Walmart, and Visa.  

Throughout the letter, the giants accredited the patchwork of privacy regulations on a state-level for the disorder of consumer privacy in the United States. Today, companies face an increasing number of state and jurisdictional legislation that uphold varying standards to which organizations must comply. This, the companies argue, is inefficient to protect citizens, whereas a federal consumer data privacy law would provide reliable and consistent protections for Americans.

The letter also goes so far as to offer a proposed Framework for Consumer Privacy Legislation that the CEOs believe should be the base for future legislation. This framework states that data privacy law should…

  1. Champion Consumer Privacy and Promote Accountability.
  2. Foster Innovation and Competitiveness
  3. Harmonize Regulations
  4. Achieve Global Interoperability

While a unified and consistent method to hold American companies accountable could benefit users, many leading privacy advocates, and even some tech giants, have pointed out the immoral intentions of the CEOs. This is because they regarded the proposal as a method “to aggregate any privacy lawmaking under one roof, where lobby groups can water-down any meaningful user protections that may impact bottom lines.” (Source)

This pattern of a disingenuous push for a federal privacy law continued last week as the Internet Association (IA), a trade group funded by the largest tech companies worldwide, launched a campaign to request the same. Members are largely made up of companies who make a profit through the monetization of consumer data, including Google, Microsoft, Facebook, Amazon, and Uber (Source).

In an Electronic Frontier Foundation (EFF) article, this campaign was referred to as a “disingenuous ploy to undermine real progress on privacy being made around the country at the state level.” (Source) Should this occur, the federal law would supersede state laws, like The Illinois Biometric Information Privacy Act (BIPA) that makes it illegal to collect biometric data without opt-in consent, and the California Consumer Privacy Act (CCPA) which will give state residents the right to access and opt-out of the sale of their personal data (Source). 

In the last quarter alone, the IA has spent close to USD $176,000 to try and weaken CCPA before it takes effect without success. As a result, now, in conjunction with Business Roundtable and Technet, they have called for a “weak national ‘privacy’ law that will preempt stronger state laws.” (Source)

One of the companies campaigning to develop a national standard is Facebook, who is caught up, yet again, in a data privacy scandal.

Apple’s new iOS 13 update looks to rework the smartphone operating system to prioritize privacy for users (Source). Recent “sneak peeks” showed that it will notify users of background activity from third-party apps surveillance infrastructure used to generate profit by profiling individuals outside their app-usage. The culprit highlighted, unsurprisingly, is Facebook, who has been caught using Bluetooth to track nearby users

While this may not seem like a big deal, in “[m]atching Bluetooth (and wif-fi) IDs that share physical location [Facebook could] supplement the social graph it gleans by data-mining user-to-user activity on its platform.” (Source) Through this, Facebook can track not just your location, but the nature of your relationship with others. In pairing Bluetooth-gathered interpersonal interactions with social tracking (likes, followers, posts, messaging), Facebook can escalate its ability to monitor and predict human behaviour.

While you can opt-out of location services on Facebook, this means you cannot use all aspects of the app. For instance, Facebook Dating requires location services to be enabled, a clause that takes away a user’s ability to make a meaningful choice about maintaining their privacy (Source).

In notifying users about apps using their data in the background, iOS 13 looks to bring back a measure of control to the user by making them aware of potential malicious actions or breaches of privacy.

In the wake of this, Facebook’s reaction has tested the bounds of reality. In an attempt to get out of the hot seat, they have rebranded the new iOS notifications as “reminders” (Source) and, according to Forbes, un-ironically informed users “that if they protect their privacy it might have an adverse effect on Facebook’s ability to target ads and monetize user data.” (Source) At the same time, Facebook PR has also written that “We’ll continue to make it easier for you to control how and when you share your location,” as if to take credit for Apple’s new product development (Source).

With such comments, it is clear that in the upcoming months, we will see how much individuals value their privacy and convenience. Between the debate over the value of data, who should govern consumer privacy rights, and another privacy breach by Facebook, the relevance of the data privacy conversation is evident. To stay up to date, sign up for our monthly newsletter and keep an eye out for our weekly blogs on privacy news.

Join our newsletter