Rewarded for sharing your data? Sign me up!

Rewarded for sharing your data? Sign me up!

Companies now starting to pay users for their data, in efforts to be more ethical. Large Bluetooth security flaw detected proving potentially harmful to millions. Blockchain’s future looking bright as privacy-preserving technology booms. Canadian federal elections being ‘watched’ for their history of ‘watching’ public.

Rewarded for sharing your data? Sign me up!

Drop Technologies has secured USD$44 million in investments towards growing a technology-based alternative towards traditional customer loyalty programs. With over three million users signed up already, as well as 300 brands on its platform, such as Expedia and Postmates, the company is headed in the right direction. 

Given that Facebook and other tech giants are monetizing data without user permission, getting paid for it doesn’t seem like a bad idea after all. “I’m a Facebook user and an Instagram user, and these guys are just monetizing my data left and right, without much transparency,” said Onsi Sawiris, a managing partner at New York’s HOF Capital.” At least if I’m signing up for Drop, I know that if they’re using my data I will get something in return, and it’s very clear” (Source).

This alternative to rewards programs basically tracks your spending with all of their 300+ brands, and lets you earn points that you can spend at certain companies such as Starbucks of Uber Eats. If it’s an alternative to credit card rewards, it will be beneficial to consumers looking for extra savings on their purchases. So don’t drop it till you try it!

Bluetooth proving to be a potential data breach vulnerability 

Researchers have discovered a flaw that leaves millions of Bluetooth users vulnerable to data breaches. This flaw enables attackers to interfere while two users are trying to connect without being detected, as long as they’re within a certain range. From music to conversations, to data entered through a Bluetooth device, anything could be at risk. “Upon checking more than 14 Bluetooth chips from popular manufacturers such as Qualcomm, Apple, and Intel, researchers discovered that all the tested devices are vulnerable to attacks” (Source). 

Fortunately, some companies such as Apple and Intel have already implemented security upgrades on their devices. Users are also advised to keep their security, software, and firmware updated at all times. 

Get ready for blockchain advancements like never before

For the past decade, blockchain has been used to build an ecosystem where cryptocurrencies and peer-to-peer transactions are just a few of the many use cases. (Source).

Traditionally, data is shared across centralized networks, leaving systems vulnerable to attacks. However, with decentralization as an added security measure to blockchain, the threat of a single point of failure across a distributed network is eradicated. 

As more and more companies turn to blockchain to gain the benefits of more efficient data sharing and easier data transfers, privacy is overlooked.

In most public blockchains today, transactions are visible to all nodes of a network. Naturally, of course, the issue of privacy is raised due to the sensitive nature of the data, and this transparency comes at a cost. With digital transformation happening all around us, privacy protection cannot be ignored.

To address privacy, many blockchain companies are employing privacy-preserving mechanisms on their infrastructures, from zero-knowledge proofs to encryption algorithms such as Multi-Party Computation (MPC). These mechanisms encrypt data as it’s shared and only reveal the specific elements needed for a specific task (Source).

Costs efficiencies and a better understanding of consumer needs are just a few of the advantages of privacy-preserving mechanisms being introduced. As data and privacy go hand in hand in the future, equitability and trust will be our key to unlock new possibilities that enhance life as we know it (Source).

Upcoming Canadian elections could turn into surveillance problem

Once again, the Canadian federal elections are raising concerns about interference and disruption through the misuse of personal data. In the past, political parties have been known to use their power to influence populations who are not aware of how their data is being used. 

Since data has played a major role in elections, this could become a surveillance issue because experts who study surveillance say that harnessing data has been the key to electoral success, in past elections. “Politicians the world over now believe they can win elections if they just have better, more refined and more accurate data on the electorate” (Source).

A related issue is a lack of transparency between voters and electoral candidates. “There is a divide between how little is publicly known about what actually goes on in platform businesses that create online networks, like Facebook or Twitter, and what supporters of proper democratic practices argue should be known” (Source).

The officials of this upcoming election should be paying close attention to the public’s personal data and how it is being used.

Join our newsletter


Avoid Data Breaches and Save Your Company Money

Avoid Data Breaches and Save Your Company Money

Tips on how to avoid privacy risks and breaches that big companies face today. How much data breaches cost in 2019. Why consumers are shying away from sharing their data. Airline phishing scam could prove to be fatal in the long-run.

Stay Ahead of the Privacy Game

The Equifax data breach is another wake-up call for all software companies. There’s so much going on today, with regards to data exposure, fraud and, threats. Especially with the new laws proposed, companies should take the necessary steps to stay away from penalties and breaches. Here are some ways you can stay ahead of the privacy game. 

  1. Get your own security hackers – Many companies have their own cybersecurity team, to test out for failures, threats, etc. Companies also hire outside hackers to uncover any weaknesses in the company’s privacy or security tactics. “Companies can also host private or public “bug bounty” competitions where hackers are rewarded for detecting vulnerabilities” (Source)
  2. Establish trust with certificates of compliance – Earn your customers’ trust by achieving certificates of compliance. The baseline certification is known as the ISO 27001. If your company offers cloud services, you can attain the SOC 2 Type II certificate of compliance.
  3. Limit the data you need – Some companies ask for too much information, for example, when a user is signing up for a free trial in hopes of making easy money. Why ask for their credit card number when you are offering a free trial service? If they love the product or service, they themselves will offer to pay for full services. Have faith in your product or service.
  4. Keep the data for as long as needed only – Keeping this data for long period of time, when you don’t need it is simply a risk for your company. Think about it: As a consumer yourself, how would you react if your own personal data was compromised because of a trial you signed up for years ago? (Source)

How much does a data breach cost today?

According to a 2019 IBM + Ponemon Institute report, the average data breach costs a company approximately USD$1.25 million to USD$8.19 million, depending on the country and industry.

Each record costs companies an average of USD$148, based on the report’s results, which surveyed 507 organizations and was based on 16 regions in the world, across 17 industries. The U.S. takes first place with the highest data breach, at USD$8.19 million. Healthcare is the most expensive industry in terms of data breach costs, sitting in at an average of USD$6.45 million. 

However, the report isn’t all negative, as it provides tips to improve your data privacy. You can reduce the cost of a potential data breach by up to USD$720,000, through simple mitigating steps such as an incident response team or having encryption in place (Source).

Consumers more and more hesitant to share their data

Marketers and data scientists all over – beware. A survey of 1,000 Americans conducted by the Advertising Research Foundation indicates that consumers’ will to share data with companies has decreased drastically since last year. “I think the industry basically really needs to communicate the benefits to the consumer of more relevant advertising,” said ARF Chief Research Officer Paul Donato. It is important to remember that not all consumers would happily give up their data for better-personalized advertisements (Source).

Air New Zealand breach could pose long-term effects

Air New Zealand’s recent phishing scam from earlier this week has caused fear among citizens. The data breach exposed about 112,00 Air New Zealand Airpoints customers to long-term privacy concerns. 

Victims received emails requesting them to disclose personal information. They then responded with personal information like passport numbers and credit card numbers. 

“The problem is, the moment things are out there, then they can be used as a means to gain further information,” said  Dr. Panos Patros, a specialist in cybersecurity at the University of Waikato. “Now they have something of you so then they can use it in another attack or to confuse someone else” (Source).

A good practice for situations similar to this is to regularly change your passwords and monitor your credit card statements. Refrain from putting common security question information on your social media such as the first school you attended or your first pet’s name, etc. Additionally, delete all suspicious emails immediately without opening them (Source). 

Join our newsletter


Facial Recognition Technology is Shaking Up the States

Facial Recognition Technology is Shaking Up the States

Facial recognition technology is shaking up the States

Many states in America are employing facial recognition devices at borders to screen travelers. However, some cities like Massachusetts and San Francisco have banned the use of these devices, and the American Civil Liberties Union (ACLU) is pushing for a nationwide ban. 

It is still unclear how the confidential data gathered by the facial recognition devices will be used. Could it be shared with other branches of the government, such as ICE? 

ICE, or Immigrations and Customs Enforcement have been in the public eye for some time now, for their arrests of undocumented workers and immigration offenders. 

“Any time in the last three to four years that any data collection has come up, immigrants’ rights … have certainly been part of the argument,” says Brian Hofer, who is part of Oakland’s Privacy Advisory Commission. “Any data collected is going to be at risk when [ICE is] on a warpath, looking for anything they can do to arrest people. We’re definitely trying to minimize that exposure”.

This unregulated data is what is helping ICE locate and monitor undocumented people violating laws (Source).

Now Microsoft is listening to your Skype calls

A new day, a new privacy scandal. This week, Microsoft and Skype employees were revealed to be reviewing real consumer video chats, to check the quality of their software, and its translations. 

The problem is that they are keeping their customers in the dark on this, as do most tech companies. Microsoft has not told its consumers that they do this, though the company claims to have their users’ permission. 

“I recommend users refrain from revealing any identifying information while using Skype Translation, and Cortana. Unless you identify yourself in the recording, there’s almost no way for a human analyst to figure out who you are”, says privacy advocate Paul Bischoff (Source).

Essentially Alexa, Siri, Google Home, and Skype are listening to your conversations. Instead of avoiding these products as a consequence, however, we are compromising our privacy for convenience and efficiency. 

Canadians want more healthcare tech, regardless of privacy risks

New studies indicate that Canadians are open to a future where healthcare is further enhanced with technology, despite privacy concerns. 

The advantages of these innovations include reduced medical errors, reduced data loss, better-informed patients, and much more. 84% of respondents wanted to access their health data on an electronic platform, as opposed to hard copy files. 

Dr. Gigi Osler, president of the Canadian Medical Association, states, “We’ve got hospitals that still rely on pagers and fax machines, so the message is clear that Canada’s health system needs an upgrade and it’s time to modernize”. 

Furthermore, most respondents look forward to the possibility of online doctor visits, believing that treatment could be faster and more convenient (Source).

After all, if we bank, shop, read, watch movies and socialize online, why can’t we get treated online? 

Join our newsletter


How to Decode a Privacy Policy

How to Decode a Privacy Policy

How to Decode a Privacy Policy

91% of Americans skip privacy policies before downloading apps. It is no secret that people and businesses are taking advantage of that, given that there’s a new app scandal, data breach or hack everyday. For example, take a look at the FaceApp fiasco from last month.

In their terms of use, they clearly state the following;

 “You grant FaceApp a perpetual, irrevocable, nonexclusive, royalty-free, worldwide, fully-paid, transferable sub-licensable license to use, reproduce, modify, adapt, publish, translate, create derivative works from, distribute, publicly perform and display your User Content and any name, username or likeness provided in connection with your User Content in all media formats and channels now known or later developed, without compensation to you. When you post or otherwise share User Content on or through our Services, you understand that your User Content and any associated information (such as your [username], location or profile photo) will be visible to the public” (Source).

However, these documents should actually be rendered important, especially since it discloses legal information about your data, including what the company will do with your data, how they will use it and with whom they will share it. 

So let’s look at the most efficient way to read through these excruciating documents. Search for specific terms by doing a keyword or key phrase search. The following terms are a great starting point: 

  • Third parties
  • Except
  • Retain
  • Opt-out
  • Delete
  • With the exception of
  • Store/storage
  • Rights 
  • Public 

“All consumers must understand the threats, their rights, and what companies are asking you to agree to in return for downloading any app,” Adam Levin, Founder of CyberScout says. “We’re living in an instant-gratification society, where people are more willing to agree to something because they want it right now. But this usually comes at a price” (Source).

New York Passes Data Breach Law

A New York law has recently been passed, known as the SHIELD Act, or the Stop Hacks and Improve Electronic Data Security Act. This act requires businesses that collect personal data from New York residents to comply. Below are some of the act’s enforcement and features: 

  • requires notification to affected consumers when there is a security breach,
  • broadens the scope of covered information, 
  • expands the definition of what a data breach means, 
  • and extends the notification requirement to any entity with the private information of a New York resident (Source)

Why Apple Won’t Let You Delete Siri Recordings

Apple claims to protect its users’ privacy by not letting them delete their specific recordings. “Apple’s Siri recordings are given a random identifier each time the voice assistant is activated. That practice means Apple can’t find your specific voice recordings. It also means voice recordings can’t be traced back to a specific account or device” (Source).

After it was reported that contractors were listening to private Siri conversations, including doctor discussions and intimate encounters, Apple needed to change its privacy policies. 

The reason why Siri works differently than its rivals is because of how Google Assistant or Alexa data is connected directly with a user’s account for personalization and customer service reasons. Apple works differently, as they don’t rely too much on ad revenue and customer personalization like their rivals – they rely on their hardware products and services.

LAPD Data Breach Exposes 2,500 Officers’ Data

The PII of about 17,500 LAPD applicants and 2,500 officers has been stolen in a recent data breach, with information such as names, IDs, addresses, dates of birth and employee IDs compromised.

LAPD and the city are working together to understand the severity and impact of the breach. 

“We are also taking steps to ensure the department’s data is protected from any further intrusions,” the LAPD said. “The employees and individuals who may have been affected by this incident have been notified, and we will continue to update them as we progress through this investigation” (Source).

Join our newsletter


Capital One: An Expensive Lesson to Learn

Capital One: An Expensive Lesson to Learn

As part of their business practices, organizations are uploading private customer information to the cloud. However, just focusing on how secure the data is and not thinking about privacy is a mistake.

Capital One’s recent data breach proves that organizations need to be more conscious and proactive about their data protection efforts to prevent potential privacy exposure risks. Organizations have the obligation to ensure their customers’ data is fully privacy-protected before it is uploaded to the cloud. This doesn’t just mean eliminating or encrypting client names, ID’s, etc. It also entails understanding the risks of re-identification and applying as many privacy-protecting techniques as needed.

Capital One’s USD$150 Million Mistake

This month, one of the United States’ largest credit card issuers, Capital One, publicly disclosed a massive data breach affecting over 106 million people. Full names, addresses, postal codes, phone numbers, email addresses, dates of birth, SINs/SSNs, credit scores, bank balances and, income amounts were compromised (Source).

Former AWS systems engineer, Paige Thompson, was arrested for computer fraud and abuse, as a result of obtaining unauthorized access to Capital One customer data and credit card applications (Source). “Thompson accessed the Capital One data through exploiting a ‘misconfiguration’ of a firewall on a web application, allowing her to determine where the information was stored” F.B.I. officials stated. “These systems are very complex and very granular. People make mistakes” (Source).

To make amendments, Capital One is providing any affected customers with free credit monitoring and identity theft insurance in efforts. They will also be notifying customers if their data has been compromised (Source). 

Unfortunately, the company is expecting the breach to cost about USD$150 million, and these costs are driven by customer notifications, credit monitoring, technology costs, and legal support.

How the breach could have been avoided

Simply encrypting data isn’t enough because Thompson was able to exploit a security system vulnerability and decrypt the data (Source). 

Organizations should apply as many privacy-protecting techniques as possible to their dataset to minimize risks of customer re-identification in case of a data breach.

One way in which data can be privacy-protected to reduce the risk of re-identification is by anonymizing it. The best privacy technique to accomplish anonymization is differential privacy, which uses mathematical guarantees to hide whether an individual is present in a dataset or not. 

A second way to reduce the risk of re-identification is by combining pseudonymization of direct identifiers with generalization and suppression techniques of indirect identifiers. Optimal k-anonymity is a privacy technique that generalizes and suppresses data to make it impossible to distinguish any specific individual from the rest of the individuals.

Organizations should elevate their understanding of privacy-protection to the same level at which they understand cyber-security. There are two essential questions that every organization need to be able to answer:

  1. What is the re-identification risk of my data?
  2. What privacy-protecting techniques can we implement throughout our data pipeline?

To learn more about how CryptoNumerics can help you privacy-protect your data, click here.

Join our newsletter


Protect Your Data Throughout the Pipeline

Protect Your Data Throughout the Pipeline

Organizations all over the world have embraced the opportunity that data and data analysis presents. Millions of dollars are spent every year in designing and implementing data pipelines that allow organizations to extract the value from their data. Unfortunately, data misuse and data breaches have led government bodies to promote regulations such as GDPR, CCPA, and HIPAA, bestowing privacy rights upon consumers and placing responsibilities upon businesses.

Maximizing data value is essential, however, privacy regulations must be satisfied when doing so. This is achievable by implementing privacy-protecting techniques throughout the data pipeline to avoid compliance risks. 

Before introducing the privacy-protecting techniques, it is important to understand the four stages of the data pipeline:

  1. Data Acquisition: first off, the data must be acquired, which can be either generated internally or externally from third parties.
  2. Data Organization: the data is now stored for future use, and needs to be protected along the pipeline to avoid misuse and breaches. This can be achieved using access controls.
  3. Data Analysis: the data must now be opened up and mobilized in order to analyze it, which allows for a better understanding of an organization’s operations and customers, as well as improved forecasting.
  4. Data Publishing: lastly, analysis results are published, and/or internal data is shared with another party. 

Now that we have talked about the 4 stages of the data pipeline, let’s go over the sixteen privacy-protecting techniques that can be implemented throughout the pipeline to make it privacy-protected.

Within the randomizing group, there are two techniques: additive and multiplicative noise, where random noise is added to or multiplied to the individual’s record to transform the data. These techniques can be used in the Data Acquisition stage of the data pipeline. 

The sanitizing group has five privacy techniques in it. The first technique is k-anonymity, where identifiable attributes of any record in a particular database are indistinguishable from at least one other record. Next comes l-diversity, which is an extension of k-anonymity. However, this technique solves the k-anonymity shortfall by making sure there is a diversity of sensitive information in each group. Another technique in this group is t-closeness, which makes sure that the distribution of sensitive elements in each group remains the same as the distribution in the whole group. This technique is used to prevent attribute disclosure by maintaining a ‘t’ threshold. Additionally, there is the personalized privacy technique, in which privacy levels are defined and customized by owners. The last technique in this group is ε-differential privacy, which ensures any single record does not affect the overall outcome of the data’s analysis. These techniques can be used in the Data Acquisition stage, Data Organization stage, and the Data Publishing stage of the data pipeline. 

The output group has three techniques, which are used to reduce the inference of sensitive information from the output of any algorithm. The first technique is known as association rule hiding, where information used to exploit privacy can be taken from the rules identified in the data set. Next, there is the downgrading classifier effectiveness technique, where data is sanitized to reduce the classifier’s effectiveness to prevent information from being leaked. Finally, the query auditing and inference control technique, where data queries can output data that can be used to detect sensitive information. These techniques can be applied to the Data Publishing stage of the data pipeline. 

Last but not least, the distributed computing group, made up of seven privacy-protecting techniques. 1-out-of-2 oblivious transfer is where two messages are sent, but only one out of the two messages, are received and encrypted. Another technique in this group is homomorphic encryption, a method of performing a calculation on encrypted information (ciphertext) without decrypting it (to plaintext) first. Secure sum receives the sum of inputs without revealing these inputs to others. Secure set union shares and creates a union of sets without compromising the owners of each set. Secure size of intersection figures out the size of the data set’s intersection without revealing the data itself. The scalar product technique computes the scalar product between two vectors without revealing the input vector to each other’s party. Finally, the private set intersection technique computes the intersection of two sets from each party without revealing anything else. This technique can be used in the Data Acquisition stage, as well. All of the techniques from the distributed computing group prevent access to original, raw data while allowing analysis to be performed. All of these techniques can be applied to the Data Analysis stage and Data Publishing stage of the data pipeline. Homomorphic encryption can also be used in the Data Organization stage of the data pipeline.

These sixteen techniques help protect data’s privacy throughout the data pipeline. For a visual view on the privacy-exposed pipeline versus the privacy-protected pipeline, download the ‘Data Pipeline infographic’. 

For more information, or to find out how to privacy-protect your data, contact us today at [email protected].

Join our newsletter