How Third Parties who Act as Brokers of Data will Struggle as the Future of Data Collaboration Changes

How Third Parties who Act as Brokers of Data will Struggle as the Future of Data Collaboration Changes

Today, everyone understands that, as The Economist put it, “data is the new oil.”

And few understand this better than data aggregators. Data aggregators can loosely be defined as third parties who act as brokers of data to other businesses. Verisk Analytics is perhaps the largest and best-known example, but many other companies exist as well: Yodlee, Plaid, MX and many more.

These data aggregators understand the importance of data, and how the right data can be leveraged to create value through data science for consumers and companies alike. But the future of data collaboration is starting to look very different. Their businesses may well start to struggle.

Why data aggregators face a tricky future

As the power of data has become more widely recognized, so too has the importance of privacy. In 2018, the European Union implemented GDPR (General Data Protection Regulation), the most comprehensive data privacy regulation of its kind, with broad-sweeping jurisdiction. GDPR did its work right away, with a succession of privacy leaks across multiple industries that led to highly negative media coverage. Facebook suffered a $5-billion fine.

Where once many were skeptical, today, few people deny the importance of data privacy. Privacy itself has become a separate dimension, distinct from security. The data scientist community has come to understand that datasets must not only be secure from hackers, but de-identified, to ensure no individual can have their information stolen as the data is shared.

In the new era of privacy controls, third party data aggregators will face two problems: 

  1. Privacy Protection Requirements
    Using a third party to perform data collaboration is a flawed approach. No matter what regulations or protections you enforce, you are still moving your data out of your data centers, and exposing your raw information (which contains both PII and IP-sensitive items) to someone else. Ultimately, third party consortiums do not maintain a “privacy-by-design” frame, which is the standard required for GDPR compliance.

  2. Consumers Don’t Consent to Have their Data Used
    The GDPR requires that collectors of data also collect the consent of their consumers for its use. If I have information that I’ve collected, I can only use it for the specific purpose the consumer has allowed for. I cannot just share it with anyone, or use it however I like.

These challenges are serious obstacles to data collaboration, and will affect data aggregators the most due to their unique value proposition.Many see data aggregators as uniquely flawed in their dealings with these issues, and that has generated some negative traction against them. A recent Nevada state law required all who qualified to sign up for a public registry. 

There is a need for these aggregators to come out ahead of this, in order to overcome challenges to their business model, and to avoid negative media attention.

How CryptoNumerics can help

At CryptoNumerics, we recognise the genuine ethical need for privacy. But we also recognize the vast good that data science can provide. In our opinion, no-one should have to choose one over the other. Hence we have developed new technology that enables both.

CN-Insight uses a concept we refer to as Virtual Data Collaboration. Using technologies like secure multi-party computation and secret share cryptography, CN-Insight enables companies to perform machine learning and data science across distributed datasets. Instead of succumbing to the deficits of the third-party consortium model, we enable companies to keep their data sets on-prem, without need of co-location or movement of any kind, and without needing to expose any raw information. The datasets are matched using feature engineering, and our technology enables enterprises to build the models as if the data sets were combined.

Data aggregators must give these challenges serious thought, and make use of these new technology innovations in order to stay ahead of a new inflection point in their industry. Privacy is here to stay, and as the data brokers that lead the industry, they have an opportunity to play a powerful role in leading the way forward, and improving their business future.

Join our newsletter

What do Trump, Google, and Facebook Have in Common?

What do Trump, Google, and Facebook Have in Common?

This year, the Trump Administration declared the need for a national privacy law to supersede a patchwork of state laws. But, as the year comes to a close, and amidst the impeachment inquiry, time is running out. Meanwhile, Google plans to roll out encrypted web addresses, and Facebook stalls research into social media’s effect on democracy. Do these three seek privacy or power?
The Trump Administration, Google, and Facebook claim that privacy is a priority, and… well… we’re still waiting for the proof. Over the last year, the news has been awash with privacy scandals and data breaches. Every day we hear promises that privacy is a priority and that a national privacy law is coming, but so far, the evidence of action is lacking. This begs the question, are politicians and businesses using the guise of “privacy” to manipulate people? Let’s take a closer look.

Congress and the Trump Administration: National Privacy Law

Earlier this year, Congress and the Trump Administration agreed they wanted a new federal privacy law to protect individuals online. This rare occurrence was even supported and campaigned for by major tech firms (read our blog “What is your data worth” to learn more). However, despite months of talks, “a national privacy law is nowhere in sight [and] [t]he window to pass a law this year is now quickly closing.” (Source)

Disagreement over enforcement and state-level power are said to be holding back progress. Thus, while senators, including Roger Wicker, who chairs the Senate Commerce Committee, insist they are working hard, there are no public results; and with the impeachment inquiry, it is possible we will not see any for some time (Source). This means that the White House will likely miss their self-appointed deadline of January 2020, when the CCPA goes into effect.

Originally, this plan was designed to avoid a patchwork of state-level legislature that can make it challenging for businesses to comply and weaken privacy care. It is not a simple process, and since “Congress has never set an overarching national standard for how most companies gather and use data.”, much work is needed to develop a framework to govern privacy on a national level (Source). However, there is evidence in Europe with GDPR, that a large governing structure can successfully hold organizations accountable to privacy standards. But how much longer will US residents need to wait?

Google Encryption: Privacy or Power

Google has been trying to get an edge above the competition for years by leveraging the mass troves of user data it acquires. Undoubtedly, their work has led to innovation that has redefined the way our world works, but our privacy has paid the price. Like never before, our data has become the new global currency, and Google has had a central part to play in the matter. 

Google has famously made privacy a priority and is currently working to enhance user privacy and security with encrypted web addresses.

Unencrypted web addresses are a major security risk, as they make it simple for malicious persons to intercept web traffic and use fake sites to gather data. However, in denying hackers this ability, power is given to companies like Google, who will be able to collect more user data than ever before. For the risk is “that control of encrypted systems sits with Google and its competitors.” (Source)

This is because encryption cuts out the middle layer of ISPs, and can change the mechanisms through which we access specific web pages. This could enable Google to become the centralized encryption DNS provider (Source).

Thus, while DoH is certainly a privacy and security upgrade, as opposed to the current DNS system, shifting from local middle layers to major browser enterprises centralizes user data, raising anti-competitive and child-protection concerns. Further, it diminishes law enforcement’s ability to blacklist dangerous sites and monitor those who visit them. This also opens new opportunities for hackers by reducing their ability to gather cybersecurity intelligence from malware activity that is an integral part of being able to fulfil government-mandated regulation (Source).

Nonetheless, this feature will roll out in a few weeks as the new default, despite the desire from those with DoH concerns to wait until learning more about the potential fallout.

Facebook and the Disinformation Fact Checkers

Over the last few years, Facebook has developed a terrible reputation as one of the least privacy-centric companies in the world. But it is accurate? After the Cambridge Analytica scandal, followed by endless cases of data privacy ethical debacles, Facebook stalls its “disinformation fact-checkers” on the grounds of privacy problems.

In April of 2018, Mark Zuckerburg announced that the company would develop machine learning to detect and manage misinformation on Facebook (Source). It then promised to share this information with non-profit researchers who would flag disinformation campaigns as part of an academic study on how social media is influencing democracies (Source). 

To ensure that the data being shared could not be traced back to individuals, Facebook applied differential privacy techniques.

However, upon sending this information, researchers complained data did not include enough information about the disinformation campaigns to allow them to derive meaningful results. Some even insisted that Facebook was going against the original agreement (Source). As a result, some of the people funding this initiative are considering backing out.

Initially, Facebook was given a deadline of September 30 to provide the full data sets, or the entire research grants program would be shut down. While they have begun offering more data in response, the full data sets have not been provided.

A spokesperson from Facebook says, “This is one of the largest sets of links ever to be created for academic research on this topic. We are working hard to deliver on additional demographic fields while safeguarding individual people’s privacy.” (Source). 

While Facebook may be limiting academic research on democracies, perhaps they are finally prioritizing privacy. And, at the end of the day with an ethical framework to move forward, through technological advancement and academic research, the impact of social media and democracy is still measurable without compromising privacy.

In the end, it is clear that privacy promises hold the potential to manipulate people into action. While the US government may not have a national privacy law anywhere in sight, the motives behind Google’s encrypted links may be questionable, and Facebook’s sudden prioritization of privacy may cut out democratic research, at least privacy is becoming a hot topic, and that holds promise for a privacy-centric future for the public.

Join our newsletter


Google Prioritizes Privacy Amidst the YouTube Scandal and Facebook Dating Launch

Google Prioritizes Privacy Amidst the YouTube Scandal and Facebook Dating Launch

Photo by rawpixel.com from Pexels

In the wake of the University of Zurich study and current affairs with YouTube’s children’s privacy issues, it is clear that data security is of the utmost importance to protect citizen’s rights and companies from noncompliance fines and scandals. However, today’s standards are minimal and conventional techniques, such as anonymization, are not enough. The issue of reoccurring privacy issues is also evident in Facebook, leaving users questioning what Facebook Dating means for their privacy. In spite of this, alongside the launch of Google’s open-source differential privacy library for companies, there are clear signals that organizations are making data security a priority.

The University of Zurich published a study on September 2, 2019, indicating that they were able to “identify the participants in confidential legal cases, even though such participants had been anonymized.” (Source) Through the combination of AI and big data, researchers were able to de-anonymize the data of 84% of 120,000+ participants in less than one hour. Such demonstrates that “linkage,” or anonymization without consideration of available public information, is not a secure manner to guarantee privacy, and the speed and ease at which these results were obtained sets an alarming precedent. More specifically, this industry-specific case sets the tone that the government’s systems cannot protect the privacy of plaintiffs and defendants, signalling the need for enhanced privacy and automation to prevent classified information from being released.

However, the government is not the only one who’s data security was called into question last week, as Google agrees to pay a record $170 million for illegally harvesting children’s data (Source). On Wednesday, the Federal Trade Commission and New York’s attorney general reached an agreement that the company must pay the fine and improve the children’s privacy protection on YouTube. The repercussions are a direct result of YouTube collecting personal data of children under the age of thirteen without parental permission and targeting them with behavioural ads, in violation of the Children’s Online Privacy Protection Act (COPPA). Part of the settlement indicates that moving forward will require child-directed content to be designated as such and behavioural ads on corresponding content should be prevented. Further, YouTube will not obtain previously shared data without parental consent.

The public is largely displeased with these sanctions, calling them a mere “slap on the wrist,” and some industry professionals have similar beliefs. Jeffery Chester, executive director of the Center for Digital Democracy, reports “It’s the equivalent of a cop pulling somebody over for speeding at 110 miles an hour – and they get off with a warning.” (Source) Such expressions indicate a trend for increased corporate responsibility and legal repercussions for privacy breaches.

Between the University of Zurich study and the ramifications of YouTube’s illegal data gathering reports last week, it is evident that companies should question how well protected their datasets are. Without adequate systems in place, they risk noncompliance fines and a loss of public trust.

This lack of trust follows Facebook, as users question what Facebook Dating will mean for their data in spite of its new privacy and security features (Source). Announced on Thursday, this service is set to roll out across the US to allow users to create a separate dating-specific profile and be matched with other users based on location, indicated preferences, events attended, and groups, amongst other factors. Some of the features include the ability to hide a profile from friends or to share plans with select people. Beyond this, Facebook will offer “Secret Crush” on Instagram, which allows individuals to compile a list of friends they are interested in, and to be matched if the crush also lists them.

This data is likely not as safe as Facebook suggests. Digital strategist, Jason Kelley, states that “If you’re trying to avoid dating services that have red flags, you can’t really find one that has more red flags than Facebook.” (Source) After all, days ago ~200 million Facebook users’ phone numbers were exposed online (Source).

The primary concern, given Facebook’s history of mishandling personal data, is Facebook’s ability to develop a more sophisticated ad profile based on their dating information. This includes “what kinds of people users like, whom they match with, and even how dates go” (Source). Mark Weinstein, a privacy advocate and founder of social network MeWe, even goes so far as to say that “Facebook will use Facebook Dating as a new portal into users’ lives; collecting, targeting, and selling dating history, romantic preferences, emotions, sexual interests, fetishes, everything.” (Source)

An additional concern due to their track record is that while Facebook reports dating profiles will not be connected to their Facebook activity, sensitive information, like sexual orientation, could be at risk of exposure.

The immense cloud of doubt surrounding Facebook is one other organizations hope to avoid, and as a result, there has been an increased focus on privacy protection. Google made this a priority when launching its open-source differential privacy library this week (Source).

With Google’s new library, developers are able to “take this library and build their own tools that can work with aggregate data without revealing personally identifiable information either inside or outside their companies.” (Source) This is a direct result of differential privacy, the most advanced technology to date.

Differential privacy enables the public sharing of information about a dataset while maintaining the confidentiality of individuals in the dataset. Using this method of security, according to Miguel Guevara, a Privacy and Data Protection Product Manager at Google, is vital because “without strong privacy protections, you risk losing the trust of your citizens, customers, and users.” (Source)

Google’s prioritization of privacy through its open-source launch signals that improved privacy protection is expected in today’s market. Such corroborates the needs outlined in the University of Zurich study, the fines Google faces over the YouTube scandal, as well as the impact Facebook’s history looks to have on the trust of their new service. Thus, it is evident: data security and privacy protection are more consequential than ever.

Join our newsletter


Rewarded for sharing your data? Sign me up!

Rewarded for sharing your data? Sign me up!

Companies now starting to pay users for their data, in efforts to be more ethical. Large Bluetooth security flaw detected proving potentially harmful to millions. Blockchain’s future looking bright as privacy-preserving technology booms. Canadian federal elections being ‘watched’ for their history of ‘watching’ public.

Rewarded for sharing your data? Sign me up!

Drop Technologies has secured USD$44 million in investments towards growing a technology-based alternative towards traditional customer loyalty programs. With over three million users signed up already, as well as 300 brands on its platform, such as Expedia and Postmates, the company is headed in the right direction. 

Given that Facebook and other tech giants are monetizing data without user permission, getting paid for it doesn’t seem like a bad idea after all. “I’m a Facebook user and an Instagram user, and these guys are just monetizing my data left and right, without much transparency,” said Onsi Sawiris, a managing partner at New York’s HOF Capital.” At least if I’m signing up for Drop, I know that if they’re using my data I will get something in return, and it’s very clear” (Source).

This alternative to rewards programs basically tracks your spending with all of their 300+ brands, and lets you earn points that you can spend at certain companies such as Starbucks of Uber Eats. If it’s an alternative to credit card rewards, it will be beneficial to consumers looking for extra savings on their purchases. So don’t drop it till you try it!

Bluetooth proving to be a potential data breach vulnerability 

Researchers have discovered a flaw that leaves millions of Bluetooth users vulnerable to data breaches. This flaw enables attackers to interfere while two users are trying to connect without being detected, as long as they’re within a certain range. From music to conversations, to data entered through a Bluetooth device, anything could be at risk. “Upon checking more than 14 Bluetooth chips from popular manufacturers such as Qualcomm, Apple, and Intel, researchers discovered that all the tested devices are vulnerable to attacks” (Source). 

Fortunately, some companies such as Apple and Intel have already implemented security upgrades on their devices. Users are also advised to keep their security, software, and firmware updated at all times. 

Get ready for blockchain advancements like never before

For the past decade, blockchain has been used to build an ecosystem where cryptocurrencies and peer-to-peer transactions are just a few of the many use cases. (Source).

Traditionally, data is shared across centralized networks, leaving systems vulnerable to attacks. However, with decentralization as an added security measure to blockchain, the threat of a single point of failure across a distributed network is eradicated. 

As more and more companies turn to blockchain to gain the benefits of more efficient data sharing and easier data transfers, privacy is overlooked.

In most public blockchains today, transactions are visible to all nodes of a network. Naturally, of course, the issue of privacy is raised due to the sensitive nature of the data, and this transparency comes at a cost. With digital transformation happening all around us, privacy protection cannot be ignored.

To address privacy, many blockchain companies are employing privacy-preserving mechanisms on their infrastructures, from zero-knowledge proofs to encryption algorithms such as Multi-Party Computation (MPC). These mechanisms encrypt data as it’s shared and only reveal the specific elements needed for a specific task (Source).

Costs efficiencies and a better understanding of consumer needs are just a few of the advantages of privacy-preserving mechanisms being introduced. As data and privacy go hand in hand in the future, equitability and trust will be our key to unlock new possibilities that enhance life as we know it (Source).

Upcoming Canadian elections could turn into surveillance problem

Once again, the Canadian federal elections are raising concerns about interference and disruption through the misuse of personal data. In the past, political parties have been known to use their power to influence populations who are not aware of how their data is being used. 

Since data has played a major role in elections, this could become a surveillance issue because experts who study surveillance say that harnessing data has been the key to electoral success, in past elections. “Politicians the world over now believe they can win elections if they just have better, more refined and more accurate data on the electorate” (Source).

A related issue is a lack of transparency between voters and electoral candidates. “There is a divide between how little is publicly known about what actually goes on in platform businesses that create online networks, like Facebook or Twitter, and what supporters of proper democratic practices argue should be known” (Source).

The officials of this upcoming election should be paying close attention to the public’s personal data and how it is being used.

Join our newsletter


Avoid Data Breaches and Save Your Company Money

Avoid Data Breaches and Save Your Company Money

Tips on how to avoid privacy risks and breaches that big companies face today. How much data breaches cost in 2019. Why consumers are shying away from sharing their data. Airline phishing scam could prove to be fatal in the long-run.

Stay Ahead of the Privacy Game

The Equifax data breach is another wake-up call for all software companies. There’s so much going on today, with regards to data exposure, fraud and, threats. Especially with the new laws proposed, companies should take the necessary steps to stay away from penalties and breaches. Here are some ways you can stay ahead of the privacy game. 

  1. Get your own security hackers – Many companies have their own cybersecurity team, to test out for failures, threats, etc. Companies also hire outside hackers to uncover any weaknesses in the company’s privacy or security tactics. “Companies can also host private or public “bug bounty” competitions where hackers are rewarded for detecting vulnerabilities” (Source)
  2. Establish trust with certificates of compliance – Earn your customers’ trust by achieving certificates of compliance. The baseline certification is known as the ISO 27001. If your company offers cloud services, you can attain the SOC 2 Type II certificate of compliance.
  3. Limit the data you need – Some companies ask for too much information, for example, when a user is signing up for a free trial in hopes of making easy money. Why ask for their credit card number when you are offering a free trial service? If they love the product or service, they themselves will offer to pay for full services. Have faith in your product or service.
  4. Keep the data for as long as needed only – Keeping this data for long periods of time, when you don’t need it is simply a risk for your company. Think about it: As a consumer yourself, how would you react if your own personal data was compromised because of a trial you signed up for years ago? (Source)

How much does a data breach cost today?

According to a 2019 IBM + Ponemon Institute report, the average data breach costs a company approximately USD$1.25 million to USD$8.19 million, depending on the country and industry.

Each record costs companies an average of USD$148, based on the report’s results, which surveyed 507 organizations and was based on 16 regions in the world, across 17 industries. The U.S. takes first place with the highest data breach, at USD$8.19 million. Healthcare is the most expensive industry in terms of data breach costs, sitting in at an average of USD$6.45 million. 

However, the report isn’t all negative, as it provides tips to improve your data privacy. You can reduce the cost of a potential data breach by up to USD$720,000, through simple mitigating steps such as an incident response team or having encryption in place (Source).

Consumers more and more hesitant to share their data

Marketers and data scientists all over – beware. A survey of 1,000 Americans conducted by the Advertising Research Foundation indicates that consumers’ will to share data with companies has decreased drastically since last year. “I think the industry basically really needs to communicate the benefits to the consumer of more relevant advertising,” said ARF Chief Research Officer Paul Donato. It is important to remember that not all consumers would happily give up their data for better-personalized advertisements (Source).

Air New Zealand breach could pose long-term effects

Air New Zealand’s recent phishing scam from earlier this week has caused fear among citizens. The data breach exposed about 112,00 Air New Zealand Airpoints customers to long-term privacy concerns. 

Victims received emails requesting them to disclose personal information. They then responded with personal information like passport numbers and credit card numbers. 

“The problem is, the moment things are out there, then they can be used as a means to gain further information,” said  Dr. Panos Patros, a specialist in cybersecurity at the University of Waikato. “Now they have something of you so then they can use it in another attack or to confuse someone else” (Source).

A good practice for situations similar to this is to regularly change your passwords and monitor your credit card statements. Refrain from putting common security question information on your social media such as the first school you attended or your first pet’s name, etc. Additionally, delete all suspicious emails immediately without opening them (Source). 

Join our newsletter


Facial Recognition Technology is Shaking Up the States

Facial Recognition Technology is Shaking Up the States

Facial recognition technology is shaking up the States

Many states in America are employing facial recognition devices at borders to screen travelers. However, some cities like Massachusetts and San Francisco have banned the use of these devices, and the American Civil Liberties Union (ACLU) is pushing for a nationwide ban. 

It is still unclear how the confidential data gathered by the facial recognition devices will be used. Could it be shared with other branches of the government, such as ICE? 

ICE, or Immigrations and Customs Enforcement have been in the public eye for some time now, for their arrests of undocumented workers and immigration offenders. 

“Any time in the last three to four years that any data collection has come up, immigrants’ rights … have certainly been part of the argument,” says Brian Hofer, who is part of Oakland’s Privacy Advisory Commission. “Any data collected is going to be at risk when [ICE is] on a warpath, looking for anything they can do to arrest people. We’re definitely trying to minimize that exposure”.

This unregulated data is what is helping ICE locate and monitor undocumented people violating laws (Source).

Now Microsoft is listening to your Skype calls

A new day, a new privacy scandal. This week, Microsoft and Skype employees were revealed to be reviewing real consumer video chats, to check the quality of their software, and its translations. 

The problem is that they are keeping their customers in the dark on this, as do most tech companies. Microsoft has not told its consumers that they do this, though the company claims to have their users’ permission. 

“I recommend users refrain from revealing any identifying information while using Skype Translation, and Cortana. Unless you identify yourself in the recording, there’s almost no way for a human analyst to figure out who you are”, says privacy advocate Paul Bischoff (Source).

Essentially Alexa, Siri, Google Home, and Skype are listening to your conversations. However, instead of avoiding these products, we are compromising our privacy for convenience and efficiency. 

Canadians want more healthcare tech, regardless of privacy risks

New studies indicate that Canadians are open to a future where healthcare is further enhanced with technology, despite privacy concerns. 

The advantages of these innovations include reduced medical errors, reduced data loss, better-informed patients, and much more. 84% of respondents wanted to access their health data on an electronic platform, as opposed to hard copy files. 

Dr. Gigi Osler, president of the Canadian Medical Association, states, “We’ve got hospitals that still rely on pagers and fax machines, so the message is clear that Canada’s health system needs an upgrade and it’s time to modernize”. 

Furthermore, most respondents look forward to the possibility of online doctor visits, believing that treatment could be faster and more convenient (Source).

After all, if we bank, shop, read, watch movies and socialize online, why can’t we get digital treatment too? 

Join our newsletter