Select Page
Location data and your privacy

Location data and your privacy

As technology grows to surround the entirety of our lives, it comes as no surprise that each and every move is tracked and stored by the very apps we trust with our information. With the current COVID-19 pandemic, the consequences of inviting these big techs into our every movement are being revealed. 

At this point, most of the technology-users understand the information they do give to companies, such as their birthdays, access to pictures, or other sensitive information. However, some may be unknowing of the amount of location data that companies collect and how that affects their data privacy. 

Location data volume expected to grow

We have created over 90% of the world’s data since 2017. As wearable technology continues to grow in trend, the amount of data a person creates each day is on a steady incline. 

One study reported that by 2025, the installation of worldwide IoT-enabled devices is expected to hit 75 billion. This astronomical number highlights how intertwined technology is into our lives, but also how welcoming we are to that technology; technology that people may be unaware of the ways their data is collected. 

Marketers, companies and advertisers will increasingly look to using location-based information as its volume grows. A recent study found that more than 84% of marketers use location data for their 

The last few years have seen a boost in big tech companies giving their users more control over how their data is used. One example is in 2019 when Apple introduced pop-ups to remind users when apps are using their location data.

Location data is saved and stored for the benefit of companies to easily direct personalized ads and products to your viewing. Understanding what your devices collect from you, and how to eliminate data sharing on your devices is crucial as we move forward in the technological age. 

Click here to read our past article on location data in the form of wearable devices. 

COVID-19 threatens location privacy

Risking the privacy of thousands of people or saving thousands of lives seems to be the question throughout this pandemic; a question that is running out of time for debate. Companies across the big 100 have stepped up to volunteer its anonymized data, including SAS, Google and Apple. 

One of the largest concerns is not how this data is being used in this pandemic, but how it could be abused in the future. 

One Forbes article brought up a comparison of the regret many are faced with after sharing DNA with sites like 23andMe, leading to health insurance issues or run-ins with criminal activity. 

As companies like Google, Apple and Facebook step-up to the COVID-19 technology race, many are expressing their concerns as these companies have not been deemed reliable for user data anonymization. 

In addition to the data-collecting concern, governments and big tech companies are looking into contact-tracking applications. Civilian location data being used for surveillance purposes, while alluded for the greater good of health and safety, raises multiple red flags into how our phones can be used to survey our every movement. To read more about this involvement in contact tracing apps, read our latest article

Each company has released that it anonymizes its collected data. However, in this pandemic age, anonymized information can still be exploited, especially at the hands of government intervention. 

With all this said, big tech holds power over our information and are playing a vital role in the COVID-19 response. Paying close attention to how user data is managed post-pandemic will be valuable in exposing how these companies handle user information.

 

Google and Apple to lead data privacy in the global pandemic

Google and Apple to lead data privacy in the global pandemic

What happens to privacy in a global pandemic? This question continues to be debated as countries like Canada, the United States and the United Kingdom move into what is assumed to be the peak of COVID-19 spreading in their countries. 

The world watched as countries like South Korea and China introduced grave measures to track its citizens, essentially stripping their privacy rights. But as numbers continue to rise in the western world, governments are looking to implement similar tracking technologies into their own citizen’s devices. 

At the frontlines of the tracing-app, is the U.K.’s National Health Service’s health technology development unit (NHSX). The U.K.’s contract-tracing app would track COVID-19 positive patients and alert the people they had been in contact with. 

However, prior to their launching of the app, big tech companies Google and Apple released their joint contact-tracing system, limiting invasive apps on their devices and therefore derailing the development of the app. 

Google and Apple have released that they are not releasing an app themselves, but instead a set of ‘privacy-focused API’s” to ensure that governments are not releasing invasive apps onto their citizen’s devices. 

Countries like Singapore that have these contact tracing apps on phones have problems that Google and Apple are looking to avoid. Issues including requiring citizens to leave their phones unlocked or severe battery drainage. 

Google and Apple have informed that these Bluetooth-systems will run in the background and work even when the phone is locked. They have also released that this system will cease to run once the pandemic is over. 

The two big tech companies have created a high standard for privacy in the pandemic age. They will have to grant permission not only for the government applications to go live but for health authorities to access the technology (source). They have also released that they are developing policies on whether they will allow tracing apps to gather location. 

One Oxford University researcher said that around two-thirds of a country’s population would need to be involved for the contact tracing to be effective. However, the top U.S. infection diseases export says that many Americans would be inclined to reject any contact-tracing app that knowingly collects their location data.

The idea behind the Google/Apple partnership is to ensure governments are not forcing highly invasive technologies onto its citizens, and that while the world is engulfed in its chaos, personal privacy remains as intact as possible.

The NHSX has continued with its app development. However, it is alleged that they are in close contact with the Apple/Google partnership. The European Commission told one reporter that “mobile apps should be based on anonymized data and work with other apps in E.U. countries.” 

As the world struggles to contain this virus’ spread, apps and systems such as the Google/Apple partnerships could have a great effect on how COVID19 is managed. It’s important going forward not only to pay attention to how our data is being managed, but also how our anonymized data can be helped to save others.

 

Data sharing in a global pandemic

Data sharing in a global pandemic

As the world continues to brace the impact of Covid-19, data privacy remains a central concern in the development of working from home and containing the spread. Last week, Google released an anonymized dataset of location data outlining hotspots of group gatherings. In addition to this, UK officials are looking to introduce contact tracing applications. For those companies containing the spread within their offices, Zoom has risen as a trusted video conference app. However, in the last few weeks, serious privacy concerns have increased.  

Google releases location data.

On Friday, Google released its COVID-19 Community Mobility Reports. These reports are a collection of data from users who have opted in to sharing their location history with the search giant. This location history comes from Google maps, where the data is aggregated and anonymized. 

Google says that by releasing this data, public health officials are able to determine which businesses are most crowded. This helps to determine the type of grand scale decisions to be made in terms of curfews, stay-at-home orders, or which companies are necessary to remain open. 

The reports are open for public viewing, opening up the data of 131 countries with certain countries displaying regional data such as provinces or states. After selecting the country, Google creates a PDF file with the data for downloaded. 

Each PDF report contains six categories of location data. These include: 

  • Retail and recreation (restaurants, shopping centers, libraries, etc.)
  • Grocery and pharmacy (supermarkets, drug stores)
  • Parks (beaches, dog parks)
  • Transit stations (subways, bus and train stations)
  • Offices 
  • Residences 

(source

Creating these reports comes after weeks of requests from public health officials asking for applications to test a person’s contact with an infected patient. While Google’s data is unable to determine that these datasets may help cities or countries to determine preventive measures. 

Other countries have used similar, more aggressive technology using location data. At the beginning of March, we released an article about Korea’s efforts to stop the spreading by using people’s location to track if they leave their house or not. 

Another news article revealed Taiwan also participated in using location data to track its citizens. Even by going as far as calling phones twice a day to ensure citizens are not just leaving their house without their phone. 

Google released that its data will cover the past 48-72 hours, and has yet to determine when the data will be updated.

Contact Tracing Apps

Similar to the data released by Google, there is more pressure on governments to introduce contact tracing apps like the ones seen in Korea or Taiwan.

In the UK, researchers have begun compiling papers to discuss how privacy can be handled and mishandled in these tracing apps.

One researcher, Dr. Yves-Alexandre de Montjoye, created a whitepaper outlining eight questions to understanding how privacy is protected in these types of potential apps. 

These eight questions include: 

  • How do you limit personal data gathered by the app developers?
  • How do you protect the anonymity of every user? 
  • Does the app reveal to its developers the identity of users who are at risk? 
  • Could the app be used by users to learn who is infected or at risk, even in their social circle? 
  • Does the app allow users to learn any personal information about other users? 
  • Could external parties exploit the app to track users or find out who’s infected? 
  • Do you put in place additional measures to protect the personal data of infected and at-risk users?
  • How can users verify that the system does what it says? 

(source

As governments move quickly to contain the spread of Covid-19, actions like contact tracing apps are becoming seriously considered for introduction. However, patient privacy should not disappear. As the world braces for an even more significant influx of Covid-19 cases, it is in the hand of government officials and big tech to work together to contain the spread while maintaining data privacy.

Zoom faced with data sharing lawsuit

A few weeks ago, we released an article outlining small privacy concerns about introducing Zoom into the work from home environment. Since that article, Zoom has not only increased in popularity but insignificant privacy concerns as well. 

On March 26, Vice news released an article detailing Zoom’s relationship with Facebook. 

Vice reported that Zoom sends analytic data to Facebook, even when a Zoom user doesn’t have a Facebook. While this type of data transfer is not uncommon between companies and Facebook SDK, Zoom’s own privacy policy leaves its data-sharing out.

The article report that Zoom notifies Facebook when a user opens its app as well as specific identifying details for companies to target users with advertisements through. 

Zoom reached out to the news article stating they will be removing the Facebook SDK. However, it wasn’t long before the video-conferencing company was hit with a lawsuit. 

But the privacy concerns don’t just come from data-sharing. The past few weeks have seen numerous reports of account hacking, allegedly not offering end-to-end encryption, password stealing, leaks, and microphone/camera hijacking

And these claims are only just starting to roll in. As Zoom projects to the top of one used video-conferencing technology during this work-from-home burst, the next few weeks could see thousands of data privacy broken.

How can working from home affect your data privacy?

How can working from home affect your data privacy?

On March 11, the World Health Organization declared the Coronavirus (COVID-19) a global pandemic, sending the world into a mass frenzy. Since that declaration, countries around the world have shut borders, closed schools, requested citizens to stay indoors, and sent workers home. 

While the world may appear to be at a standstill, some jobs still need to get done. Like us at CryptoNumerics, companies have sent their workers home with the tools they need to complete their regularly scheduled tasks from the comfort of their own homes. 

However, with a new influx of people working from home, insecure networks, websites or AI tools can lead company information vulnerable. In this article, we’ll go over where your privacy may be at risk during this work-from-home season.

Zoom’s influx of new users raises privacy concerns.

Zoom is a video-conferencing company used to host meetings, online-charts and online collaboration. Since people across the world are required to work or participate in online schooling, Zoom has seen a substantial increase in users. In February, Zoom shares raised 40%, and in 3 months, it has doubled its monthly active users from the entire year of 2019 (Source). 

While this influx and global exposure are significant for any company, this unprecedented level of usage can expose holes in their privacy protection efforts, a concern that many are starting to raise

Zoom’s growing demand makes them a big target for third-parties, such as hackers, looking to gain access to sensitive or personal data. Zoom is being used by companies large and small, as well as students across university campus. This means there is a grand scale of important, sensitive data could very well be vulnerable. 

Some university professors have decided against Zoom telecommuting, saying the Zoom privacy policy, which states that they may collect information about recorded meetings that take place in video conferences, raises too many concerns of personal privacy. 

On a personal privacy level, Zoom gives the administrator of the conference call the ability to see when a caller has moved to another webpage for over 30 seconds. Many are calling this option a violation of employee privacy. 

Internet-rights advocates have begun urging Zoom to begin publishing transparent reports detailing how they manage data privacy and data security.  

Is your Alexa listening to your work conversations?

Both Google Home and Amazon’s Alexa have previously made headlines for listening to homes without being called upon and saving conversation logs.  

Last April, Bloomberg released a report highlighting Amazon workings listening to and transcribing conversations heard through Alexa’s in people’s homes. Bloomberg reported that most voice assistant technologies rely on human help to help improve the product. They reported that not only were the Amazon employees listening to Alexa’s without the Alexa’s being called on by users but also sharing the things they heard with their co-workers. 

Amazon claims the recordings sent to the “Alexa reviewers” are only provided with an account number, not an address or full name to identify a user with. However, the entire notion of hearing full, personal conversations is uncomfortable.

As the world is sent to work from home, and over 100 million Alexa devices are in American homes, there should be some concern over to what degree these speaker systems are listening in to your work conversations.   

Our advice during this work-from-home-long-haul? Review your online application privacy settings, and be cautious of what devices may be listening when you have important meetings or calls. 

Banking and fraud detection; what is the solution?

Banking and fraud detection; what is the solution?

As the year comes to a close, we must reflect on the most historic events in the world of privacy and data science, so that we can learn from the challenges, and improve moving forward.

In the past year, General Data Protection Regulation (GDPR) has had the most significant impact on data-driven businesses. The privacy law has transformed data analytics capacities and inspired a series of sweeping legislation worldwide: CCPA in the United States, LGPD in Brazil, and PDPB in India. Not only has this regulation moved the needle on privacy management and prioritization, but it has knocked major companies to the ground with harsh fines. 

Since its implementation in 2018, €405,871,210 in fines have been actioned against violators, signalling that the DPA supervisory authority has no mercy in its fervent search for the unethical and illegal actions of businesses. This is only the beginning, as the deeper we get into the data privacy law, the more strict regulatory authorities will become. With the next wave of laws hitting the world on January 1, 2020, businesses can expect to feel pressure from all locations, not just the European Union.

 

The two most breached GDPR requirements are Article 5 and Article 32.

These articles place importance on maintaining data for only as long as is necessary and seek to ensure that businesses implement advanced measures to secure data. They also signal the business value of anonymization and pseudonymization. After all, once data has been anonymized (de-identified), it is no longer considered personal, and GDPR no longer applies.

Article 5 affirms that data shall be “kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the personal data are processed.”

Article 32 references the importance of “the pseudonymization and encryption of personal data.”

The frequency of a failure to comply with these articles signals the need for risk-aware anonymization to ensure compliance. Businesses urgently need to implement a data anonymization solution that optimizes privacy risk reduction and data value preservation. This will allow businesses to measure the risk of their datasets, apply advanced anonymization techniques, and minimize the analytical value lost throughout the process.

If this is implemented, data collection on EU citizens will remain possible in the GDPR era, and businesses can continue to obtain business insights without risking their reputation and revenue. However, these actions can now be done in a way that respects privacy.

Sadly, not everyone has gotten the message, as nearly 130 fines have been actioned so far.

The top five regulatory fines

GDPR carries a weighty fine:  4% of a business’s annual global turnover, or €20M, whichever is greater. A fine of this size could significantly derail a business, and if paired alongside brand and reputational damage, it is evident that GDPR penalties should encourage businesses to rethink the way they handle data

1. €204.6M: British Airways

Article 32: Insufficient technical and organizational measures to ensure information security

User traffic was directed to a fraudulent site because of improper security measures, compromising 500,000 customers’ personal data. 

 2. €110.3M: Marriott International

Article 32: Insufficient technical and organizational measures to ensure information security

The guest records of 339 million guests were exposed in a data breach due to insufficient due diligence and a lack of adequate security measures.

3. €50M: Google

Article 13, 14, 6, 5: Insufficient legal basis for data processing

Google was found to have breached articles 13, 14, 6, and 5 because it created user accounts during the configuration stage of Android phones without obtaining meaningful consent. They then processed this information without a legal basis while lacking transparency and providing insufficient information.

4. €18M: Austrian Post

Article 5, 6: Insufficient legal basis for data processing

Austrian Post created more than three million profiles on Austrians and resold their personal information to third-parties, like political parties. The data included home addresses, personal preferences, habits, and party-affinity.

5. €14.5M: Deutsche Wohnen SE

Article 5, 25: Non-compliance with general data processing principles

Deutsche Wohnen stored tenant data in an archive system that was not equipped to delete information that was no longer necessary. This made it possible to have unauthorized access to years-old sensitive information, like tax records and health insurance, for purposes beyond those described at the original point of collection.

Privacy laws like GDPR seek to restrict data controllers from gaining access to personally identifiable information without consent and prevent data from being handled in manners that a subject is unaware of. If these fines teach us anything, it is that investing in technical and organizational measures is a must today. Many of these fines could have been avoided had businesses implemented Privacy by Design. Privacy must be considered throughout the business cycle, from conception to consumer use. 

Businesses cannot risk violations for the sake of it. With a risk-aware privacy software, they can continue to analyze data while protecting privacy -with the guarantee of a privacy risk score.

Resolution idea for next year: Avoid ending up on this list in 2020 by adopting risk-aware anonymization.

The data access bottleneck

The data access bottleneck

We create an influx of information each day, minute, and second. In the United States alone, 4, 416, 720 gigabytes of data were used every minute in 2019. This number is reported to have risen 41% since its 2018 report. 

As we continue entering the fast-paced era of technology, the world has been bombarded with hoards of user information without the resources ready to manage it. The role of Data Scientist, a career that didn’t exist ten years ago, has topped Glassdoor’s list of the best roles in America for the last five years. 

The responsibility of a data scientist includes collecting and cleaning data, performing analysis, applying data science techniques and measuring analytic results. This vital process helps businesses by providing customer insights to help manage innovation. However, the process of receiving and analyzing data loses precedence as cleaning and organizing the data takes time.

Data scientists search out the data needed, through other departments or data lakes, creating hours of waiting to receive the information they need. When finally provided with the information necessary, it may contain severe data quality issues. This takes a considerable amount of time away from being able to provide an actual analysis of the data. 

There is a typical time division for this very scenario, known as the 80/20 rule. 80% of data scientists’ work time is spent finding data and cleaning it, while 20% of their time is spent providing analysis on the data. 

This bottleneck of information leads to an increase in potential error and dries up analytical resources. 

One survey conducted by TMMData and the digital analytics association created insight into the difficulty a data scientist faces before getting the opportunity to implement analytic techniques. 56.9% of the 800 surveyed said it takes a few days to a few weeks before they are granted access to all the data they need.

The study also said that only ⅓ are able to immediately access all the data they need or receive the required data in less than one day. 

On top of this, 43 respondents to the survey mentioned that gaining data access to be one of their top two analytics challenges. 

On top of the difficulty of gaining access to the data, this influx of information stored in data lakes is of poor quality. 48% of data scientists questioned the accuracy of the data they received. This incomplete or bad data can lead a data scientist in the wrong direction of their analytic process.

In 2017, IBM released that the two previous years had created 90% of the world’s data. As technology grows, the ability to consume and organize data must expand as well. To reverse the 80/20 time statistic for data science, companies’ abilities to harness and manage data as its collected must improve. 

Flipping 80/20 for data science

Based on the statistics presented, the most significant issue for data scientists involved access wait time and cleaning the data once received. 

It’s understandable why data is so disorganized right now. No one could predict the pace the internet and technology took just ten years ago. Knowing how and when to prepare and store data is still a relatively new issue. 

To improve this issue, the efficiency of data prep must be increased, and the number of people involved with the data should expand. By expanding the data over the organization, and limiting the prep time using less manual methods, companies will see a faster turnover of data. 

Now is the time to play catch-up, and organize the incoming data so that analytics can be prepped and ready to move your company forward as fast as possible.