Select Page
Facebook collecting healthcare data

Facebook collecting healthcare data

As many of our previous blogs have highlighted, COVID-19 is severely impacting the tech world. Privacy regulations have been a hot topic for debate between governments, Big tech, and its users. 

 Facebook has joined the top companies taking advantage of user data in COVID-19 research. As well, Brazil’ LGPD sees pushback in its enforcing because of COVID-19. Opposite to Brazil, US senators are introducing a new privacy bill to ensure American’s data privacy remains protected.

 

Facebook collecting symptom data

 In the current pandemic climate, tech companies of all sizes have stepped up to provide solutions and aid to governments and citizens struggling to cope with COVID-19. As we’ve highlighted in our previous blog posts, Google and Apple have been at the frontlines of introducing systems to protect their user privacy, while inflicting change in how communities track the virus.

 Following closely behind, Facebook has introduced its attempt to work with user data for the greater good of COVID-19 research. 

 Facebook announced its partnerships with different American universities to begin collecting symptom data in different countries. Facebooks CEO and founder told the Verge that the information could work to highlight COVID hotspots across the globe, especially in places where governments have neglected to address the virus’s severity.

Facebook has been working throughout this pandemic to demonstrated how aggregated & anonymized data can be used for good. 

However, not everyone is taking to Facebook’s sudden praise for user data control. One article highlighted how the company is still being investigated by the FTC over privacy issues

Facebook’s long list of privacy invasions for its users is raising some concerns over not how the data is currently being used, but how it will be handled after the pandemic has subsided. 

 

Brazil pushes back privacy legislation.

At the beginning of this year, we wrote an article outlining Brazil’s first data protection act, LGPD. This privacy legislation follows closely to that of the EU’s GDPR and will unify 40 current privacy laws the country has. 

Before COVID-19s effect on countries like Brazil, many tech companies were already pressuring the Brazilian government to change LGPD’s effective date.

On April 29th, the Brazilian president delayed the applicability date of the LGPD to May 3rd, 2021. By issuing this Provisional measure, the Brazilian Congress has been given 16 days to approve the new LGPD implementation. 

If Congress does not approve of this new date by May 15th, the Brazillian Congress must vote on the new LGPD date. If they do not, the LGPD will come into effect on August 14th, 2020. 

Brazil’s senate has now voted to move its introduction in January 2021, with sanctions coming to action in August 2021. Meaning all lawsuits and complaints can be proposed as of January 1st, and all action will be taken on August 1st (source).

 

America introduces new privacy law.

Much like Brazil’s privacy legislation being affected by COVID-19, some US senators have stepped up to ensure the privacy of American citizens data.

The few senators proposing this bill have said they are working to “hold businesses accountable to consumers if they use personal data to fight the COVID-19 pandemic.”

This bill does not target contact tracing apps like those proposed by Apple and Google. However, it does ensure that these companies are effectively using data and protecting it. 

The bill requires companies to gain consent from users in order to collect any health or location data. As well, it forces companies to ensure that the information they collect is properly anonymized and cannot be re-identified. The bill requires that these tech companies will have to delete all identifiable information once COVID-19 has subsided, and tracking apps are no longer necessary. 

The bill has wide acceptance across the congressional floor and will be enforced by the state attorney generals. This privacy bill is being considered a big win for Americans’ privacy rights, especially with past privacy trust issues between big tech companies and its users. 

Location data and your privacy

Location data and your privacy

As technology grows to surround the entirety of our lives, it comes as no surprise that each and every move is tracked and stored by the very apps we trust with our information. With the current COVID-19 pandemic, the consequences of inviting these big techs into our every movement are being revealed. 

At this point, most of the technology-users understand the information they do give to companies, such as their birthdays, access to pictures, or other sensitive information. However, some may be unknowing of the amount of location data that companies collect and how that affects their data privacy. 

Location data volume expected to grow

We have created over 90% of the world’s data since 2017. As wearable technology continues to grow in trend, the amount of data a person creates each day is on a steady incline. 

One study reported that by 2025, the installation of worldwide IoT-enabled devices is expected to hit 75 billion. This astronomical number highlights how intertwined technology is into our lives, but also how welcoming we are to that technology; technology that people may be unaware of the ways their data is collected. 

Marketers, companies and advertisers will increasingly look to using location-based information as its volume grows. A recent study found that more than 84% of marketers use location data for their 

The last few years have seen a boost in big tech companies giving their users more control over how their data is used. One example is in 2019 when Apple introduced pop-ups to remind users when apps are using their location data.

Location data is saved and stored for the benefit of companies to easily direct personalized ads and products to your viewing. Understanding what your devices collect from you, and how to eliminate data sharing on your devices is crucial as we move forward in the technological age. 

Click here to read our past article on location data in the form of wearable devices. 

COVID-19 threatens location privacy

Risking the privacy of thousands of people or saving thousands of lives seems to be the question throughout this pandemic; a question that is running out of time for debate. Companies across the big 100 have stepped up to volunteer its anonymized data, including SAS, Google and Apple. 

One of the largest concerns is not how this data is being used in this pandemic, but how it could be abused in the future. 

One Forbes article brought up a comparison of the regret many are faced with after sharing DNA with sites like 23andMe, leading to health insurance issues or run-ins with criminal activity. 

As companies like Google, Apple and Facebook step-up to the COVID-19 technology race, many are expressing their concerns as these companies have not been deemed reliable for user data anonymization. 

In addition to the data-collecting concern, governments and big tech companies are looking into contact-tracking applications. Civilian location data being used for surveillance purposes, while alluded for the greater good of health and safety, raises multiple red flags into how our phones can be used to survey our every movement. To read more about this involvement in contact tracing apps, read our latest article

Each company has released that it anonymizes its collected data. However, in this pandemic age, anonymized information can still be exploited, especially at the hands of government intervention. 

With all this said, big tech holds power over our information and are playing a vital role in the COVID-19 response. Paying close attention to how user data is managed post-pandemic will be valuable in exposing how these companies handle user information.

 

Key terms to know to navigate data privacy

Key terms to know to navigate data privacy

As the data privacy discourse continues to grow, it’s crucial that the terms used to explain data science, data privacy and data protection are accessible to everyone. That’s why we at CryptoNumerics have compiled a continuously growing Privacy Glossary, to help people learn and better understand what’s happening to their data. 

Below are 25 terms surrounding privacy legislations, personal data, and other privacy or data science terminology to help you better understand what our company does, what other privacy companies do, and what is being done for your data.

Privacy regulations

    • General Data Protection Regulation (GDPR) is a privacy regulation implemented in May 2018 that has inspired more regulations worldwide. The law determined data controllers must establish a specific legal basis for each and every purpose where personal data is used. If a business intends to use customer data for an additional purpose, then it must first obtain explicit consent from the individual. As a result, all data in data lakes can only be made available for use after processes have been implemented to notify and request permission from every subject for every use case.
    • California Consumer Privacy Act (CCPA) is a sweeping piece of legislation that is aimed at protecting the personal information of California residents. It will give consumers the right to learn about the personal information that businesses collect, sell, or disclose about them, and prevent the sale or disclosure of their personal information. It includes the Right to Know, Right of Access, Right to Portability, Right to Deletion, Right to be Informed, Right to Opt-Out, and Non-Discrimination Based on Exercise of Rights. This means that if consumers do not like the way businesses are using their data, they request for it to be deleted -a risk for business insights 
    • Health Insurance Portability and Accountability Act (HIPAA) is a health protection regulation passed in 1998 by President Clinton. This act gives patients the right to privacy and covers 18 personal identifiers that are required to be de-identified. This Act is applicable not only in hospitals but in places of work, schooling, etc.

Legislative Definitions of Personal Information

  • Personal Data (GDPR): Any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person’ (source)
  • Personal Information (PI) (CCPA): “information that identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household.” (source)
  • Personal Health Information (PHI) (HIPAA): considered to be any identifiable health information that is used, maintained, stored, or transmitted by a HIPAA-covered entity – A healthcare provider, health plan or health insurer, or a healthcare clearinghouse – or a business associate of a HIPAA-covered entity, in relation to the provision of healthcare or payment for healthcare services. PHI is made up of 18 identifiers, including names, social security number, and medical record numbers (source)

Privacy terms

 

  • Anonymization is a process where personally identifiable information (whether direct or indirect) from data sets is removed or manipulated to prevent re-identification. This process must be made irreversible. 
  • Data controller is a person, an authority or a body that determines the purposes for which and the means by which personal data is collected.
  • Data lake is a collection point for the data a business collects. 
  • Data processor is a person, an authority or a body that processes personal data on behalf of the controller. 
  • De-identified data is the result of removing or manipulating direct and indirect identifiers to break any links so that re-identification is impossible. 
  • Differential privacy is a privacy framework that characterizes a data analysis or transformation algorithm rather than a dataset. It specifies a property that the algorithm must satisfy to protect the privacy of its inputs, whereby the outputs of the algorithm are statistically indistinguishable when any one particular record is removed in the input dataset.
  • Direct identifiers are pieces of data that identify an individual without the need for more data, ex. name, SSN, etc.
  • Homomorphic encryption is a method of performing a calculation on encrypted information (ciphertext) without decrypting it (to plaintext) first.
  • Identifier: Unique information that identifies a specific individual in a dataset. Examples of identifiers are names, social security numbers, and bank account numbers. Also, any field that is unique for each row. 
  • Indirect identifiers are pieces of data that can be used to identify an individual indirectly, or with the combination of other pieces of information, ex. date of birth, gender, etc.
  • Insensitive: Information that is not identifying or quasi-identifying and that you do not want to be transformed.
  • k-anonymity is where identifiable attributes of any record in a particular database are indistinguishable from at least one other record.
  • Perturbation: Data can be perturbed by using additive noise, multiplicative noise, data swapping (changing the order of the data to prevent linkage) or generating synthetic data.
  • Pseudonymization is the processing of personal data in a way that the personal data can no longer be attributed to a specific data subject without the use of additional information. This is provided that such additional information is kept separately and is subject to technical and organizational
  • Quasi-identifiers (also known as Indirect identifiers) are pieces of information that on its own are not sufficient to identify a specific individual but when combined with other quasi-identifiers is possible to re-identify an individual. Examples of quasi-identifiers are zip code, age, nationality, and gender.
  • Re-identification, or de-anonymization, is when anonymized data (de-identified data) is matched with publicly available information, or auxiliary data, in order to discover the individual to which the data belong to.
  • Secure multi-party computation (SMC), or Multi-Party Computation (MPC), is an approach to jointly compute a function over inputs held by multiple parties while keeping those inputs private. MPC is used across a network of computers while ensuring that no data leaks during computation. Each computer in the network only sees bits of secret shares — but never anything meaningful.
  • Sensitive: Information that is more general among the population, making it difficult to identify an individual with it. However, when combined with quasi-identifiers, sensitive information can be used for attribute disclosure. Examples of sensitive information are salary and medical data. Let’s say we have a set of quasi-identifiers that form a group of women aged 40-50, a sensitive attribute could be “diagnosed with breast cancer.” Without the quasi-identifiers, the probability of identifying who has breast cancer is low, but once combined with the quasi-identifiers, the probability is high.
  • Siloed data is data stored away in silos with limited access, to protect it against the risk of exposing private information. While these silos protect the data to a certain extent, they also lock the value of the data.
How can working from home affect your data privacy?

How can working from home affect your data privacy?

On March 11, the World Health Organization declared the Coronavirus (COVID-19) a global pandemic, sending the world into a mass frenzy. Since that declaration, countries around the world have shut borders, closed schools, requested citizens to stay indoors, and sent workers home. 

While the world may appear to be at a standstill, some jobs still need to get done. Like us at CryptoNumerics, companies have sent their workers home with the tools they need to complete their regularly scheduled tasks from the comfort of their own homes. 

However, with a new influx of people working from home, insecure networks, websites or AI tools can lead company information vulnerable. In this article, we’ll go over where your privacy may be at risk during this work-from-home season.

Zoom’s influx of new users raises privacy concerns.

Zoom is a video-conferencing company used to host meetings, online-charts and online collaboration. Since people across the world are required to work or participate in online schooling, Zoom has seen a substantial increase in users. In February, Zoom shares raised 40%, and in 3 months, it has doubled its monthly active users from the entire year of 2019 (Source). 

While this influx and global exposure are significant for any company, this unprecedented level of usage can expose holes in their privacy protection efforts, a concern that many are starting to raise

Zoom’s growing demand makes them a big target for third-parties, such as hackers, looking to gain access to sensitive or personal data. Zoom is being used by companies large and small, as well as students across university campus. This means there is a grand scale of important, sensitive data could very well be vulnerable. 

Some university professors have decided against Zoom telecommuting, saying the Zoom privacy policy, which states that they may collect information about recorded meetings that take place in video conferences, raises too many concerns of personal privacy. 

On a personal privacy level, Zoom gives the administrator of the conference call the ability to see when a caller has moved to another webpage for over 30 seconds. Many are calling this option a violation of employee privacy. 

Internet-rights advocates have begun urging Zoom to begin publishing transparent reports detailing how they manage data privacy and data security.  

Is your Alexa listening to your work conversations?

Both Google Home and Amazon’s Alexa have previously made headlines for listening to homes without being called upon and saving conversation logs.  

Last April, Bloomberg released a report highlighting Amazon workings listening to and transcribing conversations heard through Alexa’s in people’s homes. Bloomberg reported that most voice assistant technologies rely on human help to help improve the product. They reported that not only were the Amazon employees listening to Alexa’s without the Alexa’s being called on by users but also sharing the things they heard with their co-workers. 

Amazon claims the recordings sent to the “Alexa reviewers” are only provided with an account number, not an address or full name to identify a user with. However, the entire notion of hearing full, personal conversations is uncomfortable.

As the world is sent to work from home, and over 100 million Alexa devices are in American homes, there should be some concern over to what degree these speaker systems are listening in to your work conversations.   

Our advice during this work-from-home-long-haul? Review your online application privacy settings, and be cautious of what devices may be listening when you have important meetings or calls. 

What does COVID-19 mean for patient privacy?

What does COVID-19 mean for patient privacy?

The rapid spread of the Coronavirus (COVID-19) has sent the world into mass shock, halting the movement in the economy, companies, schools and regular life. 

In situations of mass panic such as this, maintaining privacy and legislation compliance is the last thing on the publics’ minds. However, for companies and hospitals, this should not be the case. In this weekly news, we will go through how proper data sharing is beneficial, how governments are reacting to privacy concerns, and how employers should be handling their employees’ information.

Data Sharing and COVID-19

According to one Wired article released last week, Genomic data and data marketplaces across countries are being utilized for better understanding the virus and its unique spreading. 

NextStrain, an open-source application tracking bacteria evolution, is helping researchers release and share bacteria strains as close to 48hours after the bacteria is located.  

The article explains that NextStrain is an open-source application, and therefore allows research facilities to create their versions or use the application as a starting ground for other models of open research. 

By participating in this cross-platform data sharing, researchers “creates new opportunities to bridge the gap between public health and academia, and to enable novice users to explore the data as well.”

While this data sharing is proving helpful in moving quickly to understand and stop the growth of this virus, there are issues presented with sharing data. 

An issue with open-source data sharing, as one researcher shared with Wired, is that non-professionals can misinterpret the information, as one Twitter user published false information last week. This twitter thread not only stresses the importance of incorrect information but also how data can spread across platforms—thus emphasizing the importance of anonymizing the influx of COVID-19 patient data.

Last month, we released a short article involving genomic data and marketplaces, as well as the process of de-identifying its information. Click here to read more about what that entails. 

Crisis Communication 

Last week, we released an article about the lack of privacy in South Korea, as every detail of patients’ lives are disclosed to the public, in fear that regular people made contact with the infected individual.

As the virus moves toward Western countries, this handling of privacy must be prevented. However in unprecedented situations such as this, the “every-man-for-himself” mindset takes over for much of the public, as the concern of connection with an infected person spreads. 

One senior risk manager told Modern HealthCare, “It’s a slippery slope—if you let people know where the cases are, they may be more cautious and stay away from certain events,” she said. “If you say nothing, they get a false sense of security.” 

When looking to release information to the public or between researchers, hospitals need to ensure their data is de-identified and compliant with legislation like the Health Insurance Portability and Accountability Act (HIPAA). Not doing so leaves organizations liable to penalties ranging from $100 to $50,000 per violation.

In a newly released Advis survey, only 39% of surveyed U.S hospitals reported that they were prepared for an outbreak like COVID-19. This level of unpreparedness is where cracks in patient privacy can open up, and sensitive data is put at risk of the general public.  

COVID-19 and personal privacy 

Last month, the U.S Department of Health and Human Services released a bulletin outlining HIPAA and privacy factors in response to the outbreak. 

Highlighted in this bulletin is the minimum required disclosures of employers and workplaces as well as the implications versus necessary action of sharing patient data. This bulletin serves as a reminder to the general public of understanding the importance of privacy protection, especially in scenarios as drastic as the current situation.

Because of the panic this virus causes, the mass fear that is created has to be dealt with by authority positions properly. Employers and companies must ensure they are approaching the handling of this pandemic with consideration of patient privacy and legislation compliance. 

One U.S law firm, Sidley, created and released an elaborate list of questions companies should be reflecting on while dealing with the COVID-19 virus. In terms of privacy, some items include; 

  • What information can companies collect from third parties and open sources about employees’ and others’ health and risk of exposure?
  • Are there statutory, regulatory or contractual restrictions on any data collection, processing or dissemination contemplated to address COVID-19 risks? What are the risks of these activities?
  • Are existing privacy disclosures and international data transfer mechanisms adequate to address any new data collection and analyses?
  • Is a privacy impact assessment, or a security risk assessment, required or advisable for any new data-related activities?

(Source)

The main struggle for companies right now is ensuring that their employee information is dealt with in compliance with privacy legislation, while still keeping in mind the safety of the other workers.

Join our newsletter


Data sharing is an issue across industries

Data sharing is an issue across industries

Privacy, as many of our previous blogs have enforced, is essential not only on a business-customer relationship but also on a moral level. The recent Fitbit acquisition by Google has created big waves in the privacy sphere, as the customer’s health data is at risk, due to Google’s past dealings with personal information. On the topic of healthcare data, the recent Coronavirus panic has thrown patient privacy out the window, as the fear of the spreading virus rises. Finally, data sharing continues to raise eyes as a popular social media app, TikTok scrambles to protect its privacy reputation.  

Fitbit acquisition causing major privacy concerns

From its in-house command system to being the world’s most used search engine, Google has infiltrated most aspects of regular life. There are seemingly no corners left untouched by the search engine. 

In 2014, Google released its Wear OS, a watch technology for monitoring health, as well as for use compatible with phone technology. While wearable technology has soared to the top of technology chart, as a popular way to track and manage your health and lifestyle, Google’s Wear OS has not gained the popularity necessary to maintain itself as a strong tech competitor.  

In November of last year, Google announced its acquisition of Fitbit for $2.1 billion. Fitbit has sold over 100 million devices and is worn by over 28 million people, 24 hours a day, 7 days a week. Many are calling this Google’s attempt to recover from its failing project.

But there is more to this acquisition than staying on top of the market; personal data. 

Google’s terrible privacy reputation is falling onto Fitbit, as fears that the personal information FitBit holds, like sleep patterns or heart rate, will fall into the hands of third parties and advertisers.  

Healthcare is a large market, one of which Google has been silently buying into for years. Accessing personal health information gives Google an edge in the healthcare partnerships it’s been looking for. 

Fitbit has come under immense scrutiny after its announced partnership with Google, seeing sales drop 5% in 2019. Many are urging Fitbit consumers to ditch their products amidst the acquisition.

However, Fitbit still maintains that users will be in full control of their data and that the company will not see personal information to Google. 

The partnership will be followed with a close eye going forward, as government authorities such as the Australian Competition and Consumer Commission open inquiries into the companies intentions.

TikTok scrambling to fix privacy reputation

TikTok is a social media app that has taken over video streaming services. With over 37 million users in the U.S. last year, TikTok has been downloaded over 1 billion times. And that number is expected to rise 22% this year

While the app is reporting these drastically high numbers for downloading, the app has been continuously reprimanded for its terrible privacy policy and its inability to protect its user’s information. After already being banned from companies across the U.S, one Republican Senator, Josh Hawley, is introducing legislation to prohibit federal workers from using the app. This comes from several security flaws reported against the app in January, addressing user location and access to user information. 

The CEO of Reddit recently criticized TikTok, saying he tells people, “don’t install that spyware on your phone.”

These privacy concerns stem from the app’s connection with the Chinese government. In 2017, viral app Musical.ly was acquired and merged with TikTok by Beijing company, ByteDance, for $1 billion. Chinese law requires companies to comply with government intelligence operations if asked, meaning apps like TikTok would have no authority to decline government access to their data.

In response to their privacy backlash, the company made a statement last year saying all their data centers are located entirely outside of China. However, their privacy policy does state that they share a variety of user data with third parties. 

In new attempts to combat all privacy concerns, ex-APD, Roland Cloutier has been hired as Chief Information Security Officer to oversee privacy information issues within the popular app.

With Cloutier’s long history in cybersecurity, there is hope that the most popular app among will soon gain a better privacy reputation.

Coronavirus raising concerns over person information 

The Coronavirus is a deadly, fast-spreading respiratory illness that has moved quickly throughout China and now reported in 33 countries across the world. 

Because of this, China has been thrown into a rightful panic and has gone to all lengths to combat and protect its spreading. However, in working to protect the continuous spread of the virus, many are saying that patient privacy is being thrown out the window.

Last month China put out a ‘close contact’ app, testing people to see if they’ve been around people who have or contracted the virus. The app assigns a colour code to users; green for safe, yellow for required 7day quarantine, and red is a 14day quarantine. 

Not only is the app required to enter public places like subways or malls, but the data is also shared with police. 

The New York Times released that the app sends a person’s location, city name and an identifying code number to the authorities. China’s already high-tech surveillance has reached new limits, as the times reports that surveillance cameras placed around neighborhoods are being strictly monitored, watching residents who present yellow or red cards.

South Korea has also thrown patient privacy to the wind, as text messages are sent out, highlighting every movement of individuals who contracted the virus. One individual’s extra-marital affair was exposed through the string of messages, revealing his every move before contracting the virus, according to the Guardian.

The question on everyone’s mind now is, what happens to privacy when the greater good is at risk?

For more privacy blogs, click here

Join our newsletter