CCPA is here. Are you compliant?

CCPA is here. Are you compliant?

As of January 1, 2020, the California Consumer Privacy Act (CCPA) came into effect and has already altered the ways companies can make use of user data. 

Before the CCPA implementation, Big Data companies had the opportunity to harvest user data and use it for data science, analytics, AI, and ML projects. Through this process, consumer data was monetized without protection for privacy. With the official introduction of the CCPA, companies now have no choice but to oblige or pay the price. Therefore begging the question; Is your company compliant?

CCPA Is Proving That Privacy is not a Commodity- It’s a Right

This legislation enforces that consumers are safe from companies selling their data for secondary purposes. Without explicit permission to use data, companies are unable to utilize said data.

User data is highly valuable for companies’ analytics or monetization initiatives. Thus, risking user opt-outs can be detrimental to a company’s progressing success. By de-identifying consumer data, companies can follow CCPA guidelines while maintaining high data quality. 

The CCPA does not come without a highly standardized ruleset for companies to satisfy de-identification. The law comes complete with specific definitions and detailed explanations of how to achieve its ideals. Despite these guidelines in place, and the legislation only just being put into effect, studies have found that only 8% of US businesses are CCPA compliant.  

For companies that are not CCPA compliant as of yet, the time to act is now. By thoroughly understanding the regulations put out by the CCPA, companies can protect their users while still benefiting from their data. 

To do so, companies must understand the significance of maintaining analytical value and the importance of adequately de-identified data. By not complying with CCPA, an organization is vulnerable to fines up to $7500 per incident, per violation, as well as individual consumer damages up to $750 per occurrence.

For perspective, after coming into effect in 2019, GDPR released that its fines impacted companies at an average of 4% of their annual revenue.

To ensure a CCPA fine is not coming your way, assess your current data privacy protection efforts to ensure that consumers:

  • are asked for direct consent to use their data
  • can opt-out or remove their data for analytical purposes
  • data is not re-identifiable

In essence, CCPA is not impeding a company’s ability to use, analyze, or monetize data. CCPA is enforcing that data is de-identified or aggregated, and done so to the standards that its legislation requires.

Our research found that 60% of datasets believed, by companies, to be de-identified, had a high re-identification risk. There are three methods to reduce the possibility of re-identification: 

  • Use state-of-the-art de-identification methods
  • Assess for the likelihood of re-identification
  • Implement controls, so data required for secondary purposes is CCPA compliant

Read more about these effective privacy automation methods in our blog, The business Incentives to Automate Privacy Compliance under CCPA.

Manual Methods of De-Identification Are Tools of The Past

A standard of compliance within CCPA legislation involves identifying which methods of de-identification leaves consumer data susceptible to re-identification. The manual way, which is extremely common, can leave room for re-identification. By doing so, companies are making themselves vulnerable to CCPA.

Protecting data to a company’s best abilities is achievable through techniques such as k-anonymity and differential privacy. However, applying manual methods is impractical for meeting the 30-day gracing period CCPA provides or in achieving high-quality data protection.

Understanding CCPA ensures that data is adequately de-identification and has removed risk, all while meeting all legal specifications.

Achieving CCPA regulations means ditching first-generation approaches to de-identification, and adopting privacy automation defers the possibility of re-identification. Using privacy automation as a method to protect and utilize consumer’s data is necessary for successfully maneuvering the new CCPA era. 

The solution of privacy automation ensures not only that user data is correctly de-identified, but that it maintains a high data quality. 

CryptoNumerics as the Privacy Automation Solution

Despite CCPA’s strict guidelines, the benefits of using analytics for data science and monetization are incredibly high. Therefore, reducing efforts to utilize data is a disservice to a company’s success.

Complying with CCPA legislation means determining which methods of de-identification leave consumer data susceptible to re-identification. Manual approach methods of de-identification including masking, or tokenization, leave room for improper anonymization. 

Here, Privacy Automation becomes necessary for an organization’s analytical tactics. 

Privacy automation abides CCPA while benefiting tools of data science and analytics. If a user’s data is de-identified to CCPA’s standards, conducting data analysis remains possible. 

Privacy automation revolves around assessment, quantification, and assurance of data. Simultaneously, a privacy automation tool measures the risk of re-identification, applying data privacy protection techniques, and providing audit reports. 

A study by PossibleNow indicated that 45% of companies are in the process of preparing, but had not expected to be compliant by the CCPA’s implementation date. Putting together a privacy automation tool to better process data and prepare for the new legislation is critical in a companies success with the CCPA. Privacy automation products such as CN-Protect allow companies to succeed in data protection while benefiting from the data’s analytics. (Learn more about CN-Protect)

Join our newsletter


Privacy: The Most Talked About Gadget of CES 2020

Privacy: The Most Talked About Gadget of CES 2020

This week Las Vegas once again saw the Consumer Electronics Show (CES), accompanied by a range of flashy new gadgets. Most significant among the mix; privacy. 

Technology front runners such as Facebook, Amazon, and Google took the main stage in unveiling data privacy changes in their products, as well as headlining discussions surrounding the importance of consumer privacy. However, through each reveal, attendees noticed gaps and missteps in these companies’ attempts at privacy.

Facebook: A New Leader in Data Privacy? 

This year, Facebook attempted to portray itself as a changed company in the eyes of privacy. Complete with comfortable seating and flowers, Facebook’s CES booth revealed a company dedicated to customer privacy, pushing the idea that Facebook does not sell customer data. 

Originally created in 2014, Facebook relaunched a new-and-improved “Privacy Checkup”, complete with easy to manage data-sharing settings. Facebook took the opportunity at this year’s CES to display added features such as the ability to turn off facial recognition, managing who can see a user account or posts, and the ability to remove/add preferences based on personal browsing history.

While these changes to privacy settings are a step in the right direction towards protecting user data, attendees could not help but notice the side-stepping of significant data privacy initiatives of which Facebook is ignoring. Most notably, the lack of user control on how advertisers use personal information. 

Ring’s New Control Center: Fix or Flop?

Ring has been a hot commodity in household security since its purchase by Amazon in 2018. However, recently, the company has come under fire for its law enforcement partnerships. 

In light of mounting hacking concerns, the home security company utilized CES to announce a new dashboard for both Apple and Android users labeled “the control center”. This center provides the user with the opportunity to manage connected Ring devices, third-party devices, as well as providing the user with options for law enforcement to request access to Ring videos. 

Ring has missed initial requests of its customers who are asking for additions such as suspicious activity detection or notifying for new account logins. Ring has continued to add software that in turn places onus onto users to protect themselves. Customers are viewing this so-called privacy update as nothing more than a “cosmetic redesign”. The device continues to provide no significant hacker-protection, and therefore no notable privacy protection for its customers. 

Google Assistant: New Front-Runner in Privacy Adjustments

Each year Google is celebrated for taking full advantage of CES to indulge its visitors into the technology of the company. This year, Google’s efforts focused on Google Assistant.

After last year’s confirmation that third-party workers were monitoring Google Assistant, Google’s efforts to combat data privacy has been at the forefront of this year’s CES panel. On January 7, 2020, Google announced new features to its Assistant, reassuring its dedication to privacy protection. Users are now able to ask their assistant questions such as: 

  • “Are you saving my audio data?”
  • “Hey google, delete everything I said to you this week”
  • “Hey Google, that wasn’t for you”
  • “How are you keeping my information private?”

Source

Of these new user commands, the most significant is “are you saving my audio data?” This command allows users to determine whether or not their Assistant opted into allowing Google access. 

However, some Google Assistant users are accusing Google of placing onus onto the user, instead of creating a product that protects its user. Similar to the Ring controversy, there is frustration that Google is missing the mark for understanding the privacy demands of its users. All that being said, Google is one of few companies taking the step in the right direction to most significantly impact how user information is stored. 

It is clear that this year’s CES, while still delivering new and exciting ‘gadgets of the future’, has experienced a shift towards privacy as the most significant technological topic point. While that was made clear by most front-leading tech companies, many continue to be missing the mark in understanding the privacy their users want.

Facebook, Ring and Google each brought forward privacy changes of topical interest while continuing to exude an ignorant role of misunderstanding what it means to keep their user’s information private. Thus the question we must ask ourselves as consumers of these products continues to be; are these minimal changes enough for us to continue flushing our information into? 

Join our newsletter


You are the product: People are feeling defeatist in the surveillance age

You are the product: People are feeling defeatist in the surveillance age

As the year wraps up, a new study showcases that Canadians are feeling defeatist over their lack of privacy. Such signals the place for privacy-prioritizing businesses to achieve a competitive advantage -especially in the thick of Amazon’s latest plan to acquire patient data and Toys “R” Us’ return as a surveillance showroom.

Amazon’s latest partnership endorses the collection of patient data

“Dr. Alexa, I have a headache.”

“Don’t worry, that’s just a symptom of Amazon’s latest data harvesting plan.”

In a recent UK government contract, Amazon was given access to healthcare information collected by the National Health Service (NHS). While the material shared does not include patient data, Amazon will now be able to acquire that information straight from their mouths.

Now, if you ask Alexa about a health symptom, it will search through the NHS information on symptoms, causes, and definitions of conditions, and provide you with a diagnosis. It will even offer to call 999.

In principle, this will reduce the workload on GPs by answering questions at home for the 14% of households the own an Amazon device. However, Amazon can use this data to track individuals and improve their ad targetting, create new products, or sell data with third-parties. An NHS spokesperson has assured the public that “appropriate safeguards” have been put in place, but today it is unclear what they will be.

The worry is that “the NHS has not only handed over expertise and allowed the e-commerce giant to directly profit from it – it has also endorsed Amazon to gather patient data straight from our own mouths, in a way that identifies us, with little regard for the consequences.” (Wired)

 

Toys to Surveillance: How Toys “R” Us is following in Amazon’s footsteps

Taking a nod from the competitor that nearly put them out of business, Toys “R” Us has been turned into a “private equity surveillance project.” (Vice)

Last week, Toys “R” Us opened up a second store, and customers quickly took notice. The stores were smaller and they lacked stock. Some even compared them to glorified showrooms. In reality, the new locations were not designed for shopping, but surveillance. Throughout the store, ceiling sensors, cameras, and additional tech were deployed to capture your experience.

This will allow the business to measure shoppers’ paths and behaviour. It has been reported that some of the cameras will blur faces to prevent identification and some children won’t be recorded (so long as they are under 4 feet from the ground). Yet, the company has boasted that traffic data is anonymous. Haphazard anonymization is not anonymization.

People have very harsh words to say about Toys “R” Us’ reincarnation. For example, Karl Bode wrote in a Vice article that, “Toys “R” Us and mascots like Geoffrey the Giraffe could have just died a quiet death, secure in the annals of retail history. Instead the brand has been lobotomized and re-animated into something decidedly different—a private equity-backed playground where everything from your location to your LEGO play is collected, stored, and monetized, whether you like it or not.”

 

People are feeling defeatist about private data security 

73% of Canadians don’t know who holds their data, but they do know that they don’t trust what businesses are doing with it. In a recent IBM Canada survey, the company found out that Canadians don’t feel they have privacy, despite the fact that 83% find it important, believing businesses should do more to protect them from cybersecurity threats. 

It concluded that Canadians are “paying more attention to which companies make the security of their information a priority, and it’s starting to impact their buying decisions. In this digital economy, Canadians need to trust their privacy is protected.”

With 73% saying that they would not purchase again from a company that shared their information, Canadians are growing increasingly adamant. They want privacy.

This is not to suggest that the digital economy is cancelled, but rather that businesses who make a point to prioritize privacy through disclosure, anonymization, and communication, will be rewarded.

Don’t fall into the surveillance trap blazed forward by Amazon, and now Toys “R” Us. Consumers are willing to turn away from businesses that do not respect their privacy. As such, we predict the future of the digital economy is anonymization — and not the ad-hoc form deployed by Toys “R” Us.

Join our newsletter


Google’s “Project Nightingale” makes a mockery of patients’ right to privacy

Google’s “Project Nightingale” makes a mockery of patients’ right to privacy

Recently, Google announced a business partnership with Ascension, the second-largest healthcare provider in the US. It transpires that, through this partnership, the medical records of 50 million Americans will be transmitted – without the knowledge or consent of the patients. None of the records involved are de-identified. 

 

Data analytics has become synonymous with business success, and the personal information of real people is often viewed in terms of dollar signs and profit margins. In turn, true privacy is often portrayed as an unattainable ethical ideal. However, patient and consumer privacy should not be disregarded. Consumer consent is important, and if obtaining it is unmanageable, data should at least be de-identified, so that it is no longer personal.

On November 11, Google and Ascension signed an agreement, codenamed Project Nightingale, that constitutes the largest data transfer of its kind. Project Nightingale’s goal is to build a medical-action suggestion tool. However, potential ulterior motives have raised red flags across the globe. By the time the transfer is complete, 50 million patient records will have been shared. Last Tuesday, 10 million had already been delivered.

In the past, similar efforts to use technology to improve healthcare have first required data to be de-identified. A good example is the collaboration between Google and Mayo Clinic. But in the case of Google and Ascension, the lack of de-identification suggests that a new boundary of data greed has been pushed, in an effort to make data available for purposes beyond those associated with Protect Nightingale.

No-one should be able to access and manipulate medical records without the knowledge and consent of patients and doctors. Not only is this highly unethical; it is also potentially illegal.

Coupled with the acquisition of Fitbit earlier this month, Google appears to be on a mission to become a major stakeholder in the healthcare industry. It is unlikely that Google wants to do this for the common good. After all, Google’s actions undermine the basic right to privacy afforded to all individuals. The company’s new ability to combine search and medical records for business gain is troublesome.

 

Project Nightingale may have violated HIPAA

 

Since neither patients nor doctors were made aware of Project Nightingale, Google is at risk of a HIPAA violation. In fact, a federal inquiry has already been launched.

Under the law, even healthcare professionals must get permission to access health records. Why wouldn’t big tech?

Google has repeatedly insisted that it will follow all relevant privacy laws. However, with the volume and variety of data that the company holds on the average individual, this case likely pushes into uncharted territory that few regulations currently govern.

Even if the secret harvesting of data is not determined to have breached HIPAA, it has undoubtedly crossed the ethical boundaries of healthcare. 

 

Google employees can access medical records and use them to make money

 

Through this partnership, Google plans to create a search tool, designed for medical professionals, that suggests prescriptions, diagnoses, and doctors.

While the public aim may be to improve patient outcomes and reduce spending, a whistleblower expresses concerns that Google “might be able to sell or share the data with third parties, or create patient profiles against which they can advertise healthcare products.”

With the launch of its newest partnership, Google harvested patient names, lab results, hospitalization records, diagnoses, and prescriptions from over 2,600 hospitals. This data can and has been accessed by Google staff (Source). 

With this information,

  1. Google employees can access the medical records of real people. 
  2. Advertisements can target people based on their medical history.
  3. Google can pass identifiable health records to a third-party. 

The potential misuse of medical records places emphasis on the need to de-identify personal information that is being shared, especially without consent. Patients have now unknowingly been put at risk, and their trust has been completely violated. 

 

Who wants to think that their embarrassing injuries are lunchtime conversations for Google employees? That their cancer is the target of Google ads? That their mental health history is being sold to insurance companies?

 

As the Google whistleblower puts it, “Patients must have the right to opt-in or out. The uses of the data must be clearly defined for all to see, not just for now but for 10 or 20 years into the future.”

The actions of Google and Ascension cross the boundary of healthcare ethics, signaling a complete disregard for the privacy of patients. When it signed the deal and secretly harvested the medical records of 50 million Americans, Google demonstrated a sense of entitlement and deceitfulness that is entirely unbecoming of a business that already holds an enormous amount of data on the average citizen.

Confidentiality is the foundation of doctor-patient relationships, and if people can no longer trust that their secrets are safe with their healthcare providers, who can they trust?

Join our newletter


This isn’t the tech people signed up for

This isn’t the tech people signed up for

Apple and Google throw punches over privacy, technological advancement, and price tags. Their feud highlights the importance of privacy rights and the perception behind its role in AI and products.

The tradeoff between progress and privacy

Apple CEO Tim Cook recently dismissed the idea that technological advancement is synonymous with privacy loss. While not naming them directly, this comment is understood to have been a direct jab at Google and Facebook, who have come under much scrutiny due to the sheer mass of data they collect on customers. This has kicked off a debate over consumer data and big data’s responsibility to protect it.

Recently, Apple has made moves to position themselves as a privacy leader and defender, emphasizing that their revenue stream is not reliant on ads and branding the new iPhone with the tagline “what happens on your iPhone, stays on your iPhone”.

Tim Cook even went so far as to say that the belief that you have to understand everyone’s personal life to create great machine learning is a “false trade-off.” “There’s a lot of these false choices that are out there, embedded in people’s minds,” Cook said. “We try to systematically reject all of those and turn it on its head.” (Business Insider)

However, AI users everywhere were quick to point out that Apple’s lack of data collection is a hindrance to AI, noting the limited capabilities of Siri when compared to Alexa or Google Assistant.

This feud is not new, as in the past Google’s CEO had his own criticism to share about the company. As the saying goes, those in glass houses shouldn’t throw stones.

Privacy as a luxury feature

Google CEO Sundar Pichai “hinted that Apple could only afford to respect users’ privacy because its products are so expensive.” With a $1379 (CAD) minimum price point for the newest iPhone, the iPhone 11 Pro, we cannot dismiss his point.

While we believe preserving privacy and advancing AI in conjunction is possible through anonymization, this debate brings up the larger concern of privacy price tags. Bankrate’s financial security index survey in 2018, showed that only 39% of Americans could cover a $1000 (USD) emergency with savings. That’s a negative sign if consumers can only be afforded privacy with a price point of over a grand. 

Yet, rather than address privacy at a lower price, Pichai writes in an op-ed piece in the New York Times, “We feel privileged that billions of people trust [Google products] to help them every day.” Some feel that he was “waxing poetic about how privacy means many things to many people,”. If so, his claim negates the significance of privacy to users and exudes the notion that if users trust the company then privacy is unimportant.

Such an idea is 1984-esk, and is a worry expressed in a recent Amnesty International report that refers to Google’s business model as “surveillance-based.” It then goes on to state that “This isn’t the internet people signed up for.”

We feel Federighi, Apple senior senior vice president of Software Engineering addresses the trust vs. privacy notion well: “Fundamentally, we view the centralization of personalized information as a threat, whether it’s in Apple’s hands or anyone else’s hands,” In saying this, Apple is not exactly the prime example of privacy.

 

iPhone 11 Pro is sharing your location data even when you say no

Despite the fact that the iPhone 11 Pro has been advertised, seemingly, to be the most privacy-focussed smartphone on the market, Brian Krebs, a security researcher, has found a significant privacy flaw. He discovered that the phone “pings its GPS module to gather location data, even if the user has set their phone not to do so.”

This could mean that Apple is geo-tagging locations of cell towers and Wi-Fi hotspots periodically, even after users have opted-out of sharing their location data. Krebs said, “Apparently there are some system services on this model (and possibly other iPhone 11 models) which request location data and cannot be disabled by users without completely turning off location services, as the arrow icon still appears periodically even after individually disabling all system services that use location.” (Forbes)

He suspects that this may be a hardware issue connected with supporting Wi-Fi 6, and emphasizes that the only way to avoid this issue is to disable your phone’s location services completely in settings. This will limit the phone’s capabilities tremendously (Say goodbye to Maps).

This revelation comes shortly after the discovery that iOS13 was designed to offer users control over what companies can access data, but not necessarily for their own apps.

While Apple may be leading the industry in terms of privacy, its model is not bulletproof. What’s more, with such a steep price tag, there are concerns over privacy discrimination. At the end of the day, privacy is important to everyone, and must be available at every price point, whether or not the business is trustworthy.

Join our newsletter


Privacy as a commodity deepens inequality

Privacy as a commodity deepens inequality

Privacy is fundamental to societal and consumer values. Consequently, people have demanded privacy regulations to bar businesses from secretly monetizing their sensitive information. Yet, new policy proposals suggest treating data as a commodity will rebalance the relationship between Americans and the technology industry. Implementing legislation of this form perpetuates a future of data colonialism and threatens to disproportionately strip low-income communities of their privacy rights. 

Americans value their privacy and expect businesses to respect it

The last few years have embodied a transformation of data value. Often it is referred to as the new oil, encompassing the proliferation of business insights and the impact of such on business revenue. However, since the beginning of GDPR talks, a wave of concern over the disregard of people’s privacy has occurred. This has lead privacy and insights to be portrayed as polarizing priorities, with businesses and consumers shaping opposite ends of the argument. While such needs not be contrasted amidst the launch of advanced privacy-protecting and insight-preserving technology, people’s fight to be protected signifies a clear prioritization of their privacy.

In support of this, a new privacy bill, dubbed the Consumer Online Privacy Rights Act (COPRA), was proposed by Democratic senators on Tuesday that may be the push needed to implement a federal privacy bill in America.

This is intended to afford US citizens similar rights to their EU counterparts under GDPR. COPRA would:

  • Allow subjects to request what data companies are holding on them and ask for it to be deleted or corrected
  • Require explicit consent for companies to collect and share sensitive data
  • Forbid companies from collecting more information than is reasonable to carry out the service consumers signed up for
  • Necessitate CEOs of data-collection companies will have to annually certify that they have “adequate internal controls” and reporting structures to be compliant 
  • Capacitate private-citizens lawsuits over data collection become a possibility

Sen. Maria Cantwell (D-Wash.) declared that “In the growing online world, consumers deserve two things: privacy rights and a strong law to enforce them.” Steve Durbin, manager director of the Internet Security Forum, seems to agree, writing in an email,  “What is clear is that privacy is becoming more of an issue in the United States.”

This week, a new Pew Research study questioned how these values impact Americans’ view of smart speakers. It demonstrated that more than half of Americans are concerned about data privacy and that 66% of respondents were not willing to sacrifice more data for more personalization.

As smart speakers continue to grow in popularity, data privacy concerns will continue to rise. However, consumers are making decisions as to where to buy based on the privacy stance of the brands. Such is seen in the fact that Google has negative growth in the market share (-40.1%) in light of their GDPR-fine, secret harvesting of medical records, and acquisition of Fitbit. 

Learn more about how consumer purchasing decisions rely on product privacy in our blog: https://cryptonumerics.com/blog/consumer-purchasing-decisions-rely-on-product-privacy/.

Treating data as a commodity effectively monetizes your privacy rights 

In mid-November, Democratic candidate Andrew Yang proposed a four-prong policy approach to tackle the inadequacy of American privacy legislature today. Part of this plan is what is referred to as “data as a property right.” The idea is that people should profit from the money companies make collecting and monetizing their personal data. That is to say that, businesses would provide consumers with data payments if they chose to give them access to their personal information.

While the proposal seeks to rebalance the American relationship with big tech, this model will normalize the idea of privacy as a commodity, and disproportionately strip low-income communities of their data privacy.

Ulises Mejias, an associate professor at the State University of New York, explained that “Paying someone for their work is not some magical recipe for eliminating inequality, as two centuries of capitalism have demonstrated.” This argument signals that not only would treating data privacy as a commodity not rebalance the power, but will normalize systemic “data colonialism.”

In the article, “Data Colonialism: Rethinking Big Data’s Relation to the Contemporary Subject,” researchers Couldry and Mejias suggest that continuous tracking “normalizes the exploitation of human beings through data, just as historic colonialism appropriated territory and resources and rules subjects for profit.” Such is based on the unprecedented opportunities for discrimination and behavioural influence that would only be scaled if data goes up for sale.

The reality is, if data is considered a commodity, people would not be selling their data, but their privacy. After all, there are no reasonable statistics to determine the value of data, “[t]here’s no going rate for Facebook likes, no agreed-upon exchange rate for Instagram popularity.” (Malwarebytes Labs) So, the question becomes, not how much am I willing to sell my age information for, but how much do I value the safety afforded with location secrecy and right to non-discrimination based on sexual orientation, for example. 

When data is a commodity, private information that individuals should choose whether or not to disclose becomes transactional. “It’s much easier for a middle-class earner to say no to a privacy invasion than it is for stressed, hungry families, Marlow said.” (Malwarebytes Labs) In essence, treating data as a commodity is like a pay-for-privacy scheme, designed to take advantage of those who need extra money.

When the world is pushing for data privacy to be considered a fundamental human right, moves to monetize privacy reflects the historic appropriation of resources and people. Data colonialism will disadvantage those in low-income communities and regress the revolution of privacy prioritization.

An alternative way to empower autonomy over consumer data is to regulate Privacy by Design and Default. Businesses should embed privacy into the framework of their products and have the strictest privacy settings as the default. In effect, privacy operations management must be a guiding creed from stage one, across IT systems, business practices, and data systems.

This promotes anonymization as a solution and leads to a future where business insights and consumer privacy are part of a common goal. In revoking the commodity nature of the Yang proposal, we rescind the deep-seated inequality ingrained in pay-for-privacy schemes while accomplishing the original intent and building a better future. Privacy is not a commodity, it is a fundamental human right.

Join our newsletter