This isn’t the tech people signed up for

This isn’t the tech people signed up for

Apple and Google throw punches over privacy, technological advancement, and price tags. Their feud highlights the importance of privacy rights and the perception behind its role in AI and products.

The tradeoff between progress and privacy

Apple CEO Tim Cook recently dismissed the idea that technological advancement is synonymous with privacy loss. While not naming them directly, this comment is understood to have been a direct jab at Google and Facebook, who have come under much scrutiny due to the sheer mass of data they collect on customers. This has kicked off a debate over consumer data and big data’s responsibility to protect it.

Recently, Apple has made moves to position themselves as a privacy leader and defender, emphasizing that their revenue stream is not reliant on ads and branding the new iPhone with the tagline “what happens on your iPhone, stays on your iPhone”.

Tim Cook even went so far as to say that the belief that you have to understand everyone’s personal life to create great machine learning is a “false trade-off.” “There’s a lot of these false choices that are out there, embedded in people’s minds,” Cook said. “We try to systematically reject all of those and turn it on its head.” (Business Insider)

However, AI users everywhere were quick to point out that Apple’s lack of data collection is a hindrance to AI, noting the limited capabilities of Siri when compared to Alexa or Google Assistant.

This feud is not new, as in the past Google’s CEO had his own criticism to share about the company. As the saying goes, those in glass houses shouldn’t throw stones.

Privacy as a luxury feature

Google CEO Sundar Pichai “hinted that Apple could only afford to respect users’ privacy because its products are so expensive.” With a $1379 (CAD) minimum price point for the newest iPhone, the iPhone 11 Pro, we cannot dismiss his point.

While we believe preserving privacy and advancing AI in conjunction is possible through anonymization, this debate brings up the larger concern of privacy price tags. Bankrate’s financial security index survey in 2018, showed that only 39% of Americans could cover a $1000 (USD) emergency with savings. That’s a negative sign if consumers can only be afforded privacy with a price point of over a grand. 

Yet, rather than address privacy at a lower price, Pichai writes in an op-ed piece in the New York Times, “We feel privileged that billions of people trust [Google products] to help them every day.” Some feel that he was “waxing poetic about how privacy means many things to many people,”. If so, his claim negates the significance of privacy to users and exudes the notion that if users trust the company then privacy is unimportant.

Such an idea is 1984-esk, and is a worry expressed in a recent Amnesty International report that refers to Google’s business model as “surveillance-based.” It then goes on to state that “This isn’t the internet people signed up for.”

We feel Federighi, Apple senior senior vice president of Software Engineering addresses the trust vs. privacy notion well: “Fundamentally, we view the centralization of personalized information as a threat, whether it’s in Apple’s hands or anyone else’s hands,” In saying this, Apple is not exactly the prime example of privacy.

 

iPhone 11 Pro is sharing your location data even when you say no

Despite the fact that the iPhone 11 Pro has been advertised, seemingly, to be the most privacy-focussed smartphone on the market, Brian Krebs, a security researcher, has found a significant privacy flaw. He discovered that the phone “pings its GPS module to gather location data, even if the user has set their phone not to do so.”

This could mean that Apple is geo-tagging locations of cell towers and Wi-Fi hotspots periodically, even after users have opted-out of sharing their location data. Krebs said, “Apparently there are some system services on this model (and possibly other iPhone 11 models) which request location data and cannot be disabled by users without completely turning off location services, as the arrow icon still appears periodically even after individually disabling all system services that use location.” (Forbes)

He suspects that this may be a hardware issue connected with supporting Wi-Fi 6, and emphasizes that the only way to avoid this issue is to disable your phone’s location services completely in settings. This will limit the phone’s capabilities tremendously (Say goodbye to Maps).

This revelation comes shortly after the discovery that iOS13 was designed to offer users control over what companies can access data, but not necessarily for their own apps.

While Apple may be leading the industry in terms of privacy, its model is not bulletproof. What’s more, with such a steep price tag, there are concerns over privacy discrimination. At the end of the day, privacy is important to everyone, and must be available at every price point, whether or not the business is trustworthy.

Join our newsletter


Consumer purchasing decisions rely on product privacy

Consumer purchasing decisions rely on product privacy

79% of Americans are concerned about the way companies are using their data. Now, they are acting by avoiding products, like Fitbit after the Google acquisition. *Privacy Not Included, a shopping guide from Mozilla, signals that these privacy concerns will impact what (and from whom) consumers shop for over the holidays.

Consumers are concerned about the ways businesses are using their data

A Pew Research Center study investigated the way Americans feel about the state of privacy, and their concerns radiated from the findings. 

    • 60% believe it is not possible to go through daily life without companies and the government collecting their personal data.
    • 79% are concerned about the way companies are using their data.
    • 72% say they gain nothing or very little from company data collected about them.
    • 81% say that the risks of data collection by companies outweigh the benefits.

This study determined that most people feel they have no control over the data that is collected on them and how it is used.

Evidently, consumers lack trust in companies and do not believe that most have their best interests at heart. In the past, this has not been such a big deal, but today, businesses will live and die by their privacy reputation. Such is reflected by the wave of privacy regulations emerging across the world, with GDPR, CCPA, and LGPD.

However, the legal minimum outlined in privacy regulations is not enough for many consumers, suggesting that meeting the basic requirements without embedding privacy into your business model is insufficient.

Such is seen with Fitbit, and the many users pledging to toss their devices in light of the Google acquisition. Google’s reputation has been tarnished in recent months with €50 million GDPR fine and backlash over their secret harvesting of health records in the Ascension partnership.

Google’s acquisition of Fitbit highlights the risks of a failure to prioritize privacy

On November 1, Google acquired Fitbit for $2.1 billion in an effort, we presume, to breach the final frontier of data: health information. Fitbit users are now uprising against the fact that Google will have access not just to their search data, location, and behaviour, but now, their every heartbeat.

In consequence, thousands of people have threatened to discard their Fitbits out of fear and started their search for alternatives, like the Apple Watch. This validates the Pew study and confirms that prioritizing privacy is a competitive advantage.

Despite claims that it will not sell personal information or health data, Fitbit users are doubtful. One user said, “I’m not only afraid of what they can do with the data currently, but what they can do with it once their AI advances in 10 or 20 years”. Another wrote this tweet:

 

This fear is hinged on the general concern over how big tech uses consumer data, but is escalated by the company’s historical lack of privacy-prioritization. After all, why would Google invest $2.1 billion if they would not profit from the asset? It can only be assumed that Google intends to use this data to break into the healthcare space. This notion is validated by their partnership with Ascension, where they have started secretly harvesting the personal information of 50 million Americans, and the fact that they have started hiring healthcare executives.

Privacy groups are pushing regulators to block the acquisition that was originally planned to close in 2020.

Without Privacy by Design, sales will drop

On November 20, the third annual *Privacy Not Included report was launched by Mozilla, which determines if connected gadgets and toys on the market are trustworthy. This “shopping guide” looks to “arm shoppers with the information they need to choose gifts that protect the privacy of their friends and family. And, spur the tech industry to do more to safeguard customers.” (Source)

This year, 76 products across six categories of gifts (Toys & Games; Smart Home; Entertainment; Wearables; Health & Exercise; and Pets) were evaluated based on their privacy policies, product specifications, and encryption/bug bounty programs.

To receive a badge, products must:

    • Use encryption
    • Have automatic security updates
    • Feature strong password mechanics
    • Manage security vulnerabilities
    • Offer accessible privacy policies

62 of those products met the Minimum Security Requirements, but Ashley Boyd, Mozilla’s Vice President of Advocacy, warns that that is not enough, because “Even though devices are secure, we found they are collecting more and more personal information on users, who often don’t have a whole lot of control over that data.”

8 products, on the other hand, failed to meet the Minimum Security Standards, including:

    • Ring Video Doorbell
    • Ring Indoor Cam
    • Ring Security Cams
    • Wemo Wifi Smart Dimmer
    • Artie 3000 Coding Robot
    • Little Robot 3 Connect
    • OurPets SmartScoop Intelligent Litter Box
    • Petsafe Smart Pet Feeder

These products fail to protect consumer privacy and adequately portray the risks associated with using their products. They are the worst nightmare of consumers, and the very reason 79% are concerned about the way companies are using their data.

Through this study, there was an evident lack of privacy prioritization across businesses, especially small ones, despite positive security measures. And those that did prioritize privacy, tended to make customers pay for it. This signals, that the market is looking for more privacy-focused products, and there is room to move in.

Businesses should embed privacy into the framework of their products and have the strictest privacy settings as the default. In effect, privacy operations management must a guiding creed from stage one, across IT systems, business practices, and data systems. This is what is known as Privacy by Design and Privacy by Default. These principles address the increasing awareness of data privacy and ensure that businesses will consider consumer values throughout the product lifecycle. To learn more read this: https://cryptonumerics.com/privacy-compliance/.

Customers vote with their money, coupling the Pew study results with the Fitbit case, it is clear that customers are privacy-conscious and willing to boycott not only products but companies who do not represent the same values. This week serves as a lesson that businesses must act quickly to bring their products in line with the privacy values, to move beyond basic regulatory requirements, and meet the demands of customers.

Join our newsletter


Breaching Data Privacy for a Social Cause

Breaching Data Privacy for a Social Cause

Data partnerships are increasingly justified as a social good, but in a climate where companies are losing consumer trust through data breaches, privacy concerns begin to outweigh the social benefits of data sharing. 

 

This week, Apple is gaining consumer trust with its revamped Privacy Page. Facebook follows Apple’s lead as they become more wary about sharing a petabyte of data with Social Science One researchers due to increasing data privacy concerns. Also, law enforcement may be changing the genetic privacy game as they gain unprecedented access to millions of DNA records to solve homicide cases and identify victims.

Apple is setting the standard for taking consumer privacy seriously—Privacy as a Social Good

Apple is setting the stage for consumer privacy with its redesigned privacy page. Apple CEO Tim Cook announced, “At Apple, privacy is built into everything we make. You decide what you share, how you share it, and who you share it with. Here’s how we protect your data.” (Source)

There is no doubt that Apple is leveraging data privacy. When entering Apple’s new privacy landing page, bold letters are used to emphasize how privacy is a fundamental part of the company, essentially one of their core values (Source). 

Apple’s privacy page explains how they’ve designed their devices with their consumers’ privacy in mind. They also showcase how this methodology applies to their eight Apple apps: Safari browser, Apple Maps, Apple Photos, iMessage, Siri Virtual Assistant, Apple News, Wallet and Apple Pay, and Apple Health.

A privacy feature fundamental to many of Apple’s apps is that the data on an Apple device is locally stored and is never released to Apple’s servers unless the user consents to share their data, or the user personally shares his/her data with others. Personalized features, such as smart suggestions, are based on random identifiers.

  • Safari Browser blocks the data that websites collect about site visitors with an Intelligent Tracking Prevention feature and makes it harder for individuals to be identified by providing a simplified system profile for users. 
  • Apple Maps does not require users to sign in with their Apple ID. This eliminates the risk of user location and search information history linking to their identity. Navigation is based on random identifiers as opposed to individual identifiers.  

Photos taken on Apple devices are processed locally and are not shared unless stored on a cloud or shared by the user.

  • iMessages aren’t shared with Apple and are encrypted via end-to-end device encryption.
  • Siri, Apple’s voice-activated virtual assistant can process information without the information being sent to Apple’s servers. Data that is sent back to Apple is not associated with the user and is only used to update Siri.
  • Apple News curates personalized news and reading content based on random identifiers that are not associated with the user’s identity. 
  • Apple Wallet and Pay creates a device account number anytime a new card is added. Transactional data is only shared between the bank and the individual.
  • Apple Health is designed to empower the user to share their personal health information with whom they choose. The data is encrypted and can only be accessed by the user via passcodes. 

 

Facebook realizes the ethical, legal, and technical concerns in sharing 1,000,000 gigabytes of data with social science researchers

Facebook has been on the wrong side of data privacy ever since the Cambridge Analytica scandal in 2018 where users’ data was obtained, without their consent, for political advertising. Now that Facebook is approaching privacy with users best interest in mind, this is creating tension between the worlds of technology and social science. 

Earlier this year, Facebook and Social Science One partnered in a new model of industry-academic partnership initiative to “help people better understand the broader impact of social media on democracy—as well as improve our work to protect the integrity of elections.” said Facebook (Source). 

Facebook agreed to share 1,000,000 gigabytes of data with Social Science One to conduct research and analysis but has failed to meet their promises. 

According to Facebook, it was almost impossible to apply anonymization techniques such as differential privacy to the necessary data without stripping it completely of its analytical value.   

Facebook half-heartedly released some data as they approached deadlines and pressure, but what they released and what they promised was incomparable. Facebooks’ failure to share the data they agreed to counters the proposed social benefit of using the data to study the impact of disinformation campaigns. 

Facebook is torn between a commitment to contributing to a socially good cause without breaching the privacy of its users. 

This exemplifies how Facebook may not have been fully prepared to shift its business model from one that involved data monetization to a CSR-driven (corporate social responsibility) model where data sharing is used for research while keeping privacy in mind. 

Will Facebook eventually fulfill their promises?

 

Socially Beneficial DNA Data: Should Warrants be given to access Genealogy website databases?

At a police convention last week, Floridian detective, Michael Fields, revealed how he received a valid law enforcement request to access GEDmatch.com data (Source).

GEDmatch is a genealogy website that contains over a million users’ records. But, does the social benefit accrued outweigh the privacy violation to users whose data was exposed without their consent?

Last year, GEDmatch faced a mix of scrutiny and praise when they helped police identify the Golden State Killer after granting them access to their database (Source).  After privacy concerns surfaced, GEDmatch updated its privacy terms. Access was only permitted to law enforcement from users who opted-in to share their data. Additionally, police authorities are limited to searching for the purposes of, “murder, nonnegligent manslaughter, aggravated rape, robbery or aggravated assault” cases (Source).

This recent warrant granted to detective Fields overrode GEDmatch privacy terms by allowing the detective to access data of all users, even those who did not consent. This was the first time a judge agreed to a warrant of this kind. This changes the tone in genetic privacy, potentially setting precedent about who has access to genetic data. 

 

Join our newsletter


What is your data worth?

What is your data worth?

How much compensation would you require to give a company complete access to your data? New studies demonstrate that prescribing a price tag to data may be the wrong approach to go about fines for noncompliance. Meanwhile, 51 CEOs write an open letter to Congress to request a federal consumer data privacy law and the Internet Associations joins them in their campaign. At the same time, Facebook is caught using Bluetooth in the background to track users and drive up profits.

Would you want your friends to know every facet of your digital footprint? How about your:

  • Location
  • Visited sites
  • Searched illnesses
  • Devices connected to the internet
  • Content read
  • Religious views
  • Political views
  • Photos
  • Purchasing habits


How about strangers? No? We didn’t think so. Then, the question remains, why are we sharing non-anonymized or improperly-anonymized copies of our personal information with companies? 

Today, many individuals are regularly sharing their data unconsciously with companies who collect it for profit. This data is used to monitor behaviour and profile you for targeted advertising that will make big data and tech companies, like Facebook, $30 per year in revenue per North American user (Source). Due to the profitability of data mining and the increasing number of nine-figure fines for data breaches, researchers have become fascinated by the economics of privacy. 

A 2019 study in the Journal of Consumer Policy questioned how users value their data. In the study, individuals stated they would only be willing to pay $5/month to protect personal data. While the low price tag may sound like privacy is a low priority, it is more likely that individuals’ believe their privacy should be a given, rather than something they have to pay to receive. This theory is corroborated by the fact that in reversing ownership in the question, and asking how much users would accept for full access to their data, there was a median response of $80/month (Source). 

While this study demonstrates a clear value placed on data from the majority, some individuals attributed a much higher cost and others said they would share data for free. Thus, the study concluded that “both willingness to pay and willingness to accept measures are highly unreliable guides to the welfare effects of retaining or giving up data privacy.” (Source)

In calling into question the ability of traditional measures of economic value to determine fines for data breaches and illegally harvesting data, other influential players in the data privacy research were asked how to go about holding corporations accountable to privacy standards. Rebecca Kelly Slaughter, Federal Trade Commission (FTC) Commissioner, stated that “injury to the public can be difficult to quantify in monetary terms in the case of privacy violations.” (Source

Rohit Chopra, a fellow FTC commissioner, also explained that current levels of monetary fines are not a strong deterrent for companies like Facebook, as their business model will remain untouched. As a result, the loss could be recouped through the further monetization of personal data. Consequently, both commissioners suggested that holding Facebook executives personally liable would be a stronger approach (Source).

If no price can equate to the value of personal data, and fines do not deter prolific companies like Facebook, should we continue asking what data is worth? Alessandro Acquisti, of Carnegie Mellon University, suggests an alternative method to look at data privacy is to view it as a human right. This model of thinking poses an interesting line of inquiry for both big data players and lawmakers, especially as federal data privacy legislature increases in popularity in the US (Source).

On September 10, 51 top CEOs, members of Business Roundtable, an industry lobbying organization, sent an open letter to Congress to request a US federal data privacy law that would supersede state-level privacy laws to simplify product design, compliance, and data management. Amongst the CEOs were the executives from Amazon, IBM, Salesforce, Johnson & Johnson, Walmart, and Visa.  

Throughout the letter, the giants accredited the patchwork of privacy regulations on a state-level for the disorder of consumer privacy in the United States. Today, companies face an increasing number of state and jurisdictional legislation that uphold varying standards to which organizations must comply. This, the companies argue, is inefficient to protect citizens, whereas a federal consumer data privacy law would provide reliable and consistent protections for Americans.

The letter also goes so far as to offer a proposed Framework for Consumer Privacy Legislation that the CEOs believe should be the base for future legislation. This framework states that data privacy law should…

  1. Champion Consumer Privacy and Promote Accountability.
  2. Foster Innovation and Competitiveness
  3. Harmonize Regulations
  4. Achieve Global Interoperability

While a unified and consistent method to hold American companies accountable could benefit users, many leading privacy advocates, and even some tech giants, have pointed out the immoral intentions of the CEOs. This is because they regarded the proposal as a method “to aggregate any privacy lawmaking under one roof, where lobby groups can water-down any meaningful user protections that may impact bottom lines.” (Source)

This pattern of a disingenuous push for a federal privacy law continued last week as the Internet Association (IA), a trade group funded by the largest tech companies worldwide, launched a campaign to request the same. Members are largely made up of companies who make a profit through the monetization of consumer data, including Google, Microsoft, Facebook, Amazon, and Uber (Source).

In an Electronic Frontier Foundation (EFF) article, this campaign was referred to as a “disingenuous ploy to undermine real progress on privacy being made around the country at the state level.” (Source) Should this occur, the federal law would supersede state laws, like The Illinois Biometric Information Privacy Act (BIPA) that makes it illegal to collect biometric data without opt-in consent, and the California Consumer Privacy Act (CCPA) which will give state residents the right to access and opt-out of the sale of their personal data (Source). 

In the last quarter alone, the IA has spent close to USD $176,000 to try and weaken CCPA before it takes effect without success. As a result, now, in conjunction with Business Roundtable and Technet, they have called for a “weak national ‘privacy’ law that will preempt stronger state laws.” (Source)

One of the companies campaigning to develop a national standard is Facebook, who is caught up, yet again, in a data privacy scandal.

Apple’s new iOS 13 update looks to rework the smartphone operating system to prioritize privacy for users (Source). Recent “sneak peeks” showed that it will notify users of background activity from third-party apps surveillance infrastructure used to generate profit by profiling individuals outside their app-usage. The culprit highlighted, unsurprisingly, is Facebook, who has been caught using Bluetooth to track nearby users

While this may not seem like a big deal, in “[m]atching Bluetooth (and wif-fi) IDs that share physical location [Facebook could] supplement the social graph it gleans by data-mining user-to-user activity on its platform.” (Source) Through this, Facebook can track not just your location, but the nature of your relationship with others. In pairing Bluetooth-gathered interpersonal interactions with social tracking (likes, followers, posts, messaging), Facebook can escalate its ability to monitor and predict human behaviour.

While you can opt-out of location services on Facebook, this means you cannot use all aspects of the app. For instance, Facebook Dating requires location services to be enabled, a clause that takes away a user’s ability to make a meaningful choice about maintaining their privacy (Source).

In notifying users about apps using their data in the background, iOS 13 looks to bring back a measure of control to the user by making them aware of potential malicious actions or breaches of privacy.

In the wake of this, Facebook’s reaction has tested the bounds of reality. In an attempt to get out of the hot seat, they have rebranded the new iOS notifications as “reminders” (Source) and, according to Forbes, un-ironically informed users “that if they protect their privacy it might have an adverse effect on Facebook’s ability to target ads and monetize user data.” (Source) At the same time, Facebook PR has also written that “We’ll continue to make it easier for you to control how and when you share your location,” as if to take credit for Apple’s new product development (Source).

With such comments, it is clear that in the upcoming months, we will see how much individuals value their privacy and convenience. Between the debate over the value of data, who should govern consumer privacy rights, and another privacy breach by Facebook, the relevance of the data privacy conversation is evident. To stay up to date, sign up for our monthly newsletter and keep an eye out for our weekly blogs on privacy news.

Join our newsletter


Rewarded for sharing your data? Sign me up!

Rewarded for sharing your data? Sign me up!

Companies now starting to pay users for their data, in efforts to be more ethical. Large Bluetooth security flaw detected proving potentially harmful to millions. Blockchain’s future looking bright as privacy-preserving technology booms. Canadian federal elections being ‘watched’ for their history of ‘watching’ public.

Rewarded for sharing your data? Sign me up!

Drop Technologies has secured USD$44 million in investments towards growing a technology-based alternative towards traditional customer loyalty programs. With over three million users signed up already, as well as 300 brands on its platform, such as Expedia and Postmates, the company is headed in the right direction. 

Given that Facebook and other tech giants are monetizing data without user permission, getting paid for it doesn’t seem like a bad idea after all. “I’m a Facebook user and an Instagram user, and these guys are just monetizing my data left and right, without much transparency,” said Onsi Sawiris, a managing partner at New York’s HOF Capital.” At least if I’m signing up for Drop, I know that if they’re using my data I will get something in return, and it’s very clear” (Source).

This alternative to rewards programs basically tracks your spending with all of their 300+ brands, and lets you earn points that you can spend at certain companies such as Starbucks of Uber Eats. If it’s an alternative to credit card rewards, it will be beneficial to consumers looking for extra savings on their purchases. So don’t drop it till you try it!

Bluetooth proving to be a potential data breach vulnerability 

Researchers have discovered a flaw that leaves millions of Bluetooth users vulnerable to data breaches. This flaw enables attackers to interfere while two users are trying to connect without being detected, as long as they’re within a certain range. From music to conversations, to data entered through a Bluetooth device, anything could be at risk. “Upon checking more than 14 Bluetooth chips from popular manufacturers such as Qualcomm, Apple, and Intel, researchers discovered that all the tested devices are vulnerable to attacks” (Source). 

Fortunately, some companies such as Apple and Intel have already implemented security upgrades on their devices. Users are also advised to keep their security, software, and firmware updated at all times. 

Get ready for blockchain advancements like never before

For the past decade, blockchain has been used to build an ecosystem where cryptocurrencies and peer-to-peer transactions are just a few of the many use cases. (Source).

Traditionally, data is shared across centralized networks, leaving systems vulnerable to attacks. However, with decentralization as an added security measure to blockchain, the threat of a single point of failure across a distributed network is eradicated. 

As more and more companies turn to blockchain to gain the benefits of more efficient data sharing and easier data transfers, privacy is overlooked.

In most public blockchains today, transactions are visible to all nodes of a network. Naturally, of course, the issue of privacy is raised due to the sensitive nature of the data, and this transparency comes at a cost. With digital transformation happening all around us, privacy protection cannot be ignored.

To address privacy, many blockchain companies are employing privacy-preserving mechanisms on their infrastructures, from zero-knowledge proofs to encryption algorithms such as Multi-Party Computation (MPC). These mechanisms encrypt data as it’s shared and only reveal the specific elements needed for a specific task (Source).

Costs efficiencies and a better understanding of consumer needs are just a few of the advantages of privacy-preserving mechanisms being introduced. As data and privacy go hand in hand in the future, equitability and trust will be our key to unlock new possibilities that enhance life as we know it (Source).

Upcoming Canadian elections could turn into surveillance problem

Once again, the Canadian federal elections are raising concerns about interference and disruption through the misuse of personal data. In the past, political parties have been known to use their power to influence populations who are not aware of how their data is being used. 

Since data has played a major role in elections, this could become a surveillance issue because experts who study surveillance say that harnessing data has been the key to electoral success, in past elections. “Politicians the world over now believe they can win elections if they just have better, more refined and more accurate data on the electorate” (Source).

A related issue is a lack of transparency between voters and electoral candidates. “There is a divide between how little is publicly known about what actually goes on in platform businesses that create online networks, like Facebook or Twitter, and what supporters of proper democratic practices argue should be known” (Source).

The officials of this upcoming election should be paying close attention to the public’s personal data and how it is being used.

Join our newsletter


How to Decode a Privacy Policy

How to Decode a Privacy Policy

How to Decode a Privacy Policy

91% of Americans skip privacy policies before downloading apps. It is no secret that people and businesses are taking advantage of that, given that there’s a new app scandal, data breach, or hack everyday. For example, take a look at the FaceApp fiasco from last month.

In their terms of use, they clearly state the following;

 “You grant FaceApp a perpetual, irrevocable, nonexclusive, royalty-free, worldwide, fully-paid, transferable sub-licensable license to use, reproduce, modify, adapt, publish, translate, create derivative works from, distribute, publicly perform and display your User Content and any name, username or likeness provided in connection with your User Content in all media formats and channels now known or later developed, without compensation to you. When you post or otherwise share User Content on or through our Services, you understand that your User Content and any associated information (such as your [username], location or profile photo) will be visible to the public” (Source).

However, these documents should actually be rendered important, especially since it discloses legal information about your data, including what the company will do with your data, how they will use it and with whom they will share it. 

So let’s look at the most efficient way to read through these excruciating documents. Search for specific terms by doing a keyword or key phrase search. The following terms are a great starting point: 

  • Third parties
  • Except
  • Retain
  • Opt-out
  • Delete
  • With the exception of
  • Store/storage
  • Rights 
  • Public 

“All consumers must understand the threats, their rights, and what companies are asking you to agree to in return for downloading any app,” Adam Levin, Founder of CyberScout says. “We’re living in an instant-gratification society, where people are more willing to agree to something because they want it right now. But this usually comes at a price” (Source).

New York Passes Data Breach Law

A New York law has recently been passed, known as the SHIELD Act, or the Stop Hacks and Improve Electronic Data Security Act. This act requires businesses that collect personal data from New York residents to comply. Below are some of the act’s enforcement and features: 

  • requires notification to affected consumers when there is a security breach,
  • broadens the scope of covered information, 
  • expands the definition of what a data breach means, 
  • and extends the notification requirement to any entity with the private information of a New York resident (Source)

Why Apple Won’t Let You Delete Siri Recordings

Apple claims to protect its users’ privacy by not letting them delete their specific recordings. “Apple’s Siri recordings are given a random identifier each time the voice assistant is activated. That practice means Apple can’t find your specific voice recordings. It also means voice recordings can’t be traced back to a specific account or device” (Source).

After it was reported that contractors were listening to private Siri conversations, including doctor discussions and intimate encounters, Apple needed to change its privacy policies. 

The reason why Siri works differently than its rivals is because of how Google Assistant or Alexa data is connected directly with a user’s account for personalization and customer service reasons. Apple works differently, as they don’t rely too much on ad revenue and customer personalization like their rivals – they rely on their hardware products and services.

LAPD Data Breach Exposes 2,500 Officers’ Data

The PII of about 17,500 LAPD applicants and 2,500 officers has been stolen in a recent data breach, with information such as names, IDs, addresses, dates of birth and employee IDs compromised.

LAPD and the city are working together to understand the severity and impact of the breach. 

“We are also taking steps to ensure the department’s data is protected from any further intrusions,” the LAPD said. “The employees and individuals who may have been affected by this incident have been notified, and we will continue to update them as we progress through this investigation” (Source).

Join our newsletter