With the emerging Artificial Intelligence (AI) market comes the everso popular privacy discourse. Data regulations that are being introduced left and right, while effective, are not yet representative of the growing technologies like facial recognition or data marketplaces.
Companies like Clearview AI are once again making headlines after receiving cease-and-desist from big tech, despite there being no current facial recognition laws they are violating. As well, Nature released an article calling for an international code of conduct for genomic research aggregation. Between both AI and healthcare, Microsoft has announced a $40million AI for health initiative.
Facial recognition company hit with cease-and-desist
A few weeks ago, we released a blog introducing the facial recognition start-up, Clearview AI, as a threat to privacy.
Since then, Clearview AI has continued to make headlines, and most recently, has received cease-and-desist from Big Tech companies like Google, Facebook and Twitter.
To recap, Clearview AI is a facial recognition company that has created a database of over 3 billion searchable faces, scrapped from different social media platforms. The company has introduced its software in more than 600 police departments across Canada and the US.
The company’s CEO, Hoan Ton-That, has repeatedly defended its company, telling CBS:
“Google can pull in information from all different websites, so if it’s public, you know, and it’s out there, it could be inside Google search engine it can be inside ours as well.”
Google then responded, saying this was ‘inaccurate.’ Google says they are a public search option and give sites choices in what they put out, as well as give opportunities to withdraw images. All options Clearview does not provide, as they go as far as holding images in their database after it’s been deleted from its source.
While Google and Facebook have both provided Clearview with a cease-and-desist, Clearview has maintained that they are within their first amendment rights to use the information. One privacy attorney told Cnet, “I don’t really buy it. It’s really frightening if we get into a world where someone can say, ‘The first amendment allows me to violate everyone’s privacy.’”
While cities like San Francisco have started banning facial recognition, there are currently no federal laws addressing it as an issue, thus allowing more leeway for companies like Clearview AI to create potentially dangerous software.
Opening up genomic data for researchers across the world
With these introductions to new health care initiatives, privacy becomes more relevant than ever. Healthcare data contains some of the most sensitive information for an individual. Thus the idea of big tech buying and selling such personal data is scary.
Last week, Nature, an international journal of science, released that over 800 terabytes of genomic data are available to investigators all over the world. The eight authors worked explicitly to protect the privacy of the thousands of patients/volunteers who consented to have their data used in this research.
The article reports the six-year collection of 2,658 cancer genomes between 468 institutions in 34 different countries is creating an open market of genome data. This project, called the Pan-Cancer Analysis of Whole Genomes (PCAWG), was the first attempt to aggregate a variety of subprojects and release a dataset globally.
A significant emphasis of this article was on the lack of clarity within the healthcare research community on how to protect data in compliance with the ongoing changes to privacy legislation.
Some issues in these genomic marketplaces are in the strategic attempts to not only comply with the variety of privacy legislation but also in ensuring that no individual can be re-identified using this information. Protecting patient data is not just a legislative issue but a moral one.
The majority of the privacy unclarity came from questions of what vetting should occur before gaining access to information, or what checks should be made before the data is internationally shared.
As the article says, “Genomic researches urgently need clear data-sharing rules that are harmonized across jurisdictions.” The report calls for an international code of conduct to overcome the current hurdles that come with the different emerging privacy regulations.
The article also said that the Biobanking and BioMolecular Resources Research Infrastructure (BBMRI-ERIC), had announced back in 2017 that it would develop an EU Code of Conduct on Health-Related Data. Once completed and approved,
Microsoft to add another installment to AI for Good
The ability to collect patient data and share in an open market for researchers or doctors is helping cure and diagnose patients at a faster rate than ever before seen. In addition to this, AI is seen as another vital tool for the growing healthcare industry.
Last week, Microsoft announced its fifth installment to its ‘AI for Good’ project, ‘AI for Health.’ This project, similar to its cohorts, will support healthcare initiatives such as providing access to cash grants, AI tools, cloud computing, and Microsoft researchers.
The project will focus on three different AI strategies, including:
- Accelerating medical research
- Increase the understanding of mortality to guard various global health crises.
- Reducing health injustices
The program will be emphasizing supporting individual non-profits and under-served communities. As well, Microsoft released in a video their focus on addressing Sudden Infant Death Syndrome, eliminating Leprosy and diabetic retinopathy-driven blindness in partnership with different non-for-profits.
AI is essential to healthcare, and it has lots of data that companies like Microsoft are utilizing. But with this, privacy has to remain at the forefront of the action.
Similar to Nature’s data, protecting user information is extremely important and complicated when looking to utilize the data’s analytical value, all while complying with privacy regulations. Microsoft announced that it would be using Differential Privacy as its privacy solution.
Like Microsoft, we at CryptoNumerics user differential privacy as a method of anonymization and data value preserving. Learn more about differential privacy and CryptoNumeric solutions.