All The World’s A Stage Part 2: A Clear View Into Dystopia

Hoan Ton-That is an interesting guy. He is an Australian entrepreneur by way of Vietnam, an ex model and he’s possibly created a technology that has destroyed our personal privacy forever.



Clearview AI is a new research tool used by law enforcement agencies to identify perpetrators and victims of crimes.

Clearview AI’s technology has helped law enforcement track down hundreds of at-large criminals, including pedophiles, terrorists and sex traffickers. It is also used to help exonerate the innocent and identify the victims of crimes including child sex abuse and financial fraud.

Using Clearview AI, law enforcement is able to catch the most dangerous criminals, solve the toughest cold cases and make communities safer, especially the most vulnerable among us.

Sounds pretty good, right? Using technology to take down at-large terrorists, pedophiles and sex traffickers. Everybody can get on board with that. But almost acting as a compendium for the company’s shady business practices their mission statement is very vague. They don’t tell you HOW their technology helps law enforcement catch all of these terrorists and sex traffickers. And that, “how” in this equation is big.

Hoan Ton-That

In 2017 Hoan Ton-That and Richard Schwartz founded Clearview AI, an American company that provides facial recognition software for commercial use. The company has amassed an impressive database of over THREE BILLION images indexed from all over the internet. Including all major social media sites.

The idea is simple: Collect every photo of everyone on Earth from every angle they can and then develop a software that can match an identity to a face in a security camera or photo. Then… sell it to ANYONE who can afford it.

To say this young company has already had its fair share of controversy would be a gross understatement.

I guess first things first – the pair of entrepreneurs have allegedly been linked to some interesting and downright controversial characters from the beginning. While courting investors they loaned the services to dickhead billionaires left and right. The early Clearview technology was being simply used as a, “plaything for the rich”.

These wealthy potential investors used the technology for pressing matters like identifying their daughter’s boyfriend to make sure she wasn’t dating a, “charlatan”. The software company has also been linked with other shitheads like everybody’s favorite, Jeffrey Epstein and Keebler elf look-a-like, Jeff Sessions.

But who they hung out with before the company took off is just the tip of the iceberg as far as controversy is concerned.

Clearview operated in a shroud of secrecy for years before a New York Times expose titled, “The Secretive Company That Might End Privacy as We Know It” was released in January 2020 (1).

Following the publication over forty tech and civil rights organizations sent a letter to the Privacy and Civil Liberties Oversight Board (PCLOB). The ACLU had also begun legal proceedings on behalf of the people of the state of Illinois as well.

Software “scraping” photos of users off every social media site in the world unknowingly sounds really illegal but not surprisingly it isn’t. Not surprising only because it seems that most cyber law isn’t legislated until after the act. Mainly because no one expected someone to develop a software that stores every single photo of everyone that’s ever been on the internet. Which in hindsight always seems obvious. OF COURSE some James Bond Villain type would develop this but as they say hindsight is always 20/20.

Ultimately highly unethical but not necessarily illegal.

Twitter, Facebook and the rest of Big Social eventually responded with cease and desist orders also demanding the return of all user photos of the respective platforms but I wouldn’t hold my breath.

Hoan Ton-That is far from some sort of strongman and it’s not that Clearview AI is a legal powerhouse but what makes the company so powerful isn’t the company itself. It’s who employs them.

Law enforcement agencies immediately jumped at the chance to utilize the facial recognition software and are Clearview’s largest client. But in late February 2020 the software company suffered a major data breach with hackers leaking their entire client list. No one was surprised to find out that ICE, FBI or the DOJ had been using the software but what shocked many was the amount of private companies employing the facial recognition service. Businesses like Home Depot, Best Buy and even Madison Square Garden. Most businesses and College campuses that were using it or considering the security measures were met with ferocious backlash and distanced themselves from the company. But not all schools and private businesses.


Commercial and other non-government entities

As the Coronavirus pandemic rages in April 2020 Clearview even threw its own hat in the ring to offer its services to state’s department of health offices to assist with contact tracing.

It seems theres nothing there are not willing to exploit people’s privacy for.

So what are we left with? An invasive and creepy company that keeps the wolf from the door by collecting and selling our heads (and faces) to law enforcement searching for criminals and a room full of constantly stunned lawmakers that never have the foresight or imagination to look to the future and get ahead of gross oversteps in cyber privacy and security.

Oh and it’s worth noting that despite the company claiming it has up to a 99.6% accuracy rate of identifying people that really only seems to be the case with identifying Caucasian males. Where in a study conducted by the National Institute of Standards and Technology the algorithms falsely identified Asian and African American faces 10 to 100 times more. Sounds like America to me. – DC




Follow me on social media if thats your thing:

Definitely not the reason I do this BUT if anything I write brings you joy and you feel like buying me a cup of coffee I would appreciate it wholeheartedly.

6 thoughts on “All The World’s A Stage Part 2: A Clear View Into Dystopia

  1. Great post! Do AIs have built in bigotry? Just read an article about google firing the only Black person on the ethical AI team and also the white female co-founder. Big brother watching is more sophisticated. What a place this world is becoming?


    1. Thank you. ☺️The result of the inaccuracies I think is that there are just less of a diverse pool of photos for people of color online then there are caucasians online. The podcast, Malicious Life covered this much better than I did if you want a much more educated look into the racial discrepancies. According to them, the algorithms were wrong with African Americans females a lot but due to the smallest amount of online representation, Native Americans were the most victimized by the tech. It’s fascinating and frightening stuff.

      Liked by 1 person

      1. Being a cynic, I wonder if that was not the purpose—to incarcerate more innocent Black and non-white people. Essentially removing them from society to ensure non-whites never get ahead or pose a threat of success.

        Liked by 1 person

      2. Thanks for sharing the link. It highlights my concern. Yes AI perform functions based on mathematical algorithms. The problem is there is bias inherent in the algorithms. As discussed in the link, the algorithms created by whites and asians at google identified gorillas as black people.

        Liked by 1 person

      3. Yes. We certainly have a long way to go. We can only hope that it’ll take less time to right the wrongs of this technology than the broken, seemingly hopeless system we see in America right now.

        Liked by 1 person

Leave a Reply to Colin Cancel reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s