9 DISTURBING AI Tools You Haven’t Seen!

OpenAI has unveiled its new text-to-video model. It’s calling it Sora. The facial recognition software attempts to match people’s faces with billions of images scraped off social media websites. As much as we love to see the rapid advancement in AI tools and the many use cases, some may be pushing it to the very edge. In this article, we are going through nine of the most disturbing AI tools that you will not believe already exist. Everything from cloning to tracking to creative AI and even possible fraudulent use cases for some AI tools. If you think you saw everything, think again.

Kicking things off at number 9 is PimEyes. Explaining what this tool does almost feels like discussing a tool straight out of a sci-fi movie. PimEyes is designed to find any and every photo of you that has ever surfaced on the internet. The company describes it as an online face-search engine. So the same way you would search for something on Google to find all the information that has been posted about it on the internet, PimEyes does the same with faces. When you upload a photo to PimEyes, the tool works with a facial recognition infrastructure to reverse search the image for matches all over the internet. You can find a face as well as the exact place where it was posted online. Now, this may be a pretty useful tool with a wide range of possibilities, which could involve some pretty malicious moves. At a time when so much information is voluntarily put on the internet, anyone searching with intent could use such a tool to scrub for any relevant information and use it to find or stalk a person. But this is merely the tip of the iceberg compared to some of the other tools that we have coming. My number-one tool will leave you with many questions.

But still, on the topic of stalking, let’s check out number eight, GeoSpy. This AI tool is every internet user’s worst nightmare. In the context of this conversation, imagine for a second that someone is trying to track you down. With PIM eyes, they can deep-search the internet for several photos of you, but that would only be the first step. With GeoSpy, your stalker now has the opportunity to find out exactly where the photo was taken. Even the basic version of GeoSpy is already so good. With the slightest of contexts, it can detect the country, state, and city where the photo was taken. It even goes as far as getting you estimated coordinates. Unsurprisingly, for even better performance, you could sign up for early access to the Pro version. At the Pro level, the tool will be able to provide the exact coordinates for where a photo was taken. Photos with more context, like buildings, weather, and environment, also helped the AI do better at pinning the exact location of the feature. If that isn’t disturbing, I don’t know what is.

Number seven is a tool you may have heard of; it’s called 11 Labs. Alternative text-to-speech is going to change the way you make audio forever. But you have no idea what it is really capable of when 11 labs come out with a text-to-speech AI tool. It nearly broke the internet with just how well it worked, but the company did not stop there. Now, beyond just text-to-speech, 11Labs is capable of multilingual speech-to-speech functionality. In simple terms, this means that you can upload or record your voice and upload it to 11Labs to get a realistic AI voiceover of what you have said. This can also be done in multiple languages, closing a language gap. However, where it starts to get creepy is with its voice-cloning tool. Since the tool is designed with the ability to learn, it is capable of cloning voices. With just a 30-second sample size, this tool will be capable of cloning any voice. As you may notice with most of the tools in this video, this should be a great tool in the right hands. However, it also means that a lot could go wrong once malicious intent is introduced, and

Taking things literally out of this world is number 6. Waldo II. This tool is so powerful that, despite being open source, the designer has refused to make it public. In the spirit of the privacy risks associated with AI, Waldo is about as disturbing as it gets. This tool has been trained on thousands and thousands of drone photos and videos. This allows Waldo to be able to identify objects, people, plants, and even things that would not be naturally visible to the human eye. This has great potential since it can be used for accurate surveillance and crime fighting. However, imagine that this tool gets into the hands of government bodies or groups with access to satellite imagery. Waldo could easily become one of the greatest tracking tools in the history of man. Most powerful bodies are already under scrutiny for violating the privacy of certain citizens by using surveillance as an excuse. With a tool this swift and powerful, it could run on autopilot. Number five is Sora. On the 15th of February, OpenAI’s text-to-video tool Sora just shook the world with its capabilities. Sora is the latest tool released by the company behind the groundbreaking chatbot, ChatGPT. Naturally, there were many reactions to this new tool, and while the positives are right there for all to see, so are the many negatives. Even before the tool has been made public, it is already under the microscope of the Italian Data Protection Agency. The agency has stated that if Sora is going to be functional in the European Union and especially Italy, they must come clean about two things. The first concern stated is that OpenAI must be open about the data used to train its model. Considering that there were many speculations regarding the data set used to train ChatGPT, this is a big caveat that they cannot ignore. If other people’s work is being used to train the Sora AI model without permission, this poses a threat to the safety of intellectual property. The second concern that the Italian intends to address is whether or not user data will also eventually be used to train Sora. With access to a lot of users’ information, chances are that OpenAI may begin stealing user data to improve its model. Without these two things ironed out, Sora will be banned in Italy and possibly in the European Union. Imagine getting banned before even seeing the light of day.

Unfortunately, it gets even worse with this next tool. 4. Worm GPT Think about a large language model just like chat GPT, except that this time there are no boundaries to what you are allowed to do. You would think that such a tool should never exist, but this is exactly what Worm GPT is. To show just how far AI tools can go, Worm GPT removes every chain that constrains a user from exploring the depths of digital power. This encompasses everything from malicious activity you can come up with, whether the goal is to launch a malware attack, create phishing emails, or just generally get advice on digital misconduct. Up next is an industry that has continued to witness scary improvements with time. Deepfakes. This particular one should not even exist. DeepSwap can take any video and replace the face in the video with whoever you choose. The risks that such a tool poses are almost endless. We live in a time when videos have become a primary form of communication. Therefore, being able to manipulate a video with one tool is a really big deal. I do have to mention that other tools probably do a better job than DeepSwap. However, what qualifies this one for the top three on today’s list is its low bar of entry. Most tools that do a better job are usually targeted at enterprise usage and cost higher fees. For DeepSwap, it goes for an accessible fee of around $10 per month. With $10, a person can just replace their face with a celebrity’s and cause some trouble if they want to.

Number two is a watermark remover. Most stock photos and video websites protect their creative properties by placing watermarks on them. But thanks to this AI tool, this seemingly unbreakable wall of defense is now breachable. Watermark Remover makes it such that you can upload any photo from the internet that has been covered, even with the worst of watermarks, and it will erase every trace of it. Interestingly, just to show how confident they are in their product, the designers offer three free credits upon signing up that you can use to try it out yourself. No matter the advancement in technology, something that must be respected is the use of other people’s intellectual property. This one crosses the line in this regard, and topping things off at number one is an AI tool that will definitely scare a lot of business owners. Do not pay at its core.

The mission of not paying is pretty simple and quite noble to an extent. The objective of this AI tool is to help the customer beat the system, and I’ll come back to this in a bit with the many subscriptions that we have these days. It can take a lot of work to keep track. So do not pay. This is the AI companion you need to ensure that you are not getting robbed by these companies. It can automatically cancel subscriptions that are billing you unduly, and it can also help cancel a subscription as soon as the free trial ends to avoid the automatic debit that may follow. However, in a bid to give power to the people, they may have taken it a few steps too far. The perfect example here is that this tool allows you to sign up and complete registration for platforms and services without performing verification. It teaches users how to do this with inexpensive phones and maintain anonymity. While this sounds great at face level, the need for identity verification and KYC requirements should not be taken away from businesses. This verification helps track down people who may be using tools for illegal activities. This level of anonymity opens the door to many possibilities for malpractice. Let me know your thoughts on these tools in the comments, and watch the article I’ve selected just for you.

About Anushka Agrawal

Leave a Reply

Your email address will not be published. Required fields are marked *