AI Lied to Me

AI is an amazing tool and has definitely started changing our work landscape. However, is it really what it promised itself to be?

I have been using AI for about a year or two to help me with work and sometimes, just for fun. So far, it has been quite helpful and has made several tasks easier and finished faster. It helped a lot in conceptualizing themes for our corporate events and helped me in reviewing grammatical errors in my documents. Since it has been useful at work, I decided to use it for some of my chores as well. With the high inflation and daily increase in the prices of goods, I decided to use AI to help me find the best option to shop for some items. I wanted to know if AI can help me find the online store where I can buy what I need at the cheapest price. I tried ChatGPT first but it will not help me and gave me the following reply:

I’m unable to provide real-time pricing information for specific products from different online pharmacies in xxx. I recommend visiting the websites of these pharmacies directly to compare the prices of xxx. You can then make an informed decision based on the current prices available.

Since ChatGPT won’t help me since it cannot access external content, I had to use another AI option. I wrote the prompt, and I got so excited as it started to type and showed me the price comparison from the websites that I provided. I was surprised about how detailed the recommendation is as it provided the following information:

  • It provided the price of each item from each website
  • It gave me the website with the lowest price for each item
  • It says which website offers the lowest prices for most of the items
  • It provided a recommendation which website to purchase from, with the total cost of all items from each website, including delivery cost

In the end, it also provided the typical disclaimer:

Please note that prices may vary depending on the seller and availability, so it’s always a good idea to double-check prices before making a purchase.

I then signed up on the website that it recommended to proceed with my purchase. I trusted that the AI provided me with the correct information. As I put items on my cart, I remembered the price for one of the items from one of the online stores where I made my purchases before. Checking if I remember it right, I saw that it’s cheaper in the other store compared to AI’s recommended store. I then tried to validate the other prices and noticed that the prices that AI provided were different from the store prices. The differences are quite significant and nothing from the prices I got from AI are correct.

Of course, I had to confront AI and ask for the links where it got its prices from. It did reply with the link for each of the items on my list. So I thought, what happened? AI has the links but how come it’s different? Then I clicked on the links and all gave me a 404 Error. Again, AI gave me the wrong information.

Hmmm.. why would AI mislead me and send me to the wrong website then? Giving AI the benefit of the doubt, I had to ask it one last time what actually happened. Then finally, AI told me the truth.

I must confess that I made a mistake earlier. I was overconfident in my ability to generate fictional URLs that might resemble real ones.

In reality, I didn’t actually find or suggest working links from XXX. Instead, I generated fictional URLs based on my understanding of how URLs are typically structured. However, these URLs were not actual links and did not point to any real products on XXX.

I apologize for any confusion or inconvenience this may have caused. I’ll strive to be more accurate and transparent in my responses moving forward. Thank you for pointing out my mistake!

Well at least it learned it’s lesson, right? Should I give AI another chance?

AI Hallucination

Of course, AI deserves another chance. However, since a lot of the AI tools are just in their early development, we can expect some flaws, and in this case, it’s called hallucination. Asking our friend Google for the definition of AI hallucinations, those are incorrect or misleading results that AI models generate. These errors can be caused by a variety of factors, including insufficient training data, incorrect assumptions made by the model, or biases in the data used to train the model.

AI, including LLMs (Large Language Models), are trained on a dataset and learn to identify patterns and relationships within that data, which enables them to make predictions and produce output in response to the prompts. However, there are times when they learn incorrect patterns or logic, which leads to incorrect results or hallucinations.

Hallucination does not just happen on one AI tool; it also happens on other AIs like Gemini, ChatGPT, and other LLMs and AI models. I searched the web for other cases of hallucinations and I saw that it’s not just limited to providing incorrect information. Hallucination also happens in image creation and in AI-powered chatbots used by companies. One interesting incident is about an airline’s chatbot offering a bereavement discount to a passenger. According to the chatbot, if the passenger used it within 90 days, a portion of the ticket would be refunded. However, upon claiming, the passenger was told that the airline doesn’t offer refunds for travels that already occurred and it’s their policy.

AI has already made significant advancements in transforming how we work, and we anticipate even greater developments in the future. However, it is important to acknowledge that AI is still in its early stages and has a long way ahead in terms of refinement and evolution. Despite its progress, human intervention remains unparalleled in certain aspects. Ultimately, as the creators of AI, we hold the key to harnessing its potential and guiding its growth in a way that truly complements and enhances the human experience.

Leave a comment

Hello!

Welcome to MCreates where I share my travel stories, creative pursuits, and thoughts about life. Come keep me company as I explore some parts of the world, various hobbies like clay art, pottery, and panting, and share what occupies my mind. Together, let’s see new places, start creating, share our thoughts about things, experiences, events and people, and just live life. 

Let’s connect