**Why Trusting AI for Travel Advice Could Ruin Your Vacation**
*By Mudit Dube | Sep 30, 2025, 12:27 PM*
—
As artificial intelligence (AI) tools like ChatGPT and Google Gemini become increasingly popular for trip planning, many travelers are turning to these technologies for advice. However, a growing number of users have experienced frustrating or even dangerous situations due to incorrect or misleading information generated by AI.
### The Risks of Relying Solely on AI for Travel Advice
A recent incident in Peru highlights the potential dangers of depending exclusively on AI travel recommendations. Two tourists followed an AI-generated suggestion to visit the “Sacred Canyon of Humantay,” a destination that does not actually exist. This misinformation led them to a remote rural road on the outskirts of Mollepata without a guide or clear destination—a risky situation in unfamiliar terrain.
Miguel Angel Gongora Meza, founder of Evolution Treks Peru, warns that incorrect travel guidance can be deadly in places like Peru. Factors such as high elevation, sudden climate changes, and difficult trail accessibility require careful planning and local expertise. He explains that using AI tools without proper understanding can result in travelers being stranded at high altitudes with no oxygen or phone signal.
### The Rising Popularity of AI in Trip Planning
Despite the risks, AI-powered travel tools are now an integral part of trip planning for millions of people worldwide. Recent surveys reveal that about 30% of international travelers utilize generative AI platforms or dedicated travel sites like Wonderplan and Layla to organize their trips.
When accurate, these AI travel advisors can offer valuable tips and streamline planning. However, inaccurate recommendations can lead to frustration or even hazardous situations, underscoring the importance of human expertise alongside AI assistance.
### Limitations and Challenges of AI Travel Recommendations
A 2024 survey found that 37% of users relying on AI for travel planning reported receiving insufficient information, and approximately 33% encountered false or misleading AI-generated recommendations. These issues arise from how AI models generate answers.
Rayid Ghani, a distinguished professor of machine learning at Carnegie Mellon University, explains that while programs like ChatGPT may provide seemingly rational advice, they are also prone to “hallucinations” — a term for fabricated or incorrect information produced by AI.
### AI’s Lack of Understanding of the Physical World
Ghani emphasizes that AI’s data analysis processes do not equate to a meaningful understanding of the physical world. For example, AI might confuse a casual 4,000-meter walk through a city with a strenuous 4,000-meter elevation climb up a mountain. This fundamental misunderstanding heightens the risk of inaccurate travel suggestions and potential dangers for travelers who rely solely on AI-generated itineraries.
### Increasing Misinformation in AI Travel Content
The problem extends beyond text recommendations. A recent Fast Company article recounted an incident where a couple was misled by an AI-generated video on TikTok promoting a scenic cable car in Malaysia — a structure that does not exist. Such misinformation not only wastes travelers’ time and resources but also raises bigger concerns about how AI-generated content may subtly distort our perception of the world.
—
**Conclusion**
AI tools offer exciting possibilities for enhancing trip planning. However, travelers should exercise caution and complement AI-generated advice with trusted human expertise and verified information sources to avoid frustration and potential safety risks.
Use AI as a helpful guide—not a sole source—for your next adventure. Safe travels!
https://www.newsbytesapp.com/news/science/ai-hallucinations-are-creating-fake-travel-destinations/story