Google Maps Now Answers Questions Like a Human
Ask Maps lets you have a conversation. Immersive Navigation renders your route in 3D. Both powered by Gemini.
I tried something yesterday that surprised me.
Opened Google Maps and typed: “My phone is dying. Where can I charge it without waiting in a long line for coffee?”
Maps understood the question. Gave me three options within walking distance. One was a library two blocks away with free outlets and zero wait time.
This kind of query never worked before. Now it does.
What Google Shipped This Week
Two features went live yesterday, both powered by Gemini.
Ask Maps turns the search bar into a conversation. You type full questions in plain English, and Maps answers with places, directions, and context pulled from 300 million locations and 500 million user reviews.
Immersive Navigation rebuilds the entire driving experience from scratch. 3D buildings that render as you approach them. Real terrain. Lane markers that actually help you merge. Voice directions that sound like a friend riding with you.
Google calls it the biggest Maps update in over a decade. After using it for a day, I think they’re right.
Ask Maps: Real Questions, Real Answers
The old Maps experience meant typing “coffee shop” and scrolling through 47 blue dots on a map. You’d open each listing, skim reviews, and piece together whether it had what you needed.
The new experience is different. You ask “Where can I get coffee with outdoor seating and good wifi near Union Square?” and Maps just answers the question.
I tested a handful of queries to see how well it worked.
“Is there a public tennis court with lights on that I can play at tonight?” Maps found one, showed me the hours, and mentioned it was free to use.
“I’m driving to the Grand Canyon tomorrow. Any stops worth making along the way?” I got a curated list with drive times between each stop, plus tips pulled from reviews like “arrive before 7am to avoid crowds” and “the east entrance has shorter lines.”
“My friends are coming from Brooklyn to meet me in Manhattan after work. Where should we eat?” Maps already knew I save vegan spots from my history. It found places roughly in the middle, and showed which ones had tables available at 7pm.
From there, you can book a reservation, save the place to a list, share it with your group, or get directions. All without leaving the conversation.
The feature is rolling out now in the US and India, with desktop support coming soon.
Immersive Navigation: Driving Finally Makes Sense
Google Maps navigation has worked fine for years. It gets you where you’re going. But the interface felt frozen in 2015, with the same flat blue line and the same robotic voice it’s always had.
This update changes the entire feel of the experience.
The map renders in 3D now. Buildings appear as you approach them, overpasses look like actual overpasses, and terrain has real depth. When you need to change lanes, Maps highlights the exact lane on screen instead of just saying “keep right.”
The voice guidance got a rewrite too. It sounds less like a GPS and more like a person giving directions.
Old style: “In 500 feet, take exit 23.”
New style: “Go past this exit and take the next one for Illinois 43 South.”
Same information, much easier to follow when you’re actually driving.
Three things stood out after testing it:
The view extends further ahead. The camera pulls back when a complex section is approaching. Buildings go transparent so you can see through them to the road beyond. You know what’s happening three turns from now, which makes lane changes way less stressful.
Alternate routes come with context. Maps now explains why you might choose a different path. “This route adds 8 minutes but avoids the accident on I-90.” Or “This one has a toll but saves 12 minutes.” You pick based on what matters to you in the moment.
Arrival guidance actually helps. Before you leave, you can preview your destination in Street View and see where to park. As you get close, Maps highlights the building entrance and shows which side of the street you need to be on. Those last few hundred feet stop being the frustrating part of the trip.
The update is rolling out now in the US. Support for CarPlay, Android Auto, and cars with Google built-in will expand over the next few months.
The Data Behind All This
Google shared some numbers in the announcement that explain how this works.
The database covers 300 million places. Reviews and tips come from 500 million contributors. Traffic data updates 5 million times per second from drivers on the road. And 10 million user reports flow in every day covering accidents, road closures, and construction.
Gemini processes all of that data. Maps presents it in a way that feels conversational instead of overwhelming.
Why This Approach Matters
Google made a smart choice with this release.
They could have launched a separate AI app. They could have added a chatbot that opens in its own window. They could have created a whole new product and asked everyone to learn something new.
Instead, they made Maps better.
You open the same app you’ve used for years. You type in the same search bar. But now it understands full questions and gives you actual answers instead of a list of links to figure out yourself.
Most people will never read this announcement or any article about it. They’ll just notice that Maps feels smarter than it used to. That’s exactly the point.
The best AI updates are the ones that feel invisible. You think the product got better. You keep using it the same way. But something shifted underneath.
This is one of those updates.

