Skip to Main Content

Finally, My iPhone 15 Pro Is Getting the Visual Intelligence Upgrade It Deserves

Sweet vindication at last.
Visual Intelligence running on iPhone 16E using Action Button
Credit: Apple

When Apple announced the iPhone 16E late last month, it also confirmed that the new budget phone was getting Apple Intelligence’s “Visual Intelligence” feature, marking the first time the AI trick would come to a phone without a “Camera Control” button. While the other iPhone 16 series phones use their Camera Control buttons to access Visual Intelligence, the iPhone 16E can instead map it to its Action Button, a simple change that raises the question: why not the iPhone 15 Pro, too?

Personally, as an iPhone 15 Pro owner, I’ve been asking that question for months now, as I’ve long suspected my phone’s internals were definitely capable of it—it can run every other Apple Intelligence feature without issue. It instead seemed to me like Apple was arbitrarily holding the feature back because it wanted to tie it to a specific button press I didn't have. Well, with the iPhone 16E adopting the Action Button workaround, it seems like Apple’s finally listening. Apple representatives have now confirmed that Visual Intelligence will be coming to the iPhone 15 Pro as well, using the same strategy.

Speaking to Daring Fireball’s Jeff Gruber, an Apple spokesperson said that the iPhone 15 Pro will indeed get Visual Intelligence “in a future software update,” and that users could map it to the Action Button. Sweet vindication.

While Apple hasn't set a specific release date for Visual Intelligence on the iPhone 15 Pro yet, 9to5Mac has reported that it seems to be set for iOS 18.4, going by what's being tested in the latest developer beta. To be honest, I’m not sure if I’ll use Visual Intelligence much, but it’s encouraging to see my phone’s software not get held back by an arbitrary push for hardware cohesion anymore. Heck, the beta even reportedly includes the ability for the rest of the iPhone 16 series to open Visual Intelligence using the Action button instead of the Camera Control too, if desired.

What do you think so far?

For the uninitiated, Visual Intelligence brings AI to your iPhone’s camera. You can point your camera at a foreign language menu, for instance, to get a translation, or point it at a book to get a summary of what’s on the page, or point it at a dog to try to find out what breed it is. It can also surface information about businesses simply by looking at their storefront or signage (in the United States only), and works with Google and ChatGPT for extended search queries. In other words, it's similar to Google Lens, but puts AI first and is built into your operating system. Again, I’ve been prevented from playing around with it much, but hey, at least I now have the option.

Updated 3/3 with new reporting from 9to5Mac saying that Visual Intelligence for iPhone 15 Pro is being tested for release in iOS 18.4.

Michelle Ehrhardt
Michelle Ehrhardt
Associate Tech Editor

Michelle Ehrhardt is Lifehacker's Associate Tech Editor. She has been writing about tech and pop culture since 2014 and has edited for outlets including Gizmodo and Tom's Hardware.

Read Michelle's full bio