AirPods with cameras for Visual Intelligence could be one of the best personal safety features Apple has ever planned – here’s why

Click here to visit Original posting


  • Apple is said to still be working on cameras in AirPods
  • iOS 18's Visual Intelligence is at the heart of Apple's plans
  • Features are still "generations away"

We've known the "what" for some time – Apple is experimenting with cameras in its AirPods – and now we perhaps know the "why". A new report sheds light on Apple's plans for future AirPods, and if the tech can do what it promises to do it could be a really important personal safety feature.

There is an important caveat, though: the features are "still at least generations away from hitting the market".

The report comes from the well-connected Mark Gurman at Bloomberg, who says that "Apple’s ultimate plan for Visual Intelligence goes far beyond the iPhone." And AirPods are a big part of that plan.

According to Gurman, Visual Intelligence – recognizing the world around you and providing useful information or assistance – is considered a very big deal inside Apple, and it's planning to put cameras in both the Apple Watch and Apple Watch Ultra too. As with the AirPods, "this would help the device see the outside world and use AI to deliver relevant information."

How AirPods will work with Visual Intelligence

Visual Intelligence was introduced in iOS 18 for the iPhone 16, and it enables you to point the camera at something and find out more about it: the type of plant, the breed of dog (as in the image at the top of this article), the opening hours of the café you've just found, and so on.

Visual Intelligence on an iPhone 16

(Image credit: Apple)

Visual Intelligence can also translate text, and maybe one day it'll be able to help people like me who have a shockingly bad memory for names and faces.

The big problem with Visual Intelligence, though, is that you have to bring out your phone to do it. And there are circumstances where you're not going to want to do that. I'm reminded of when Apple brought Maps to the Apple Watch: by making it possible to use Maps without broadcasting "I am not from here and I am hopelessly lost. Also I have a very expensive phone" to all the neighborhood villains, it was an important personal safety feature.

This could be too. If Apple makes it possible to invoke Visual Intelligence with a point of the head and a squeeze of the stems, that would enable you to get important information – such as a translation of a direction sign in another country – without waving your phone around.

We're a long way from actually having these features – don't expect them in the AirPods Pro 3, which will probably arrive later in 2025. But I'm excited by the prospect: imagine Apple Intelligence, but good.

You might also like