Apple has made the smallest update to Visual Intelligence in iOS 26, and yet the impact of being able to use it on any image is huge, and at least doubles the usefulness of this one feature. Visual Intelligence involved pointing your iPhone camera at whatever you were interested in. What Apple has done with iOS 26 is take that step away. Everything else is the same, but you no longer have to use your camera. You can instead deploy Visual Intelligence on anything on your iPhone’s screen. This one thing means that researchers can find out more about objects they see on websites. And shoppers can freeze frame a YouTube video and use Visual Intelligence to track down the bag that influencer is wearing. There is an issue that this means there are now two different ways to use Visual Intelligence, and they involve you having to do two different things to start them. The new version is an extra part of Visual Intelligence, not a replacement. Visual Intelligence is replete with different ways to use it, one of which provides a very different service to the rest. Yet being able to identify just about anything on your screen is a huge boon. And consequently Apple increased the usefulness of Visual Intelligence just by not requiring the step where you point your iPhone camera at anything.