Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

Apple Intelligence Can Now Creep on Your iPhone Screen


It would not be a disseputor in 2025 without a little AI, right? As disappointing as an intelligence since its deployment in October of last year, Apple seems determined to improve this experience with pivot upgrades like … New Genmojis? Okay, so maybe WWDC 2025 was not a revolutionary year for Apple IntelligenceBut there are still upgrades to note, including a new feature that can look at what you do on your phone, then take specific measures depending on the scenario.

Visual Intelligence, as Apple calls it, is a functionality that extends multimodal capabilities beyond the application of the camera and in the screen of your iPhone. “Users can ask questions about Chatgpt on what they look at on their screen to find out more, as well as search Google, Etsy or other supported applications to find similar images and products”, ” Said Apple. “If there is an object that is particularly interested, like a lamp, it can highlight it to search for this specific element or similar objects online.”

It does not seem new by means, but it brings Apple’s intelligence closer to competitors like Google, which has a Gemini function It’s about the same thing. This also brings Apple’s intelligence closer to the Holy Grail of “Agentic“, Which is the way the technological world describes AI that can do things for you. Like Ho-Hum that multimodal features and visual intelligence have become in a very short time, they always have the power to improve the telephone experience, in my opinion.

Functional Apple Intelligence in iOS 16.
© Apple

I think I speak for most people when I say that using your iPhone is not as simple as before, and there are several reasons for this. One of the reasons is that we expect our phones to do much more than before, which means that devices must have more features to do all these things. The problem is that keeping track of these features and finding a place so that they exist in a user interface is not easy – this makes software more swollen. The agentic AI has the ability to cut the bloating and get you to what you want to do more quickly. If that means that I spend less time entering information on the payment card or navigating between applications on my phone, I am absolutely for that.

All this is theoretical at the moment because visual intelligence has just been published, and we cannot say with certainty if it works as promised, but I am certainly not crazy about the idea, even despite a little disappointed. Visual intelligence should also run on AI available, which is great because sending data from my phone screen anywhere would not really be high on my task list.

It was not visual intelligence; Apple has also unveiled new AI features such as live translation in messages and FaceTime translates while you send SMS or call with someone. There have also been Genmoji and Image Playground updates which add additional personalization and new art styles for generated images and emoji. In addition, Apple will open its foundation model on Apple Intelligence devices and will invite third -party application developers to design their own AI features.

“Application developers will be able to rely on Apple Intelligence to provide users with new intelligent experiences, available when they are offline, and which protect their privacy, using an IA inference that is free,” Apple said in a press release. “For example, an educational application can use the model on devices to generate a personalized quiz from a user notes, without any cloud API cost, or an outdoor application can add natural language search capabilities that work even when the user is offline.”

Again, this is not exactly the most flashy news for Apple Intelligence, but it is perhaps a solid way to accelerate the development of new AI features, especially while Apple is lagging behind in the field of generative AI and models of large languages. Speaking of delay, a notable thing that was missing was the upgrade of Siri fueled by AI of Apple, although Apple approached the Elephant of AI in the room, declaring that we would hear more “later this year”. This is by no means surprising, but it certainly indicates Apple’s trippers on the AI ​​front.

This year’s WWDC has not done much to appease all the concerns you may have on Apple’s progress on the AI ​​front, but it has advanced the needle a little, and this may be enough. Despite the emphasis on industry on AI’s features, consumers have a resolutely smaller appetite for these features, so I doubt that this year update will prevent anyone from exhausting this year and buying the Last iPhone.

Anyone who is part of the Apple Developer program can use the new features of Apple Intelligence today, while the first public version will be available next month. If you are not interested in beta or if you are not a developer, you will have to wait until fall to try these new features in full.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *