Apple had its WWDC 2025 event yesterday and actually led off right away with Apple Intelligence and then later spoke about its Apple Visual Intelligence. Basically, Apple said they need to perfect it, so there will be more to come in the coming year.
Apple Intelligence
Craig Federighi, SVP, Software Engineering at Apple said regarding Apple Intelligence:
And as we shared, we continue our work to deliver the features that make Siri even more personal. This work needed more time to reach our high quality bar, and we look forward to sharing more about it in the coming year.
Apple did announce new language support, here is the full list of Apple Intelligence supported languages coming with iOS26 (the new name for the upcoming iOS release):
- French
- German
- Italian
- Portuguese (Brazil)
- Spanish
- Chinese (Simplified)
- Japanese
- Korean
- English
- Danish
- Dutch
- Norwegian
- Portuguese (Portugal)
- Swedish
- Turkish
- Chinese (Traditional)
- Vietnamese
He also announced that app developers will be able to leverage Apple Intelligence for their apps. He said, the new "Foundation Models framework" lets App developers to tap into the on-device models, all done on the device and not in the cloud, and thus no cost, no privacy issues and offline support.
This part started at the 6:32 mark and ended at 9:50:
Apple Visual Intelligence
Then at the 40 minute mark, Craig spoke about Apple Visual Intelligence. We covered a number of features here last year, some of which launched and some never launched. I did not see Gemini integration.
In any event, here are some of the new features he demoed around Apple Visual Intelligence.
Screenshot to search: When you take a screenshot, you will see at the bottom "Ask" and "Image Search."
Clicking "Image Search" will find matching images on Google or third-party apps:
You can also highlight to search a specific part of the screen, like Google's circle to search feature:
And then "Ask" will let you ask ChatGPT questions about what you are looking at - i.e. multimodal:
Of course, Google's Live Search is much more impressive right now.
Here is a video at the 40 minute mark so you can watch this part too - it is less than 4 minutes long:
So that is what is new with Apple Intelligence and Apple Visual Intelligence for iOS26.
Forum discussion at X.