Apple announced a list of updates to its artificial intelligence feature suite, Apple Intelligence, at the World Wide Developers Conference (WWDC) 2025 on Monday. These new features will be available for supported iPhone models later this year, with iOS 26.
The company announced a live translation feature integrated into Messages, FaceTime, and Phone applications. This feature helps translate text messages, phone calls, and audio from FaceTime calls in real time.
Apple also announced an update to its AI-enabled emoji maker feature. Users can now mix two emojis to create one. This is in addition to the previously launched Genmoji feature, which lets users develop emojis with prompts in natural language. Apple also announced that users can now generate images in the ‘Image Playground’ with custom art styles. The feature will now leverage ChatGPT’s latest image generation models.
Visual Intelligence Takes the ‘Circle to Search’ Path
Apple also announced a new update to Visual Intelligence features on Apple Intelligence, allowing AI to analyse screen content and answer relevant queries or take necessary actions.
Users will be able to capture screenshots and highlight objects to search for similar items online. This feature is largely similar to the ‘Circle to Search’ feature available on Android devices.
Apple has really nailed the Apple Intelligence usage this time. Live translations, visual intelligence, call screening all legit useful use cases of AI. #WWDC25 pic.twitter.com/6nbfjCqAIb
— Tanmay (@tanmays) June 9, 2025
In addition, Apple Intelligence can take necessary actions based on the data in the image. For instance, if Apple Intelligence detects event details in the image, it will suggest an “Add to Calendar” button that pre-populates the date, time, and location. Users can also use the ‘Ask’ button to pose a question directly to ChatGPT about the highlighted object.
Apple Intelligence Comes to Workouts
Apple also announced ‘Workout Buddy’ on the Apple Watch, which works with Apple Intelligence and uses workout data and fitness history to generate personalised and motivational insights during a workout session.
Apple said a text-to-speech model analyses a user’s workout and fitness history and provides voice-based motivational insights with the ‘right energy, style and tone, for a workout’.
Furthermore, Apple also announced a new ‘Foundation Models Framework’, where developers can build on top of Apple Intelligence to bring new features via third-party apps. “For example, an education app can use the on-device model to generate a personalised quiz from a user’s notes, without any cloud API costs,” said Apple.
Moreover, Apple announced the integration of Apple Intelligence to Shortcuts, where users can automate actions and tasks on their iPhone. Users can now use Apple Intelligence capabilities (like writing or image generation capabilities) to create a Shortcut.
Apple Intelligence is, uh, a lot more than just Siri pic.twitter.com/3ldeUSSz9q
— Max Weinbach (@MaxWinebach) June 9, 2025
This expands the list of features, alongside the ones currently available with iOS 18. However, Apple has faced criticism throughout for the underwhelming features of Apple Intelligence so far.
Recently, Eddy Cue, Apple’s senior VP of services, revealed the company’s intention to incorporate AI search into Safari, with support from either Perplexity, Google, or OpenAI.
Additionally, Google CEO Sundar Pichai disclosed that Google intends to partner with Apple to integrate Gemini into Apple Intelligence this year. It was also reported that CEO Tim Cook, in a discussion with Pichai, mentioned that ‘more third-party AI models’ will be delivered to Apple Intelligence later this year.
The post iOS 26 Brings Plenty of New Updates to Apple Intelligence appeared first on Analytics India Magazine.