Last year, Apple’s WWDC keynote highlighted the company’s ambitious strides in AI. This year, the company toned down its emphasis on Apple Intelligence and concentrated on updates to its operating systems, services, and software, introducing a new aesthetic it calls “Liquid Glass” along with a new naming convention.
Nevertheless, Apple still attempted to appease the crowd with a few AI-related announcements, such as an image analysis tool, a workout coach, a live translation feature, and more.
Visual Intelligence
Visual Intelligence is Apple’s AI-powered image analysis technology that allows you to gather information about your surroundings. For example, it can identify a plant in a garden, tell you about a restaurant, or recognize a jacket someone is wearing.

Now, the feature will be able to interact with the information on your iPhone’s screen. For instance, if you come across a post on a social media app, Visual Intelligence can conduct an image search related to what you see while browsing. The tool performs the search using Google Search, ChatGPT, and similar apps.
To access Visual Intelligence, open the Control Center or customize the Action button (the same button typically used to take a screenshot). The feature becomes available with iOS 26 when it launches later this year. Read more.
ChatGPT comes to Image Playground
Apple integrated ChatGPT into Image Playground, its AI-powered image generation tool. With ChatGPT, the app can now generate images in new styles, such as “anime,” “oil painting,” and “watercolor.” There will also be an option to send a prompt to ChatGPT to let it create additional images. Read more.
Workout Buddy
Apple’s latest AI-driven workout coach is exactly what it sounds like — it uses a text-to-speech model to deliver encouragement while you exercise, mimicking a personal trainer’s voice. When you begin a run, the AI within the Workout app provides you with a motivational talk, highlighting key moments such as when you ran your fastest mile and your average heart rate. After you’ve completed the workout, the AI summarizes your average pace, heart rate, and whether you achieved any milestones. Read more.
Live Translation
Apple Intelligence is powering a new live translation feature for Messages, FaceTime, and phone calls. This technology automatically translates text or spoken words into the user’s preferred language in real time. During FaceTime calls, users will see live captions, whereas for phone calls, Apple will translate the conversation aloud. Read more.

AI helps with unknown callers
Apple has introduced two new AI-powered features for phone calls. The first is referred to as call screening, which automatically answers calls from unknown numbers in the background. This allows users to hear the caller’s name and the reason for the call before deciding whether to answer.
The second feature, hold assist, automatically detects hold music when waiting for a call center agent. Users can choose to stay connected while on hold, allowing them to use their iPhone for other tasks. Notifications will alert them when a live agent becomes available. Read more.
Poll suggestions in Messages
Apple also introduced a new feature that allows users to create polls within the Messages app. This feature uses Apple Intelligence to suggest polls based on the context of your conversations. For instance, if people in a group chat are having trouble deciding where to eat, Apple Intelligence will recommend starting a poll to help land on a decision. Read more.

AI-powered shortcuts
The Shortcuts app is becoming more useful with Apple Intelligence. The company explained that when building a shortcut, users will be able to select an AI model to enable features like AI summarization. Read more.

Contextually aware Spotlight
A minor update is being introduced to Spotlight, the on-device search feature for Mac. It will now incorporate Apple Intelligence to improve its contextual awareness, providing suggestions for actions that users typically perform, and tailored to their current tasks. Read more.
Foundation Models for developers
Apple is now allowing developers to access its AI models even when offline. The company introduced the Foundation Models framework, which enables developers to build more AI capabilities into their third-party apps that utilize Apple’s existing systems. This is likely intended to encourage more developers to create new AI features as Apple competes with other AI companies. Read more.
Apple’s AI-powered Siri setback
The most disappointing news to emerge from the event was that the much-anticipated developments for Siri aren’t ready yet. Attendees were eager for a glimpse of the promised AI-powered features that were expected to debut. However, Craig Federighi, Apple’s SVP of Software Engineering, said they won’t have more to share until next year. This delay may raise questions about Apple’s strategy for the voice assistant in an increasingly competitive market. Read more.