Appleās WWDC25 keynote (June 9ā13, 2025) was packed with big changes: a bold new on-device AI platform (āApple Intelligenceā) and a sleek Liquid Glass design overhaul across all OSes. Apple also renamed its operating systems to year-based versions (iOS 26, iPadOS 26, macOS Tahoe 26, etc.), making updates easier to track. For developers, WWDC25 means powerful new APIs and tools ā from an on-device large language model to UI framework updates. Letās break it all down in clear terms, with the highlights and what they mean for beginners and pros alike.
Apple Intelligence: On-Device AI for Everyone š¤
Apple calls its new AI suite Apple Intelligence. It brings generative and translation features right on your device, no cloud needed. At WWDC, Apple showed iPhones, iPads, and Macs all running new AI features (as illustrated above). The biggest news for developers is the FoundationModels framework: Apple now lets your app tap the on-device large language model (LLM) that powers Apple Intelligence. This model (about 3B parameters) runs locally for speed and privacy ā and Apple says you can use it with āas few as three lines of codeā in Swift. Because it runs offline, user data never leaves the device.
Key Apple Intelligence features include:
- Live Translation (on-device): Real-time translation in Messages, FaceTime, and Phone. Your iPhone can translate incoming chats or calls on the fly between dozens of languages (using only on-device Apple models, so your conversations stay private).
- Visual Intelligence & Smart Search: You can tap or screenshot anything on your screen and ask questions. For example, by highlighting an object on screen, users can ask the model to search Google or ChatGPT about it. This extends the old āLive Textā idea to search and image tasks ā it can even spot dates in a web page and suggest adding an event to Calendar.
- Creative Tools (Genmoji & Image Playground): Appleās emoji generator (Genmoji) and Image Playground get smarter. You can now mix emojis together and add text prompts to create custom stickers. Image Playground lets users apply new ChatGPT-driven styles (like oil paint or vector art) to photos or sketches. Everything stays on-device unless you choose ChatGPT explicitly.
- Smart Shortcuts & Siri: Shortcuts can now directly invoke Apple Intelligence (or even ChatGPT) for tasks like text summarization or image creation. Siri also learns to maintain context better and can leverage ChatGPT in its writing tools (e.g. drafting messages or understanding documents) to give more helpful answers.
Developers get direct access to this AI. The FoundationModels framework (with a simple Swift API) gives apps the LLM for tasks like summarizing notes, generating content, or natural-language search. Apple even added a new Playgrounds mode in Xcode for AI: you can write a few Swift lines and see the modelās output live in the editor (great for testing prompts). In short, powerful generative AI is now a built-in Apple platform feature, and itās free to use (no cloud fees).
Key Apple Intelligence Features
- Foundation Model for Devs: On-device LLM accessible via Swift. Developers can ātap into the on-device foundation modelā for private, fast AI in apps.
- Live Translation: Auto-translate text and audio in Messages, FaceTime, or Phone in real time.
- Visual Intelligence: Ask about or search anything on the screen with ChatGPT integration.
- Genmoji & Images: Mix emojis and prompts to generate custom stickers; use new ChatGPT styles in Image Playground.
- Shortcuts & Siri AI: New āintelligent actionsā in Shortcuts (e.g. auto-summarize text, generate images) and smarter Siri (context retention, ChatGPT-powered responses).
Liquid Glass: Appleās New Design System šØ
WWDC25 unveiled a sweeping UI redesign called Liquid Glass. Itās a new translucent āmaterialā for Appleās software. Imagine UI panels and controls that look like glass: they refract and reflect the colors behind them and gently blur whatās underneath. Apple says Liquid Glass ācombines the optical qualities of glass with a fluidity only Apple can achieveā. The effect makes app interfaces feel more vibrant and layered. Above is an example of the new design across Apple TV, Mac, iPad, iPhone, and Watch: you can see the semi-transparent panels and colorful widgets.
Liquid Glass brings a unified look across platforms: for the first time the same design language extends through iOS 26, iPadOS 26, macOS Tahoe 26, watchOS 26, and tvOS 26. Some highlights:
- Glass-like Panels: New tab bars, toolbars, and sidebars are frosted and adaptive. For example, in iOS 26 a tab bar will shrink and fade when you scroll to focus on content, then expand again when needed. On Mac and iPad, sidebars now semi-refract your wallpaper, giving constant context of where you are.
- Dynamic Content Focus: UI controls now āfloatā above content. They morph to show more options or recede to let content shine. Apple even redesigned widgets and icons as multi-layered Liquid Glass objects with customizable tints and a new āclearā look. On macOS Tahoe, you can tint the Dock and menu bar in custom colors or make the menu bar fully transparent, maximizing space.
- Consistent Typography & Effects: Even the clock font on the Lock Screen dynamically adjusts weight and position to sit ābehindā a photo subject. The new design uses Appleās San Francisco font in clever ways under the glass.
Apple has updated its UI frameworks so developers can adopt Liquid Glass easily. Using SwiftUI, UIKit or AppKit, you can now apply built-in Liquid Glass materials and fresh controls to your app with a few new APIs. In fact, Apple released a new Icon Composer tool to help devs create Liquid Glass-style app icons (light, dark, tinted, or clear variants) across all platforms. In short, Apple is giving you the tools to make your app match the new look.
Liquid Glass Design at a Glance:
- Unified UI Across Devices: The redesign touches iPhone, iPad, Mac (Tahoe), Watch, and TV alike.
- Translucent āGlassā Material: UI panels refract background color and light, adapting dynamically.
- Adaptive Controls: Tab bars, sidebars, toolbars shrink, expand or fade in sync with content.
- Rich Icons & Widgets: App icons and widgets use multiple glass layers and custom tints for a vibrant look.
- Developer APIs: Updated SwiftUI/UIKit/AppKit APIs for Liquid Glass make it easy to refresh your appās design.
New OS Releases: iOS 26, iPadOS 26, macOS Tahoe 26, and More
Every Apple platform got a big upgrade with the new design and intelligence features. Apple even moved to year-based names (e.g. Tahoe 26) so versions are obvious. Here are the headlines:
OS | Key New Features |
---|---|
iOS 26 | Liquid Glass redesign throughout. Live Translation integrated in Phone/FaceTime/Messages (spam and unknown senders now filtered). Revamped CarPlay (āUltraā mode syncs your iPhone across car screens with widgets and vehicle controls). New Apple Games app as a central hub for gaming. Enhanced Maps, Wallet, and Music. |
iPadOS 26 | All-new multitasking windowing system to control and organize apps (a huge update). Liquid Glass UI (app shells, dock, Control Center) with resizable panels. Files app overhauled (folders in dock, custom sorting). New Preview app for PDFs (with Apple Pencil markup and AutoFill). Enhanced audio/video tools (background tasks for recording, multiple audio inputs) for creators. AI features like Live Translation, Genmoji/Image Playground, and smarter Shortcuts are deeply integrated. |
macOS Tahoe 26 | Liquid Glass design on desktop/Dock/menus. Continuity: a full Phone app on Mac (see iPhone contacts, recents, voicemails) with new Call Screening and Hold notifications. Live Activities from iPhone can appear on Mac. Spotlight search gets its biggest overhaul: you can now directly take actions (send email, create note, etc.) right from the search results. Expanded Apple Intelligence ā Live Translation and Genmoji/Image tools come to Mac, and Shortcuts can tap the on-device model. |
watchOS 26 | All-new āWorkout Buddyā on Watch: an AI fitness coach that uses your workout data to give real-time tips and cheers in a trainerās voice. Watch faces and UI get the new design language. (Developers can use watchOS updates for workout and health apps.) |
tvOS 26 | Liquid Glass UI on Apple TV: the Home screen and apps now have glassy icons and backgrounds. The new design makes Apple TVās interface more cohesive with other devices. |
(Table: Summary of WWDC25 OS updates)
Above, we quoted Appleās keynote buzzwords: iOS 26 āstunning new designā and smarter apps, iPadOS 26 ābiggest release everā with powerful new features, and macOS Tahoe ābeautiful new designā with Continuity and AI boosts. In practice, you can expect your iPhone and iPad to look totally refreshed and get new multitasking and focus tools, your Mac to feel more iPhone-like (with calls and widgets) and smarter search, and your Watch/TV to visually match.
Developer Tools & APIs: Whatās New for You
WWDC25 wasnāt just about end-user features ā there are plenty of goodies for developers too. Key takeaways:
- Foundation Models Framework (AI): As noted above, the new Swift FoundationModels framework gives apps a convenient API to the on-device LLM. Appleās sessions describe guided generation (structured outputs in Swift types), streaming responses, and the ability to ācall toolsā (run code) as part of a session. All of this runs locally and wonāt bloat your app size. This means any app (education, games, note-taking, etc.) can now integrate powerful AI without backend costs.
- SwiftUI & UIKIt Updates: Apple refreshed its frameworks for the new design. According to Appleās docs, SwiftUI now includes new components to āmake the most of the new designā. For example, SwiftUI views can automatically adopt Liquid Glass materials, and you can design widgets that run on iPad, visionOS and even CarPlay. (Yes ā widgets on CarPlay and in Vision Pro are now supported.) UIKit and AppKit got similar updates. You can also enable rich text editing in text views, build custom editor controls, and even mix 2D SwiftUI with 3D RealityKit views. Swift Charts got 3D support, and WebKit got new APIs for in-app browsing.
- Xcode & Playgrounds: The new Playground feature in Xcode lets you experiment with the on-device model in real time. In a Playground canvas you can write a loop of prompts and see the AIās output instantly, which is great for testing how your app might use generative AI. Apple emphasized that this is a big help for learning prompts and iterating on AI features.
- Icon Composer: Apple released a new tool called Icon Composer for developers to create app icons with Liquid Glass effects. It generates icon assets (light/dark/tinted/clear variants) that automatically get the cool glassy look on each platform. This makes it easy to refresh your appās icon in the new style without manual design work.
Overall, Appleās developer session on AI capped it off: it reminded us that the on-device model is āseveral orders of magnitude biggerā than anything Apple had before, and is optimized for tasks like summarization and classification. (Itās not meant to replace internet-scale reasoning models, but itās huge for a phone/tablet.) Apple will keep improving it, and tools like guided generation (structured outputs) mean you can get results back as typed Swift data. In short, Apple gave us both the engine (the model) and the control knobs (APIs) to build next-gen AI features right into our apps.
Wrapping Up: Your Thoughts? š¬
WWDC25 was all about making everyday interactions smarter and more beautiful. Users will see smarter apps (thanks to Apple Intelligence) and a magical new interface (thanks to Liquid Glass). Developers get to play with cutting-edge AI tools and updated design APIs.
What do you think of these announcements? Are you excited to add on-device AI to your apps, or to refresh your UI with liquid translucency? Which feature will you try first? Share your thoughts or questions in the comments ā the dev community would love to hear your take on Appleās big 2025 update!
Top comments (3)
This was a super helpful breakdown, especially for someone like me just stepping into iOS development. I'm curious though, how do you think Apple Intelligence compares to what Google or Microsoft are doing with AI on their platforms? Is it just playing catch-up or doing something unique?
Thanks for reading and the lovely thoughtful question Anna! š
You're absolutely right to draw that comparison ā Apple is entering a field where Google (with Gemini) and Microsoft (with Copilot) have been making major moves. But what stands out with Apple Intelligence is their focus on privacy-first, on-device AI. While others rely heavily on cloud-based models, Apple is integrating smaller, efficient models that run locally, and only use server-side processing (via Private Cloud Compute) when absolutely necessary.
Itās not just catch-up ā itās a very Apple take on AI, prioritizing user trust and ecosystem integration. I think this could create a new standard for how consumer AI gets adopted on personal devices. Definitely one to watch! š
Letās see how it plays out across developer tools and APIs in the coming months!
It may be a traumatic experience to the victim of an online investment fraud, however preventing & protecting oneself against any possible scams in the future involves the improvement of oneās financial knowledge & the safety protocols of the reliable asset recovery platform whom he/she has chosen . Become aware with basic tips that may help one avoid being conned, like attractively high and low risk assurance of high returns and high pressure to invest. In my time of despair & regrets,I had no guidance or article that suggested āhackersā as a way out, so if I were you Iād consider this piece of information āvitalā. In spite of the emotional & financial damage I battled with mentally, i still managed to make a legal report to the cryptocurrency regulatory authority within my jurisdiction; who then pointed me to āVALOR HACK RECOVERY TEAM ā direction as an off the book guaranteed solution to my report.
Contact info: