DEV Community

Cover image for Apple WWDC25 Recap: Apple Intelligence, Liquid Glass, and Next-Gen OS šŸš€
Om Shree
Om Shree

Posted on

Apple WWDC25 Recap: Apple Intelligence, Liquid Glass, and Next-Gen OS šŸš€

Apple’s WWDC25 keynote (June 9–13, 2025) was packed with big changes: a bold new on-device AI platform (ā€œApple Intelligenceā€) and a sleek Liquid Glass design overhaul across all OSes. Apple also renamed its operating systems to year-based versions (iOS 26, iPadOS 26, macOS Tahoe 26, etc.), making updates easier to track. For developers, WWDC25 means powerful new APIs and tools – from an on-device large language model to UI framework updates. Let’s break it all down in clear terms, with the highlights and what they mean for beginners and pros alike.

Apple Intelligence: On-Device AI for Everyone šŸ¤–

Apple calls its new AI suite Apple Intelligence. It brings generative and translation features right on your device, no cloud needed. At WWDC, Apple showed iPhones, iPads, and Macs all running new AI features (as illustrated above). The biggest news for developers is the FoundationModels framework: Apple now lets your app tap the on-device large language model (LLM) that powers Apple Intelligence. This model (about 3B parameters) runs locally for speed and privacy – and Apple says you can use it with ā€œas few as three lines of codeā€ in Swift. Because it runs offline, user data never leaves the device.

Key Apple Intelligence features include:

  • Live Translation (on-device): Real-time translation in Messages, FaceTime, and Phone. Your iPhone can translate incoming chats or calls on the fly between dozens of languages (using only on-device Apple models, so your conversations stay private).
  • Visual Intelligence & Smart Search: You can tap or screenshot anything on your screen and ask questions. For example, by highlighting an object on screen, users can ask the model to search Google or ChatGPT about it. This extends the old ā€œLive Textā€ idea to search and image tasks – it can even spot dates in a web page and suggest adding an event to Calendar.
  • Creative Tools (Genmoji & Image Playground): Apple’s emoji generator (Genmoji) and Image Playground get smarter. You can now mix emojis together and add text prompts to create custom stickers. Image Playground lets users apply new ChatGPT-driven styles (like oil paint or vector art) to photos or sketches. Everything stays on-device unless you choose ChatGPT explicitly.
  • Smart Shortcuts & Siri: Shortcuts can now directly invoke Apple Intelligence (or even ChatGPT) for tasks like text summarization or image creation. Siri also learns to maintain context better and can leverage ChatGPT in its writing tools (e.g. drafting messages or understanding documents) to give more helpful answers.

Developers get direct access to this AI. The FoundationModels framework (with a simple Swift API) gives apps the LLM for tasks like summarizing notes, generating content, or natural-language search. Apple even added a new Playgrounds mode in Xcode for AI: you can write a few Swift lines and see the model’s output live in the editor (great for testing prompts). In short, powerful generative AI is now a built-in Apple platform feature, and it’s free to use (no cloud fees).

Key Apple Intelligence Features

  • Foundation Model for Devs: On-device LLM accessible via Swift. Developers can ā€œtap into the on-device foundation modelā€ for private, fast AI in apps.
  • Live Translation: Auto-translate text and audio in Messages, FaceTime, or Phone in real time.
  • Visual Intelligence: Ask about or search anything on the screen with ChatGPT integration.
  • Genmoji & Images: Mix emojis and prompts to generate custom stickers; use new ChatGPT styles in Image Playground.
  • Shortcuts & Siri AI: New ā€œintelligent actionsā€ in Shortcuts (e.g. auto-summarize text, generate images) and smarter Siri (context retention, ChatGPT-powered responses).

Liquid Glass: Apple’s New Design System šŸŽØ

WWDC25 unveiled a sweeping UI redesign called Liquid Glass. It’s a new translucent ā€œmaterialā€ for Apple’s software. Imagine UI panels and controls that look like glass: they refract and reflect the colors behind them and gently blur what’s underneath. Apple says Liquid Glass ā€œcombines the optical qualities of glass with a fluidity only Apple can achieveā€. The effect makes app interfaces feel more vibrant and layered. Above is an example of the new design across Apple TV, Mac, iPad, iPhone, and Watch: you can see the semi-transparent panels and colorful widgets.

Liquid Glass brings a unified look across platforms: for the first time the same design language extends through iOS 26, iPadOS 26, macOS Tahoe 26, watchOS 26, and tvOS 26. Some highlights:

  • Glass-like Panels: New tab bars, toolbars, and sidebars are frosted and adaptive. For example, in iOS 26 a tab bar will shrink and fade when you scroll to focus on content, then expand again when needed. On Mac and iPad, sidebars now semi-refract your wallpaper, giving constant context of where you are.
  • Dynamic Content Focus: UI controls now ā€œfloatā€ above content. They morph to show more options or recede to let content shine. Apple even redesigned widgets and icons as multi-layered Liquid Glass objects with customizable tints and a new ā€œclearā€ look. On macOS Tahoe, you can tint the Dock and menu bar in custom colors or make the menu bar fully transparent, maximizing space.
  • Consistent Typography & Effects: Even the clock font on the Lock Screen dynamically adjusts weight and position to sit ā€œbehindā€ a photo subject. The new design uses Apple’s San Francisco font in clever ways under the glass.

Apple has updated its UI frameworks so developers can adopt Liquid Glass easily. Using SwiftUI, UIKit or AppKit, you can now apply built-in Liquid Glass materials and fresh controls to your app with a few new APIs. In fact, Apple released a new Icon Composer tool to help devs create Liquid Glass-style app icons (light, dark, tinted, or clear variants) across all platforms. In short, Apple is giving you the tools to make your app match the new look.

Liquid Glass Design at a Glance:

  • Unified UI Across Devices: The redesign touches iPhone, iPad, Mac (Tahoe), Watch, and TV alike.
  • Translucent ā€œGlassā€ Material: UI panels refract background color and light, adapting dynamically.
  • Adaptive Controls: Tab bars, sidebars, toolbars shrink, expand or fade in sync with content.
  • Rich Icons & Widgets: App icons and widgets use multiple glass layers and custom tints for a vibrant look.
  • Developer APIs: Updated SwiftUI/UIKit/AppKit APIs for Liquid Glass make it easy to refresh your app’s design.

New OS Releases: iOS 26, iPadOS 26, macOS Tahoe 26, and More

Every Apple platform got a big upgrade with the new design and intelligence features. Apple even moved to year-based names (e.g. Tahoe 26) so versions are obvious. Here are the headlines:

OS Key New Features
iOS 26 Liquid Glass redesign throughout. Live Translation integrated in Phone/FaceTime/Messages (spam and unknown senders now filtered). Revamped CarPlay (ā€œUltraā€ mode syncs your iPhone across car screens with widgets and vehicle controls). New Apple Games app as a central hub for gaming. Enhanced Maps, Wallet, and Music.
iPadOS 26 All-new multitasking windowing system to control and organize apps (a huge update). Liquid Glass UI (app shells, dock, Control Center) with resizable panels. Files app overhauled (folders in dock, custom sorting). New Preview app for PDFs (with Apple Pencil markup and AutoFill). Enhanced audio/video tools (background tasks for recording, multiple audio inputs) for creators. AI features like Live Translation, Genmoji/Image Playground, and smarter Shortcuts are deeply integrated.
macOS Tahoe 26 Liquid Glass design on desktop/Dock/menus. Continuity: a full Phone app on Mac (see iPhone contacts, recents, voicemails) with new Call Screening and Hold notifications. Live Activities from iPhone can appear on Mac. Spotlight search gets its biggest overhaul: you can now directly take actions (send email, create note, etc.) right from the search results. Expanded Apple Intelligence – Live Translation and Genmoji/Image tools come to Mac, and Shortcuts can tap the on-device model.
watchOS 26 All-new ā€œWorkout Buddyā€ on Watch: an AI fitness coach that uses your workout data to give real-time tips and cheers in a trainer’s voice. Watch faces and UI get the new design language. (Developers can use watchOS updates for workout and health apps.)
tvOS 26 Liquid Glass UI on Apple TV: the Home screen and apps now have glassy icons and backgrounds. The new design makes Apple TV’s interface more cohesive with other devices.

(Table: Summary of WWDC25 OS updates)

Above, we quoted Apple’s keynote buzzwords: iOS 26 ā€œstunning new designā€ and smarter apps, iPadOS 26 ā€œbiggest release everā€ with powerful new features, and macOS Tahoe ā€œbeautiful new designā€ with Continuity and AI boosts. In practice, you can expect your iPhone and iPad to look totally refreshed and get new multitasking and focus tools, your Mac to feel more iPhone-like (with calls and widgets) and smarter search, and your Watch/TV to visually match.

Developer Tools & APIs: What’s New for You

WWDC25 wasn’t just about end-user features – there are plenty of goodies for developers too. Key takeaways:

  • Foundation Models Framework (AI): As noted above, the new Swift FoundationModels framework gives apps a convenient API to the on-device LLM. Apple’s sessions describe guided generation (structured outputs in Swift types), streaming responses, and the ability to ā€œcall toolsā€ (run code) as part of a session. All of this runs locally and won’t bloat your app size. This means any app (education, games, note-taking, etc.) can now integrate powerful AI without backend costs.
  • SwiftUI & UIKIt Updates: Apple refreshed its frameworks for the new design. According to Apple’s docs, SwiftUI now includes new components to ā€œmake the most of the new designā€. For example, SwiftUI views can automatically adopt Liquid Glass materials, and you can design widgets that run on iPad, visionOS and even CarPlay. (Yes – widgets on CarPlay and in Vision Pro are now supported.) UIKit and AppKit got similar updates. You can also enable rich text editing in text views, build custom editor controls, and even mix 2D SwiftUI with 3D RealityKit views. Swift Charts got 3D support, and WebKit got new APIs for in-app browsing.
  • Xcode & Playgrounds: The new Playground feature in Xcode lets you experiment with the on-device model in real time. In a Playground canvas you can write a loop of prompts and see the AI’s output instantly, which is great for testing how your app might use generative AI. Apple emphasized that this is a big help for learning prompts and iterating on AI features.
  • Icon Composer: Apple released a new tool called Icon Composer for developers to create app icons with Liquid Glass effects. It generates icon assets (light/dark/tinted/clear variants) that automatically get the cool glassy look on each platform. This makes it easy to refresh your app’s icon in the new style without manual design work.

Overall, Apple’s developer session on AI capped it off: it reminded us that the on-device model is ā€œseveral orders of magnitude biggerā€ than anything Apple had before, and is optimized for tasks like summarization and classification. (It’s not meant to replace internet-scale reasoning models, but it’s huge for a phone/tablet.) Apple will keep improving it, and tools like guided generation (structured outputs) mean you can get results back as typed Swift data. In short, Apple gave us both the engine (the model) and the control knobs (APIs) to build next-gen AI features right into our apps.

Wrapping Up: Your Thoughts? šŸ’¬

WWDC25 was all about making everyday interactions smarter and more beautiful. Users will see smarter apps (thanks to Apple Intelligence) and a magical new interface (thanks to Liquid Glass). Developers get to play with cutting-edge AI tools and updated design APIs.

What do you think of these announcements? Are you excited to add on-device AI to your apps, or to refresh your UI with liquid translucency? Which feature will you try first? Share your thoughts or questions in the comments – the dev community would love to hear your take on Apple’s big 2025 update!

Top comments (3)

Collapse
 
thedeepseeker profile image
Anna kowoski

This was a super helpful breakdown, especially for someone like me just stepping into iOS development. I'm curious though, how do you think Apple Intelligence compares to what Google or Microsoft are doing with AI on their platforms? Is it just playing catch-up or doing something unique?

Collapse
 
om_shree_0709 profile image
Om Shree

Thanks for reading and the lovely thoughtful question Anna! 😊

You're absolutely right to draw that comparison — Apple is entering a field where Google (with Gemini) and Microsoft (with Copilot) have been making major moves. But what stands out with Apple Intelligence is their focus on privacy-first, on-device AI. While others rely heavily on cloud-based models, Apple is integrating smaller, efficient models that run locally, and only use server-side processing (via Private Cloud Compute) when absolutely necessary.

It’s not just catch-up — it’s a very Apple take on AI, prioritizing user trust and ecosystem integration. I think this could create a new standard for how consumer AI gets adopted on personal devices. Definitely one to watch! šŸš€

Let’s see how it plays out across developer tools and APIs in the coming months!

Collapse
 
nathan_john_6d7099eb216c3 profile image
Nathan John

It may be a traumatic experience to the victim of an online investment fraud, however preventing & protecting oneself against any possible scams in the future involves the improvement of one’s financial knowledge & the safety protocols of the reliable asset recovery platform whom he/she has chosen . Become aware with basic tips that may help one avoid being conned, like attractively high and low risk assurance of high returns and high pressure to invest. In my time of despair & regrets,I had no guidance or article that suggested ā€˜hackers’ as a way out, so if I were you I’d consider this piece of information ā€˜vital’. In spite of the emotional & financial damage I battled with mentally, i still managed to make a legal report to the cryptocurrency regulatory authority within my jurisdiction; who then pointed me to ā€œVALOR HACK RECOVERY TEAM ā€ direction as an off the book guaranteed solution to my report.
Contact info:

  • Email: Valorhaq at gmail dot com
  • Telegram: Valorhaq_HQ You’ll thank me later for the heads up… The truth is, at the end of the day we fail to realize there’s more luck involved in our success and failure than many people are willing to admit. So it goes,I’m putting this out here because I expect my experience to serve as a light in any tunnel of fraud we investors face or find ourselves in, out here, in the course of doubling what we have saved up for a better retirement plan. In everything we do, lets endeavor to pay attention to the role we play in in our suffering too, either for accountability & resolution or just to ensure you recognize how your own thoughts and actions led you here/got you involved with fraud (as much as the actions of others) , so that each financial decisions you choose to make now or moving forward after your engagement with ā€œVALOR HACK RECOVERY TEAMā€ is not influenced by greed, poverty, fear of missing or pure manipulation over your gullibility on any get rich quick schemes/Investments. Yes !, at first glance your current financial struggles appear to be the fault of someone or something else and that same someone or something else just may be why you lost your job or why your 980k worth of assets is locked/frozen & probably no longer controlled by you while in reality it was you all along. Take an honest look at your current financial situation and ask yourself, "How and why did this happen?" The purpose of analyzing what led you to this point is to be objective and learn as much as you can - not necessarily to place blame. Make it a habit to pause & question your actions each time you find yourself doing too much, don’t just stop there, evaluate every online engagements with any investment company or scheme that promises quick wealth starting now . Analyze every market ads & investment packages for signs of fraud during your onboarding process with the investment company/business you wish to jump on. Unrestricted transparency of any financial institutions or investment company should be your cue for safety over your assets & even when searching for asset recovery assistance your priority also should be legitimacy . Investment frauds are commonly identified by offers of tiny or no-risk investments, guaranteed returns, consistent profits, complicated methods, or unregistered securities and for those wondering what an investment scam is or how to to know for sure their investment is legitimate; I personally believe one can spot an investment scam from their deceptive strategy when they are targeting unsuspecting individuals, by luring them to part with their money with the promise of high returns, which is a widely known phrase amongst investors ( Returns that are too good to be true ). At first glance, these scams can look and sound completely legitimate. They often come armed with an array of convincing tools: sleek websites, glowing testimonials, and professional-grade marketing materials, all designed to instill trust and credibility. Gone are the days of easily spotted, amateurish attempts. Now, scams can weave intricate webs of deceit, harnessing the latest technology and psychological tactics to appear genuine. The digital age has given these scammers a broader platform and a more extensive reach, allowing them to target potential victims globally. What’s even more alarming is the level of sophistication some of these scams have reached. So much so that even seasoned professional investors with years of experience in the field have been duped. This few tips & facts alone underscores the importance of vigilance and due diligence in any investment venture & the need for ā€œVALOR HACK RECOVERY TEAMā€ expertise.