Since Apple Journal launched, users’ top ask was, “When is this coming to more devices?” That was my question too, so helping bring Journal from iOS to iPadOS and macOS was a dream project.
As an intern, I owned 3 core features: Settings, Tips, and Shortcuts. I refreshed Settings across all 3 devices, crafted a Tips experience for widescreen formats, and developed logic for a Mac keyboard shortcut system. Additionally, I built a 300-component design system that cut my team's work time in half. My Journal work is now live on devices running iPadOS 26 and macOS Tahoe 26.
In addition, I set out to bring Apple Intelligence into one of the company’s Services apps for the first time. After discovering a key feature was underused, I explored how AI could make it contextual, personalized, and engaging. In 5 weeks, I built a version that doubled engagement and established frameworks for future AI integrations across Apple’s Services family.
I started by partnering with Research to learn why engagement was low. The feature was supposed to be an entry point for new users, but most felt like it was irrelevant, which turned them away.
I reframed the challenge with a series of questions:
How might we transform this feature into a core part of the UX?
How might we leverage data to make interactions feel relevant without sacrificing trust?
How might we surface this feature to reduce friction and make this app feel more intuitive?
I explored and tested dozens of solutions across Design, SWE, and Research. Ultimately, I created a primary flow where AI generated personalized versions of the feature based on shared data, plus a follow-on flow that kept surfacing them as people interacted. In testing, engagement doubled.
Because there were no visual precedents for AI yet, I worked across teams to pressure-test feel and feasibility. That allowed me to ship in only 5 weeks while documenting frameworks for future AI integrations. In my final week, I presented to leadership, including Apple’s Head of Product.
Work under NDA.