BYOM (Bring Your Own Memory) - by David Hoang
Apple introduced Focus modes in iOS 15 as an evolution of Do Not Disturb, letting users filter notifications and even customize Home Screens by context (Work, Personal, Sleep). In iOS 16, Focus became smarter with Lock Screen pairings and filters across apps like Mail, Calendar, and Safari. iOS 17 refined this with more granular notification controls. Taken together, Focus has evolved from muting distractions to a full context-aware filtering system, a model that shows how AI memory could also be partitioned and personalized by mode rather than being “on” or “off.”
That same framing will be essential for AI memory. Not “on” or “off,” but a filter: what memory is relevant in this context? That same framing will be essential for AI memory. Not “on” or “off,” but a filter: what memory is relevant in this context?
One way to achieve this is through a memory interpreter—a layer that sits between your raw personal history and the work context you’re stepping into. Imagine you’ve been doing deep personal research on a topic—reading, journaling, exploring ideas in your own voice. When you shift into a professional setting, the interpreter could filter that knowledge, stripping away casual notes, personal anecdotes, or tone, while surfacing only the relevant facts and references in a format appropriate for work.
In practice, it would act like a translator, allowing the richness of your personal exploration to inform your professional contributions without oversharing or leaking unintended details. It’s not about fusing personal and work memory, but about controlled permeability