Semantic Grammar
When interfaces generate themselves, the designer's job is no longer to specify form. It is to specify meaning.
Interfaces are getting smarter, but design practice still starts with screens. A designer draws layouts, connects them into flows, and hands off specifications. This worked when every surface was authored by hand. It breaks when interfaces adapt at runtime, generate elements on the fly, or serve actors whose needs are structurally in tension.
This essay proposes a different starting point. Instead of designing screens and extracting patterns from them, begin with what a system means: the objects that persist, the states that matter, the conflicts between the actors involved. Call this the semantic layer. Above it, a signal layer encodes what shapes that meaning for its audience, drawing on brand, regulation, user context, and editorial judgment. Expression is not authored directly. It is derived from the interaction of meaning and signal.
The framework raises more questions than it settles. Can derivation produce interfaces that feel designed, not just functional? What vocabulary describes register without collapsing into visual properties? Where does the designer's authority sit when form is no longer fully specified? These are open problems. The goal here is not to close them but to make them precise enough to work on.
Most interaction design still begins with screens. A designer opens a canvas, draws rectangles, fills them with content, and connects them into flows. Components get extracted, systematized, documented. The output is a specification: this screen, this layout, these states, this behavior.
This worked when designers controlled the adaptation. Responsive design and multi-platform systems already broke the one-screen assumption years ago, but the response was to multiply specifications: breakpoints, platform-specific layouts, device-targeted components. The designer still authored every variant. What is changing now is that adaptation itself is moving beyond what a designer can specify in advance. Interfaces adapt to user histories, situational contexts, and content states in combinations no designer anticipated. Generative UI produces interface elements at runtime. Adaptive systems reorder, summarize, and prioritize content without a designer touching a layout.1
The response so far has been to systematize the surface: design tokens, component libraries, multi-platform design systems. These help, but they solve the problem at the wrong layer. They make the visual expression consistent and portable. They don't address what the interface means.
What holds across every adaptation, every surface, every context is not a component or a layout. It's the meaning underneath: what the system is about, who it serves, and where their interests collide. That layer needs its own grammar.
The semantic layer describes what a system is about. Not what it looks like, not what it does, but the meaningful entities, relationships, and tensions that persist regardless of how or where the system is rendered.
This is the layer most design practice skips. A team jumps from a problem statement to wireframes, treating the space between as "discovery" or "synthesis" rather than as a designable structure in its own right. The semantic layer makes that structure explicit.
It consists of five elements:
Actors
Not "users." Agents with distinct motivations, stakes, and structural relationships to each other.
In a hospital discharge system, four actors operate on the same event: the patient wants to go home, the nurse needs documentation completed before releasing the bed, the billing department needs insurance codes submitted, and the attending physician needs to confirm no outstanding clinical risks. These are not user personas with different demographics. They are agents whose goals are structurally in tension. The design problem lives in that tension, not in any individual actor's journey.
Objects
Things with persistence and meaning beyond a single interaction.
A rental lease agreement exists before the tenant moves in (draft), during occupancy (active), during a dispute (contested), and after move-out (archived but legally binding). It is the same object across all of these. Its meaning shifts at each stage, but it never stops mattering. Objects like this are the stable anchors of the semantic layer. If a system loses track of them, no amount of UI polish compensates.2
States
What an object can be, and what those states mean to each actor involved.
A prescription can be written, filled, partially filled, expired, or recalled. "Partially filled" means something different to the patient (I only got half my medication), the pharmacist (insurance covered 30 of 60 pills), and the prescribing doctor (was there an interaction concern?). The state is one fact. Its meaning fans out across actors. Designing for the state without designing for that divergence produces interfaces that are technically correct and semantically useless.
Events
What changes state, why it matters, and who triggers it.
A flight gets cancelled. For the passenger, it is a disruption: missed connection, hotel needed, plans unwinding. For the gate agent, it is a queue forming: 120 people who all need rebooking in the next 40 minutes. For the airline's operations center, it is a resource problem: crew reassignment, aircraft repositioning, cascading delays across the network. One event, three divergent meanings, each demanding a different response within minutes. Events are where the system moves. They are also where misalignment between actors becomes visible.
Conflicts
Where actor interests collide. This is the element most often missing from design work, and the one that matters most.
Airbnb's review system sits on a three-way conflict. The guest wants honesty. The host wants favorable ratings. Airbnb wants trust signals that help future guests decide. The design problem is not "how should reviews look." It is: whose version of the truth gets expressed, and how is retaliation prevented? Every meaningful design decision in that system traces back to this conflict. The visual form is downstream.
Conflicts are what make the semantic layer more than a data model. A database schema can capture objects, states, and events. It cannot capture the fact that two actors need incompatible things from the same object at the same time. That is a design problem, and it requires design judgment.
Meaning is contextual
One more property of this layer: the same object, state, or conflict can carry different meaning for different actors, or in different situations. This is not a problem to resolve. It is a condition to design with. The semantic layer does not flatten these differences. It makes them visible so the signal and expression layers can respond to them.
If the semantic layer describes what a system is about, the signal layer describes what shapes how that meaning reaches its audience. Not signals as in data inputs: sensor readings, system events, user actions. Those are events in the semantic layer. Signals here are closer to the concept in foresight research: observable forces that shape what is emerging. A cultural shift, a regulatory climate, a brand's accumulated voice. A red notification badge in a Western app means "attention needed." In Chinese financial apps, red signals prosperity. Same visual element, opposite semantic load. These forces exist whether or not anyone is designing with them. They cannot be authored from scratch. They can be read, amplified, dampened, or overridden. The designer's job in this layer is distinguishing signal from noise and orchestrating how the signals that matter shape what gets expressed.
Signals come from multiple sources: the semantic layer itself (a bank balance of €12.40 three days before rent generates urgency no designer needs to inject), brand and organizational identity, domain and regulation, cultural context, user behavior and preferences, and the designer's own editorial judgment. These sources are not the structure of the layer. They are what feeds it. The structure is what signals act on — five dimensions along which every signal, regardless of origin, shapes expression.
Gravity
What is at stake. A medication dosage confirmation carries weight that a playlist suggestion does not. High gravity narrows the range of acceptable expression toward the unambiguous, the legible, the predictable. Low gravity opens the range: more visual richness, more surprise, more play. Gravity is often inherent to the semantic layer: a conflict involving health, money, or legal standing generates it. But it can also be imposed. Regulation raises the gravity of cookie consent beyond what the interaction itself would suggest.
Tempo
How much time pressure exists. An emergency notification compresses everything: fewer elements, higher contrast, immediate action. An onboarding flow expands: more explanation, lower commitment per step, room to explore. Tempo is not just clock time. A person checking their bank balance on a Monday morning and the same person checking it after an unexpected charge have different tempos in the same interface, on the same object, seconds apart.
Intimacy
How personal the content is. A medical record, a journal entry, a salary number: these demand a different expression than a weather forecast or a shipping update. Intimacy affects what can be shown on a shared screen, what should be spoken aloud by a voice interface, what belongs behind an extra tap. A banking app that displays the account balance on the home screen treats it as routine. One that hides it behind a tap treats it as intimate. Same data, different judgment about exposure. That judgment is culturally situated: what feels private, what feels routine, what demands discretion varies across contexts and audiences.
Authority
Who is speaking and with what standing. Stripe's error messages are calm, specific, and developer-facing: "Your API key is missing the sk_live_ prefix." This positions the product as a peer, not a gatekeeper. The same error at a consumer bank would need a different register entirely: institutional, reassuring, less technical. Authority is also shaped by the relationship between system and user. A first interaction carries different authority than a five-year relationship. And authority can be imposed from outside: when a legal requirement speaks through the interface, it overrides the brand's own voice.
Reversibility
How easily an action can be undone. Deleting an account is irreversible. Changing a profile photo is trivial. Reversibility calibrates friction: irreversible actions demand confirmation, deliberation, sometimes intentional slowness. Reversible actions invite directness, speed, low ceremony. An email client that offers "undo send" for 30 seconds treats the action as reversible and keeps the interface fast. Without that window, sending would need the same gravity as a bank transfer. The same action, different reversibility, different expression.
Mixed signals
These five dimensions make signal conflicts precise. A fitness app where the user's preference says "don't show me my weight" but the medical domain says "BMI is clinically relevant for this recommendation." This is not a vague disagreement. It is a conflict between intimacy (the number is personal, the user has asked for distance) and gravity (clinical relevance raises the stakes of suppressing it). The designer's role is not to pick one signal but to make the trade-off explicit and defensible: surface the clinical relevance without making the number the centerpiece.
When sources produce contradictory pressures on the same dimension, the result is mixed signals. When nobody resolves them, the system oscillates: playful in contexts that demand seriousness, rigid in contexts that invite exploration. This looks like inconsistency, but the root cause is a missing editorial decision. The designer's role in the signal layer is closer to orchestrator than to traditional author. Not because it requires less agency, but because the agency shifts. The designer still makes consequential choices: which signals take priority, what tensions are acceptable, where the system should resist adaptation and hold firm. That is a stance, not a neutral coordination task.
Register
The combined effect of signals along these dimensions produces register: the appropriate expression for this meaning in this moment. Register is not a single axis. It is the composite of gravity, tempo, intimacy, authority, and reversibility as they converge in a specific context. The tone of a banking app shifts between checking a balance (low gravity, low tempo, routine) and approving a large transfer (high gravity, high intimacy, low reversibility). Both are the same product, the same brand, the same user. The difference is register.
Register is what gives a system range without losing coherence. A product that sounds the same in every context is rigid. A product that sounds different in every context is incoherent. The signal layer defines the range of acceptable registers and the conditions that move between them.
The expression layer is derived from the semantic layer by orchestrating the signal layer.
This is the claim that separates Semantic Grammar from conventional design practice. It is also where the framework faces its hardest test: showing that derivation can produce coherent, appropriate interfaces, not just technically functional ones.
How derivation works
Take a medical appointment. The semantic layer says: obligation, health stakes, time-bound, involves a specific provider. The signal layer says: high gravity, high authority, low reversibility (missing the appointment has consequences), tempo compressed by the fact that the person is in transit on their phone.
A well-derived form: a high-contrast notification with the clinic name, appointment time, and a one-tap action to open navigation. No decorative elements, no "You've got this!" copy, no illustration of a stethoscope. Every element traces back to a signal.
A badly derived form: a cheerful card with casual copy, buried address, and an animation. The signals were all there. The derivation ignored the register. The result is technically correct (right data, right time) and semantically wrong (wrong tone, wrong priorities, wrong assumptions about the moment).
Derivation heuristics
Not every derivation decision needs to be made from scratch. Certain patterns hold across contexts:
High gravity, less adaptation. When the consequences of misunderstanding are severe, the system should reduce ambiguity, increase legibility, and resist adaptation. A medication dosage confirmation should look the same every time, on every device, in every mood. Predictability is the point.
Exploratory contexts, more surface area. When the user is browsing, discovering, or comparing, the interface can show more, assume less, and lower the commitment required per action. A music discovery feed can afford visual richness and surprise that a payment confirmation screen cannot.
Repeat use, progressive compression. As a user returns to the same object or flow, the interface can compress: show less context, assume more familiarity, surface shortcuts. A first-time booking flow needs full disclosure. The fiftieth needs a confirmation button.
First encounter, full semantic disclosure. Before asking for action, show what this is. A new user encountering an unfamiliar object type needs to understand what they are looking at before they can act on it meaningfully. Name the thing. Show its state. Make the actors visible.
Conflicting signals, make the trade-off visible. When signals from different sources pull in opposite directions, the worst response is to silently pick one. Surface the tension. A clinical recommendation that overrides a user preference should say so, not pretend the preference doesn't exist.
When derivation fails
The framework should be honest about its failure modes.
Under-specified semantic layer. If the team skipped the semantic work, derivation has nothing to derive from. The result is generic UI that looks functional but carries no meaning. Forms that collect data without communicating what happens to it. Dashboards that display metrics nobody acts on. The system works, technically. It means nothing.
Unresolved signal conflicts. When nobody decides which signals take priority, the system oscillates. It is playful in contexts that demand seriousness, or rigid in contexts that invite exploration. This looks like inconsistency, but the root cause is a missing editorial decision in the signal layer.
Over-derived expression. Adaptation without constraint. The system tries to be so responsive to context that it loses identity. Every surface looks different, every interaction feels unfamiliar. Users cannot build a mental model because the interface never holds still long enough to become recognizable. This is the failure mode of context-sensitivity without a stable semantic core.
These questions test whether a team has done the semantic work. They are useful in product conversations, design critiques, and stakeholder alignment. A surface-level answer reveals a surface-level understanding of the system.
1. Describe your product without naming a single UI element.
Surface answer: "We help people manage their finances." Semantic answer: "We hold a person's financial obligations and commitments, show where they conflict with their resources, and surface the moments where a decision is needed before a consequence arrives." The first answer could describe a spreadsheet. The second describes a system with objects, states, events, and conflicts.
2. What are the objects in this system that have a life beyond one interaction?
This question separates meaningful objects from UI artifacts. A "card" on a dashboard is not an object. The thing the card represents might be. If nothing persists beyond the session, the system is a tool, not a product. That is fine, but it changes what needs to be designed.
3. What states can those objects be in, and what do those states mean to each actor?
If every actor experiences the same state the same way, either the system is simple or the team hasn't looked closely enough. A job application in "under review" means waiting (to the candidate), workload (to the recruiter), and pipeline velocity (to the hiring manager). The state is one word. The meaning is three different concerns.
4. Where do your actors' interests conflict?
Surface answer: "Users want simplicity, stakeholders want data." Semantic answer: "The landlord wants automated rent collection. The tenant wants to delay payment by three days without penalty. The platform wants predictable cash flow to maintain its lending margin." If the answer is "they don't really conflict," the team hasn't found the design problem yet.
5. If the interface was rebuilt from scratch tomorrow, what would have to stay the same for it to still be the same product?
This question isolates the semantic core. Not the brand, not the layout, not the component library. The things that, if removed, would make the product a different product. Those things belong in the semantic layer. Everything else is derived.
Semantic Grammar is not a method you adopt on Monday. It is a lens: a way of looking at product and system work that makes certain problems visible that screen-first practice obscures. The claim is that meaning, not form, is the stable layer. That signals from multiple sources compete for expression, and that managing that competition is the designer's core work. That form follows from both, not from a canvas.
If you work on a system where multiple actors need different things from the same object, the semantic layer gives you a way to map that tension before anyone opens a design tool or writes a line of code. If you work on a system that adapts across contexts, the signal layer gives you a way to define what holds and what flexes without specifying every variant by hand. If you build design systems, the framework suggests that tokens and components solve the problem one layer too late.
None of this is proven. The diagnostic questions are testable. The derivation heuristics are falsifiable. What this text offers is not answers but structured questions: specific enough to guide work, open enough to survive contact with real systems. The useful response is not agreement or rejection but a better version of the argument, tested against a domain this essay did not cover.
To build on:
Semantic layer. Objects, states, and conflicts persist across every surface. They are more stable than any interface built around them. Start there.
Signal layer. Meaning does not shape itself. Signals from brand, regulation, culture, users, and editorial judgment compete for influence along five dimensions: gravity, tempo, intimacy, authority, reversibility. The designer orchestrates that competition.
Expression layer. Expression is what results when meaning meets signal in a specific context. Gravity narrows the range. Exploration widens it. Repetition compresses it.
Diagnostic questions. Can you describe the product without naming a screen? Can you say where your actors' interests collide? If not, the layer underneath is missing.
Designer's role. Less direct authorship, not less authority. The shift is from specifying form to shaping the conditions under which expression is produced.
There is a guided interview skill for AI agents. It runs the five sections as a conversation and produces a semantic grammar context document at the end. Install it, open your agent, and see where the vague answers are.
npx semantic-grammar
- The "generative UI" pattern, where AI produces interface elements at runtime, is directionally aligned with this framework. What most current implementations lack is a formalized constraint system between meaning and output. The interface is either fully pre-designed (static components) or fully improvised (raw LLM output). The signal layer is the missing middle. ↩
- Object-Oriented UX (OOUX), developed by Sophia Prater, argues that design should start with objects and their relationships rather than screens and flows. This is the right instinct: objects are more stable and meaningful than the screens built around them. But OOUX treats the object model as an input to a conventional design process. The designer still manually translates objects into UI. Semantic Grammar takes the next step: if objects, states, and conflicts are formally described, form can be derived rather than authored. Similarly, Domain-Driven Design (DDD) in software engineering proved that shared vocabulary between domain and implementation reduces errors. Semantic Grammar extends this principle to the interface layer. See Eric Evans' Domain-Driven Design (2003). ↩