The MBSE Reckoning · Part 1

The MBSE Reckoning: Why the Industry Is at a Breaking Point

April 10, 2026 · 4 min read · Luvian Team
MBSE Systems Engineering Digital Engineering SysML v2

Model-Based Systems Engineering has won the argument. Nobody seriously disputes that models are better than documents for managing the complexity of modern engineered systems. INCOSE has declared the future of systems engineering is model-based. Every major defense program now mandates it. Aerospace, automotive, and industrial automation organizations have all put MBSE on their strategic roadmaps.

And yet.

Organizations invest in MBSE tooling, and after three months, nobody is using it. Engineers describe their tools as “dating back to the 1990s.” One practitioner - a veteran consultant who has spent years in the trenches across multiple MBSE platforms - put it plainly: “MBSE tools suck. If you compare them to tools in other industries, they would be ridiculed.”

Some practitioners are now openly asking whether MBSE is “at best overhyped, or at worst actively dying.”

This isn’t a technology problem. It’s a product problem, a people problem, and an intent problem. And it’s time to talk about it honestly.

The question has changed

The MBSE conversation has shifted. Organizations are no longer asking “should we adopt MBSE?” - that question is settled. They’re asking “why isn’t it working?” and “how do we make it actually deliver value?”

That shift matters. It means the industry is past the evangelism phase and into the accountability phase. The promises have been made. The budgets have been spent. The tools have been deployed. And in a disturbing number of cases, the result is shelfware - expensive models that nobody reads, written in a language most of the organization doesn’t speak.

Five gripes, one root cause

Over the next nine weeks, this series will unpack five structural problems we’ve identified through conversations with systems engineers, developers, program managers, and leadership across defense, aerospace, automotive, and industrial automation:

  1. The tools are hostile to humans. MBSE tools were designed by modeling language experts for modeling language experts. They optimize for ontological completeness, not for the act of engineering. The result is tools that are inaccessible to 90% of the people who need to interact with system data.

  2. MBSE lives in its own universe. Models operate in informational silos, disconnected from the daily tools and workflows engineers actually use. When the model isn’t connected to the real work, it becomes a document nobody reads written in a language nobody speaks.

  3. Nobody knows where they are on the maturity curve. Both tool builders and adopters are flying blind. Vendors ship features without acceptance criteria. Organizations adopt MBSE without maturity frameworks. Nobody can answer “are we getting better at this?”

  4. Systems engineers and developers speak different languages. SE models don’t compile, run, or connect to CI/CD. The handoff between architecture artifacts and implementation artifacts is manual and lossy. Each discipline has its own concept of what a “model” is and what “done” looks like.

  5. Stakeholders can’t see what’s happening. MBSE was supposed to be the single source of truth. Instead, it created a priesthood - a small group who can read the models - and everyone else gets PowerPoint translations.

These aren’t five separate problems. They’re five symptoms of one structural failure: intent degrades at every handoff in the engineering pipeline. Stakeholder to product, product to SE, SE to developer, developer to QA - at each boundary, the “why” gets stripped away, leaving only the “what.”

Why this matters now

In the age of AI-accelerated implementation - where coding tools can ship features 3-5x faster - the bottleneck is no longer engineering velocity. The bottleneck is coordination capacity. Product teams still run manual processes for requirements refinement, epic management, and stakeholder negotiation. The moment a product decision becomes a ticket, work enters a system dependent on interpretation and prioritization rather than direct execution.

The speed paradox is real: we can implement faster than ever, but we still can’t agree on what to implement, or communicate why.

What this series will cover

Each week for the next nine weeks, we’ll go deep on one dimension of this crisis - not to complain, but to diagnose and to point toward what “MBSE 2.0” actually looks like:

  • Week 2: Your MBSE tool was designed for the wrong person
  • Week 3: The shelfware problem - when models don’t connect to work
  • Week 4: The maturity myth - why nobody knows where they are
  • Week 5: Two tribes - why SEs and developers can’t hear each other
  • Week 6: The priesthood problem - MBSE’s stakeholder visibility crisis
  • Week 7: Intent doesn’t survive the pipeline (and that’s the real problem)
  • Week 8: AI won’t save MBSE - but it might make it usable
  • Week 9: Models that run - the case for executable system architecture
  • Week 10: What comes next - a practitioner’s manifesto for MBSE 2.0

We’re writing this because we believe MBSE matters. The systems being built today - autonomous vehicles, defense platforms, medical devices, industrial robots - are too complex for documents and too critical for guesswork. The practice deserves better tools, better frameworks, and more honest conversations about where it stands.

Let’s start that conversation.


This is Part 1 of “The MBSE Reckoning,” a 10-part series from Luvian on the state and future of Model-Based Systems Engineering. Subscribe to our newsletter to get each article as it publishes.

Build better systems, faster.

Luvian is the AI system design platform for modern engineering teams. Join the waitlist for early access.

Join the Waitlist