
10 min read
March 1, 2026
TL;DR
UX isn't aesthetics — it's the engineering discipline that determines whether your custom software gets adopted or abandoned.
For mid-market companies investing $200K–$500K+ in a custom ERP or operational platform, skipping UX research and design is the fastest way to blow your budget on rework, change requests, and workarounds.
The companies that build UX into discovery and development from day one see higher adoption, lower support costs, and faster time-to-value.
This article breaks down what that process looks like and why it matters at the budget level.
If you're a director of operations, a COO, or a CTO at a company with 50–250 employees, you've probably lived through at least one software implementation that looked great in the demo and fell apart in practice.
The dashboard was polished. The feature list checked every box. But six months after go-live, your team was back in spreadsheets, building workarounds, or submitting a never-ending stream of change requests.
Not because the code was broken, but because the software didn't match how your people actually work.
That's a UX failure. And it's the single most expensive category of failure in custom software projects.
UX — user experience engineering — is the discipline that ensures software is built around your team's real workflows, not around assumptions made in a conference room. It determines how information is structured, how tasks flow from one step to the next, how errors are prevented, and how quickly a new employee can become productive in the system.
When UX is done right, it's invisible. The software just works the way your team expects it to. When it's done wrong — or skipped entirely — you feel it in adoption rates, support tickets, rework costs, and the quiet erosion of the ROI your project was supposed to deliver.
If you're reading this, you're likely evaluating whether to build custom software — an ERP, an operations platform, a workflow system — instead of buying another off-the-shelf tool that forces your business to conform to someone else's process.
That decision to go custom is often the right one for mid-market companies. But it comes with a responsibility that off-the-shelf buyers don't face: you own the design decisions.
When you buy Salesforce or NetSuite, the UX is what it is. You adapt to it (or you don't, which is why 77% of companies report difficulty reengineering their processes to fit ERP requirements).
But when you build custom, you have the opportunity — and the obligation — to design software that fits your operations like a glove.
That opportunity is wasted if UX is treated as a coat of paint applied after the engineering is done.
UX needs to be embedded in the project from the very first week of discovery. The architecture decisions, the data model, the workflow logic — all of it should be informed by rigorous research into how your team actually works today and how the software needs to support them tomorrow.
For executives evaluating a custom software partner, it's worth understanding what a serious UX process involves. This isn't about wireframe aesthetics. It's about de-risking your investment.
Before a single line of code is written, a competent development team conducts user research. That means interviewing the people who will use the system every day — not just the executives who commissioned it.
This matters because there's almost always a gap between how leadership describes a process and how the frontline team actually executes it. The warehouse manager toggling between three browser tabs, a spreadsheet, and a paper clipboard to complete a receiving workflow? That's a requirements goldmine that never surfaces in a stakeholder meeting.
Discovery research includes direct interviews with end users across roles and levels, contextual observation (watching people work in their actual environment), analysis of existing systems to identify which features are actually used versus which were built and ignored, and documentation of workarounds — the shadow processes your team has invented to compensate for tool gaps.
The output isn't a report that gets filed away. It's the foundation for every design and engineering decision that follows. When this research is rigorous, the resulting software is dramatically more likely to be adopted, efficient, and aligned with the business outcome it was built to deliver.
This is a core part of how software discovery should work. Discovery that skips user research produces specifications based on assumptions. A discovery that includes it produces specifications based on evidence.
Information architecture (IA) determines the navigation structure, menu hierarchy, feature grouping, and screen flow of your software. It's the discipline most responsible for whether users can find what they need quickly — or waste time hunting through menus and clicking through unnecessary screens.
The most common IA mistake in enterprise software is organizing the system by department or module (HR, Finance, Operations) rather than by task (submit a request, approve a request, check status). The first structure makes sense to the team that built it. The second makes sense to the people who use it. That distinction is often the difference between software that gets adopted and software that gets resisted.
For a mid-market company where the same person might wear three hats, task-based architecture is even more critical. Your operations lead who also handles procurement and manages vendor relationships needs a system that supports how they actually move through their day — not one that forces them to log into three different modules.
Interaction design governs what happens when a user clicks a button, submits a form, or navigates between screens. It's the discipline responsible for making the software feel responsive, predictable, and trustworthy.
The principles that matter most for operational software are error prevention (designing interactions so mistakes are hard to make — disabling submit buttons until required fields are complete, using date pickers instead of free-text fields, validating inputs in real time), immediate feedback (every action produces a visible, understandable response — no silent submissions, no mystery loading states), and consistency (if a gesture or pattern works one way in one part of the system, it works the same way everywhere).
These aren't aesthetic choices. They're engineering decisions that directly affect data quality, task completion time, and user confidence in the system.
When your team trusts the software, they use it. When they don't, they build workarounds.
Usability testing means putting prototypes and in-development features in front of real users, observing how they interact with them, and identifying where they struggle.
The economics are straightforward: a wireframe that gets rejected takes a day to rebuild. A fully developed feature that gets rejected takes weeks. Testing throughout development — not just once before launch — catches problems when they cost hours to fix rather than sprints.
Modern usability testing doesn't require flying users to a lab. Remote unmoderated testing platforms allow teams to recruit participants, define tasks, and collect recordings without scheduling synchronous sessions.
The best development teams run lightweight tests every sprint or every other sprint, validating individual features as they're built.
For a mid-market company investing $200K–$500K in a custom ERP or operational platform, the temptation to reduce scope on UX is understandable. It looks like a soft cost — something you can trim without affecting the "real" engineering work.
The data says otherwise.
When UX research is skipped, you pay for it in change requests after launch. Requirements that were assumed rather than validated surface as rework — typically at 5–10x the cost of catching them in discovery.
When information architecture is neglected, you pay in adoption resistance. Users can't find what they need, tasks take too many clicks, and features go unused because nobody knows they exist.
When usability testing is cut, you pay in support costs and workarounds. The system works technically but fails operationally, and your team routes around it with spreadsheets and manual processes — defeating the purpose of the investment.
Consider this math: if a workflow that should take 2 minutes takes 5 because of poor interface design, and that workflow is performed 50 times a day by 100 employees, your organization is losing 250 hours of productive time per day. That's not a design problem. That's an operational cost with a dollar sign on it.
If you're evaluating custom software development partners, the way they talk about UX tells you a lot about how your project will go.
Be cautious if UX is presented as a separate phase that happens after requirements are locked, or if "design" is described primarily in terms of visual aesthetics (colors, fonts, branding), or if user research isn't part of the proposed discovery process. These are signals that UX is being treated as a finishing step rather than an engineering discipline — and that your risk of adoption failure is higher than it needs to be.
Look for partners who embed UX research into the discovery phase from day one, who interview your actual end users (not just your leadership team), who test prototypes with real users before committing engineering resources, and who can articulate how UX decisions affect architecture, data modeling, and workflow logic — not just the visual layer.
At Moonello, UI/UX design is built into our software discovery process and carries through the entire development lifecycle. We've built ERPs, operational platforms, and workflow systems for mid-market companies — and the UX discipline is a major reason those systems get adopted and used rather than resisted and routed around.
One question we hear frequently: "Doesn't AI make UX less important? Can't AI just generate interfaces now?"
The short answer: AI is accelerating UI production. It's making the visual and component layer faster and cheaper to produce. But that makes UX — the strategic layer underneath — more important, not less.
AI can generate a dozen dashboard layouts in minutes. It cannot tell you which metrics the dashboard should display, which user persona it's designed for, or how it fits into your team's daily workflow. That requires research, strategic thinking, and the kind of domain understanding that comes from interviewing your people and observing how they work.
AI-powered features inside your software also create new UX challenges. Users need to understand what an AI feature does, how confident they should be in its outputs, and how to override it when it's wrong. Trust, transparency, and control are now core design concerns for any system that includes AI functionality.
The interface layer is becoming more commoditized. The thinking behind it — what to build, for whom, and why — is becoming more valuable.
If you're evaluating a custom ERP, operations platform, or workflow system — and you want to make sure the investment delivers — we should talk.
Moonello builds custom software for mid-market companies with 50–250 employees. Our process starts with discovery that includes real user research, not just stakeholder assumptions. That's how we build systems that get adopted on day one and deliver compounding value over time.
Schedule a discovery conversation
We'll walk through your current systems, your pain points, and what a custom build scoped around your team's actual workflows would look like. No pitch deck. Just a grounded conversation about whether this is the right move for your business.
Key Takeaways
UX is an engineering discipline, not a design exercise. It determines whether your custom software investment achieves its business objectives or becomes an expensive tool your team works around.
For custom builds, UX is your responsibility. Unlike off-the-shelf software where you inherit someone else's design decisions, a custom build gives you the opportunity to design software that fits your operations. That opportunity is wasted without rigorous UX research and design.
User research during discovery is the highest-ROI activity in your project. Interviewing end users, observing real workflows, and validating assumptions before engineering begins prevents the change requests and rework that blow budgets after launch.
Information architecture determines adoption. Software organized around tasks (not departments or modules) gets used. Software that forces users to hunt through menus gets abandoned in favor of spreadsheets.
Usability testing throughout development catches problems when they're cheap to fix. A rejected wireframe costs a day. A rejected feature costs weeks.
The cost of skipping UX compounds daily. Poor interface design doesn't just frustrate users — it creates measurable operational drag in hours lost, errors made, and workarounds maintained.
AI makes UX more important, not less. As AI commoditizes the visual layer, the strategic UX decisions — what to build, for whom, and why — become the primary differentiator between software that delivers ROI and software that doesn't.