The Software Factory Is Dead, Long Live the Software Factory

From IC Insider Coder
By Amanda Phelps, Head of Global Public Sector Partnerships and Alliances at Coder
I have watched talented government and defense teams pour years of effort into software factories, and watched those same factories quietly become the thing slowing development down. That is not a failure of the people who built them. It is a signal that the model has run its course.
For the past decade, “software factory” has been the defining concept of digital transformation across the Department of Defense and the Intelligence Community. Initiatives like Platform One, Kessel Run, Black Pearl, and Kobayashi Maru proved that modern DevSecOps could work in sensitive environments, that containers could survive an ATO, and that developers did not need to wait months for infrastructure. Those teams deserve every bit of credit they receive.
But the model they pioneered is now collapsing under its own weight. The factories we built were cathedrals. What the mission needs now is something more flexible, like a framework.
The untenable cost of the monolith
The original software factories had to be centralized and monolithic. Kubernetes was not yet authorized. CI/CD pipelines could not yet satisfy NIST 800-53 controls. GitOps was unproven at the classification boundary. Early adopters bore enormous compliance burdens just to establish that modern development was possible in secure environments at all.
The fundamental problem is that monolithic factories try to be everything to everyone. A single factory is expected to serve programs building web applications and programs training machine learning models, teams operating in the cloud and teams in air-gapped facilities, experienced engineers and teams encountering modern development for the first time. That breadth creates impossible tradeoffs. Make the factory standardized and you alienate programs with specialized requirements. Make it flexible and you drown in configuration complexity until the factory itself becomes the bottleneck.
These factories also became single points of failure. When key personnel leave, institutional knowledge walks out with them. Platform teams get consumed by access requests, exception handling, and organizational overhead. The infrastructure meant to accelerate delivery starts slowing it down.
A maturing commercial ecosystem changes the calculus
What changed is that commercial technology matured in ways government-built factories cannot match.
Purpose-built infrastructure automation tools now treat cluster provisioning, configuration management, and infrastructure-as-code as solved problems. Declarative approaches — defining the desired state and letting automation handle the realization — enable agencies to enforce discrete access controls, reduce insider risk, and remove human error from provisioning. These capabilities are core to the tools, not incidental to them.
For the developer experience specifically, the shift has been equally significant. Secure, cloud-based development environments address one of the most persistent pain points in classified software development: standing up a compliant workstation and becoming productive. Developers in air-gapped facilities typically wait days, and weeks or months are not uncommon, for properly configured systems. Modern development environments can be provisioned in minutes from an approved template, with security controls built in rather than bolted on. Coder provides this capability for IC and DoD programs — teams define workspace infrastructure as code, enforce policy at the platform level, and support everything from unclassified development through classified and disconnected environments without changing the developer workflow.
This approach also shifts the maintenance burden from overworked government platform teams to vendors with engineering resources, SLAs, and dedicated security response. When something breaks, you open a ticket rather than lose months of capability development while someone reconstructs a Kubernetes cluster from memory.
Platform engineering: Smaller teams, greater scale
The successor to the software factory is not another factory. It is a platform engineering model where small, focused teams curate technology choices and define integration patterns rather than building and operating infrastructure from scratch.
This distinction matters enormously for the IC because the scalability problem that killed monolithic factories came down to headcount. When every program office queues behind a central platform team, scaling means hiring more government employees — slow, expensive, and constrained by hiring authorities that often have little relationship with the pace of mission need. When programs consume infrastructure through self-service catalogs built on proven commercial technology, scaling becomes a software problem. That is solvable.
In this model, platform teams do three things well:
- Define what compliant deployment looks like
- Curate the approved components that programs draw from, and
- Provide self-service interfaces that let development teams operate within security guardrails without creating a ticket for every resource request
The result is portable compliance. A workspace template that meets security requirements in an unclassified environment should work the same way on a classified network and function without modification in an air-gapped facility. Policy travels with the infrastructure.
The cross-domain problem
For the IC specifically, the platform engineering transition carries an additional imperative that defense-focused discussions tend to overlook: cross-domain development.
A developer supporting a program that spans multiple classification levels does not simply move between environments. They context-switch between different physical machines, credential sets, toolchains, and organizational processes. Work products moving between domains pass through transfer processes measured in hours or days. Managing cross-domain workflows has historically required dedicated personnel whose primary function is shepherding data across boundaries rather than building capability.
Platform engineering changes this. When development environments are defined as code and centrally managed, it becomes possible to maintain consistent toolchains and security baselines across classification levels while preserving the hard separations that protect sources, methods, and program equities. A developer’s workspace on the low side and their workspace on the high side can be structurally identical. Cross-domain transfer processes can integrate into the development workflow rather than function as afterthoughts. The compliance burden shifts from individuals performing manual procedures to the platform enforcing those procedures automatically.
IC programs are already deploying remote development infrastructure that spans classification boundaries with consistent policy enforcement and without asking developers to re-platform every time they change networks.
The democratization of building
Here is where this transition becomes genuinely transformative — and where the IC has a specific advantage to capture.
The IC has always had a large population of domain experts who are not software developers but who possess operational knowledge no development team can fully replicate. All-source analysts who understand threat actor behavior in ways that cannot be captured in a requirements ticket. SIGINT specialists who recognize patterns in data only visible through years of operational exposure. Collection managers who understand source constraints in ways that lose critical nuance when translated to pure technicians.
Platform engineering, combined with the right development infrastructure, is beginning to make these people builders. Not in the traditional software development sense, but in the sense of constructing tools, notebooks, workflows, and analytical environments that extend individual expertise into repeatable, shareable capability. An analyst who can prototype a data transformation that makes a new collection source exploitable is not writing production software. But they are producing real mission value — in ways that centralized software factories were never designed to support.
The enabling condition is a development environment that removes the infrastructure tax on exploration. When standing up a secure, compliant workspace requires a ticket and a two-week wait, only credentialed engineers do it. When it requires a template selection and a few minutes, the population of people who can meaningfully participate in building expands dramatically. Compliance becomes the baseline from which everyone works, not a gate that filters who can start.
What is actually dying (and what is not)
What is dying is the centralized, monolithic software factory that inserts itself as the critical path for every development team it serves. The model where a single isolated organization controls the infrastructure, tools, processes, and standards for all programs is giving way to something more sustainable.
The mission need those factories served is not dying. It is growing. Agencies still need to compress delivery timelines, elevate security postures without stifling innovation, retain technical talent, and deliver capability at the speed modern threats demand. The difference is that the IC no longer needs to build its own factory to achieve those outcomes. The Air Force’s Platform One has evolved toward a platform-of-platforms model. The Navy has moved to multi-vendor strategies that reduce lock-in and increase operational resilience. Across the IC, organizations are realizing they can adopt proven frameworks and adapt them to their compliance requirements rather than rebuilding from scratch.
The path forward
The question is no longer whether modern DevSecOps practices can work in classified environments. That is proven. The questions now are much harder to solve.
Can we build development infrastructure that genuinely serves the cross-domain operating reality of the intelligence enterprise, rather than forcing developers to absorb that complexity as manual overhead? Can we extend the population of people who build to include analysts, specialists, and operators whose domain expertise is the IC’s most irreplaceable asset? Can we shift enough of the compliance burden into the platform itself that security becomes an accelerant rather than a constraint?
Platform engineering makes all of this achievable. Small teams can support large organizations. Compliance becomes portable and workspaces repeatable across classification levels. The maintenance burden that currently consumes government talent shifts to software. The distance between having an idea and building something useful around it can be measured in minutes.
The factory that tried to control everything is giving way to a platform that enables everyone. That is not a loss. It is the next evolution the mission has been waiting for.
About the Author
Amanda Phelps leads Global Public Sector Partnerships and Alliances at Coder, where she works with government and defense organizations to enable secure, compliant development environments across classification levels. She specializes in creating partnership strategies that accelerate software delivery for programs in the public interest.
About Coder
Coder is the AI software development company leading the future of autonomous coding. Coder helps teams build fast, stay secure, and scale with control by combining AI coding agents and human developers in one trusted workspace. Coder’s award-winning self-hosted Cloud Development Environment (CDE) gives teams the power to govern, audit, and accelerate software development without trade-offs. Learn more at coder.com.
About IC Insiders
IC Insiders is a special sponsored feature that provides deep-dive analysis, interviews with IC leaders, perspective from industry experts, and more. Learn how your company can become an IC Insider.







