Get in touch with us. We'd love to hear from you.
The Road to Composable: A Brief History of High Cohesion and Low Coupling
Our trip begins in 1979, the year Structured Design, by Edward Yourdon and Larry L. Constantine, was published. The book introduced the concepts of coupling and cohesion to the software engineering world. Yes, it featured now-dated examples of code that resemble COBOL and Fortran, but its central ideas are still road-worthy.
Yourdon and Constantine argue that “the cost of implementation, maintenance, and modification generally will be minimized when each piece of a system corresponds to exactly one small, well-defined piece of the problem, and each relationship between a system’s pieces corresponds only to a relationship between pieces of the problem.” To achieve this, a system should be partitioned—divided into pieces and organized—establishing relationships between the pieces. Each piece should have a clear focus on a single part of the problem (high cohesion), and relationships should be organized in a way where connected pieces know only the minimal set of information about each other (our authors talk about “loosely coupled systems”).
In time, high cohesion and low coupling became a mantra in the software development world and they’re still driving principles of today’s software engineers.
What’s interesting is that the reasons why this approach works and the benefits it provides, have evolved and expanded substantially over the years. So have the means to achieve it.
At the core is the simple fact that people are notoriously bad at navigating complexity. Once we pass a certain threshold of things-one-needs-to-juggle-at-a-time, error rates increase disproportionally. The simple solution: Divide a problem into parts and attack each part individually. If you do this correctly and each piece is focused on a single problem, interconnections become weak and then you can introduce isolated changes easier, faster and safer.
But this is easier said than done. And so various techniques and paradigms were developed to aid the process and scale it, as software was eating our world and systems grew ever more complex. Programming paradigms, like object-oriented programming and higher-order programming languages, helped better align components within the business domain and enabled cohesion. Later on, domain-driven design tackled the issue on a larger scale, as complexity grew.
As programs transcended single processes and input-output capabilities matured, the array of available integration approaches between system components expanded significantly. Standardization of such approaches made the whole system extensible and composable. Now you could add new components that would work together with those already in use. These components could be developed using any stack, or even by a third party that is exceptionally good at tackling a particular problem. You mix and match best solutions from the available ecosystem. Look at where it got UNIX.
Because scalability, availability and other requirements have pushed us towards distributed systems and internet protocols have become the de facto integration standard, cohesion and coupling principles are now applied on a macro level, where modules are individually deployable units talking to each other over the network. These components can be developed using different technology stacks, the ones most suitable for a particular job. If properly designed, they can be also scaled independently, and failures can be isolated, increasing overall system resiliency. Enter microservices architecture.
Such systems come with a lot of implementation complexity and operational overhead, so the rise of the DevOps movement, automation, containerization, and the cloud revolution have all pulled up to address or shift implementation and operational burden.
Components are now also complex enough to be owned by independent, cross-functional teams and mapped to entire business capabilities. This model, when implemented correctly, reduces organization communication overhead, increases agility, reinforces ownerships and reroutes organizations towards a product mindset.
Coupling and cohesion principles were always a central part of system design, and as the complexity of business problems and software environments maturity grew, the application of these principles on broader levels allowed for the realization of new, and the expansion of existing, benefits.
Today, these principles can be applied to the entire enterprise technology ecosystem. Large off-the-shelf, generally highly coupled or monolithic application suites can be decomposed and replaced with a combination of smaller, highly cohesive custom services and offerings from modern SaaS vendors. These are specifically designed to be integrated into such an ecosystem—single purposed, API-first, open and extensible.
This approach brings a lot of already mentioned benefits on the enterprise level, such as:
- Increased business agility and accelerated time-to-market
- Ability to use best-of-breed tools and avoid lock-ins and upgrade cycles
- Potential to buy commodity capabilities and build differentiators (experience in many cases) on a very granular level
- Opportunity to shift operational overhead to SaaS vendors and cloud providers
It’s been quite a journey. The 2022 model leaves us at a destination we call composable or, for those of you riding in a digital lane, MACH. Hope you’ve enjoyed the ride.