Table of contents
A complex problem has a lot of variables. Removing unimportant variables makes a problem easier to solve. What ways are there to simplify a problem in such a way that you're left with only the most important variables?
This comes up, for example, when you're juggling multiple options with multiple decision criteria, like in a decision matrix. You'd like the matrix to contain only the most important criteria — let's say, the top 3-5 criteria. This way the unimportant criteria would not needlessly complicate decision making.
Modularization (encapsulation)
Broadly speaking, modularity is the degree to which a system's components may be separated and recombined, often with the benefit of flexibility and variety in use.
The concept of modularity is used primarily to reduce complexity by breaking a system into varying degrees of interdependence and independence across and "hide the complexity of each part behind an abstraction and interface". [Emphasis mine]
You're replacing a part of a system with only the interface of that part. That interface is the contract that both the inside and outside agree to. It's interfaces all the way down.
Jim Keller about modularity in the AMD Zen CPU architecture:
We made several components very modular.
I wanted all the interfaces defined before we wrote the RTL for the pieces.
One of the verification leads said: If we do this right, I can test the pieces so well independently that when we put it together, we won't find all these interaction bugs, because the floating point [unit] knows how the cache works.
The modularity of the design greatly improved the quality.
Humans are only so smart, and we're not getting any smarter. But the complexity of things is going up.
Sidenote: Here Muratori argues that modularization is necessary for humans' inability to cope with complexity, but that it doesn't mean that modularization is in itself a good thing that leads to better software. Modularization introduces constraints to the design space and therefore leads to theoretically sub-optimal software. It prevents some optimizations, since the modules are black boxes with only their interfaces visible. It's a necessary evil. It reduces machine-level optimization to increase business-level optimization.
Musk's algorithm
- Question every requirement, no matter how smart of a person it came from
- Delete any part or process you can
- Aim to delete "too much", and then add it back in if needed. If you don't delete 10% too much and add it back in later, you won't know if you deleted enough.
- Simplify and optimize
- Accelerate cycle time
- Automate
Steps 3-5 sound quite similar. To me, steps 1-2 sound the most important.
Via negativa and inversion
To make good decisions, avoid bad decisions.
To find good solutions, discard bad solutions.
See:
Heuristics, best practices and satisficing
Stuff that works well enough most of the time, but not always.
[...] rationality is limited when individuals make decisions, and under these limitations, rational individuals will select a decision that is satisfactory rather than optimal.
Limitations include the difficulty of the problem requiring a decision, the cognitive capability of the mind, and the time available to make the decision.
Decision-makers, in this view, act as satisficers, seeking a satisfactory solution, with everything that they have at the moment rather than an optimal solution.
A best practice is a method or technique that has been generally accepted as superior to alternatives because it tends to produce superior results.
There are some criticisms of the term "best practice". Eugene Bardach claims that the work necessary to deem a practice the "best" is rarely done. [Instead,] most of the time, one will find "good" practices or "smart" practices that offer insight into solutions that may or may not work for a given situation. [Emphasis mine]
A heuristic or heuristic technique (problem solving, mental shortcut, rule of thumb) is any approach to problem solving that employs a pragmatic method that is not fully optimized, perfected, or rationalized, but is nevertheless "good enough" as an approximation or attribute substitution.
Heuristics can be mental shortcuts that ease the cognitive load of making a decision.
Use heuristics — but if you have the time, take a moment and think more first.
Reversible vs irreversible decisions
Amazon calls these two-way doors and one-way doors:
Recognize two-way doors. While some decisions are one-way doors, others are two-way doors, meaning they are reversible, and you can correct mistakes quickly.
So does Annie Duke:
Because of the way the human mind works, we tend to view decisions as permanent and final, particularly if they are high impact.
[...] When you figure out that you’ve got a two-way-door decision, you can make choices you’re less certain about, giving yourself more low-risk opportunities to expose yourself to the universe of stuff you don’t know. The information you gather in the process will help you implement the menu strategy, improving your accuracy in sorting options into ones you like and ones you don’t.
For example, having a rollback process in a CI/CD pipeline allows you to go fast, using less time on deciding whether or not to deploy.
Cognitive load
Simple is subjective.
There are so many buzzwords and best practices out there, but let's focus on something more fundamental. What matters is the amount of confusion developers feel when going through the code.
Confusion is caused by high cognitive load. It's not some fancy abstract concept, but rather a fundamental human constraint.
We should reduce the cognitive load in our projects as much as possible.
The tricky part is that the previous author may not have experienced a high cognitive load due to familiarity with the project.
The problem is that familiarity is not the same as simplicity. They feel the same — that same ease of moving through a space without much mental effort — but for very different reasons. Every “clever” (read: “self-indulgent”) and non-idiomatic trick you use incurs a learning penalty for everyone else. Once they have done that learning, then they will find working with the code less difficult. So it is hard to recognise how to simplify code that you are already familiar with.
The more you work in a "team sport", the more you need to take into account what "simple" is for everyone else.
Analogy and related problems
Try to find an analogous (i.e., similar) problem and its solution.
We have to look around for closely related problems; we look at the unknown [here "the unknown" means the solution, usually in a mathematical problem], or we look for a formerly solved problem which is linked to our present one by generalization, specialization, or analogy.
[...]
Generalization is passing from the consideration of one object to the consideration of a set containing that object; or passing from the consideration of a restricted set to that of a more comprehensive set containing the restricted one.
The more general problem may be easier to solve.
[...]
Inventor's paradox. The more amibitious plan may have more chances of success.
This sounds paradoxical. Yet, when passing from one problem to another, we may often observe that the new, more ambitious problem is easier to handle than the original problem. More questions may be easier to answer than just one question. The more comprehensive theorem may be easier to prove, the more general problem may be easier to solve.
The more ambitious plan may have more chances of success provided it is not based on mere pretension but on some vision of the things beyond those immediately present. [Emphasis mine]
[...]
— How to solve it (Polya)
What I take from that is this: Replacing a problem with an analogous, similar or more general problem lets you look at the original problem with a slightly different perspective. The new problem is similar enough that it resembles the original problem, but the new problem may also help you see different aspects of the original problem that you didn't pay attention to earlier.
A more general problem also helps you widen the scope of your thinking. For example, think of best practices, not the details of your current context. This relates to overfitting.
Go back and understand the problem as a whole
Really try to understand what problem you're trying to solve before trying to come up with solutions.
Of course, we do not wish to waste our time with unnecessary detail and we should reserve our effort for the essential. The difficulty is that we cannot say beforehand which details will turn out ultimately as necessary and which will not.
Therefore, let us, first of all, understand the problem as a whole. Having understood the problem, we shall be in a better position to judge which particular points may be the most essential. Having examined one or two essential points we shall be in a better position to judge which further details might deserve closer examination. Let us go into detail and decompose the problem gradually, but not further than we need to. [Emphasis mine]
— How to solve it (Polya)
Gap analysis
Question: What should people pay attention to when solving a problem?
Answer: It depends both on the people and on the problem.
When choosing what to focus on when solving a problem, the answer doesn't only depend on the problem at hand. It also depends on the company or person solving that problem.
There's no objective best answer that applies to all companies or people for solving that specific problem. If a problem has an established best practice, it gives you a generic, but non-optimal starting point.
The best answer is always relative to the company or person: do they need to do something special in addition to what they would normally do when solving that problem?
- The problem defines the required baseline levels for various attributes
- E.g., a specific software system might have required minimum threshold levels for system quality attributes:
- 4/10 for testability
- 8/10 for maintainability
- 6/10 for performance, etc.
- E.g., a specific software system might have required minimum threshold levels for system quality attributes:
- The company or person has varying levels of capability/focus related to those quality attributes
- E.g., NASA might have:
- 10/10 for testing
- 2/10 for development velocity, etc.
- E.g., NASA might have:
For each attribute, the delta between 1 and 2 tells you how much to focus on a specific attribute.
If NASA's space shuttle engineers created a small website, they would probably focus religiously on testing, but they would probably need to prioritize development velocity and agility.
Look at this list of system quality attributes. All of those attributes are important, but some are more important than others. Picking the most important ones depends both on the problem and on the company or person solving that problem.
This is basically gap analysis: you're mindful of the delta (gap) between what's desired and the current state, and use that to decide the top 3 things (for example) to focus on in development roadmaps, architecture decisions, company vision, etc.
Related: SWOT analysis.
Sidenote: All of this means that roadmaps, architecture decisions, company vision, etc. should not be static. As your capabilities change, so should the decisions. Higher-level things, such as vision, should of course change with a longer cadence, but even they should change eventually. For example, think of the first mottos of Google and Meta, which they changed after a few years.