A Good Enough Post:

In today’s post, I am exploring the notion that viability depends on our capacity for action, and that this capacity may not entirely rely on having a perfect grasp of “Truth.” This possibility, drawn from evolutionary theory, invites us to reconsider a deeply rooted assumption in human thought: that knowledge aims to reflect the world as it is. Perhaps organisms do not carry mirrors of an objective environment. Perhaps they generate workable patterns that allow action. If so, truth in the sense of full correspondence might be not only unnecessary for survival but impossible to achieve.

This shift from truth to adequacy might be more than a semantic difference. It suggests we could reconsider how perception, cognition, and action evolve under the pressure of complexity. Our nervous systems may not have emerged to catalog every detail of reality. They might have emerged to enable viable engagement. They filter, reduce, and transform. They make the unmanageable manageable. This economy of attention could be what allowed life to persist in an environment whose complexity always exceeds the capacity of any single organism.

The Evolutionary Logic of Selective Attention

The earliest organisms had comparatively simple structures. Their survival depended on detecting a few vital differences: light and dark, motion and stillness, hunger and satiation. These differences were not representations of reality in its full richness. They were pragmatic distinctions, selected by evolution because they mattered for survival.

As ecosystems diversified, so did the organisms within them. Greater complexity in the environment favored organisms with richer internal structures. These structures allowed them to absorb more variety and generate more flexible responses. But this expansion had limits. No organism could ever match the full complexity of its environment. Every adaptation remained selective.

Yet evolution’s relationship with cognitive economy appears more nuanced than simple efficiency maximization. Many organisms maintain seemingly “wasteful” capacities (elaborate plumage, complex social behaviors, or redundant sensory systems) that prove crucial during rare but catastrophic events. This apparent contradiction might reveal something deeper. Evolution does not eliminate selectivity; it shapes what gets selected and how. The peacock’s tail represents a different kind of cognitive economy, one that trades metabolic efficiency for reproductive advantage. Even redundancy involves choices about what to duplicate and what to ignore.

Here we see why the word “better” seems always contextual. An organism appears better only in relation to its ecological niche and temporal horizon. There may be no universal scale of improvement. Adequacy appears always local, contingent on the demands of the situation, and provisional across time scales.

The Law of Requisite Variety and Regulatory Challenges

This principle finds a formal expression in W. Ross Ashby’s Law of Requisite Variety: only variety can absorb variety. A regulator must have as much variety in its responses as exists in the disturbances it faces. If the environment can vary in ten ways and the organism can respond in only five, some disturbances will remain unchecked, threatening viability.

Ashby’s law applies specifically to regulatory systems maintaining homeostasis, but its insights extend to cognitive systems facing similar challenges. Both must manage variety mismatches between their internal organization and environmental complexity. Yet matching variety does not mean copying the environment. No finite system can track every detail. Instead, regulation depends on attenuation and amplification. Organisms attenuate the vast variety of the environment into a reduced set of distinctions. They amplify the significance of certain cues to prioritize action.

This does not seem to be a flaw in design. It might be a condition of survival. The key point is this: attenuation may not be about discovering truth but about achieving functional adequacy within specific contexts and time frames. And here is a critical implication – what works today may fail tomorrow. Adequacy is dynamic because the variety we face today may not be the variety we face tomorrow. If we are not able to adapt to new disturbances, viability collapses. Our current struggle to integrate artificial intelligence into the workplace illustrates this point. Many organizational models were built on assumptions of human exclusivity in cognitive labor. Those assumptions worked for decades. Today, they are brittle because the environment has changed. Ashby’s law prevails.

The Shortcut Analogy: Logarithms and Cognitive Compression

To appreciate the elegance and risk of attenuation, consider a good enough historical analogy. Before the age of electronic calculators, navigation and astronomy depended on logarithmic tables. Multiplying large numbers was time-consuming and error-prone. Logarithms offered a remarkable shortcut: turn multiplication into addition. By converting numbers into their logarithmic values, sailors could compute distances and bearings quickly, reducing the cognitive load of calculation.

Crucially, these tables were extremely accurate within their domain of application. Lives depended on precise calculations, and navigators understood both the power and limitations of their tools. They built in multiple redundancies and cross-checks. This compression did not deliver the full detail of multiplication, but it delivered enough precision for safe passage across oceans when used with appropriate awareness of its boundaries.

Our minds seem to prefer operating in a linear way. Sequential thinking appears natural most likely because it proves cognitively economical. It reduces overwhelming complexity to manageable sequences we can follow. Like logarithmic tables, our conceptual frameworks trade completeness for efficiency. They allow us to act without drowning in detail. But there is an important difference. It is that logarithmic tables are mathematically precise within their defined limits. Human cognitive shortcuts however are bias-prone and culturally shaped, and they rarely come with warning labels. When we mistake our tools for the territory itself, the cost becomes invisible. Information is lost. Subtleties disappear. And when the environment changes, what once worked can become dangerous. This is the paradox: what enables us to cope also constrains what we can see. Our abstractions could be both our superpower and our vulnerability.

Pragmatism and Cybernetic Constructivism

This brings us to the philosophical dimension of the topic. Pragmatism, particularly as articulated by William James and John Dewey, treats knowledge as a tool for action rather than a mirror of reality. A belief is “true” not because it corresponds to some ultimate fact but because it proves useful in guiding behavior within a specific context. Truth is redefined as what works, but this “working” must be understood across multiple time scales and contexts. Adequacy is not fixed. It requires constant revision as the environment shifts.

This is not a license for arbitrary belief or wishful thinking. Pragmatic truth remains constrained by consequences. A bridge designed on faulty engineering principles will collapse regardless of the designer’s confidence. A medical treatment based on wishful thinking will fail regardless of the practitioner’s intentions. The pragmatic test is whether our frameworks enable effective action in the world as it actually responds to our interventions. Reality provides feedback, even if we cannot access it directly.

Cybernetic constructivism shares this orientation. Heinz von Foerster reminds us that “the environment contains no information”. What we call information arises in the interaction between an organism and its surroundings. The world does not impose meaning; meaning is enacted. Maturana and Varela describe this as structural coupling. Organisms and environments co-determine each other through ongoing interactions.

Seen in this light, our nervous system does not passively record inputs but brings forth distinctions through its own organization, maintaining coherence in continuous interaction with its surroundings. Knowing becomes an adaptive dance rather than a passive recording. The goal is not to represent an independent world but to maintain viability within a world that is partially brought forth by the act of knowing. This does not mean stability is irrelevant. Reliable patterns of interaction matter. Some regularities can be engaged in ways that allow prediction and engineering. Scientific methodology succeeds not because it removes simplification but because it manages it systematically, using feedback processes such as replication and peer review to adjust and refine adequacy over time and in a social realm.

The Double-Edged Sword: Superpower and Kryptonite

The ability to compress complexity seems to have made life possible. Yet this same ability becomes dangerous when compression becomes rigidity. When abstractions are treated as final truths, systems lose their capacity for adaptation. Stafford Beer captured this danger when he observed that ignorance becomes “the lethal attenuator”. When we lose track of what our simplifications exclude, adequacy transforms into vulnerability.

Let’s look at some examples. The use of algorithms in hiring often reduces the complexity of human potential to a few simplified metrics, which can perpetuate bias. Climate models, although highly advanced, still miss certain feedback loops and critical tipping points. Social media recommendation engines compress human interests into engagement-focused categories, which can push users toward more extreme views by filtering out moderating influences. This is evident in the world nowadays.

Heinz von Foerster reminded us that although the map may not be the territory, the map is all we have. Our ways of making sense are always partial and limited, yet they are the only tools we can use to navigate complexity. Recognizing this helps us remain aware of our cognitive blind spots.

In each case, the problem is not the use of shortcuts but forgetting their limits combined with insufficient feedback. The map is never the territory. When we mistake our ways of making sense for reality itself, fragility follows. What helps us stay viable can also make us blind.

Ethical Implications: What Do We Choose to Ignore?

If we accept that knowledge is constructed for adequacy, not truth, then the question of responsibility becomes unavoidable. Every act of attenuation involves a choice about what to include and what to ignore. These choices shape not only individual survival but collective futures.

In social systems, ignoring complexity can marginalize voices that do not fit dominant abstractions. In technological systems, it can produce biases that perpetuate injustice. The ethic of constructivism is not to abandon simplification (without it, we could not act) but to cultivate awareness of its costs and remain open to revision.

At the individual level, deliberate exposure to dissenting views, reflective journaling on hidden assumptions, and iterative sensemaking can help maintain cognitive flexibility.

We can restate Ashby’s law by saying that viability requires variety. A society that suppresses diversity of thought and perspective reduces its internal variety and becomes brittle in the face of unforeseen challenges. To design for resilience, we must design for plurality.

Final Words:

Survival does not seem to require perfect knowledge. It has required workable distinctions, compressed into forms that enable timely action. This logic of adequacy explains why our minds favor shortcuts, why linear thinking feels natural, and why abstraction is indispensable. Yet it also warns us that what we simplify to live by can, in time, limit what we live for.

The challenge, or more precisely the necessity, might be to balance economy with humility. To remember that our conceptual logarithms, like the tables once used by navigators, are tools for a journey, not the journey itself. They serve us best when we keep them provisional, open to correction, and sensitive to the richness they cannot capture.

Managing attenuation wisely is itself a complex adaptive challenge without simple solutions. It requires not just awareness of our limitations but active practices that surface hidden costs and maintain cognitive flexibility. It demands that we ask not whether our ways of making sense mirror reality, but whether they continue to support effective action in the conditions we now face, and whether we have ways to notice when they no longer do.

Engaging with complexity means getting better at being good enough, continuously. Our task is not to eliminate attenuation but to manage it wisely. And that begins with a question we often neglect. What do we choose to ignore, and how do we ensure that choice remains conscious, provisional, and responsive to feedback?

Always keep learning…

If you enjoyed this post and find my work valuable, I would appreciate your support. You can explore more of my ideas in my latest book, Second Order Cybernetics, Essays for Silicon Valley, hard copy available at the Lulu Store.


Discover more from Harish's Notebook - My notes... Lean, Cybernetics, Quality & Data Science.

Subscribe to get the latest posts sent to your email.

2 thoughts on “A Good Enough Post:

  1. Short version: “If I say false, it is true””False”.

    I’m reminded of the scene from “A Few Good Men” by Jack Nicholson shouting: “The truth? You cannot hanlde the truth!”. You cannot literally “handle” the truth, “… even if you tried with both hands”.

    One acts as-if there’s some thing called “true” and because it works, one keeps on using it. It’s good enough, like a “true” map.

    Not coincidently – I won’t go into that – in Dutch we’re using the word “waar” for both “truth” (also know as ‘waarheid‘) and “where”. Saying “Waar zijn we?“, “Where are we?”, can be used as well as in reality (which is a true, real location) as on a map (which is a metaphorical place, “mapping” an actual place on a fictitious model). Is a place on a map true or false? Depends on where you are.

    In truth, there’s no such thing as truth. When it’s made into absolute – the truth, the whole and nothing but the truth – one invokes a paradox. The analogue opposite of ‘true’ is ‘false’, the digital opposite in not true. The digital opposite of false is ‘not false’. The tension of the paradox comes from not-false being neither true nor false.

    Lewis Carroll in his “The Hunting of The Snark” the hunters use the phrase “what I say three times is true”. When they have to find if a creature they see called Boojum is a Snark, they loose count. Repeating a lie, makes you loosing count.

    At my years in high school, we used logarithm tables and slide-rules (I’ve still got mine), an analogue so I was trained to know that good enough is good enough. That has always been one of my moto’s. “Better is the enemy of good”.

    By the way, the most famous Dutch physicist Lorentz, a good friend of Einstein, used slide rules and tables to calculate the route of the so-called “Afsluitdijk” https://en.wikipedia.org/wiki/Afsluitdijk to optimise length and the sea currents. It took him 7 years and he was wrong about the water height in storms of 20 to 25 centimetres. (Einstein was mad about this, because it took him off his more important projects in science and the League of Nations. He was also a well respected facilitator).

    Using digital computers may give the illusion of something being true. As they’re based on bivalent logic – something is either true or false – AND the fact that they don’t make a distinction between zero and nothing – probably the most important conceptual improvement in calculating (by the Arabs, the cipher 0).

    A bivalent coding system leads to someone of something belong to a category or not. One is either F or M and not F means M and vice versa. In my works as an information analyst, i used to propose using categories as “belonging to”, “not belonging to” or “indifferent” in relationship to some one. AND not using classes as categories.

    “Computers says ‘no'” seems to imply something is false, while in reality it isn’t. For one thing, some one (1) has programmed or added data to the computer to calculate something based

    The structure (or architecture) of (human) brains differs fundamentally from computers, as biological beings use a three values logic: slower – normal (continuous) – faster pulse trains. Or -1 0 +1. Makes also more sense in error correcting – learning – systems. It should have been the architecture of our calculating machines too.

    Liked by 1 person

  2. Addition:

    ““Computers says ‘no’” seems to imply something is false, while in reality it isn’t. For one thing, some one (1) has programmed or added data to the computer to calculate something based…” on someone’s assumption about the state of affairs.

    Liked by 1 person

Leave a comment