Diversity is Entropy is Information (D=E=I)

In 1993, Professor Carver Mead of the California Institute of Technology, one of the century’s greatest technologists, told me the wisest words I can remember. Carver, as many called him, had been my erstwhile professor of sub-threshold analog chip design. His classes in analog circuits and continuous computation, crucial concepts powering today’s AIs, were part of neuromorphic engineering, one of many disciplines comprising the university’s nascent Computation and Neural Systems program. I was about to defend my Ph.D. dissertation in this program (via Physics). My thesis, in effect, said neuroscience was at least 99% wrong. Upon hearing my thesis pre-talk, Carver distilled it to a sage quote: “One man’s noise is another man’s information.”

What neuroscientists call “noise” is the unpredictable crackling sound that neurons make as they fire. I mean that literally. Firing neurons sound like firecrackers, popcorn or peeling Velcro. Neuroscientists, defending against the obvious question of why a well-running brain would contain noise in the first place, point out that recipient neurons, with all their synaptic inputs, seem perfectly shaped to reduce noise in two ways. First, their long, thin input tubes, called dendrites, ought to act like mufflers for electricity, damping and smoothing current flow. Second, because dendrites gather thousands of input pulses for every output pulse, the well-established law of averages applies, mathematically guaranteeing that the noise from thousands of independent inputs would cancel out, even without the extra smoothing by dendrites. Averaging is a powerful smoothing process already.

In fact, the Law of Averages is such a powerful and universal law of nature that in my talk, I chose to put it first, well ahead of the evidence and data experimental scientists cherish, because I did and do theory. My Ph.D. thesis said, in effect, that because averaging and smoothing always reduce irregularity, the well-known fact that neural firings are irregular proves on its own that neurons do not smooth out synaptic inputs in any respect at all. That is irrefutable math.

Furthermore, the only way to produce so much crackling is if those dendrite tubes behave more like high-voltage lines than mufflers, sparking at any opportunity. In this mathematically sensible version of the neural code, each pulse carries information separately and the flow through the code as a whole carries hundredsfold more than anyone imagined.

I proved that this supposed neural crackling noise could not possibly be random static. It must contain crucial information as subtle cues to bind together concepts and perceptions stored in different places in the brain.

Seven people listened to my practice pre-talk in the small conference room overlooking the sunny Beckman courtyard: six of them fellow grad students I had goaded into attending, plus Carver, whose presence surprised and flattered me. He had no questions or challenges, and he offered me afterwards his wonderful comment in his gravelly, elfin voice: “Your thesis just goes to show, one man’s noise is another man’s information.”

That was my point exactly. In technology terms, information is measured by the mathematical metric called entropy, the same metric we use to describe noise. In other words, the only difference between noise and information is about you, not about it. I claimed then and even more strongly now that neuroscientists have no mathematical justification for thinking neural irregularity crackling is noise.

Additionally, I had a dozen physical reasons to cherish their so-called noise instead, as the primary carrier of information in brains, a channel with a thousand to a millionfold more bandwidth than slow averages could ever carry.

In other words, as Carver said, one man’s noise is another man’s information.

Relative entropy density in geometry. Author’s graph.

Entropy and the negative-second law of thermodynamics

Entropy measures possibility or diversity, which is the inverse concept from probability . Probability and diversity move differently than mass or energy do, since probabilities have to always add up to 100%. That means when one probability goes up, all the other probabilities have to go down, and vice versa.

The usual path in nature is for things to smooth themselves out on their own, mix and blur over time. Objects tend toward room temperature and rooms tend toward messiness. Probabilities equalize and total entropy goes up.

But if you have an extra intervention like an energy source, an amplifier or a selection process, things can go the other way as well. A single selected, amplified probability can increase, which drives the diverse range of competing probabilities down — like a refrigerator keeping things below room temperature, or a maid tidying up the room. One outcome up, overall diversity down. Total entropy goes down.

It’s not just neuroscientists who misunderstand the physics of entropy. Even physicists do. One of the most sacred laws of physics, the second law of thermodynamics, is crucially misunderstood by most physicists. They believe it means entropy always goes down on its own. This is not quite true.

The second law applies to isolated systems where no energy or mass goes in or out, but Earth (and biospheres in general) is not like that. We have a sun blasting us with heat and light, and cold, dark space to soak up what we throw away. Because of those, we have life, whose very definition — self-regulation and self-replication in tandem — is also a definition of entropy reduction. (Copying and regulating both make entropy go down).

So the “negative-second law of thermodynamics,” the one taking over our lives right now on Earth, is that entropy decreases in biospheres.

Relative entropy in human existence. Author’s graph.

Trust is bandwidth is entropy

So life drives entropy down. But life still needs entropy to do its business. If you view life’s main processes, regulation and reproduction, as algorithmic processes, it becomes clear that even as they create low-entropy waste, they require a huge but invisible reservoir of possibilities. For example, the myriad micro-volleys and vibrations involved in trusting one’s vision or balance, and the molecular jitter cells use to repair DNA.

Another example: the noise in a phone call. Conversations once were easy on old-school landlines, where each person’s microphone was live full-time. We could hear each other’s words and silences, and employ acoustic cues like “uh-huh” or sharp exhalations just like in real life. Unfortunately, mobile carriers refuse to transmit the noise of our breath in between words, having optimized their algorithms for “content” like phonemes, to save themselves money by not carrying the noise we need. (Europe suffers from this problem less, having earlier and more consistent technological adoption of high-definition voice standards. That is, more and better government regulation.)

Bandwidth links trust to entropy. The term originated a hundred years ago with radio spectra, but thanks to information-theory genius Claude Shannon, it now means information flow, in bits or megabytes per second. “Doing the numbers” on the information flow involved in human trust (as my partner and I did ten years ago) shows that only one part in a million of our live sensory bandwidth is content, and the rest is micro-vibrations, micro-expressions and micro-sensations. The tiny sliver of our superficial conscious minds is only made possible by a seething subsurface of oceanic volume.

Having many very different outcomes all equally possible is the definition of diversity and entropy. And because entropy and information are the same, diversity is all by itself information. Diversity is the architectural foundation, the substrate of trust, the carrier wave of bandwidth and the lubricant of all successful animal communication.

The same logic that makes randomization necessary in DNA mutation and clinical trials also applies, at hyper-speed, to muscle and eyeball tremors, real-time balance, interoception, gaze control, the sense of center, interpersonal connection and the process of learning itself. Real brains need diverse training data just like artificial intelligence does, and for the same reasons. (For proof, ask your favorite large language model chatbot if it needs a random number generator.)

Our brains and bodies need diversity in order to function properly. We need all kinds of different sensory and social experiences in addition to the convenient, comfortable ones we’re trained to want. The same goes for our education and news sources. Only with a diversity of parallel, cross-checkable channels can anyone trust anything.

Examples from society

The need for entropy is everywhere in society, but named differently. “Diversity builds resilience” is axiomatic in ecology, and is equivalent to the technological idea that a robust system requires a variety of mechanisms to adapt and function. The opposite of biological diversity is monoculture, cultivating a single species (often a single crop) in one area.

  • Diversified genomes survive better (vs. inbreeding).
  • Diversified investments perform better (vs. overfocused and over-leveraged).
  • Diverse language experience improves communication. Villagers across the globe find it natural to speak three or four languages, broadening both acoustic and cultural experience. Seeing many kinds of faces growing up — old and young, frozen and mobile, dark and light, cheerful and grumpy — provides crucial training data for learning social interactions.
  • Consolidation, aggregation, takeover and defeat are reducing the variability of practically everything, by quenching all kinds of outliers: rare languages, small businesses, ethnic groups, cute buildings, weird cars, anything quaint and local.
  • The present entropic singularity: The simplest mathematical gloss of life on Earth is that one tiny subculture of the world’s most powerful species is about to cover the surface of Earth with masses of inorganic crystals in place of life, that is with reinforced concrete and solar panels.
Now, more people have less entropy. Author’s graph.

Corporations need entropy. Even inside strait-jacketed organizations and corporations, the best decisions arise when the widest variety of outlier voices are included. So corporations need diversity inside, even as they take it from others. Ironically, ”diversity, equity and inclusion” might be misconstrued as a political position, but it is, in fact, the only possible recipe for sanity.

To reverse entropy reduction, one has to rediversify, making sure no one big guy takes over. The easiest way is boosting lots of little guys, which smears all the probabilities around as thinly as possible, with no dominant message or outcome.

Essentially, this strategy maintains a healthy variety of weak voices so that no single viewpoint gains enough power to eliminate all others and dictate the entire story. This is the exact opposite of the algorithmic amplification imposed by social media, and the echochamber dynamics of most politics — especially online discussions, since online interaction has the least bandwidth of any.

Information asymmetry in technology. Author’s graph.

Money trumps diversity

Unfortunately, the equations of short-term economics decree that not only capital but information flows be endlessly aggregated, compressed and consolidated, distilling all diversity away. This inexorable physical phenomenon of entropy reduction thus appears as media consolidation and echochamber cartoonification. Our global information flows become ever more compressed and simplified, more and more deprived of the oxygen of diversity they need to survive. We literally can’t know anything at all beyond our physical horizon without the multiple, orthogonal viewpoints (which very few outlets, like Fair Observer, provide).

As far as I and the laws of technology are concerned, the most important charities and missions on Earth are those which preserve the sanctity of information flow, without which other problems can’t be addressed. Children need to grow up with diversity of touch and people, without dazzlement, distraction and deception. The public needs news and history that is authentic, validatable and immune to retraction. Everyone needs scientific truth in plain sight, uncontaminated. Truth needs to be true and unmoving. Paradoxically, it can only stay put by active high-bandwidth balancing and rewriting.

Diversity is the lubricant of communication; if you ignore it long enough, you might forget it exists.

[Lee Thompson-Kolar edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy.

The post Diversity is Entropy is Information (D=E=I) appeared first on Fair Observer.



from Fair Observer https://ift.tt/mMVFfWG

0 Comments