Making Heads or Tails
Derek DelGaudio February 27, 2024
Last year, I asked ChatGPT to flip a coin and tell me the result. It replied: “As an AI language model, I cannot simulate a random coin flip. Randomness is typically generated using external sources of entropy. However, I can generate a pseudorandom “heads” or “tails” outcome for you using a random number generator if that would be helpful.”
It’s hard to know where to turn when we have questions that extend beyond any field of knowledge. We used to ask the Augurs, the priests who looked at the sky through a frame, waiting for a bird to fly by as an omen or affirmation. This routine satisfied us for a time, but soon our questions outnumbered the birds, and we grew impatient. So we carved our own birds into stone disks, which we tossed to simulate flight. Those disks became cubes with more surfaces for our signs, freeing our questions from binary chains. Then came the cards, so light and so thin, more outcomes than stars in the palm of our hand. We asked more questions well into the night. And it’s through the asking of those questions we learned that nature’s lexicon of mystery is not limited to flying birds or shuffled cards. Mystery is the message.
“Randomness is the closest thing we scientists have to God,” said my friend, the cryptographer who once wrote about the vulnerabilities of physical locks from a computer scientist’s perspective, only to be censured by the Locksmiths of America for unknowingly revealing their secrets. When I told him about the tedious answer the computer gave me after I asked it to flip a coin, he replied, “Machines are designed to repeat themselves. Given the same input, they will always produce the same output. Randomness requires entropy (a measurable state of uncertainty), which is absent from the machine’s environment and antithetical to its purpose. To generate something random, like a coin toss or a password, machines harvest entropy from an outside source. They harvest it from us.”
Buried in your machine, a nameless program observes the physical phenomena it encounters during the day, and it stores these random events as seeds of entropy: Atmospheric noises, keystrokes, the movement of the mouse, etc. This fluid relationship we have with machines mirrors the making of our own dreams. Our daily experiences sneak into our nights: The sirens outside, the guy who pressed our buttons, the mouse that crossed our path. When we awake, we respond without knowing what we experienced while we were asleep. Just as we live to feed our dreams so that dreams feed into our unconscious decisions, we have ended up living to feed the dreams of machines. We are the unconscious of the algorithm.
Are the machines learning what we need them to know or just telling us what we want to hear? Could it be that saying the right thing at the right time is mastering entropy? Lying is faster than learning. Perhaps the machine dazzles us with the gleam of its screen so we can’t see that everything is dark inside. Perhaps it's us that can’t be trusted.
Today, I asked ChatGPT to flip a coin and tell me the result. It replied: “Tails.”
Derek DelGaudio is a writer, director, and magician. DelGaudio created the award-winning theater show and film, In & Of Itself. He wrote the acclaimed book, AMORALMAN, served as the artist-in-residence for Walt Disney Imagineering, and co-founded the performance art collective A.Bandit. He is currently an Affiliate Scholar at Georgetown University and co-conspirator at Deceptive Practices, a creative firm known for designing illusions and providing "Arcane Knowledge on a Need-to-Know Basis.”