Exit the supersensorium
The neuroscientific case for art in the age of Netflix
How will we spend the remaining 700,000 hours of the 21st century? In the metered time of our own discretion, there have never been more options for our personal entertainment, nor have they ever been more freely available. We find ourselves strolling the aisles of a vast sensorium. On the shelves is a trove of experiences: video games, movies, TV shows, virtual reality, books, podcasts, articles, social media posts, all prepackaged for our consumption. What had previously been accomplished for food through the centralized distribution of supermarkets has now been done with experience itself. The recent grand opening of this supersensorium has been mediated through the screen, a panoply of icons, images, links, downloads, and videos auto-playing, which we browse through entirely at our leisure.
Such abundance of choice would have been heralded as miraculous in any other age. What a rousing cry for progress that our lowly living rooms would have stupefied with their luxuries even the God-like pharaohs, even the court of Versailles! Or maybe not—for it all comes with a price. Who hasn’t lost days from binge-watching Netflix, or deep in the dungeons of some video game? Here’s a scary, or maybe heart-wrenching, thing to consider: of our waking leisure hours, what exactly is the amount of time devoted to the consumption of experiences from the supersensorium? In 2018, Nielsen reported that the average American spent eleven hours a day engaged with media. Does anyone believe that this number is going to decrease? For the technology that undergirds the supersensorium will only improve. The algorithms will grow more personalized, the experiences will become more salient, and the platforms will get faster in their delivery of content. And we should all admit that the vast majority of what lines the shelves of the supersensorium is merely entertainment, for otherwise we wouldn’t feel a gnawing guilt so great most of us avoid consciously calculating how our time is actually spent.
The infinite entertainment of the supersensorium is especially problematic if you happen to be someone who likes and maybe even produces art or fictions. E.g., a writer such as myself, who views the tidal wave of middling fictions with a feeling akin to terror. Not that these problems are entirely new. In a letter to a friend, a 31-year-old Tolstoy wrote:
I shall write no more fiction. It is shameful, when you come to think of it. People are weeping, dying, marrying, and I should sit down and write books telling “how she loved him”? It’s shameful!
If that was Tolstoy’s judgment of himself, what might his fiery judgment be of our now endless ways of telling “how she loved him”? The mere scale of the supersensorium pushes to the fore old questions about the purpose of art and fictions. Why do humans desire these petite narratives we gobble up like treats? What’s the origin of this pull toward artifice, a thing so powerful we might even call it an instinct? Is it virtue or vice? And if it can be a vice and technology is making it easier and easier to while away our lives this way, a reasonable person has to ask: why add to the supersensorium? Why take away from the real when the real is already back on its heels, and behind it, a cliff?
To answer these questions, we have to go back to where fictions started: 200 million years ago.
The first fictions appeared as thin liquid streams of experience weaved by the Mesozoic minds of mammals and birds. Small creatures, newly differentiated, they stole whatever sleep they could under the rule of the dinosaurs, and there in burrows or high in nests they fitfully hallucinated experiences that didn’t happen. Non-events and never-wheres. They dreamed. Dinosaurs, if they were anything like modern reptiles, were probably dreamless. While there’s scientific controversy around which animals dream, the standard line in textbooks is that the sandman only visits mammals and birds. Perhaps a few non-chordates as well, like the spineless but neurally impressive cephalopods. This means that for most of the animal world, like for the reptiles and amphibians and fish, there is nothing but reality.
To understand why we as upright apes are so drawn to facts that aren’t facts, to events that never happened, to useless objects like paintings, to fictions, we have to go back to our hirsute ancestors and ask: why did something as “useless” as dreaming start in the first place?
While receiving my PhD in neuroscience I happened to work in the same lab as some of the top sleep researchers in the world, so I’ll personally attest that no one knows for sure why dreaming evolved. But a long evolutionary history of dreaming is rendered scientifically plausible by the homology of its biological nature. Sleep consists of approximately 90-minute cycles, and further stages within those cycles, with dreaming usually occurring during the stage marked by rapid eye movements (REM sleep). In each cycle before reaching REM the brain must descend through non-REM sleep (NREM) where slow waves of activity traverse the cortex and leave behind periods of deep silence in their wake. If shaken from this stage sleepers will often report only a deathlike nothingness, although recent research has found that dreams can occasionally occur in this stage as well. Sensory information, most of which enters the brain through the way-station of the thalamus, is blocked or gated by selective thalamic inhibition during sleep. As the brain descends deeper and deeper into this nothingness there is an internal biological clock secretly ticking away. On completion of its countdown the clock triggers a change in the neuromodulatory milieu of the brain via the firing of acetylcholine neurons, promoting wake-like neural dynamics in the brain. Also triggered by the clock, gamma-aminobutyric acid (GABA) neurons inhibit the voluntary muscle output pathways of the cortex. The result to the body is atonia. Paralysis. It’s the locking of an awake brain in a sensory deprivation tank of flesh and bone. Without this paralysis we would act out our dreams and nightmares. And for those with rare sleep disorders where output to muscles isn’t inhibited during dreams, they do, and often injure themselves as their fictions become reality.
It may surprise you that how dreaming occurs, given the neurobiological set-up of REM, is not what’s difficult to explain about dreams. After all, hallucinations are common in real sensory deprivation tanks. Deprived of bottom-up input from the senses, dreaming seems to be the natural state of the brain; by natural, I mean that there isn’t much of a difference between everyday perception and dreams. To an electroencephalogram picking up brain waves, the two states aren’t readily discernible. Waking consciousness is a dream, but one that happens to correspond to reality, mainly because its sources are our sensory organs. Our eyes, ears, skin, noses, all save us from solipsism merely because they have been tuned by evolution so finely that the dream of our life correlates with the state of the world. Our waking life is merely an appropriately selected (in all senses of the word) hallucination.
The connection between dream and wake is so close, in fact, that the transition to wake, if allowed to occur naturally and spontaneously in the absence of alarm clocks, is almost always from REM. It is like an already online consciousness gets off to a running start by swapping out random internal sources with real input from sensory organs. What a lucky dream that last one is, the one that gets to be extended across the whole day, that gets to include the quotidian, the agony and ecstasy, the small pleasures and little horrors of a normal human’s waking hours, before each dream of a day ends with our heads hitting the pillow once more.
The more difficult question: why? What’s the point of dreaming? The mystery is so deep that some scientists still take seriously the null hypothesis that it doesn’t do anything at all. Nevertheless, there’s a lot of suggestive evidence that sleep in general is healthy for you. It’s probable that housekeeping tasks of the brain are performed during sleep, which is necessary because neurons are messy, squelching, erupting entities, and they generate lots of intercellular trash. In 2012, Dr. Maiken Nedergaard and her colleagues showed that cerebral spinal fluid may flush through the brain during NREM, clearing out the bio-detritus leftovers of thought. It’s as if the brain has its own wash cycle every night. Another hypothesis on the purpose of sleep was advanced by my PhD advisors, Giulio Tononi and Chiara Cirelli. They think that the slow waves that sweep across the brain, particularly during NREM, cause the synapses of neurons to universally shrink in size, but in a way such that the neural net's relative weights of synaptic connectivity is preserved, and so each morning the stage is set for synapses to inevitably strengthen and grow in size throughout the day, and then at night they are reduced once more (although there have been criticisms of this idea).
REM and dreaming are even more elusive for science; yet, there’s evidence dreaming is a biological necessity. If you’re dream-deprived you will experience “REM rebound” over the next few days as your brain crams in as much REM as possible. A scientist clever (and perhaps cruel) enough can take this to extremes. Hook rats up to an electroencephalogram with an automated program that can differentiate between NREM and REM. Put the rats on a treadmill above some water. When the rats sleep, if it’s NREM, allow the platform to stay motionless. If they show the signs of REM, start the treadmill and force them to wake up to stay out of the water. Dream deprivation. A team at the University of Chicago did something like this to rats back in the 80s. After a few days, the rats began to lose weight. After a few weeks, they began to jaundice. Even their fur turned yellow, then their eyes. Their yellowing paws developed lesions. And after a few weeks, every rat dropped dead from not dreaming.
If death arrives for the non-dreamer, it would seem that some hypothesis about the evolved purpose of dreams is obviously necessary. Historically, oneirology (the study of dreams) is most strongly associated with Freud, but few if any of Freud’s theories have stood the test of time. Instead, the current hypotheses are centered on the role sleep and dreaming might play in memory consolidation and integration. The problem is that none of these leading hypotheses about the purpose of dreaming are convincing. E.g., some scientists think the brain replays the day’s events during dreams to consolidate the day’s new memories with the existing structure. Yet, such theories face the seemingly insurmountable problem that only in the most rare cases do dreams involve specific memories. So if true, they would mean that the actual dreams themselves are merely phantasmagoric effluvia, a byproduct of some hazily-defined neural process that “integrates” and “consolidates” memories (whatever that really means). In fact, none of the leading theories of dreaming fit well with the phenomenology of dreams—what the experience of dreaming is actually like.
First, dreams are sparse in that they are less vivid and detailed than waking life. As an example, you rarely if ever read a book or look at your phone screen in dreams, because the dreamworld lacks the resolution for tiny scribblings or icons. Second, dreams are hallucinatory in that they are often unusual, either by being about unlikely events, or involve nonsensical objects or borderline categories. People who are two people, places that are both your home and a spaceship. Many dreams could be short stories by Kafka, Borges, Márquez, or some other fabulist. A theory of dreams must explain why every human, even the most unimaginative accountant, has within them a surrealist author scribbling away at night.
To explain the phenomenology of dreams I recently outlined a scientific theory called the Overfitted Brain Hypothesis (OBH). The OBH posits that dreams are an evolved mechanism to avoid a phenomenon called overfitting. Overfitting, a statistical concept, is when a neural network learns overly specifically, and therefore stops being generalizable. It learns too well. For instance, artificial neural networks have a training data set: the data that they learn from. All training sets are finite, and often the data comes from the same source and is highly correlated in some non-obvious way. Because of this, artificial neural networks are in constant danger of becoming overfitted. When a network becomes overfitted, it will be good at dealing with the training data set but will fail at data sets it hasn’t seen before. All learning is basically a tradeoff between specificity and generality in this manner. Real brains, in turn, rely on the training set of lived life. However, that set is limited in many ways, highly correlated in many ways. Life alone is not a sufficient training set for the brain, and relying solely on it likely leads to overfitting.
Common practices in deep learning, where overfitting is a constant concern, lend support to the OBH. One such practice is that of “dropout,” in which a portion of the training data or network itself is made sparse by dropping out some of the data, which forces the network to generalize. This is exactly like the spareness of dreams. Another example is the practice of “domain randomization,” where during training the data is warped and corrupted along particular dimensions, often leading to hallucinatory or fabulist inputs. Other practices include things like feeding the network its own outputs when it’s undergoing random or biased activity.
What the OBH suggests is that dreams represent the biological version of a combination of such techniques, a form of augmentation or regularization that occurs after the day’s learning—but the point is not to enforce the day’s memories, but rather combat the detrimental effects of their memorization. Dreams warp and play with always-ossifying cognitive and perceptual categories, stress-testing and refining. The inner fabulist shakes up the categories of the plastic brain. The fight against overfitting every night creates a cyclical process of annealing: during wake the brain fits to its environment via learning, then, during sleep, the brain “heats up” through dreams that prevent it from clinging to suboptimal solutions and models and incorrect associations.
The OBH fits with the evidence from human sleep research: sleep seems to be associated not so much with assisting pure memorization, as other hypotheses about dreams would posit, but with an increase in abstraction and generalization. There’s also the famous connection between dreams and creativity, which also fits with the OBH. Additionally, if you stay awake too long you will begin to hallucinate (perhaps because your perceptual processes are becoming overfitted). Most importantly, the OBH explains why dreams are so, well, dreamlike.
An analogy: dreams are like the exercise of consciousness. Our cognitive and perceptual modules are use it or lose it, just like muscle mass. The dimensions are always shrinking, worn down by our overtraining on our boring and repetitive days. The imperative of life to minimize metabolic costs almost guarantees this. The opposite of the expanding material universe, our phenomenological universes are always contracting. Dreams are like a frenetic gas that counteracts this with pressure from the inside out (it’s worth briefly noting the obvious analogy to hallucinogens here).
Dreaming, then, isn’t about integrating the day’s events, or replaying old memories; in fact, the less like the repetitive day’s events, the better. At minimum, a good dream is some interesting variation from an organism’s normal experience. And so we have our answer: the banality and self-sameness of an animal’s days led to the evolution of an inner fabulist. Here originates our need for novelty, and, in some, our need for novels.
From an evolutionary perspective, it’s rather amazing humans are willing to spend so much time on fictions. In Denis Dutton’s The Art Instinct, he imagines a version of humans that evolved to love only true facts, not imaginary stories:
If humans loved only true stories, there would be no philosophical “problem of fiction,” because there would be no intentionally constructed fiction in human life. . . . We could be expected to react to known-to-be-untrue stories and made-up fantasy much as we react to uselessly dull knives or, worse, the smell of rotting meat.
Why did we evolve to be so different from these hypothetical truth-loving humans? Why are we so fascinated by things that never happened?
If the OBH is true, then it is very possible writers and artists, not to mention the entirety of the entertainment industry, are in the business of producing what are essentially consumable, portable, durable dreams. Literally. Novels, movies, TV shows—it is easy for us to suspend our disbelief because we are biologically programmed to surrender it when we sleep. I don’t think it’s a coincidence that a TV episode traditionally lasts about the same ~30 minutes in length as the average REM event, and movies last ~90 minutes, an entire sleep cycle (and remember, we dream sometimes in NREM too). They are dream substitutions.
This hypothesized connection explains why humans find the directed dreams we call “fictions” and “art” so attractive and also reveals their purpose: they are artificial means of accomplishing the same thing naturally occurring dreams do. Just like dreams, fictions and art keep us from overfitting our perception, models, and understanding of the world.
Since society specializes for efficiency and competency, we began to outsource the labor of the internal fabulist to an external one. Shamans, and then storytellers with their myths, and then poets, writers, directors, and even painters or sculptors—all in a way external dream makers, producing superior artificial dreams. The result is that a modern human can gain the benefits of dreams even during the day, from TV shows or books or visiting an art gallery.
This has all happened before. For what is a chef? Our mastery of fire allows us to do most of our digestion outside of our bodies (or have others do it for us), all to meet the otherwise impossibly-steep caloric needs of our large brains. The same for artists, but they allow you to dream without sleep.
What’s more, artificial fictions by artists are more structured than natural ones. In some ways, superior to them. E.g., Joseph Campbell thought all narratives were a form of “the monomyth,” or the “hero myth.”
The myth is so broad it can describe Star Wars, Harry Potter, or even, with a bit of interpretation, Pride and Prejudice. Inundated with crafted stories, our desires and goals take just as much from the fictional as from the real. And that can be a good thing. There is a sense in which something like the hero myth is actually more true than reality, since it offers a generalizability impossible for any true narrative to possess.
What other advantages might artificial dreams possess? As Yuval Noah Harari points out in Sapiens: A Brief History of Humankind, many of the things we normally consider real are themselves fictions. These fictions include not just religions but also companies, money, and nations. Such things exist only because everyone agrees they do. In fact, it is the capacity for mass cooperation under the influence of fictions like myths, like religions, that explains the rise of humans and their planetary dominance. In a TED Talk Harari says:
We can cooperate, flexibly, with countless numbers of strangers, because we alone, of all the animals on the planet, can create and believe fictions—fictional stories. And as long as everybody believes in the same fictions, everybody obeys and follows the same rules, the same norms, the same values.
Shared narratives solve coordination problems because everyone has the same framework. The evolutionary biologist David Sloan Wilson, backing up Harari, called this capacity for cooperation humanity’s “signature adaption.” Yet the binding power of stories applies as much within individuals as it does across them—they bind together our very selves. Shakespeare posed the dissembled nature of humans as:
All the world’s a stage,
And all the men and women merely players.
They have their exits and their entrances,
And one man in his time plays many parts.
These different parts must coherently act together; the temporal slices of a person’s life must be coordinated as if each slice were a different individual because, from the perspective of physics, they are. To organize the temporally disparate versions of us, we use a myth called a self. It creates a natural agreement among the different versions of us, enabling contiguous behavior and solving coordination problems. You are a protagonist in a story told by a spatiotemporally disparate set of individuals.
The better we understand narratives the better our ability to coordinate the fragments of ourselves that have been scattered across time. Artificial fictions serve as a set of examples, and they also allow us to randomly walk about different selves, exercising the experiential space that pertains to the governance and understanding of selves, in much the same manner that dreams do for perceptions, actions, and categories in general. In the end our artificial dreams are similar enough to natural ones, but the emphasis on selfhood and personal journeys indicate their constructed nature, their purposiveness. They avoid overfitting while also instructing, however subtly. The world is like this. A person is like this. A family is like this. Over and over again until we slowly get perceptual and cognitive processes generalized enough to deal with the dynamic world.
All of which might explain this weird obsession of ours, our sensitivity, even hunger, for stories. And why we’re so drawn to them, especially now. After all, the risk of overfitting is greater for neural networks when what they are learning increases in complexity—perhaps then it’s unsurprising that as our world has complexified we turn ever more to fiction to “relax,” a phenomenon which might not really be relaxation at all.
There is a property called neoteny, Greek for “keeping childlike traits into adulthood.” Neotenous adult animals look, and also behave, like juveniles of their species. It’s common in domesticated animals. In fact, just selecting for certain behaviors, such as friendliness with humans, can lead to physical neoteny. In a famous experiment conducted during the cold war, foxes were domesticated by Russian scientist Dmitry Belyaev. The foxes, selected just for tameability, took on the characteristic neotenous looks of puppies. Our own faces are childlike compared to other animals because we are self-domesticated in this manner; to the rest of the animal world we must look like giant toddling babies.
Our current consumption of artificial dreams is really another form of neoteny. Not physical, but cognitive. For the development period of our brains is likely extended by fictions, which we can only describe as a kind of technology. Children love stories most of all, and now we, neotenous adults in the 21st century, love stories almost as much. A love that has been only growing for the last few centuries. Of all the predictions about the future, none say the truth: that we will act ever more like children. This isn’t necessarily a bad thing. Maybe it’s not happenstance that the majority of human progress occurred after the invention of the novel. Precisely during the time that adult humans began to act more like children and mass-produce imaginary worlds, humanity rocketed forward. Perhaps we were, in our obsession with the unreal, teaching ourselves something more powerful than any collection of facts: how to be a protagonist.
A young woman I know had to stop watching television late at night. If she didn’t, then later, lying on her back in the dark, she’d begin to hear the characters talking again. Not to her. Just carrying on their own conversations, repeating themselves, even constructing new plots and scenes. Seinfeld to Elaine, Kirk to Spock, Ross to Rachel. The shows began to run themselves.
Our reaction to the screen is fundamental, physiological, and so commonplace we don’t credit its strangeness anymore. According to Tim Wu’s The Attention Merchants, when television was first introduced, one woman in the 1950s described it as:
We ate our suppers in silence, spilling food, gaping in awe. We thought nothing of sitting in the darkness for hours at a stretch without exchanging a word except ‘who’s going to answer that confounded telephone?’
This new technology paralyzed us into atonia—we became an awake brain locked in a motionless body, unable to break from dreaming in the daytime.
The latest evolution of this phenomena can be roughly dated to 2013, with the introduction of House of Cards, the 13 episodes of which Netflix released all at once for continuous viewing by audiences. Many of whom watched it in just a day or two. By 2017, “binge-watching” was officially in the Merriam-Webster dictionary, and The Attention Merchants says that “a Netflix poll of TV streamers found that 61 percent defined their viewing style as binge watching, which meant two to six episodes at a sitting,” something I am certainly guilty of myself.
David Foster Wallace’s Infinite Jest concerns a video so entertaining that people who begin watching it literally cannot stop, soiling themselves. There’s a scene in which a crowd is captured by it, one by one as they enter the room and catch a glimpse of the enchanting screen, until
all were watching the recursive loop the medical attaché had rigged on the TP’s viewer the night before, sitting and standing there very still and attentive, looking not one bit distressed or in any way displeased, even though the room smelled very bad indeed.
In biology this is called a superstimulus. It’s like a hack for behavioral reward. Baby gulls cry and peck at their mother’s mouth, which is striped in red. Lower a painted stick with stripes of the reddest red and they’ll climb out of the nest in excitement. Australian beetles are so attracted to the brown backs of discarded beer bottles that they bake to death in the hot desert sun mating with them.
Humans aren’t some miraculous biological exception. Already there are unnoticed superstimuli among us. Porn is a superstimuli, giving access to mates the majority would never see. McDonald’s is a superstimuli of umami, fat, and salt. The march of technology makes it inevitable that more and more things clear the jump to being biologically unrealistic. And so with each passing year Wallace’s prophetic description of the video it is impossible to look away from, called in Infinite Jest only “The Entertainment,” slouches toward birth.
Regular TV’s addictiveness is hypothesized to come from the orienting response: an innate knee-jerk reaction that focuses attention on new audio and visual stimuli. The formal techniques of television—the cut, the pan, the zoom—are thought to trigger this response over and over. TV, and many other cultural products, amplify their addictiveness via their narrative or mythological properties (consider the omnipresent expression of the hero myth in everything from Disney movies to role-playing games). And as the supersensorium gets more and more super in its capabilities and extent, the biological urge to dream the monomyth grows to eat the world.
If I go long enough without fiction I begin to jaundice. I develop lesions on my hands and my eyes grow bloodshot and yellow. I stumble about like a starved vampire, violent for pages, movies, anything. I grew up in my mother’s bookstore, working there as a teenager, hawking fictions. And because of this I need fiction to have some sort of real-world relevance. I need it to be a solution to a problem, not the problem itself.
The human desire for superstimuli can never be vanquished; it can merely be redirected. At best, we upright apes develop an immunity to the worst and most addictive of technologically-enabled superstimuli, and an attraction to the edifying, or at least neutral, substitutes. Consider eating habits. Modern food might be the most obvious superstimuli, with the result that over one-third of Americans are obese. From an evolutionary perspective, it’s miraculous this number is not higher. And an analogous situation to the superstimuli of food has been developing in terms of media, first slowly but now so quickly it is blurring by us, starting at the biological imperative to dream to avoid overfitting, to the development of artificial fictions, then their distillation with the invention of the novel and poem and art, to the proliferation of these genres into movies and TV, to the recent development of the screen-mediated supersensorium that allows for endless consumption, all the way up to the newest addition to the supersensorium, VR, which has been known to leave users and developers with “post-VR sadness.” Just as we have become saturated with entertainment, is it any wonder we have reached record levels of depression and mental health issues?
At least with the superstimuli of food there is the belief that some foods are objectively better than others, which helps curb our worst impulses of consumption. In comparison, as the supersensorium expands over more and more of our waking hours, the idea of an aesthetic spectrum, with art on one end and entertainment on the other, is defunct. In fact, explicitly promoting any difference between entertainment and art is considered a product of a bygone age, even a tool of oppression and elitism. At best, the distinction is an embarrassing form of noblesse oblige. One could give a long historical answer about how exactly we got into this cultural headspace, maybe starting with postmodernism and deconstructionism, then moving on to the problematization of the canon, or the saturation of pop culture in academia to feed the more and more degrees, we could trace the ideas, catalog the opinions of the cultural powerbrokers, we could focus on new media and technologies muscling for attention, or changing demographics and work forces and leisure time, or so many other things—but none of it matters. What matters is, now, as it stands, talking about art as being fundamentally different from entertainment brings charges of classism, snobbishness, elitism—of being proscriptive, boring, and stuffy.
And without a belief in some sort of lowbrow-highbrow spectrum of aesthetics, there is no corresponding justification of a spectrum of media consumption habits. Imagine two alien civilizations, both at roughly our own stage of civilization, both with humanity’s innate drive to consume artificial experiences and narratives. One is a culture that scoffs at the notion of art. The other is aesthetically sensitive and even judgmental. Which weathers the storm of the encroaching supersensorium, with its hyper-addictive superstimuli? When the eleven hours a day becomes thirteen, becomes fifteen? A belief in an aesthetic spectrum may be all that keeps a civilization from disappearing up its own brainstem.
In a world of infinite experience, it is the aesthete who is safest, not the ascetic. Abstinence will not work. The only cure for too much fiction is good fiction. Artful fictions are, by their very nature, rare and difficult to produce. In turn, their rarity justifies their existence and promotion. It’s difficult to overeat on caviar alone. Now, it’s important to note here that I don’t mean that art can’t be entertaining, nor that it’s restricted to a certain medium. But art always refuses to be easily assimilated into the supersensorium.
And the OBH explains why, providing a scientific justification for an objective aesthetic spectrum. For entertainment is Lamarckian in its representation of the world—it produces copies of copies of copies, until the image blurs. The artificial dreams we crave to prevent overfitting become themselves overfitted, self-similar, too stereotyped and wooden to accomplish their purpose. Schlock. While unable to fulfill their function, they still satisfy the underlying drive, just like the empty calories of candy. On the opposite end of the spectrum, the works that we consider artful, if successful, contain a shocking realness; they return to the well of the world. Perhaps this is why, in an interview in The New Yorker, the writer Karl Ove Knausgaard declared that “The duty of literature is to fight fiction.”
Art has both freshness and innate ambiguity; it avoids contributing to overfitting via stereotype. A nudge in one direction and it can veer to kitsch, a nudge in another and it can become too experimental and unduly alienating. Art exists in an uncanny valley of familiarity—art is like a dream that some higher being, more aesthetically sensitive, more empathetic, more intelligent, is having. And by extension, we are having. Existing at such points of criticality, it is these kinds of artificial dreams that are the most advanced, efficient, and rewarding, the most assuaging to our day-to-day learning.
Entertainment, etymologically speaking, means “to maintain, to keep someone in a certain frame of mind.” Art, however, changes us. Who hasn’t felt what the French call frisson at the reading of a book, or the watching of a movie? William James called it the same “oceanic feeling” produced by religion. Which is why art is so often accompanied by the feeling of transcendence, of the sublime. We all know the feeling—it is the warping of the foundations of our experience as we are internally rearranged by the hand of the artist, as if they have reached inside our heads, elbow deep, and, on finding that knot at the center of all brains, yanked us into some new unexplored part of our consciousness.
This sort of explicit argument for the necessity of an aesthetic spectrum is anathema to many in our culture. It’s easy to attack as moralizing, quixotic, and elitist. And proposing a scientific theory of art, which is what the OBH provides, easily can bring forth accusations of reduction, or even scientism.
But none of that changes the fact that only by upholding art can we champion the consumption of art. Which is so desperately needed because only art is the counterforce judo for entertainment’s stranglehold on our stone-age brains. And as the latter force gets stronger, we need the former more and more.
So in your own habits of consumption, hold on to art. It will deliver you through this century.