Why I Like Writing Fiction (Reason #1)

“I don’t know, Ruth. You seem like such a happy person; I mean, I’ve never seen you less than glowing with anybody. But I have a question for you: your prettiness, all the attention, do you ever find it oppressive? Like, because men just like the idea of themselves being ‘with’ someone as pretty as you, but not really caring about you?”

I am damned with two coincident conditions: I am inquisitive, and I am analytical. This comorbidity could easily result in me asking wildly inappropriate questions. These questions are inappropriate not because they are vulgar, or in poor taste, or even, well, inappropriate; but because they are so rarely asked, especially in casual conversation, they are interpreted as strange, their asker as something of an outlier, or a Nietzschean. (Not that there’s anything wrong with that, Zarathustra.)

I believe the unspeakable nature of these questions is repressive and tragic, and it leads to that malaise hysteria–or, if the questions are particularly disturbing, they lead the asker to be admitted to mental health facilities. But because fiction is, technically, a separate and ontologically porous realm where almost anything goes, characters are able to ask those questions. If an author lets her/his characters ask these difficult questions, his/her book or story is lauded by critics–in the language particular to book critics–as “poignant” or “heartbreaking” or as “a daring exhortation compelling the reader to explore the inner world of [fill in the blank]. [Writer’s Name] shoves readers toward the door to the silent conscious, opens it, and dares them to step into the black.” Or whatever.

But, the thing about these questions is that they open up entire universes of personhood. Asking these questions takes what is otherwise a laundry list of traits and descriptions, set pieces, the meta-structures of introductions –> introduction of tension –> complicating factors –> climax –> denouement –> closing, and renders them real and relatable.

They are the barbs in the hook of plot; they emotionally attach the reader. Based on characters’ answers, they grow into something real; readers either empathize with a character’s assertions and grow fond of them, or they disagree and come to hate the character. It’s that connection which prevents the reader from leaving the story; once hooked, they come along for the ride.

That Kind of Writing to Which I Aspire

For a class, I am tasked with writing a 20-25 page paper about an intellectual. My choice is one of my all-time favorite authors, one whose prose has influenced my own cognitive processing more than I’d ever like to admit.

For a class, I am tasked with writing a 20-25 page paper about an intellectual. My choice is one of my all-time favorite authors, one whose prose has influenced my own cognitive processing more than I’d ever like to admit. If I could be a tenth the writer David Foster Wallace was, I could die happily. The tragic irony of that phrase.

An excerpt from a story in the book Oblivion:

The truth is you already know what it’s like. You already know the difference between the size and speed of everything that flashes through you and the tiny inadequate bit of it all you can ever let anyone know. As though inside you is this enormous room full of what seems like everything in the whole universe at one time or another and yet the only parts that get out have to somehow squeeze out through one of those tiny keyholes you see under the knob in older doors. As if we are all trying to see each other through these tiny keyholes.

But it does have a knob, the door can open. But not in the way you think…The truth is you’ve already heard this. That this is what it’s like. That it’s what makes room for the universes inside you, all the endless inbent fractals of connection and symphonies of different voices, the infinities you can never show another soul. And you think it makes you a fraud, the tiny fraction anyone else ever sees? Of course you’re a fraud, of course what people see is never you. And of course you know this, and of course you try to manage what part they see if you know it’s only a part. Who wouldn’t? It’s called free will, Sherlock. But at the same time it’s why it feels so good to break down and cry in front of others, or to laugh, or speak in tongues, or chant in Bengali–it’s not English anymore, it’s not getting squeezed through any hole.

So cry all you want, I won’t tell anybody.

 

An Extremely Random Thought

This is the product of a day of reading… which is to say that I was extremely prone to believing my thoughts to be particularly insightful. I wrote it in my Moleskine notebook, excited as all getout. You be the judge.

– – –

What if all mass is fluid, its solidity an illusion–a function of a given matter’s viscosity?

Glass windows warp over time: their bottoms fatter than their tops.

Stacks of gold bars were found in pharaohs’ tombs; melded together over thousands of years, discovered now as one solid mass.

Lessons – 1

[257] The illusion of facts will suffice. [272] Reality-based art hijacks its material and doesn’t apologize. [255] Facts now seem important.

A note to be affixed to the bulletin board. A finding from today's reading.

[255] Facts now seem important.

[150] If Tina Fey’s impression of Sarah Palin hadn’t been based closely on verbatim transcripts of Palin’s performances, it wouldn’t have been remotely funny, and it wouldn’t have affected the election; its comedy derived precisely from its scrupulous reframing of the real.

[256] Facts have gravitas.

[381] In order to make it easier to handle, Darwin would cut a large book in half; he’d also tear out any chapters he didn’t find of interest.

[257] The illusion of facts will suffice.

[272] Reality-based art hijacks its material and doesn’t apologize.

[496] This is the wager, isn’t it? It’s by remaining faithful to the contingencies and peculiarities of your own existence and the vagaries of your own nature that you stand the greatest chance of conveying something universal.

[497] Self-study of any seriousness aspires to myth. Thus do we endlessly inscribe and magnify ourselves.

[498] A man’s life of any worth is a continual allegory.

[499] What is true in your private heart is true for all men.

[500] All our stories are the same.

[501] Every man has within himself the entire human condition.

[502] Deep down, you know you’re him.

_____________________________

Some recognition goes to David Shield’s book, Reality Hunger, for providing these chewy didactic bits. This in spite of [259]. If “Genius borrows nobly”, I hope to have a little of it, genius–his or otherwise–rub off on me.

5 Einstein Quotes To Which I Owe My Current Sanity & Perspective

Albert Einstein, for reasons too numerous to go into here, is a personal hero of mine. “Avuncular” is a good word to describe him; he is the deadly-smart uncle with crazy hair we all want… that is, until one learns about his personal life. But, nonetheless, his public persona is one I respect immensely. His words, in their simplicity, their sagacity interpolated or genuine, are powerful. Here, I will post the quote, and under it I’ll give a one or two sentence explanation of my interpretation of it.

– – –

“In order to form an immaculate member of a flock of sheep one must, above all, be a sheep.”

Einstein was never what one might call a “conformist”. Especially in the context of academia, his iconoclasm provided inspiration for my own dissidence since early high school. This quote, if I had to guess, might have been in reference to 4.0 GPA’s… but that’s just my best estimate.

– – –

“The only thing that interferes with my learning is my education.”

Although I believe Einstein (and F. Scott Fitzgerald) cribbed this line from Mark Twain, I nonetheless say this quote–some agglomeration of Einstein, Twain, and Amory Blaine’s versions–to myself when the going gets tough at school. I have it written on an index card I keep thumb-tacked next to my door knob. I see it every day.

– – –

“Common sense is the collection of prejudices acquired by age eighteen.”

I’m in the middle of breaking away from the majority of these prejudices… of which there are many.

– – –

“…one of the strongest motives that lead men to art and science is escape from everyday life with its painful crudity and hopeless dreariness, from the fetters of one’s own ever-shifting desires. A finely tempered nature longs to escape from the personal life into the world of objective perception and thought.”

This, of all Einstein’s quotes I will mention here, is the one that affects me most right now. I am torn between business and writing fiction and nonfiction, between international relations, systems theory and neurology, psychology, and cognitive science. Although I believe I am suited for the world of “innovation” (God, what an awfully hackneyed word) in the world of entrepreneurship, I know, at some deep, visceral level that the only way I would ever be truly happy would be to recluse myself from the harshness, the brutality, and (more often than not) the soul-crushing banality of day-to-day life to craft and curate worlds of my own: perfect recreations of the “real world”, where crushing denouement, its resultant ache plays synecdoche for realization.

– – –

“If A is a success in life, then A equals x plus y plus z. Work is x; y is play; and z is keeping your mouth shut.”

I needn’t say more.

On College, Critical Thought, Cattle, and Baking

I’m not afraid of expressing my misgivings with the “education” I’m receiving at the #4 ranked institution in America. UChicago possesses a certain self-righteous rhetoric pertaining to its general, or Core, curriculum. Founded in the constructivist school of learning theory, the Core’s teaching methodology consists of reading “primary-source” text documents and leading students, like cattle through the slaughterhouse chute, to the captive bolt of stunning revelation. Unfortunately, unlike a slaughterhouse, where livestock are funneled one by one to their end through hard-walled chutes, the texts are deployed thematically and without guidance. This fosters “critical thinking” skills, enabling students to draw connections between sources toward a prevailing image—not theory—of the time.

In UChicago’s Civilization courses, students are given texts and encouraged to make assertions about overarching historical themes. Without the benefit of a critical, theoretical framework any assertions made skitter across the trite surface of the vast intellectual sea.

I approached my professor today and asked why there isn’t more structure in the class, why mention of larger theoretical frameworks is verboten, why whenever I try to probe deeper in class discussion she stares at me as she calls on another person. Why, when I “zoom in” argumentatively the natural inclination is to deflect and remove conversation to the rarefied atmosphere of platitude and placate me with a vaguely patronizing, “A poignant observation about the corporate nature of the Catholic church, but let’s shift the focus to how women are presented… B—, why don’t you go?” B— answers. “They are portrayed, as you say, ‘to be bad.'” Astute.

Please, professor, if you are reading this, I don’t blame you. I am sure you too are frustrated with the somewhat constrained nature of the course, with its “learning objectives” and whatnot. You and I could carp on and on about our shared frustrations at the lowest-common-denominator level of intellectual rigor prerequisite of a Core class, that this like all of ’em are rendered passable even for say, um… the more desultory among us, to be nice about it. I imagine we’d cackle together, laughing at our self-conceptions of our inflated noetic badassery… In some alternate reality, professor, we might be afforded this opportunity, but instead you smiled wistfully, squinting, and said:

“We are trying to teach you how to think. Imagine it this way: we could give you all the instructions for baking a cake, or we could give you the required ingredients and you do it for yourself. We want to empower you, so you can bake that cake.”

There are a couple of sticky issues to address. This statement predetermines that cakes are the goal, and necessarily means that someone measured out the ingredients for a cake, laid them out, and assumed that some unsuspecting person would come along, see the spread, and ineluctably conclude that a cake is in order. Punishment is meted out to those who bake biscuits or cookies or transcend the whole category of baked-goods altogether and instead mix water, sugar, yeast and some flour to distill alcohol, which might be then flavored with vanilla or whatever flavor was intended for the cake.
Those who bake cakes, no matter how lumpy, soupy, squishy, or dense, are commended for baking a cake; because we’re all good postmodern cognitive relativists here, we can’t criticize the craftsmanship of the cake. Cake soup is but an interpretation of cake, and all interpretations, due to their subjective nature, are inherently valid… provided, of course, that they are interpretations of cake. Biscuits and grain alcohol, no matter how well-executed are not cake: you, hapless baker or distiller, fail the test.

It isn’t the biscuit-maker’s fault he didn’t bake a cake when he was given ingredients and told to make the most of them. Without instruction, he can neither be held accountable for his product or the quality thereof. I understand that given an infinite number of tries, some random, novice baker will execute one hell of a cake, but given finite ingredients and some hinting, winking burlesque show of the pinnacle of the cake form, a novice is still a novice.

The best bakers trained with the best bakers. At the kernel level, it disturbs me that undergraduates are doomed to hapless experimentation to autogenously construct the properties of a given set of flour, eggs, sugar, water, etc. and condemned for looking in a cookbook to ascertain some method by which they might come together, some technique: an artistry. That two years of my four are spent in classes teaching me to cherrypick quotes to support baseless, absurd theses, how to render and construe and augment the absurdity of said theses, how “context” “frames” “the lens” of the “text,” how to disbelieve everything, how to laugh at claims of absolute truth—at base, how to intimate, interpolate, and extrapolate bothers me. We are taught to stir and pour when we are smart enough to bake. The details can be figured out along the way, and with the help of a skilled, involved instructor.

En masse we students are funneled toward one moment, a bolt to order the brain, but our handlers failed in one capacity. Temple Grandin, a world-renowned animal-welfare and autism advocate, intuited that animals being led to slaughter know “what’s up,” they knew viscerally what lay around the next turn. At some basic level, they were aware of the machine’s cogs’ turning.

I demand the same sort of recognition by our professors for all UChicago students. We know what the curriculum is trying to do, and this self-consciousness hinders its ultimate transformative goals. Because of Grandin’s work, meat processing facilities now implement long, undulating passageways through which cows blithely wander to their doom. They don’t need cattle prods. If I were unaware that just around the bend lay frustration, emptiness, and disappointment at the waste of my academic journey, I’d be less recalcitrant. I too would walk blithely. I’d be bovine. I am. But for now you’ll take me kicking and screaming, rhetorically of course.

On Social Networking and Personal Branding

I stopped tweeting last week. I did not, however, delete my Twitter account. What prompted me to do this was an interview on a radio program, Fresh Air. Interviewed was a journalist who just won a Pulitzer prize for a series he did on distracted driving: how cell phones–talking, texting, and emailing–change the way we drive. The conversation drifted from the specific venue of technology consumption in cars to technology consumption writ large and how technology changes the way we think, how we live, and in many instances adversely effects our quality of life.

I forgot which early psychologist twisted Aristotle’s assertion that we humans are rational beings by saying that we are rarely rational but possess an astonishing ability to rationalize our decisions ex post facto. Computers–holding onto the aura of seemingly magical productivity instilled by good IBM branding in the 1960s and 1970s–confer an “I’m getting things done here” message upon their users, and still do, even though what we use computers for as of late can rarely be called productive

My issue with Twitter in particular is that, when engaged in heavy use, I feel my brain quicken. It is volatile, and thoughts–instead of coming out nicely punctuated with a sort of droll, discursive spirit–come out in fits and starts, as if my noetic works are gummed up.  I felt the refulgent glow of wit between my ears, but found upon further examination that this readiness to rhetorically lacerate a subject, in practice, rendered itself in spluttering 140 character syllogisms and overly precious plays on words–which I then proceeded to hashtag, so that when the meme goes viral I’d be pegged as its progenitor.

#Hegelgasm: n. The palpable feeling of transcendence characterized by mental quietude and a glimmering warmth about the head and neck; likely a psychosomatic response to self-congratulatory notions of intellectual achievement resulting from successful implementation of dialectic thought processes.

I propose the following methodology for dealing with social networking utilities, especially if considering their use as a “personal-branding” tool: use Twitter and Facebook as native, parallel, back-end distribution channels for autogenous content. What this basically means is that I know that the vast majority of my blog hits are linked from my Facebook page, and that Facebook qua Facebook and Twitter qua Twitter make for extremely disappointing personal-branding experiences. One must create more substantial content than 140-character witticisms, or wacky status updates, or anemic attempts at “keeping in touch” to be considered valuable.  

Who do people respect more, the creators of content or those that re-tweet or post shortlinks to content?  Who is more valuable, the creators of content or the disseminators of otherwise unread material? It’s a tough question, and a little too philosophical to get into here.

So, here’s my new Social Networking Paradigm in a nutshell: It is my belief that the greatest currently-available tool for personal branding is the weblog. Little to no direct engagement with various social networking utilities is required, as posts linking Facebook and Twitter users are created endogenously, simultaneously to when a blog post goes up. This directs more users to the blog, and frees up otherwise wasted hours spent surfing Facebook to write better blog posts, read, engage face-to-face with friends, and re-learn deep concentration and other mental processes lost through interface with technology.

––––

Ultimately, I don’t want it to seem that I’m writing this blog only to build the brand of Jason D. Rowley. In fact, personal branding is the least of my reasons for writing The Halcyon Days; I view it as a good mental workout, as a peer-reviewed version of what I keep tucked away in a growing pile of Moleskine notebooks, and it ultimately forces me to keep my love of arcane verbiage at a manageable level.

If you’ll excuse me, I’ve got to check my Facebook and deny a random Foursquare request and retweet some nonsense and address my iPhone, which has been jittery with text messages for the past ten minutes. 

Now, how ridiculous did I sound?

In Defense of Fiction: On Nuance

It is in that moment where muscles twitch slightly and a pang of synaptic excitement and that initial ineffable fraction of a fraction of a second before one articulates the “huh, I never noticed that before” that one experiences nuance.

I roll with a crowd that is somewhat slavishly devoted to reading. This is a good thing. With the exception of one of my friends, a man with whom I share my first name, the written matter they consume often takes the form of blog posts, magazine articles, and a preponderance of politics-, economics-, or entrepreneurially-focussed books. All of the above are categorized broadly as nonfiction.

A close friend of mine was reading some of my old blog posts and came upon a very brief treatise on the subject of “nuance” nested within a recent post, On the Hating of Haters.

He shot me an email asking me how one might gain a further appreciation of nuance, a question to which I responded with the following:

Short answer: read more fiction. Non-fiction writing is based on the premise of simplicity. The goal of a non-fiction book is to convey its thesis and supporting examples succinctly and efficiently. Nuance is often confused with extraneous detail, as being pertinent but largely insignificant to the larger argument; it is perceived to be a distraction from the larger “point” of the book/article.

Fiction, on the other hand, usually lacks a hard thesis and thus doesn’t need to convey information efficiently. A fiction writer may have a didactic message to convey, but the modus operandi of fiction for the past 200+ years is to “show, not tell.” Fiction writers show readers what they want to convey: adjectives are not the enemies.

As for some greater utility to fiction, I can only cite my personal growth in emotional intelligence and writing abilities. Both of which I’ve utilized in too many venues to mention here.

While it might be true that a certain critical mass of factual material must be mastered to be a functional adult. The real marker of intellect is not an increased volume of grey matter, the computational, repository bundles of nerve cells in the brain, but the volume of white matter, the neuronal connections that allow the brain to connect disparate ideas and concepts, to render juxtaposition and combination of ideas/things/messages in “contrapuntal” (to use a musical term) harmony.

That so many “young people” eschew the creative, integrative mental processes of reading fiction bothers me. They believe vehemently that consuming a media diet constituent of nonfiction, whether it is news or biography or pop-psych-sociology characteristic of Malcom Gladwell et al, makes them, in some material way, more intelligent. I counter this assertion by positing the following: functional intelligence has very little to do with the amount of factual information one crams into one’s brain; rather, intelligence can be measured by one’s competence with dealing with complexity, and the felicity with which one can deploy knowledge from disparate fields to address a given question or issue. From this integrative cerebral interconnectivity, not the mere aggregation and distillation of facts, “insight” is conjured.

Intelligence is gestalt, it is greater than the sum of its constituent parts; our knowledge is more than the words written and read, spoken and heard, created and consumed, respectively, throughout the course of our lives. The margin of error in our accounting here is the resultant cognitive gains of interconnectedness. Moreover, one cannot hard-wire methodologies for interconnectedness, only the potential for it. Why is it so hard to develop artificial intelligence? Resulting output is the product of false connections, the aforementioned insights are ersatz when uniqueness is attempted algorithmically.

The reading of nonfiction is an exercise in consumption, while reading, consuming, fiction is an exercise of creation. In the act of creation, which will be described below, the brain comes to play.

————————

A brief interpolation: have you ever noticed that on the bookshelves of the accomplished, one is likely to find a well-thumbed volume of Tolstoy or Austen or Huxley or Fitzgerald’s? These as opposed to Malcom Gladwell or any other member of the quasi-intellectual class who pass off cant and cleverness as meaning during their TED talks? Why is it that one might find, browsing the web histories of these same individuals, a link to one of the New Yorker’s blogs or to Slate, as opposed to a link to LifeHacker?

————————

Writing fiction is difficult. The writer is tasked with creating a world with a degree of comforting verisimilitude so that their readers might suspend disbelief for a moment and forget that what’s going on in the plot is, ultimately, fallacious. This is the objective world of the novel or short story. It is confined in its scope to the meanings of chains of well-chosen words. It is flat, its descriptions arid; it is a world nonetheless.

There is a world nested within that created by the author. When one reads fiction, one not only consumes its storyline; one is forced to imagine the characters, to form mental pictures of them, to perhaps ascribe quirks and ticks that the author failed to mention. Everything one pictures in this created world is highly personal. A beach umbrella mentioned in a story might evoke the one used on a seaside vacation as a child; in the mind’s eye, the brown bob of an old girlfriend’s hair might transpose itself onto the head of a particularly endearing female character. The world created by a reader is personal; the density of analysis one could do within the confines of a book can be virtually infinite. A book read well by a dozen people will offer a dozen interpretations; all are valid. All are subjective, and those who make claims to knowledge of an objective Truth divined from the text should be met with unflinching consternation. Their capital-T Truth is just as divined, as conjured, as pulled from the ether of their immemorial unconsciousness as the next person’s truth. Each person’s interpretation of a book is different, and one can interpret one’s interpretation as a reflection of themselves. The value of fiction as a tool for personal growth lies not in how the words on a particular page are articulated, or by the wild proclamations made by those in literary analysis courses, but by what occurs “between the lines” and over the infinitesimal space between synapses.

————————

A didactic question: If nuance–intricacy–in fiction didn’t really matter, would we never again, after viewing a film adaptation of a book, claim, “You know, man, I liked the movie, but the book… the book, like, had so much in it that the movie, you know, like, kinda leftout. I dunno, but the book was just so, I dunno, man, what’s the word…” phantasmagorical in its breadth, depth, and detail? Yeah, man: phantasmagorical. Try typing that out in a text message.

————————

Nicholas Carr, in his book-length rejoinder to his Atlantic article, [sic] “Is Google Making Us Stoopid?”, The Shallows: What the Internet is Doing to Our Brains, discusses the nature of the plasticity of the brain: that, with repetition, mental and physical activities, done repetitively, literally change the wiring of the brain. New neuron channels are formed, making behaviors that were once difficult easier over time. Recollect, for a moment, the monumental task before young children learning to dress themselves: the relative conceptual ease of snaps and elastic bands versus the enormous difficulty of buttoning a shirt. Tying shoes presents a problem that initially appears gordian in its intricacy. We learn. We don’t walk around naked. Not yet anyways.

The point being this: we live in a world driven by efficiency, we consume media that panders to the lowest common denominator of comprehension, and in a world so loaded with new information, new happenings, the new New New Thing (referring to the Michael Lewis book), Americans are using an ever-shrinking vocabulary to articulate what’s going on. (I don’t really want to say “we live in a world where…” anymore; it makes me sound cantankerous.) If one were to pull back from the goings on in the world, one would realize that ours is one of monodimensional blanket statements; things are articulated as black and white, righteous and “evil”, good and bad, immutable truth and rancorous bullshit. If it isn’t expressed plainly, elegantly, and matter-of-factly, it is taken as suspect. There are no freedom fighters, only terrorists; no moderate conservatives, only Randian, Bible-belting, free-market, ignorant, moose hunting, racist, xenophobic, quasi-populist Sarah Palin devotees. The monodimensional sells itself on its simplicity, and to mitigate the stultifying nature of the monodimensional, in lieu of another dimension, carefully-chosen language articulates itself in the spicy argot of over-dramatic extremism.

Ultimately, the point of Carr’s book, that the Internet affected our brains neuronally, that our neural pathways programmed for deep contemplation, concentration, creativity, and imagination have decayed in favor of those that allow us to consume and process huge quantities of information quickly (i.e. superficially), is “learned behavior.” People who think simplistically implicitly allow themselves to do so.

Authors’ worlds are recognizable and engaging not because of the nouns they choose, but the adjectives and verbs implemented. These paint a mental picture, the resolution of which is wholly dependent on the specificity of the chosen words. An author mentions tiny details that spark, if only for a brief second, within the reader a “huh, I never noticed that before” moment. It is in that moment where muscles twitch slightly and a pang of synaptic excitement and that initial ineffable fraction of a fraction of a second before one articulates the “huh” that one experiences nuance. It is for these brief moments of cerebral vivacity, where I identify and solve simultaneously the trick the author is playing, catching ex post facto he or she rendering connections from disparity, that I read fiction. Many of my friends see the world as one that offers incredible opportunity, a view which I share; but unlike some of them, through years of reading too many works of fiction to ever count, I believe I’ve gained the ability to live fully in the moment, to notice what is ignored, to engage the world actively, to find the needle of interest in the great musty haystacks of the mundane, and, ultimately, be able to articulate to you, the reader, what I wish to articulate with a degree of exactitude I defy another 20-year-old kid with a blog to surpass. An appreciation for variegation, of tiny variations, of minuscule vacillations, comes with one for vocabulary—said appreciation is my avocation. It is learned and adopted and eventually taken for granted. We are creatures of habit, after all.