A Few Things I Learned in 2014

Note: This is a cross-post from my Medium blog, originally published in January 2015. I included it here on my website to kick off the new blog.

I’d originally intended to share this list of bullet points with a friend. Some are general, some are personal. A few are quite revealing, but I believe that personal faults and foibles are rendered less significant if they’re openly shared.

Nothing to hide, nothing to fear, right?

I’m sharing this list mostly because I think some of the things I learned are potentially valuable and applicable to other people* (See note at end of post.)

Here are a few things I learned in 2014:

  • Some people have a tendency to be trusting and generous to a fault, and that they let other people take advantage of that because they weren’t sufficiently self-confident to stand up for what was best for them.
  • Corollary: Some people (including me) have a hard time saying no to things.
  • People (especially in startup culture) tend to overestimate/overvalue people with well-developed hard skills, whereas it’s the people with well-developed soft skills (communication, intuition, aligning ideals with action, etc) that are often more effective and productive. (And it’s those with both hard and soft skills that rule the world.) [Edit: this is to say that, all things being equal, a brilliant designer with limited communication skills will not fare as well as a merely good designer with good communication skills.]
  • Although some people are fairly smart they can also be intellectually insecure. This compels them to keep learning about new and emerging fields and developing hard skills, but this drive to keep learning sometimes hinders their ability to seize opportunities as they’re presented. (I’ve dealt with this too.)
  • Finding partners that mesh well with you, personally and culturally, is more important than finding the most talented person to build something with you.
  • The emails I write are often too long.
  • Unsexy but easily-executable projects with a high likelihood of success are often a better time investment than ambitious, difficult ones with a greater risk of failure.
  • It really is all about people. This is a really tired cliche, especially in entrepreneurship circles, but it’s true. Ideas are cheap. Money can come from (almost) anywhere. Execution reigns supreme, and the only thing that can execute is a good group of people.
  • Despite a lot of irksome elements in contemporary startup culture, it does you no favors to shun it thinking you’re better off on your own. (This hasn’t been a real problem with me, but I’ve observed it in others.) That being said, it’s okay to laugh about it sometimes with people who also think it’s kind of ridiculous.
  • Friends = Family.
  • Volunteering for political campaigns is fun and educational. You learn a lot about your country by knocking on its doors.
  • Connecting and helping people make deals is incredibly fun and is probably what I’d like to spend my life doing.
  • ^That^ and writing. I used to write so much for public consumption, back in my college days. I don’t know why I stopped publishing stuff… because I really liked it. I’d developed a (very) small cult following that liked what I wrote, and to be honest I really miss having that community of readers. I’m making it a point to write and publish a lot more in 2015.
  • It’s easier to change a system from within it.

So, that’s about it. I hope you found parts of this post valuable. If so, I’d love to hear about it in the comments. If not, but you scrolled down to the bottom to see the little note at the end, I hope that doesn’t leave you dissapointed either.

Happy New Years! May your 2015 be better than your 2014.

Yet Another Preview of “Why You Should Date a Man Who Reads”

Not only is this excerpt longer than the previous excerpt, it is a scanned image out of my notebook, which, if I’m not terribly mistaken, makes this a more authentic reading experience. This might also tip my hand a little bit, revealing that I am, in certain key capacities, very much like the reading man I advised readers, female and male, to not date. Except, as I put it so eloquently to a friend over decaf coffee, “I am kind of like the man who reads from that piece, but I am not a dick.”

 

Ever the hipster, I write only in Moleskines

I’ll leave it at that. And, when reading, please forgive the poor punctuation, the illegible ‘f’ in the sixth line, and the somewhat purple haze hanging over the whole thing.

 

On Realism and Graduate School Applications

I applied and was denied admission to the University of Chicago’s Masters Program in Social Sciences summarily and without review of my application. This was, to a certain extent, expected. I am a third-year. I spoke with one of my professors, a well-known political scientist at the U of C, about my denial from the masters program. His advice:

“Here is the reason why you were denied: you posed an existential threat to the established system. Now, I know how tempting it is to make the argument that rules were made to be broken, that there are exceptions to expectations, but I implore you to evaluate the implications of your actions had you been successful. You would have turned over an entire institution, one predicated on a sequential acquisition of credentials. You don’t have to sell me on the fact that some undergraduates are more intelligent than graduate students; I’m trying, here, to sell you on the structural realist argument that how smart you think you are, or whatever intelligence you might exhibit–none of that matters. Okay? Do you see what I’m getting at here? Your actions, their potential outcomes, are defined through systemic constraints–you could’ve been omniscient, for Chris’sake, but because you don’t meet their parameters for admission–acquisiton of a bachelor’s being one of them–they won’t accept you. My best recommendation to you: game the system. Expose it, too. The College needs some shaking up.”

On Social Networking and Personal Branding

I stopped tweeting last week. I did not, however, delete my Twitter account. What prompted me to do this was an interview on a radio program, Fresh Air. Interviewed was a journalist who just won a Pulitzer prize for a series he did on distracted driving: how cell phones–talking, texting, and emailing–change the way we drive. The conversation drifted from the specific venue of technology consumption in cars to technology consumption writ large and how technology changes the way we think, how we live, and in many instances adversely effects our quality of life.

I forgot which early psychologist twisted Aristotle’s assertion that we humans are rational beings by saying that we are rarely rational but possess an astonishing ability to rationalize our decisions ex post facto. Computers–holding onto the aura of seemingly magical productivity instilled by good IBM branding in the 1960s and 1970s–confer an “I’m getting things done here” message upon their users, and still do, even though what we use computers for as of late can rarely be called productive

My issue with Twitter in particular is that, when engaged in heavy use, I feel my brain quicken. It is volatile, and thoughts–instead of coming out nicely punctuated with a sort of droll, discursive spirit–come out in fits and starts, as if my noetic works are gummed up.  I felt the refulgent glow of wit between my ears, but found upon further examination that this readiness to rhetorically lacerate a subject, in practice, rendered itself in spluttering 140 character syllogisms and overly precious plays on words–which I then proceeded to hashtag, so that when the meme goes viral I’d be pegged as its progenitor.

#Hegelgasm: n. The palpable feeling of transcendence characterized by mental quietude and a glimmering warmth about the head and neck; likely a psychosomatic response to self-congratulatory notions of intellectual achievement resulting from successful implementation of dialectic thought processes.

I propose the following methodology for dealing with social networking utilities, especially if considering their use as a “personal-branding” tool: use Twitter and Facebook as native, parallel, back-end distribution channels for autogenous content. What this basically means is that I know that the vast majority of my blog hits are linked from my Facebook page, and that Facebook qua Facebook and Twitter qua Twitter make for extremely disappointing personal-branding experiences. One must create more substantial content than 140-character witticisms, or wacky status updates, or anemic attempts at “keeping in touch” to be considered valuable.  

Who do people respect more, the creators of content or those that re-tweet or post shortlinks to content?  Who is more valuable, the creators of content or the disseminators of otherwise unread material? It’s a tough question, and a little too philosophical to get into here.

So, here’s my new Social Networking Paradigm in a nutshell: It is my belief that the greatest currently-available tool for personal branding is the weblog. Little to no direct engagement with various social networking utilities is required, as posts linking Facebook and Twitter users are created endogenously, simultaneously to when a blog post goes up. This directs more users to the blog, and frees up otherwise wasted hours spent surfing Facebook to write better blog posts, read, engage face-to-face with friends, and re-learn deep concentration and other mental processes lost through interface with technology.

––––

Ultimately, I don’t want it to seem that I’m writing this blog only to build the brand of Jason D. Rowley. In fact, personal branding is the least of my reasons for writing The Halcyon Days; I view it as a good mental workout, as a peer-reviewed version of what I keep tucked away in a growing pile of Moleskine notebooks, and it ultimately forces me to keep my love of arcane verbiage at a manageable level.

If you’ll excuse me, I’ve got to check my Facebook and deny a random Foursquare request and retweet some nonsense and address my iPhone, which has been jittery with text messages for the past ten minutes. 

Now, how ridiculous did I sound?

In Defense of Fiction: On Nuance

It is in that moment where muscles twitch slightly and a pang of synaptic excitement and that initial ineffable fraction of a fraction of a second before one articulates the “huh, I never noticed that before” that one experiences nuance.

I roll with a crowd that is somewhat slavishly devoted to reading. This is a good thing. With the exception of one of my friends, a man with whom I share my first name, the written matter they consume often takes the form of blog posts, magazine articles, and a preponderance of politics-, economics-, or entrepreneurially-focussed books. All of the above are categorized broadly as nonfiction.

A close friend of mine was reading some of my old blog posts and came upon a very brief treatise on the subject of “nuance” nested within a recent post, On the Hating of Haters.

He shot me an email asking me how one might gain a further appreciation of nuance, a question to which I responded with the following:

Short answer: read more fiction. Non-fiction writing is based on the premise of simplicity. The goal of a non-fiction book is to convey its thesis and supporting examples succinctly and efficiently. Nuance is often confused with extraneous detail, as being pertinent but largely insignificant to the larger argument; it is perceived to be a distraction from the larger “point” of the book/article.

Fiction, on the other hand, usually lacks a hard thesis and thus doesn’t need to convey information efficiently. A fiction writer may have a didactic message to convey, but the modus operandi of fiction for the past 200+ years is to “show, not tell.” Fiction writers show readers what they want to convey: adjectives are not the enemies.

As for some greater utility to fiction, I can only cite my personal growth in emotional intelligence and writing abilities. Both of which I’ve utilized in too many venues to mention here.

While it might be true that a certain critical mass of factual material must be mastered to be a functional adult. The real marker of intellect is not an increased volume of grey matter, the computational, repository bundles of nerve cells in the brain, but the volume of white matter, the neuronal connections that allow the brain to connect disparate ideas and concepts, to render juxtaposition and combination of ideas/things/messages in “contrapuntal” (to use a musical term) harmony.

That so many “young people” eschew the creative, integrative mental processes of reading fiction bothers me. They believe vehemently that consuming a media diet constituent of nonfiction, whether it is news or biography or pop-psych-sociology characteristic of Malcom Gladwell et al, makes them, in some material way, more intelligent. I counter this assertion by positing the following: functional intelligence has very little to do with the amount of factual information one crams into one’s brain; rather, intelligence can be measured by one’s competence with dealing with complexity, and the felicity with which one can deploy knowledge from disparate fields to address a given question or issue. From this integrative cerebral interconnectivity, not the mere aggregation and distillation of facts, “insight” is conjured.

Intelligence is gestalt, it is greater than the sum of its constituent parts; our knowledge is more than the words written and read, spoken and heard, created and consumed, respectively, throughout the course of our lives. The margin of error in our accounting here is the resultant cognitive gains of interconnectedness. Moreover, one cannot hard-wire methodologies for interconnectedness, only the potential for it. Why is it so hard to develop artificial intelligence? Resulting output is the product of false connections, the aforementioned insights are ersatz when uniqueness is attempted algorithmically.

The reading of nonfiction is an exercise in consumption, while reading, consuming, fiction is an exercise of creation. In the act of creation, which will be described below, the brain comes to play.

————————

A brief interpolation: have you ever noticed that on the bookshelves of the accomplished, one is likely to find a well-thumbed volume of Tolstoy or Austen or Huxley or Fitzgerald’s? These as opposed to Malcom Gladwell or any other member of the quasi-intellectual class who pass off cant and cleverness as meaning during their TED talks? Why is it that one might find, browsing the web histories of these same individuals, a link to one of the New Yorker’s blogs or to Slate, as opposed to a link to LifeHacker?

————————

Writing fiction is difficult. The writer is tasked with creating a world with a degree of comforting verisimilitude so that their readers might suspend disbelief for a moment and forget that what’s going on in the plot is, ultimately, fallacious. This is the objective world of the novel or short story. It is confined in its scope to the meanings of chains of well-chosen words. It is flat, its descriptions arid; it is a world nonetheless.

There is a world nested within that created by the author. When one reads fiction, one not only consumes its storyline; one is forced to imagine the characters, to form mental pictures of them, to perhaps ascribe quirks and ticks that the author failed to mention. Everything one pictures in this created world is highly personal. A beach umbrella mentioned in a story might evoke the one used on a seaside vacation as a child; in the mind’s eye, the brown bob of an old girlfriend’s hair might transpose itself onto the head of a particularly endearing female character. The world created by a reader is personal; the density of analysis one could do within the confines of a book can be virtually infinite. A book read well by a dozen people will offer a dozen interpretations; all are valid. All are subjective, and those who make claims to knowledge of an objective Truth divined from the text should be met with unflinching consternation. Their capital-T Truth is just as divined, as conjured, as pulled from the ether of their immemorial unconsciousness as the next person’s truth. Each person’s interpretation of a book is different, and one can interpret one’s interpretation as a reflection of themselves. The value of fiction as a tool for personal growth lies not in how the words on a particular page are articulated, or by the wild proclamations made by those in literary analysis courses, but by what occurs “between the lines” and over the infinitesimal space between synapses.

————————

A didactic question: If nuance–intricacy–in fiction didn’t really matter, would we never again, after viewing a film adaptation of a book, claim, “You know, man, I liked the movie, but the book… the book, like, had so much in it that the movie, you know, like, kinda leftout. I dunno, but the book was just so, I dunno, man, what’s the word…” phantasmagorical in its breadth, depth, and detail? Yeah, man: phantasmagorical. Try typing that out in a text message.

————————

Nicholas Carr, in his book-length rejoinder to his Atlantic article, [sic] “Is Google Making Us Stoopid?”, The Shallows: What the Internet is Doing to Our Brains, discusses the nature of the plasticity of the brain: that, with repetition, mental and physical activities, done repetitively, literally change the wiring of the brain. New neuron channels are formed, making behaviors that were once difficult easier over time. Recollect, for a moment, the monumental task before young children learning to dress themselves: the relative conceptual ease of snaps and elastic bands versus the enormous difficulty of buttoning a shirt. Tying shoes presents a problem that initially appears gordian in its intricacy. We learn. We don’t walk around naked. Not yet anyways.

The point being this: we live in a world driven by efficiency, we consume media that panders to the lowest common denominator of comprehension, and in a world so loaded with new information, new happenings, the new New New Thing (referring to the Michael Lewis book), Americans are using an ever-shrinking vocabulary to articulate what’s going on. (I don’t really want to say “we live in a world where…” anymore; it makes me sound cantankerous.) If one were to pull back from the goings on in the world, one would realize that ours is one of monodimensional blanket statements; things are articulated as black and white, righteous and “evil”, good and bad, immutable truth and rancorous bullshit. If it isn’t expressed plainly, elegantly, and matter-of-factly, it is taken as suspect. There are no freedom fighters, only terrorists; no moderate conservatives, only Randian, Bible-belting, free-market, ignorant, moose hunting, racist, xenophobic, quasi-populist Sarah Palin devotees. The monodimensional sells itself on its simplicity, and to mitigate the stultifying nature of the monodimensional, in lieu of another dimension, carefully-chosen language articulates itself in the spicy argot of over-dramatic extremism.

Ultimately, the point of Carr’s book, that the Internet affected our brains neuronally, that our neural pathways programmed for deep contemplation, concentration, creativity, and imagination have decayed in favor of those that allow us to consume and process huge quantities of information quickly (i.e. superficially), is “learned behavior.” People who think simplistically implicitly allow themselves to do so.

Authors’ worlds are recognizable and engaging not because of the nouns they choose, but the adjectives and verbs implemented. These paint a mental picture, the resolution of which is wholly dependent on the specificity of the chosen words. An author mentions tiny details that spark, if only for a brief second, within the reader a “huh, I never noticed that before” moment. It is in that moment where muscles twitch slightly and a pang of synaptic excitement and that initial ineffable fraction of a fraction of a second before one articulates the “huh” that one experiences nuance. It is for these brief moments of cerebral vivacity, where I identify and solve simultaneously the trick the author is playing, catching ex post facto he or she rendering connections from disparity, that I read fiction. Many of my friends see the world as one that offers incredible opportunity, a view which I share; but unlike some of them, through years of reading too many works of fiction to ever count, I believe I’ve gained the ability to live fully in the moment, to notice what is ignored, to engage the world actively, to find the needle of interest in the great musty haystacks of the mundane, and, ultimately, be able to articulate to you, the reader, what I wish to articulate with a degree of exactitude I defy another 20-year-old kid with a blog to surpass. An appreciation for variegation, of tiny variations, of minuscule vacillations, comes with one for vocabulary—said appreciation is my avocation. It is learned and adopted and eventually taken for granted. We are creatures of habit, after all.

My Favorite Books of the Summer Thus Far, pt. 1

For some, summer is a time to take on an internship, to travel, to spend time with friends and (if so inclined) family. For me, summer is about reading the books that the University of Chicago’s rather absurdly demanding curriculum precludes me from reading during the year. This is not to say that I don’t read during the academic year, it’s just that I’m not afforded the opportunity to read as voluminously as I’m otherwise inclined to.

Below is a list (complete with critical blurbs) of two of my favorites from the summer thus far:

Gain, by Richard Powers, 355 pp. Picador USA, 1998

Gain is the story of a woman and a company, it is one of growth and alienation. It is cold, corporate, and clinical. The novel traces the development of a fictitious company, Clare, from its humble beginnings as a soap and candle manufacturer and importer to its pervasive, expansive, Unilever-esque logical conclusion. Intertwined with this is the narrative of a woman, Laura Bodey, and her daily life in the bucolic exurban town of Lacewood, IL, which just so happens to be the home of a Clare manufacturing facility. As the twin plot lines unfold and spin themselves together, the reader notices that with the growth of one comes the decline of the other. Gain gives hints of genuine emotion, and there are moments of palpable nostalgia and sentimentality (especially in the development of Clare), but the reader is left with a feeling of corporate detachment. A poignant commentary on corporate personhood, the ethics of business development, and the alienating power of marketing, rural existence, and physical and psychological decline, Gain is a must read for those interested in ethics, entrepreneurship, or the lexical workings of the “Genius Grant” recipient, Powers.

This Side of Paradise, by F. Scott Fitzgerald. 248 pp. Library of America, 2000

An ebullient, skittering mess of a book, This Side of Paradise is the perfect novel for those that don’t read books (i.e. young people nowadays). Although most young people are first exposed to Fitzgerald in high school with the reading of The Great Gatsby, this novel, published when its author was just twenty three years old, provides a more applicable and personal meditation on the innocent pleasures of blasé youth, and the deep, adult pleasures of shedding that innocence piece by piece. The extent to which This Side of Paradise is thinly veiled memoir is unclear, though it must be said that the degree of verisimilitude between Fitzgerald and TSOP’s protagonist, Amory Blaine, is striking. Tracing Amory’s personal and intellectual growth from impetuous and pompous youth into literary snobbery during his first two years at Princeton into a real hurricane of a dalliance with a young debutanté into slow decay into post-college alcoholic turpitude and finally into what now might be referred to as indignant “adulthood,” TSOP sloshes about from one scene to another, mixing poetry, epistolary narrative, a peculiar quasi-drama in the form of a script, and careful reference to the dead white males that formed the backbone of the Jazz Age’s literary cannon. It left me with a sense of giddy self-recognition. I, like most young men, am, in whole or in part, Amory Blaine: restless, reluctantly realistic, wistful, and itching for success, the actualization of my closely-held overinflated notions of greatness. With this recognition comes the following fear: that I, like Amory, might eventually come upon realism, and, with it, resignation to the path more traveled.

Are you too smart for college?

What I believe the value of a “college education” to be is the following: the formalistic academic environment provided by our nations’ colleges and universities provides its young people with the framework–the papers, the reading assignments, the problem sets, etc.–to undertake the rather formidable task of consuming and digesting giant quantities of information, and, hopefully, be able to articulate it come the time for an exam or term paper.

Over the course of the past week, as I begin to say good byes and good lucks to my friends graduating from the University of Chicago, I’ve been doing some thinking. Why am I going through the process of “getting a college education?” Which I suppose can be rephrased as “what is a ‘college education,’ what’s so important about it, and why do I have to pay so much for one when all I’m doing is writing papers and reading books?” This quickly degenerates into an eggheaded discussion about what, ontologically, “education” is: I am not here to have that discussion, nor do I want to have it. Ever again.

What I believe the value of a “college education” to be is the following: the formalistic academic environment provided by our nations’ colleges and universities provides its young people with the framework–the papers, the reading assignments, the problem sets, etc.–to undertake the rather formidable task of consuming and digesting giant quantities of information, and, hopefully, be able to articulate it come the time for an exam or term paper. It is assumed by unwitting and idealistic faculty that “critical thinking” skills, among others, are picked up along the way. However, and I am not the first to say it, the internet and its attendant social networks and carefully hidden pockets of clandestine information have fundamentally changed the way that my classmates and I undertake the learning process; to wit, it is easier to get academic work “out of the way” without much intellectual effort in order to develop other projects. In short, we’ve hacked college. Most of us “get it done,” not for its own sake–”to learn and to grow”–but to GTFO, so to speak, and, as one of my fellow economics major friends so eloquently put it, “make shit-tons of money.”

If there is one thing that this year has taught me, it is the two flavors of motivation: does one pursue a goal as a means to some other end or as an end in itself? I, personally, have been straddling both sides of this duality, but as of late firmly decided that I only get to “do” college once, and thus I will devote myself as fully to the academic portion of it as possible. However, it seems that the pedagogical focus of the “modern college experience”–building a social network and padding a resume–is, effectively and convincingly, the cynosure among business-minded students–even at the ferociously eggheaded UChicago.

Consider the following conjecture: If you are the entrepreneurial type, the type who wants to get out there and get something started, or if you believe that academia is holding you back from what you want to do, take it from someone who’s read way too much in his life: Aristotle, Nietzsche, Adam Smith, and Karl Marx won’t make you successful. They will, however, help you find significance in your accomplishments. If you’re the retrospective type, one that cerebrates post hoc, you’ll be able to educate yourself later not because society is telling you to, but because you can approach that process with the same zeal with which you approach your current projects. If you have all of these great world-changing ideas, the wherewithal to see them through to execution and are willing to forego the short-term social cache of a college degree, then find an experienced mentor or two, build your network, and get cracking.

Parents often counter their college-bound teens’ assertions that many of today’s most prosperous companies were started by college drop-outs with the fact that their founders were smart enough to get into college in the first place. Bill Gates was also smart enough to realize that if he didn’t start Microsoft, somebody else would.

I’m not making any claims to being too smart for college. I enjoy academic life, and I want the social validation of a degree. I am, in fact, too cowardly to take the plunge into starting my projects in earnest now; few are willing to make that leap, and that’s the point. That said, I am currently researching options for a gap year.