When I was younger (around junior high age), my dad asked me what I wanted to be when I grew up. I told him that I wanted to be a writer. In characteristically honest fashion, he replied that he didn’t think I had the discipline to be a writer.
Needless to say, I took his opinion rather personally. Constant positive reinforcement and a national educational obsession with “self-esteem” had wrapped my fragile self-image around a belief that I could accomplish anything, and my placement into an advanced program in elementary school had not helped matters. My father’s honest statement did not square with my internalized self-image, one of limitless vistas of opportunity given to me by the grace of god. It was not until years later that I would truly grasp the concept of “entitlement,” and it was not until relatively recently that I have come to realize that nothing in this life is guaranteed (even if it theoretically should be). Thanks, Dad.
Beyond the cliché paternal obsession with structure and discipline, though, he had a valid point to make. At the time of our conversation, my grades were in the process of a slow, steady decline which would ultimately culminate in my failure to graduate from high school. Having recently discovered such hedonistic pursuits as videogames and cable television, my attention was drifting even further from my assignments, and the deck wasn’t exactly stacked in my favor to begin with: I’ve never had much focus, time management is a foreign concept to me, and my drive to complete assigned tasks has always been less than spectacular.
Given these immense academic handicaps, it’s a wonder that I managed to get into college at all. Granted, I possess some character attributes which are often confused with intelligence, but a very talented person can still be a bad student; likewise, an “untalented” (for lack of a better term) person can be a very good student. For many years, I watched with disdain as those I judged to be “untalented” succeeded again and again, despite natural deficiencies. And moreover, they seemed to be happier than me! “Ignorance is bliss,” I thought, dismissing their happy success as complacence. Somewhere inside, however, I knew that my disdain for my fellow human beings embodied something rotten at the core of my being.
Ironically, it is probably my disdain for my cohorts which had the greatest effect on my writing in my formative years. Very early in my education (roughly third or fourth grade), I grasped that my peers had specialized skill sets and attempted to exploit this difference by overemphasizing my natural aptitude for reading and writing; I read more than ever upon discovering my advantage, and took every opportunity to criticize my cohorts’ patently inexpert use of language.
Children are naturally ruthless, and our advanced program was no different. On the playground, we were constantly embroiled in skirmishes with the “normal kids,” no doubt because of the subtle insult to their intelligence suggested by our mere presence. In the classroom, we were all-too-aware of the fact that our IQ test scores were supposedly higher than everyone else in the school system, but among us, whose was the highest? Even our interpersonal relations were predicated upon an “us vs. them” mentality which necessitated association by virtue of our mutual exclusion from “normal society.”
This perceptible internecine conflict was an excellent motivator (as it is in the professional world), and of those I still know from that group, I think almost all have demonstrated a superior will to outperform others regardless of individual circumstances—Nietzsche’s Übermensch incarnate. In my experience, this “will to power” (if I may borrow another Nietzchean term) is a much more substantial advantage over others than any natural talent.
A strong will is an admirable quality in many ways, especially in a world where nobody cares about you, but it is not a trait which is inherent to my character. To be blunt, I just don’t give a shit. I have no stake in proving my superiority to others (perhaps because it was assumed at first), and I have no interest in succeeding by the same means by which everyone else measures success. This is the quality my dad identified when he said that I didn’t have the discipline to be a writer.
In many ways, he was correct. Maybe I will never amount to much because I don’t have the drive to succeed, the imperative for excellence that “successful” people seem to possess. Perhaps my quixotic political and philosophical ideals are simply ex post facto justifications for my inherent laziness.
If this is the case, so be it. Regardless, I shall attempt to justify my apparent lack of success by critiquing the means by which success is traditionally judged. Unless our social system is the best possible, it should be considered the worst presently possible. For we can (and do) imagine a personally-preferable alternative universe, yet are constantly denied the means by which to achieve it. And if a society is not attendant to the needs of its individual members, but instead makes demands upon them, then it is not a society at all, but a hierarchy: an artificial order imposed by convenience rather than reason. My belief in the indefensible inadequacy of our current social and institutional structures is what motivates me to write today, and frankly, I can’t think of a motivation which could be more honest or compelling. Ars gratia artis is a vapid excuse for self-expression, and this sentiment ignores the kernel of ideology at the core of every expressive exercise—the substance is already present in the form, whether we want it to be or not.
My public work has been described to me variously as “angry,” “condescending,” and “difficult to read.” Not altogether inaccurate, but not altogether accurate, either. I am angry about the way things are; I am not angry with my readers. I am all too familiar with the philosophy of the status quo, and quite impatient with the eternal recurrence of its proponents’ beliefs; I am not condescending to my readers, but instead to hackneyed arguments forever repeated in the face of novel ideas. I am a philosopher, and believe that well-composed sentences form well-considered thoughts; I am not a poet, and though I might sometimes use rhymes, assonance, and other literary tools to communicate my point, my first consideration, in writing, is always the logical and empirical accuracy of my statements and thoughts—though they may occasionally be longer and more complicated, grammatically and lexically, than is nowadays customary (like this sentence).
Writing is more than an exercise in language, it is an exercise in thought, and the Sapir-Whorf hypothesis should inform us that language matters, regardless of its complex relationship to our underlying thought processes—indeed, Noam Chomsky suggested in 1975 that the deep structure of our linguistic processes is probably intimately tied to the functional operations of other mental processes. In fact, barring a revolution in the practical application of logic, language forms the only comprehensive and intuitive rational framework by which we are able to make sense of the world. Importantly, while we are designed to use it as a tool for thought, we cannot escape the fetters that language naturally imposes, as is demonstrable by the dead-end of the philosophy of metaphysics, which, as Russell indicated repeatedly in his History of Western Philosophy, results from confusing the working of the world with the working of language.
The overwhelming abundance of details in our present-day existence would seem to require a more complex and precise use of language than was previously necessary, while also demanding a streamlined form of language appropriate to new communicative mediums, like the “txt” speech which has accompanied the rise of internet messaging and cell phones. “Txt” is a peculiarly prominent example of the increasingly technologically-deterministic background of our lives. Society acts like a normalizing buffer between us and the outside world, with social conventions serving to stabilize the often-rapid variations in our environment and circumstances. But, as technology’s cumulative effects on the world have increased, it has become more and more indistinguishable, adaptationally, from natural phenomena. This should not be cause for alarm, in and of itself, but it should present an interesting avenue for analysis regarding pervasive maladaptive social characteristics, like depression or violent tendencies—which today are often treated on an individual basis with drugs and therapy, but may indeed simply be symptoms of larger underlying social conflicts and failures, rather than (what seems to be tacitly accepted:) individual disorders among a mostly well-adapted society.
In fact, society’s unquestioned assumptions and effects can often hinder individual correction of perceived problems. There are some who would suggest that our culturally-endemic “common sense” is often the cause of many of our personal woes: in a similar vein to Alfred Korzybski’s “General Semantics,” Albert Ellis’ Rational Emotive Behavior Theory assumes at its base that our reactions to events are filtered through our preexisting set of beliefs. This implies that our undesirable reactive emotional states are, at least to some extent, controllable by means of rational examination and correction. The application of this framework on an individual psychotherapeutic basis, however, fails to correct the root of the problem: a nearly-universal reliance upon primitive belief systems which have outlived their usefulness and persist by mere virtue of historical inertia. If irrationality breeds discomfort and discord—a nearly undeniable premise—then it behooves us to remove this infernal monkey from our backs and continue without its incessant unproductive (or counterproductive) distractions.
If we, as individuals or a society, are to live in a world of our choosing, rather than one of inevitability, it is imperative that we first practice clear and rational communication, the basis of clear and rational thought. And never has the need for (and lack of) clear and rational communication been more prevalent than in our times. Texting, tweeting, and instagram will suffice as abbreviated direct communication tools, with obvious limitations upon their potential for semantic comprehensiveness or philosophical depth. Advertisements, movies, and television episodes will fit the formulas we have come to expect, as the very well-established catalog of formula-writing standbys on TVTropes.org will attest. Music will serve to satisfy some deeper longing that is beyond compare, but will continue to suffer from the same limitations as texting and tweeting, as far as complex rational communication is concerned. Internet services such as YouTube, which provide easy access to the tools of multimedia content production, have probably the most potential for refinement and revision of the modern mass media and educational processes, but one’s ability to retain information is naturally limited in scope, detail and duration by audio-visual media. Therefore, writing, and in particular free-form and lengthy writing, is still in an unassailable position to provide precisely the sort of technical, scientific, and philosophically-complex information which is most desperately needed in a world with more information about more theoretical and practical disciplines than ever before.
This is the burden that today’s writers (among others) should recognize: the burden of meaning has surpassed the conventional, the traditional, and moved into the realm of the philosophical and the scientific. If one is to be more than a “hack” or “sell-out,” one must be more than just literate, one must be a master of multitudinous branches of knowledge. And writing (among other things) must be considered a means, not an end in itself.
Unfortunately, the vocational, specialized emphasis that our current economy places on profession can easily confuse one into believing that they can be a “writer,” as I once did, but this is false. One can no more be a “professional writer” than a “professional reader.” Indeed, many write and many read extensively for their jobs, such as professors and lawyers, but these people would not call themselves “writers.” Beware those who would do so, for “writing” as an end in itself is like driving as an end in itself: it perverts the original purpose of the technology for selfish, gratuitous reasons. And, even if one ends up somewhere else, there can be only one final destination: the starting point. To stay at any other destination would make no sense without an actual preexisting reason, desire or imperative to go somewhere. Most are not content to endlessly wander, lost and directionless, and for good reason. Thus we make maps and pick our destinations with some discretion. But then again, the prevalence of alcohol and drug abuse among famous writers (and other artists/creators) might help to explain their apparent contentedness with what is, in many instances, a fruitless and forgetful meandering toward an exclusively personal oblivion.
In any case, to be a professional “writer” these days, whether it is ad copy, TV screenplays, or horror novels, has become a Sisyphean mockery of a task, as many professions have. We read too much in an average day (even those who aren’t “readers” must read signs, ads, forms, etc. every day) to believe anymore that writers have any special capabilities beyond the rest of us—just more free time. As such, if writers do not provide some degree of substance beyond the form(at) of the medium, e.g. ideological interests or technical tidbits, they will do nothing to redeem themselves or their ostensible profession.
 History of Western Philosophy, pg. 202: “Substance, in a word, is a metaphysical mistake, due to transference to the world-structure of the structure of sentences composed of a subject and a predicate.”