Saturday, November 30, 2013


I am happy to have managed to make a post to this blog every day this month.  It does not equal the accomplishment of those who have successfully completed NaNoWriMo, certainly, but it is a far cry from where I have been in my own writing.  It gives me the hope that I might someday be able to do another extended writing project--which is good since, as a scholar in the humanities, I am expected to do quite a few such projects throughout the next forty or fifty years (if I should live so long).

I have been aiming at prose pieces of approximately 500 words in this blog (along with a sprinkling of amateur poetry of varying length).  I know that I have not always succeeded at providing so much; a couple posts in the middle of the month fell far short of my expectation.  Some of my posts, however, have been a fair bit longer; I tend to run off at the mouth with things about which I am passionate, and the tendency follows me into writing.  Still, I think I came out somewhere near 500 words on average (I have not done the word counts to be sure), so I am content with the performance.

Writing is a skill set, and, like any such thing, it requires practice to develop proficiency and, perhaps, excellence.  Part of what I have been trying to do in this blog is habituate myself to the practice of writing.  My current pace is a good one, I think; it is worth maintaining for a time, until it becomes easy to do (it is not terribly difficult now).  Once it does, I will see about increasing my rate of production; 750 words seems a good next benchmark.

Advancement by 250 words will seem familiar to some; writing assignments typically operate in such numbers.  It has to do with typewriters, I think, whose text did tend to yield 250 words to the page.  Another 250 words meant another page or so, and offered an easy means of assessment; insufficient pages, insufficient content, no need to bother.  Things have changed, of course; "regular" formatting for work usually yields something like 325 words to the page.  But the older assessment patterns still persist (sometimes to the detriment of the students, who follow word count rather than page length and so suffer for not following directions).

That I appear ready to follow them as I advance in my writing marks me as embedded in the older practices, another way in which I can be argued as participating in structures of oppression (because, of course, as one of those pointy-headed folks, I am concerned with the indoctrination of the youth); the Zawacki bit comes to mind once again for some reason.  I try to transmit better and more consistent information to my students, certainly, but I still follow the basic patterns.  And I discuss them fairly frequently here, as I will probably continue to do as I continue to work on how well and how diligently I write.

Friday, November 29, 2013


Neither knowledge nor understanding
Is a one-time thing.
Both must be courted

Neither knowledge nor understanding
Is a particularly shy beloved.
Both respond willingly
To diligent pursuit.

Neither knowledge nor understanding
Is a thing free with favors.
Both require demonstrations
Of passionate ardor.

I have pursued both for long,
Given much of my life
Built my identity
Focused my being
On consummation with them.
I know that I am not their first.
I know that I am not their only.
I know that even now, I share them with others,
And they with me.

I do not think that any of us use protection.
The sensation is too good to suffer such interference.
(Although it occurs to me that
Decorum insists
That such activities be done in private;
The ivory tower isolation
Makes more sense now.)

Is it any wonder
That the love of knowledge and of understanding
Leaves so many children behind,
Perhaps more fair than Bradstreet's ill-formed offspring,
Perhaps not,
But still the fruits
Of labor like that of loins
Though situated higher in the body?

We are all of us
Pumping away
Grinding away
Milking for all the worth that can be found
In an orgiastic frenzy
All in our heads.

Thursday, November 28, 2013


For folks in the United States, it is once again Thanksgiving, something I have been discussing off and on over the last few days.  (Those outside the US, have a good day.)  I have already seen posts to social media feeds articulate thanks for things, and I have seen others that mark the day as National Genocide Day, a day solemnizing the near-total destruction of First Nations populations across centuries.  And I cannot say that either set is wrong.

(Of course, there is always the issue of my exercise of privilege in entering into such discussions.  After all, I am a member of almost all of the groups that enjoy particular privilege in the United States; I am almost the embodiment of the person for whom "the system" is designed to work.  Only my socioeconomic status interferes, and that only mildly; I am not wealthy, but I am not poor, so I am among the unmarked and therefore the secondary beneficiary of "the system."  Only for the wealthy do prevailing cultural assumptions work better than they do for me.  My ethos for entering into discussions of the generational experience of oppression is therefore weak, as I have been told more than once.)

It would be irresponsible to ignore the less pleasant parts of human history; we are a violent and bloodthirsty species, overall.  And it is true that many of the holidays any current group celebrates are borrowed from earlier celebrations, decontextualized from their "original" or original intents.  The same is true for many of our other cultural practices--whatever "our" is under discussion (or perhaps "your," since I cannot include myself among many of the groups that my cultural and biological forebears have trodden down).  It seems if we are to eschew things because they have a bloody history, if we are to set aside practices because they began in anger and hate, then there is nothing that can be done.  No act is without violence; no history is without conflict; no ancestry is without wrongdoing, and while questions of scale are valid and usefully asked, at some point, they cease to matter.  Quantification fails.

I had not intended to go even so far into the issue as I have (and I know that there is much more to say about the matter).  I had intended only to offer up my own statement of thanks, rather than broaching the topic of how fraught this holiday--any holiday, really--is (and they all are, which I can discuss on other occasions, if I remember to do so).  But the world is complex, and so engagement with the world, to be honest and open, must be likewise complex.  And I am thankful that things are thusly complex; were they simple, all would have been figured out by this point, and the challenge of living would be over.  (Again, though, my privilege speaks; my challenges may be minor and entertaining, but those others face are surely not.  Whether I ought to be thankful that it is so...perhaps for the former, but not the latter.  Complexity, again.)

Wednesday, November 27, 2013


Yesterday's post has lingered with me.

When I posted a link to it to a social media feed, I did so by contextualizing it as "the kind of public intellectualism I hope to continue to do."  The post is one that assails the words of a reporter for their inaccuracies and shortcomings, which is problematic.  While flaws in the transmission of knowledge and disagreements about the presentations thereof ought to be noted, linking "public intellectualism" to that activity serves to reinforce unhelpful stereotypes that contribute to the idea of the scholar as undesirable company.  It supports the kind of thing that makes one of Zawacki's jokes work, inadvertently contributing to the anti-authoritarian, anti-intellectual component of the US zeitgeist John McWhorter identifies in Doing Our Own Thing and which I discuss a bit more here).  As such, the comment was ill-advised.

Certainly it is the role of the intellectual to identify errors that are passed on as "fact"--and publication in major media associated indelibly with "factual" writing* comes off as presentation as "fact."  It is the role of the intellectual to apply the cultivated powers of mind, the result of a society organized such that people can take the time and expend the effort to cultivate them instead of necessarily and solely the crops in the field (not that I am arguing against those who do, particularly given how much I like to eat), in the pursuit of perfecting human activity--and the discourse of people, their dissemination of information, is a fundamental human activity.  So it is the role of the intellectual to pick apart what is presented, to find the holes in it and, by extension, the holes in humanity they indicate; only by doing so can those holes be filled, and only by doing so can what is needed to fill those holes best be known.

Yet that...corrective impetus is not the only role of the intellectual.  I was reminded of this while I was looking over materials in support of yesterday's post: the words of the Good Doctor.  In "Galley Slave," Asimov writes that the work of a scholar is that of an artist; scholars "design and build articles and books.  There is more to it than the mere thinking of words and putting them in the right order" (although that is no small task).  Scholarship is a creative act as much as painting or dance or sculpture or poetry.  Those of us who work The Work try to capture some small slice of our perception of The Truth in a way that others can see and follow to their own perception of The Truth; the end goal is to have enough separate views of The Truth that it can be shown in all its magnificent splendor to the eyes of any who care to look upon it.  And if I seem something of a mystic in my phrasing, that is not to be wondered at; I have linked my work as a scholar to the exercise of my faith before(here, here, and here, at least), and I have experienced something of the Divine in my work on various projects.  Oxymoronic as it might seem, my scholarship is in large measure my worship, and because I am called to share my experience of the Transcendent, I do what I do.

It simply sometimes does not go as well as I would hope.

*I am aware of the contested nature of the claim as it regards any publication and presentation.  I am also aware that long-standing cultural assumptions at work in the United States hold journalistic writing up as an exemplar of "unbiased" and "accurate" presentations of "fact," and that the New York Times is cited as more or less the national newspaper of record.  It is "supposed" to be "fact."

Tuesday, November 26, 2013


Yesterday, I was working on some of my outside teaching activities (a bit of extra money is welcome, and that means a bit of extra work needs to be, as well), and I came across John Markoff's 24 November 2013 New York Times article "Already Anticipating Terminator Ethics."  In the article, Markoff reports on the Humanoids 2013 conference, focusing on one presentation at the event: Ronald C. Arkin's "How to NOT Build a Terminator."  He provides a summary of Arkin's talk, using it to note that "we are a long way from perfecting a robot intelligent enough to disobey an order because it would violate the laws of war or humanity" and thus that humanity still must accept responsibility for the field actions of the automatons they create.  Markoff notes that there was some challenge to Arkin's ideas about the potential peril of robotics research, if only passingly and at the end of the article, indicating his fundamental agreement that continued development of autonomous military machines is ethically fraught.

I find much of interest in the article.  As a student of literature who has at many points made reference to the Good Doctor, I was pleased to see the deployment of Isaac Asimov in the article.  Any time the kind of work with which I am familiar and for my interest in which I was ridiculed or abused appears in broad reference, I am glad to see it; something in me is satisfied by the impression that I was right to familiarize myself with the material, since it allows me to be part of the conversation going on.  More formally, as a literary scholar, I appreciate the irony of using Asimov in discussion of robots being developed in the service of DARPA; Asimovian robots are predicated on not being able to cause harm to human beings, and "defense" projects (as Markoff reports that Arkin notes) are all too easily turned to the destruction of life and property.  (The seeming dodge that Markoff quotes Gill Pratt as offering suggests that such a turn is intended--although I would have to have more context to be sure.)  I appreciate that literary and figurative devices appear in "sober" reporting, and I would like to see my students learn the lesson that such things are good to know and to understand when they are seen, as in Markoff's employment of Asimov.

Despite my pleasure at seeing the Good Doctor cited, I have some quibbles with Markoff's specific use.  By referring to Asimov as a "science-fiction writer" in a science article, he creates the impression that Asimov is only a fiction writer, which is demonstrably untrue.  While the Foundation and Robot novels are perhaps his best-known work, Asimov was also a capacious writer of non-fiction, including Biblical and literary commentaries and an astonishing number of essays.  Too, he was among the professoriate at Columbia University, from which he earned his PhD at a remarkably young age (younger than I did, and I had mine before I was thirty, which is early).  To imply something of a sneering only, then, does the man a disservice.

Similar are the errors of fact in the article with reference to Asimov.  For instance, Markoff notes that Arkin's talk begins "where Asimov left off with his fourth law of robotics--'A robot may not harm humanity, or, by inaction, allow humanity to come to harm.'"  The law referenced appears initially in the 1985 novel Robots and Empire, and it marks a significant shift for a character who eventually (both in terms of composition and in terms of the Asimovian milieu) assumes a godlike character (in a bit of irony for so dedicated a humanist as was Asimov).  And it is worded differently, if only slightly, than Markoff quotes it--but I suppose that a missed preposition may be forgiven (or that a later edition of the text than mine might have changed it).  But it is not the fourth law, but the Zeroth (since zero precedes one and the law regarding humanity takes precedence over the First Law, which protects the individual human being), although calling it fourth (with the lowercase, not-a-proper-noun f) can be justified on the grounds that it was the fourth to be developed.  Still, that the wording is questionable does not argue in Markoff's favor any more than crossing a date does; Markoff early in the article notes that Asimov anticipates the need for robotic ethics fifty years ago, and he does not go far enough.  The anticipation goes as far back as the late 1930s, as the Good Doctor notes in his introductory remarks to the 1990 collection Robot Visions, and Asimov's codification of those robotic ethics appears as early as 1941 (as does the word "robotics," as the OED notes)--more than sixty years back from Markoff's piece.  Again, more accuracy ought to be given to that particular son of the Best of the Boroughs.

If I come across as something of an Asimov fanboy...I am something of an Asimov fanboy, so it makes sense.  But I am also a scholar of writing, and Markoff's writing could have been better in some ways.  The topic he treats and the conclusion he reaches about it deserve to be handled with the utmost care and diligence, so while he does well to engage with them and bring them to the attention of the general public (which the New York Times serves to do, as I have noted), if only for a short while (because most will not remember long that he has written, let along what he has written), he does less well than he ought to have done.

Monday, November 25, 2013


It seems that I am not the only one to comment on the subsuming of Thanksgiving into crass commercialism; Clare Zillman's 25 November 2013 Fortune article "How Black Friday Ate Thanksgiving" addresses the same issue.  In the piece, Zillman notes that "Thanksgiving is gone for good," citing the profitability of stores opening on the holiday and the fears of many retailers that they will lose sales to those stores open earlier than they.  She does this in addition to giving context for the "shopping holiday," tracing its use as an advertising slogan back through the 1990s and its use in reference to sales, generally, to the 1960s or 1970s.  Although depressing, the piece does effectively put across its points, revealing an unfortunate truth.

That Zillman makes her closing statement, quoting one source to assert that so long as sales are good on Thanksgiving, retailers will open for it, is unsurprising.  Matters are such that the acquisition of money is the prevailing cultural goal; money-making is firmly in the American zeitgeist as The Goal, and when a person's worth is discussed, it is in reference to net worth rather than moral or spiritual worth.  And I can understand why; monetary worth is calculable.  It is verifiable.  It is demonstrable.  It is something that can be quantified, and it is a truism that concrete ideas are easier for the mind to grasp than abstract.  People, myself included, are fundamentally lazy; we will do the least work possible to achieve a desired effect, whether that effect is to feel good about the amount of work we have done that day or to get food on the table and a roof over our heads.  Grasping easy ideas is concomitant with that laziness; it accounts in large part for the reduction of public debate to soundbites and the echo-chamber effect that social media have tended to provoke.  (How many deliberately keep Facebook friends whose opinions are divergent from your own?  How many do it because the opinions differ?  Milton's untested virtue comes to mind.)  And so we focus on the measures we can quantify--test scores, bank accounts--with the admittedly understandable effects thereof that those measures become the only ones that really matter.

The problem--of which the crassness of the Thanksgiving sale is but a symptom--is that reducing a person to measurable, quantifiable data is just that: a reduction.  It echoes too much the kind of system that allows slavery (whether so-called, labeled "indentured servitude," or under other names with which I am not familiar) to flourish; what else is it but the objectification of a person taken to its conclusion, the ultimate dehumanization of person-as-product, valuable only insofar as an embodiment of currency and the labor and goods which support it?  It denies the inner essence of personhood that has been recognized by populations among themselves for long but only recently and haltingly by them in others.  It undercuts what is still recognized as a great boon: service to others (something upon which all of us rely, whether we want to admit to it or not; we are fortunate that many are called to serve despite being poorly remunerated and told thereby that their service is not really as important as stump speeches say it is).  And it will work, in the end, to our ruin.

Sunday, November 24, 2013


During my morning readings, I came across Ben Christopher's 21 November 2013 submission to the online California Magazine, "Cal Lecturer's Email to Students Goes Viral: 'Why I am not canceling class tomorrow.'"  In it, Christopher offers comments about the unusualness of an email from mathematics lecturer Alexander Coward spreading across the breadth of the internet amidst the other things that populate webspace before presenting some of the professor's remarks about the email and the text of the email itself.

I find myself somewhat...conflicted about the text of the email.  Coward makes some good points regarding higher education and the unique opportunities offered by top-tier schools such as UC Berkeley (although he does not write with as much grace as could be hoped, and his tone comes off as sometimes condescendingly sing-song).  He is correct to point out that "class hours are valuable," for they are; it is during them that direct interaction with the instructor is most possible (although how much can be had from among 800 students is an open question; having TAs--or GSIs, in the email--makes a difference).  He is correct also in his comments about his teaching decisions, that each pedagogical choice he makes will help some and fail to help others.  Many people fail to recognize this, thinking that one method will be able to be universally effective, and there is not a teaching practice that will reach all students.  If nothing else, this is because the teaching environment is one of reciprocity; it relies upon the interchange between teacher and pupil to be effective, and it is the case that students are sometimes simply unwilling to engage.  Coward is correct in noting the complexity of the world, something my own experiences have taught me students are often unwilling to recognize; things are not simple.  And I, who have devoted most of my years to the pursuit of knowledge and understanding, can hardly argue with his assertion that education is among the noblest of pursuits.  Self-serving as it may seem for an educator to say so, it is only through learning (in many modes, admittedly, not all of which can happen in a classroom) that matters are improved.

At the same time, I am a believer in the power of collective bargaining, having benefited from it once or twice, myself.  As a matter of principle, I do not cross picket lines, and I do not encourage others to do so; rather the opposite is true.  While I can sympathize with Coward's position (something in his comment that "the choice not to strike is quite easy" coupled with his title of "Lecturer" tells me that he knows his participating in the strike will lead to his dismissal, and the loss of the paycheck is a frightening thing, as I well know), I know also that his employment is in some senses complicit in structures of exploitation (the striking GSIs he mentions are underpaid, under-rewarded, and undertaking significant debt that they will try to pay off for decades so that their institution will be able to inflate its performance numbers without hiring/paying people who are willing and happy to work--for a decent wage).  Worse, while his position at UC Berkeley may be contingent, he has a concurrent appointment, evidently on the tenure track, at another institution entirely; he is in a position of privilege therefore, and for him to avoid what is for him a relatively minor risk in such a way...I have to wonder how appropriate the man's name is.

Ad hominem aside, there is another point to consider, and it is one which Coward addresses in his email.  The university as an institution has a number of missions.  I have repeatedly written about the dual scholarly mission: to develop knowledge and to disseminate it.  Both are done in the pursuit of the ultimate understanding of what is.  Both speak to the improvement of the human condition, so that another mission of the university (maugre Fish and those who parrot him far less intelligently) is the betterment of people.  Coward acknowledges this, noting that "Society is investing in you [the students,] so that you can help solve the many challenges we are going to face in the coming decades, from profound technological challenges to helping people with the age old search for human happiness and meaning"; he tells his students that they, and the rest of us, are and ought to be working to make things better for people.  The strike, while inconvenient for students (as was one in which I participated), is aimed at making things better for those who are set to help others prepare to make things better.  Participating in it would have been in keeping with one of the purposes of the university as a whole, even as not kept with another.

It is...complicated, indeed.

Saturday, November 23, 2013


I know that I am later than usual in getting this piece written and posted so that it can be seen.  I do have a tendency to sleep in on the weekends--a tendency I intend to enjoy while I can, since the birth of my child will significantly inhibit my ability to do so.  There are times, however, when even the extended sleep that such days as today afford me do not leave me feeling rested; today is one of them (giving the lie to some earlier comments).  I was asleep for some nine hours, perhaps with a few momentary interruptions (I had set my alarm clock for the workday time), but I feel no better than I do when I get half that.  And I do not know why this is.

That my complaint comes off as weak mewling is known to me.  My life is easy, nearly indolent (as I have noted), and so for me to complain of fatigue suggests that my stamina is...not what it ought to be.  I do not labor with my body, and I do not exercise that body nearly so much as I once did, so I have precious little reason to actually be tired beyond what a simple night's sleep will restore.  Yet I still find myself in such a position, that even a good night of sleep does not leave me refreshed, not fully.  I am forced to wonder how much worse it will be once my child is born and my sleep is dependent upon the child's--and the newly born are not noted for sleeping nights through in peace.

It is but one of the many worries that I have as I approach fatherhood.  And, as with the general complaint, I know that it is...odd for me to have such concerns.  My beloved and wonderful wife is doing the work of bearing our child, and I know that she endures significant discomfort to do so.  (The specifics are hers to share or not, but we have been told by the physicians that they are far from unusual.)  I am impacted only insofar as I am sympathetic to her (which is a fair bit, I admit, but it is not my own body that is being subjected to the strains of gestation and that will be made to give birth).  How, then, have I a right to have such...annoyances, let alone to voice them?

Perhaps I am support for claims that "male" is the lesser gender, some of which are medical, and some cultural.  Perhaps I am simply weak, myself, and am having to face that weakness now that I am entering into a position where it can impact others whom it ought not to make suffer (or am I overestimating my importance that I think I might be able to have such negative impact?).  Or, perhaps, I am simply having a less-than-optimal morning because I am, despite the hopes and to the consternations of many, simply human.

Friday, November 22, 2013


My morning reading of my social media feed presented me with a webcomic, Stephen McCraine's Doodle Alley.  I have had some experience with comics, both in print and on the web, so I did not hesitate to page through a few of the offerings on the site.  The simple color scheme and style work well in the series, the lettering reads easily and well, and the contrast and dynamism of the frames are compelling.  Indeed, I find myself jealous of the artist--of many comic artists, really, and many artists in other media--who are able to effectively integrate image and text; doing so is not something in which I am skilled, and I am not convinced that I have the wherewithal to practice enough to be able to become skilled.  But that is another matter.

More to the point is McCranie's 1 May 2013 offering, "Diversify Your Study."  In the webcomic, McCranie points out the need to study multiple disciplines, citing the ability of expanded study to open perception and allow multiple avenues of inquiry about a single topic.  He sketches out a method for branching out into other disciplines, as well: work from the core study to proximal studies and outward.  The affirmation is useful, and it is the case that a single issue or page of a webcomic cannot account in detail for the myriad paths fashioning of an interdisciplinary identity can take, so McCranie does an at least adequate job of provoking thought and conversation about the matter.

At least, he does for me; his comic brings to mind a number of things with which I have grappled in my mind.  One prominent among them is a classroom commonplace, the whining of students about having to learn things outside of their chosen major.  "Why," they ask, "do I need to learn about literature if I am going to be an accountant?"  "What is the point," others complain, "of this?  I'm going to fix cars," or "be a farmer," or "go into politics; when am I going to need to write?"  Even some of my fellow scholars in the humanities offer such comments as "I have never needed anything above seventh-grade math.  Why I had to take college algebra is beyond me."  The answer I give to my students is that they may well need the overt skills my classes have them practice; being able to examine evidence closely and draw reasonable conclusions from it applies to every field, and being able to express the evidence and reasoning convincingly can permit such things as getting investment capital and not getting a ticket (sometimes).  Most of them are content with it, and those who are not tend to not submit their work, making themselves unproblematic for me.

The answer I would like to give them, though, but which I do not because I do not think they would listen, is much like that McCranie advances.  Training in outside disciplines is not simply a matter of acquiring knowledge; it is a matter of developing patterns and methods of thought and inquiry that can be deployed later.  I am very much a student of humanities, but I find that the inductive reasoning that typifies scientific inquiry a useful tool; identifying patterns and extrapolating from them is useful in literary critique.  I find that the systematic reasoning from postulates and theorems taught me in my math classes is also useful; there are principles of inquiry that can be applied and followed.  And study of the humanities seeks both to remind people of traditions and explicate human nature through examination of how those traditions form and are maintained and appropriated; my students, whatever their field, live in a human world, and so an understanding of human nature and human culture is doubtlessly of benefit to them.  The perspective is helpful.

Something else McCranie raises in my mind is my own privileged position with respect to interdisciplinarity.  I am by training a literary generalist with a focus (not a specialization, really) in medieval English literature.  My dissertation (which I seem to be referencing an awful lot recently) spans centuries, literary periods, and the Atlantic Ocean.  My research frequently looks at how ideas are transmitted forward through time.  Doing the work I do necessitates that I have an understanding not just of lots of books, but the contexts from which those books derive and in which they are and have been read.  I have to have a handle on literature, history, various technologies, theology, art, architecture, archaeology, and the like--and many of my more specialized colleagues do not.  I am necessarily interdisciplinary in my work, and (as I have been telling potential employers in the many application letters I continue to send out into the world) that interdisciplinarity allows me to respond to many students I would otherwise not and to further human knowledge of the human in ways the traditional disciplinary boundaries--useful for focusing attention and developing deep knowledge, yes, but always in danger of myopia, as McCranie, following Asimov*, points out--cannot.  Being able to identify, explicate, and anticipate connections among seemingly diverse sets of information is useful--and increasingly so, given the data-saturation of the world.  Those of us who study the older things are therefore particularly well placed to work among the newer--something McCranie suggests in his comic.

There is another thing to keep in mind, though: interdisciplinarity does not mean surface-level study.  Being able to be effective in multiple disciplines requires taking the time and investing the effort to develop that effectiveness, and it is not an easy task.  The most useful eyes either see very deeply or very far; in neither case is it helpful to take only a glance.

*Read "Sucker Bait."  I have it in an old and battered copy of The Martian Way and Other Stories.

Thursday, November 21, 2013


I have commented once or twice before about my coffee-drinking habits.  They began early in my life and have continued since, including while I was working in a campus coffee shop at my undergraduate school and through my graduate work.  Indeed, I would not have been able to complete my dissertation as I did without the black brew; most of the third chapter was composed in a caffeine-fueled haze of frenetic productivity.  Similar phenomena have marked much of my academic career; I have often ramped up my caffeine intake in the hopes of speeding myself along to success in the classroom and in the outside work that supports it.  While I have not perhaps been as successful as I should like, I have not failed as badly as I have feared to do.

But there is a cost to the method.  Caffeine is a stimulant, an artificial means to accelerate body and mind, and its effects only last a short while.  When they wear off, many people are left with less than they had before taking it in of vitality and energy; they crash at the end of the ride.  For many years, I was able to stave off the crash by taking in more caffeine, having another cup of coffee (and another, and another...), each one boosting me less until I came to a cup that did not matter; I drank it and went to sleep.  But with the resilience of youth, I woke up the next morning in fine form, ready to start the day again...and to resume my coffee drinking in earnest.  I knew myself to be addicted, and I still know myself to be, but I know also that I cannot maintain quite the pace of consumption that I used to even if I still feel the need for it to make my way in the academic world.

I still drink coffee on working days, about a pot of the stuff from home and one or two cups at the office during the day.  On my days off, however, I have been switching more to tea (I drink Darjeeling by preference).  I am still able to feel the influx of caffeine into my body when I drink it, staving off the ponding headaches that would come from its lack (I noted my addiction), but I do not suffer the...drawbacks of the caffeine rush quite so badly, if at all.  Perhaps it is because hot tea must be drunk more slowly than I quaff my coffee.  Perhaps it is because the production cycle is a bit slower, as well.  And perhaps it is the case that the other components of the tea that are much touted--soothing chemicals that emerge from the leaves when brewed--work upon me, keeping my heart from racing although it quickens, preventing my body from burning through its ready resources in short order while still sharpening the mind.

It seems to be helping matters somewhat.  I am feeling better on my (few, and growing fewer) days off than had been the case before, so I will keep doing what I am doing for a while.

Wednesday, November 20, 2013


I understand that the last couple of posts made to this blog have been a bit briefer than usual.  Today's should be back to regular length, or thereabouts, although it may be a bit random in its structure; this one is a news post (with commentary, of course, because it is my writing).

My lovely wife and I went to our perinatologist in the City of Thunder (kennings make things better!) to have a checkup done on our forthcoming child.  The child is developing nicely, for which I am thankful; all ten fingers and all ten toes are present, and the limbs to which they are attached are moving about fairly freely.  The child's face is developing well, as are the many vital organs in the human body.  Too, we have positive confirmation of which pronoun we will need to use, having gotten ultrasound images (something about the term strikes the eye oddly) of the relevant equipment.  The grandparents have been notified, and others will be advised when the time is right.

There seems to be a baby boom going on at my workplace.  The last few weeks have seen several of my colleagues bring children into the world; I have not heard that any of them are doing poorly, for which I and others are grateful.  Unlike other such booms I have seen, there is not a commonality of names.  Many of the babies born to members of one of my former departments in the past few years have been named after the same line of English monarchs, which makes sense given that I have worked in English departments and the cultural cachet of the Bard, but it does set up a potentially confusing situation.  Overlapping names can be problematic.

One of my colleagues, Dr. Helen Young, at the time of this writing a postdoctoral fellow at the University of Sydney (information here), has released a call for papers that seems to have originated in the Tales after Tolkien Society of which she is the head and of which I am part.  The project seems interesting, and I will most certainly be offering an abstract in support of it.  For one, I need the publication.  For another, I am committed to the kind of scholarship that the call suggests (as well as to other projects, including Humanities Directory and my own proposed SCMLA special session, both of which could use submissions).

The semester at my current institution is rapidly approaching its end.  Accordingly, there is panic in the classes as students realize that they have not done so well as they might have hoped, and there are few assignments and little time in which to correct matters.  For many, it is at this point too late, and a semester of slacking off is about to have the just and appropriate consequences.  Few contacts from parents are expected, although such things have been known to happen from time to time.  Comments about the phenomenon, which I am certain is not as recent as it typically assumed but has been less widespread in the past, are welcome.

Tomorrow may well have a more "normal" post, something more like a regular essay than today's few bits of unrelated comment.  But I make no promises.

Tuesday, November 19, 2013


Today is the sesquicentennial of the Gettysburg Address, one of the most important pieces of oratory in the history of the United States.  The piece is a remarkable example of concision, some two pages of open script totaling less than 300 words, the work of a rare genius with language and, doubtlessly, extensive revision and consideration.

Much popular discourse in the early twenty-first century relies on brevity.  Text messages sent from phone to phone can total no more than 160 characters, Tweets no more than 140.  Musicians release singles rather than full albums.  Politicians speak in sound bites.  "Too long; didn't read" is seen as a valid excuse for ignorance--and it itself too long to be read, abbreviated most often to "tl;dr."  That Lincoln compressed so poignant a message (admittedly reflective of the problematic assumptions of the time) into so short a piece as he did appears to mark him as ahead of his time, more in line with our standards for text and speech than those of his contemporaries.  But it is only in appearance, for Polonius's soul of wit is not the same as concision, and it is the latter which the Address displays most prominently.

Concision is saying most in the fewest words possible, not simply saying the fewest words possible.  It offers all needed information, while brevity may not.  It attends to style, while brevity does not.  It engages with diverse vocabulary, while brevity generally does not.  It allows such devices as anaphora, while brevity does not.  It is thus a useful guide for writing, as brevity is not.  And Lincoln did it best at Gettysburg 150 years ago.

Monday, November 18, 2013


Briefly only:

My parents visited Stillwater and Sherwood Cottage over the weekend.  It was really good to have them at my home for a while as they visited for no reason at all other than that they had time and felt like coming on up from the Texas Hill Country.  It will be a while before I will get to see them again, and I shall be counting the days until then.

I have the good fortune of actually liking my parents (a rarity, I know).  I need to work to be sure it stays that way; my child needs to know the grandparents...

Sunday, November 17, 2013


I would normally scorn to read something that comes from the pages of Cosmopolitan.  I would consider it beneath me to read such a publication, to sully my eyes with the pap of beauty tips and sex tips that are entirely too elementary to consider useful.  I would normally wholly avoid the kind of trite banality and reinforcement of artificial body image encapsulated in the publication, scorning it as transmissive of patriarchal double-standards that trend to the continued self-oppression of women.  I do not think that I would be wrong to do so.

In my morning readings, however, David Ingber's 11 November 2013 piece "10 Reasons Why You Should Date a Nerd" caught my attention.  In the piece, Ingber differentiates the nerd he discusses from the older stereotypes before offering the promised ten reasons the "modern" nerd is worth dating.*  On the surface, the piece is simply another iteration of mindless pap, presuming to offer relationship advice to a presumed or intended readership that, by even buying into the publication or reading the piece, demonstrates something of an inability to display the personal depth that allows for an actually fulfilling relationship.**  But there are some things in the article that actually appear to support comment.

One of them is that Ingber's assertions are not entirely correct.  For example, his tenth point, that "Nerds are comfortable in their own skin," is far too broad a generalization.  Having been a nerd for much of my life, and having spoken most with nerds and observed their behavior through being among their company, I can attest that many nerds are far from content.  Many of us look at the many others in the world as they scurry about from place to place and are envious.  The way that the contributions of the non-nerdy are celebrated far above and far more consistently than those of nerds--even in colleges and universities, which ought to be the nerdiest of places--grates on me and, from what others have told me, breeds a sense of discontent.  The oft-cited lamentations of graduate students, those who are nerds and are in training to become yet greater nerds, bear this out; Graduate School Barbie (TM) is a case of "funny because it's true," except that it is not really funny...because it is all too true.  Bespoken is a fairly constant self-questioning and second-guessing of life choices that do not indicate people's comfort in their own skin, but the exact opposite of that thing.

Another, broader problem is Ingber's misidentification of actual nerddom.  I am aware that the polished, witty characters on such shows as The Big Bang Theory and others are taken as representative of the corps of nerds.  They are not.  Instead, they are rarefied, idealized nerds, pretty people with a thin veneer of nerditude lightly brushed on.  True nerds are far less polished, far less refined, and far denser in making arcane references than their television and (recent) movie counterparts; they are not likely to be presented accurately or well on screen, probably because they would not test well with audiences.

But there is one thing that Ingber has exactly right.  The essence of nerdhood is, as he suggests, enthusiasm.  Being cool is as much a matter of being detached as anything else.  Nerditude is typically the opposite of coolness, and that opposite is enthusiastic engagement with something.  The traditional nerd engages thusly with things outside the mainstream, admittedly, but even those people who are greatly enthusiastic about such things as baseball and football--solidly mainstream athletic constructions--are called nerds, not for the knowledge itself, but for the intensely enthusiastic display of it.  And, writing once again from the perspective of having long been a nerd, that enthusiasm applies itself to matters of love as much as matters of...anything, really.

*In another venue, I have discussed comments about the uselessness of the "nerd" label--although it is still very much applied, and I still very much identify as one.

**I get to set myself aside from this, since I am a scholar and cultural critic.  One of the benefits of the position...

Saturday, November 16, 2013


One of my excellent co-workers brought to my attention the blog Academic Men Explain Things to Me.  As I at least try to fair and even-handed in my dealings with people, to accord them the respect their actions merit rather than imposing my own...nuanced...worldview upon them, I found myself aghast at what other people in academia still perpetrate upon one another for no better reason than the perceptions of genital equipment.  (I note, though, that it would seem to give the lie to the notion that universities are full of nothing but "liberal pc bullshit.")

As I thought on the matter, though, I found myself sticking at the term "mansplaining."  As I talked with that most excellent co-worker about the issue, I voiced the opinion that the gendered descriptor is itself a sexist term, citing both the de-gendering of many other terms (typically professional) and the fact that I have been presented with similar situations.  I have, for example, been told by a lesbian (I believe a gold star, but I do not know enough about the person to be certain--and I am not sure I want to) how my penis really works.  (Had the person in question been a medical practitioner, I might have been willing to accept the dictum, but this was not the case.  And I do not think that medical recommendations are what the term "mansplaining" typically refers to in any event--although I am sure there are examples.)

The co-worker and I did not agree on the point; she asserted that the nature of the phenomenon is gender-specific (although I would point out to her that it assumes a uniformity among men that may not be the case; gay men may or may not commit acts of mansplanation, but I have not seen any attested on Academic Men Explain Things to Me), and, as in other cases where gender is a bona fide issue, the gendered descriptor is warranted.  She did, however, point out that similar phenomena deserve to be given their own terms of approbation; it is a workable resolution to the disagreement (and I sincerely love to have such discussions; no sarcasm is present in this), albeit one I still do not find optimal.

I did not stop thinking about the issue, though, and in my morning readings yesterday, I came across a piece that seems to me to be relevant to that issue: Winston Rowntree's "5 Responses to Sexism That Just Make Everything Worse" on  Rowntree points out, I think usefully, that while there is pressure placed on men to be and act certain ways (and I have commented to that effect, such as here), there is more, and frequently more destructive, pressure on women to conform to narrowly prescribed standards of appearance and conduct.  Working from the reminder of that idea (I am sure that I had encountered it previously; I am a pointy-headed liberal elitist, after all, since I am part of the liberal arts professoriate), I arrived at the notion that even though it is true that the term "mansplaining" is sexist, and it needs to be...amended...therefore, its sexism is very little against that of the phenomenon it purports to describe, and eliminating the big problem needs to take priority.

Friday, November 15, 2013


During this morning's readings, I stumbled across Jonathan Stempel's 14 November 2013 Reuters article "Google Defeats Authors in U.S. Book-Scanning Lawsuit."  In it, Stempel reports upon the decision by the 2nd US Circuit Court of Appeals to dismiss a lawsuit filed by the Author's Guild against Google which had claimed that much of the Google Books project violates copyright law.  The court ruled that Google's current practice falls under the fair use doctrine, in which short excerpts of a work may be used by someone other than the copyright holder of the work.  Stempel reports also that continued legal action is forthcoming, as well as noting the implications of the case for works in other media than print.

How to integrate source material, and how much of it to use, is a concern for me in both major components of my work: teaching and research.  I have addressed the former once or twice before (albeit a while back); one of the things that I have to do as a teacher of composition courses is discuss with students how to incorporate the ideas of others into their own work--correctly and responsibly.  It is not enough to simply cut and paste the words of others into another paper, and it is really not enough to put that text in quotation marks.  More must be done to massage the data into the writer's own expression of understanding.  As to the latter: I have to do much to ground my understandings of literature in the literature itself, which means have to do a fair bit of quotation and the like.  I also have to account for the understandings of those scholars who have gone before me, which ends up meaning the same thing.  Appropriately enough, then, when I saw a reference to a copyright case involving a major media provider which I use extensively (this webspace is a Google offering, after all), my interest was piqued.

As I note above, much of my work as a professor involves the effects of the fair use doctrine.  I am therefore invested in seeing it upheld and substantiated; its diminishment impairs my ability to do the work I have trained to do (if it can be called work).  At the same time, the frequency with which I am cited impacts how I am regarded as a scholar.  Too, as someone who does have creative work out in the world, and actually in print, I entertain the hope that I will be able to someday make a bit of money from my writing, and it occurs to me that the decision may well make it more difficult for authors to benefit from the dissemination of their works.  It takes thought to write, energy to think, food to get energy, and money to get food, so that if readers want writers to write, they need to buy what the writers write (the same is true for other media; piracy is generally bad).  And it seems to me that the snippets Google offers are ripe for copy-paste jobs by students.  How I ought to regard the court decision and what result I should hope to see from the promised appeals Stempel mentions elude me--which strikes me as a good thing.

It promotes thought and self-reflection, and both are good to have of a morning.

Thursday, November 14, 2013


I seem to have fallen behind in my journal-reading again.   That I have is in part a result of the move to Sherwood Cottage, which disrupted my activities substantially.  The move was months ago, though, and had it been the only thing interfering, I would have caught up with myself by now.  Similarly the trouble journals have taken getting to me, which somehow seems to have abated; I get my journals in a more timely fashion in Stillwater, Oklahoma, than in New York City, which strikes me as an oddity, given how resistant many of the people here are to governmental programs and their funding.  (Note that I discuss people and not politicians.  And I am aware that the construction of the previous sentence tends to argue against the humanity of politicians.)

Another reason for the slowness in my journal-reading has already been discussed, and not too long ago; I will not go over it again.  But perhaps chief of them is that I am lazy.  The assertion will come as a stunning revelation, I know, to those who believe that the life of an intellectual is one of indolence and ease.  Having summers off and holidays throughout the fall and spring terms attracts to the work of the mind those who are not equipped or are unwilling to do real work and who therefore deserve to find themselves impoverished and disdained; did we deserve respect, we would have already earned it.  And the professoriate, which inflates its own ego through onanism and the profligate deployment of words that are too damned big and of which I, at least in name, am part, is worst among them, trying only to perpetuate itself and lead promising youths into deconstructionist bullshit and lifetimes of poverty so that its own members can pass off what little teaching they would otherwise do onto unequipped graduate students in favor of sitting in offices, intoxicated, with their thumbs up their asses--if anything so useful.

I really want a sarcasm font.

The truth is, though, that I do often feel as if I am being lazy, despite spending hours not only in the classroom and more hours in reading over and offering comments on student work, but more hours yet in developing and refining the knowledge base from which I teach and grade so that I can do both more effectively and with a greater understanding of how things actually are and what they can be.  (The hope I clutch to my chest is that the students will realize what they can do to help the world get from how it is to how it can be better.)  I come from people who have given themselves over to the work of the hands, and they excel at that work, but I see the price they have paid and are paying to have such mastery.  Complain as I might of what I am obliged to tolerate in what I do--and those who read what I write here have seen many such complaints--I know that I suffer far less from doing what I do than do the hard-working members of my family for what they do.  I spend time away from my work because I must; I have to sleep, though I begrudge the need, and I have to tend to family and home, though I do not begrudge those needs at all.  But something in me nags at me in every moment that I am away from grading and lesson planning and teaching in the classroom and meetings with students.  It nags at me as I exchange ideas with other scholars in passing in the halls or in email or in conference presentations or in printed work.  It nags at me as I am away from my desk, doing something that is not the two-fold scholarly mission of developing knowledge and disseminating it.

It tells me that I am lazy however hard I might work, and I do not know how to get it to shut up.

Wednesday, November 13, 2013


Where has the time gone?
It nears the middle of the month,
But I had thought it just started.

A holiday has passed,
Another draws near,
And I am not ready for it
Despite my love
Of feasting
Of napping
Of time with family and friends.

People and places around me
Look ahead
To another holiday entirely
And it pisses me off
Both because it skips a good day
On purpose
And because I am not ready for the later holiday, either.

There is still work to do.
There is less time to do it than I should like.

The cliché holds:
Tempus fugit.
Time flies.
It is supposed to happen while a person is having fun.
It flies for me,
But I am not sure that I am having fun,
Not with this.

But maybe I ought to
Take the cliché,
The Latinism,
Mess with it,
And come up with something like
Tempus, fuck it.

It is, admittedly, a stretch.

Tuesday, November 12, 2013


I was going to write something else today, something talking about how nice it is to have the kind of job that allows me to do such things as have running fights with my snooze button and cook a nice breakfast for my lovely wife (it is, and I did, by the way).  But I looked at my media feeds this morning, and as I did, I recalled things I have seen in stores and around the campus as I have walked between work and Sherwood Cottage, and I realized that I had something different to say than I had originally intended.  I suppose that it is hardly unique that I make the comment, but I find myself annoyed that Thanksgiving is being more or less ignored, and for damned stupid reasons.

I am angered by what I read in such stories as that by Amrita Jayakumar and Abha Bhattarai in the 11 November 2013 Washington Post, "Kohl’s, Target, Toys R Us Add to Early Holiday Shopping Frenzy," that Thanksgiving is being shoved aside in favor of lucre.  I have written before to my high regard for Thanksgiving, and even if its history is problematic in being bound up with the faulty mythology the United States uses to try to justify its colonialist oppression of First Nations peoples to its children (how many of us were told that "the first Thanksgiving" was an amicable affair?), I still find it a just and worthy thing to dedicate a day to gratitude for the things that are offered to us (and I know that my privileged position as a white, Anglo-Saxon, Protestant man of the US middle class influences that belief, thank you).  Too, I am certainly still appreciative of having a full belly and taking naps (the former is attested by my breakfast comment and the picture offered here).  That something I regard as being just and worthy is being interdicted vexes me greatly, and I avow that I will not partake of that which interferes; however good the sales may be that day, I will not be shopping them.

This is not the first time I have complained about the misappropriation of holidays, the first time I have expressed dissatisfaction about the way in which the United States warps itself on "special days"--even that which ought to be paramount in the national celebrations.  It may well not be the last, even if I am screaming vainly into the wind, my voice lost against the cacophony of commercials hawking wares at discounted prices purchased only through the oppression of workers across the world.  I suppose, though, that in the increasing numbers of sales on holidays, the increasing subsuming of celebration by crass materialism (again, I know that my own biases manifest here, thank you), the people of the United States as a whole indicate what it is that is truly important to them.

I have said before that I am a hypocrite--but I have also said before that I am in abundant company.  How Thanksgiving is being treated shows the truth of the latter.

Monday, November 11, 2013


Today in the United States, it is Veterans Day,* upon which I have commented once or twice before.  In many other places, it is Armistice Day, a commemoration of the end of the First World War.  It is lamentable that those many who called World War I "the war to end all wars" were wrong; it is sad that the optimism in 1919 was in vain.  But it is still fitting that they who fought in that conflict are remembered; we may need remediation to learn the lesson they sought to teach, but that does not mean that we cannot learn it.

If we ignore it, though, we will not be able to learn it, and the Great War is too often ignored in favor of other conflicts.  In some senses, this is understandable.  WWII was far more horrific than its predecessor, killing far more and changing the world perhaps more drastically; that it is widely studied is appropriate.  Too, it remains in living memory, if by a decreasing margin, and WWI barely does if at all, anymore.  The conflict in Korea is in some senses still going on; the US still has forces arrayed along the border between North and South Korea, after all.  Vietnam was...what it was, and I honestly do not know enough to comment upon it.  Other, more recent actions have been more thoroughly and publicly documented, and I need not remark too much upon them; they have received more attention than I can provide.

There was much of note in WWI, however.  General of the Armies of the United States Pershing was hardly without a storied history.  Similarly Colonel Alvin York (whose commission was a later matter, but still entitles him to the style).  Yet even such people, who were lauded so highly and who exerted such influence on later generations, are overlooked; even I, who am aware that there is more to see, can recall few other than them, to my shame.  It is something which I will be working to remedy (among the many other things I seek to fix in myself and the world around me).

It is something towards which we all ought to be working.  If it is the case that such days as today exist to remember those who have fought and died on our behalf, then it is the case that they all ought to be remembered--or as many of them as can be.  This is true even for those who fought a fight whose purposes were not wholly met--and, again, sadly, the First World War neither ended all war nor even ended it among the nations and peoples of Europe.  They deserve to be honored no less than their successors--and perhaps we honor them best by working to enact their hope.

And to those who have so struggled, particularly those who have done so in uniform, I offer my thanks.

*English major bit (because I cannot help but): there is confusion about the genitive.  I have seen "Veterans Day," as I give above, from the US Department of Veterans Affairs--and that agency has substantial authority in the matter.  I have also seen "Veteran's Day," and have used it.  As I think on it, "Veterans' Day" seems most "correct," as the day is one in honor of more than one veteran.  The agency should then be Veterans' Affairs, and, as it was set up in a time when people were more...punctilious about "correct" usage, I am confused at what strikes me as an error.

Sunday, November 10, 2013


Today is my father's birthday.  Being a dutiful and loving son, I have already called him to wish him the joy of it, and I sent a gift in what ought to have been enough time to get it to him on or before the day.  So there is that.

Those who read what I write will note that November is a month that has given my family much; many of us are born in November.  In some way, that is appropriate; if October used to be Death Month, it follows that the cycle of years would move on to new life, thus birth, thus many babies coming to my family in November.  But as October no longer really counts as Death Month, November will not be Birth Month much longer, at least not for the family as a whole.  My child, for example, is due at the end of March--and that is not the only way in which the new generation of my family is messing around with family traditions.  (Others are known by those who need to know.)  But the combo-breaker represented is most welcome.

There is, perhaps, some oddity in celebrating the beginnings of life when, in the Northern Hemisphere in which I and my family live (I know of no exceptions), the life of the world is sinking into quiescence, still mourning alongside Demeter after millennia and with the almost-certain knowledge that Persephone will come again (the trauma of the imagery only now occurs to me).  And I am certain that in generations past, expecting through the winter--with its environmentally imposed limitations on food and travel and the challenges in keeping healthily warm--was...less than desirable.  I am also certain that having the first months of life outside the womb being those darkest and coldest does...something* to those of us who have the sense of timing to be born into what many consider the worst part of the year.

Then again, we are also born into the festive times, with harvest and winter holidays greeting us soon after we greet the world.  Our earliest lives are surrounded by joy and the preparations for it, and insofar as beginnings go, ours are good ones.  My father shows it, certainly; his continuous optimism (I have never known him to show a lack of hope to his children--or to anyone else, come to think of it) bespeaks being solidly rooted in happiness and cheer.  It is one of the many things that people love about him, that he is a happy--not annoyingly bubbly or forcedly ebullient, but happy--person; it makes him good to be around.

I am not able to be around him today; the demands of work prevent it, and he well understands working life, I know.  But I would have liked to have.  And I have done what I can to help him have a good day.  He is most certainly worth it.

*What that "something" is varies.  I have much of the winter about me, while today's birthday boy has much of the summer.  I am sure there are metaphors in that.  Then again, I study English; I would be sure of it.

Saturday, November 9, 2013


On 14 October 2013, Todd VanDerWerff's "How Homestar Runner Changed Web Series for the Better" appeared on the AV Club website.  The article examines and its significance in Internet and broader popular culture, arguing that the gentle humor and mild absurdity of the early Internet series did much to position the medium for its current and rapidly mutating boom.  VanDerWerff also remarks on the earnestness and lack of cynicism that typify the site's offerings (bringing to mind an article on Humanities Directory, Christopher Bell's "The Ballad of Derpy Hooves: Transgressive Fandom in My Little Pony: Friendship is Magic").  Overall, his reading is good, offering a compelling case for the site's continued relevance and the way in which it has established patterns of development that remain useful even now.

My brother, who is a young man of substantial talent and no mean insight, passed the article along to me, knowing that it would strike something of a chord with me both for being an example of the kind of popular culture study that I try to do in some of my work and for my own engagement with the material VanDerWerff references.  As a long-time nerd, and one who performs much of the nerdhood online, I am more than passingly familiar with Homestar, Strong Bad, and the others inhabiting the strangely entertaining world of the Chapmans' design.  I was gratified therefore to see both that the site has received serious consideration--and, however "serious" a site can be that is owned by the same people as The Onion, the treatment VanDerWerff offers is relatively serious--and that it is regarded as having particular importance.  It suggests to me that my tastes are not too far off from where they ought to be, and that is hardly to be devalued.

I stress the last because of my professional alignment.  As a scholar in the humanities, particularly one whose primary field is one that examines the old, I have been obliged more than once to justify my existence.  How I do so varies based upon who it is that demands such justifications of me.  For some audiences, I speak about my work examining the old as a way to understand the new; by looking at what we still use, we can figure out other things that accord with it through analogy to what has accorded with those things in the past.  For others, I speak to my work to facilitate the development of free and open inquiry among my students; by working to uncover the depths of things, I show that there are depths to even the most innocuous things, and that plumbing those depths affords more understanding of and therefore power in the world.  Sometimes, for some audiences, I take what I admit is a more egotistical and more...fraught...approach: the study I do positions me as a keeper of traditions, a bastion of older cultures in the evidently increasingly unstable world.*  While I might rather phrase it otherwise, it is in some sense true; by doing the work I do in the way I do it, I necessarily pass forward ideas that are held over from earlier times.  As such, I am a maintainer of traditions, even if I do not follow all of them.  For my tastes to be in line with what is to be perceived as an emergent tradition, then, helps me to have authority in particular ways that may well be useful to me as I pursue my agendas, open and otherwise.

What those agendas what I write.  I have already revealed them.

*There is something of the Sophist in me (in several senses), as those who know me know.  I work to achieve goals; sometimes I do so through means I would not otherwise prefer.  Some ears do not ear certain sounds; to reach those who have such ears, I have to make sounds they can hear.

Friday, November 8, 2013


A few days ago, my lovely wife emailed me a link to the Rebecca Tuhus-Dubrow's New York Times article, published on 1 November 2013 and revised on 3 November, "The Repurposed Ph.D."  In it, Tuhus-Dubrow reports that an increasing number of people who trained as academics are leaving the traditional academic workforce (which is itself substantially undermined, with most of the work of teaching carried out by contingent academic labor).  Some are transitioning to other careers within academic institutions, such as library and curatorial work.  Others are transitioning to work outside the academy that is similar to academia, working in programs that call for many of the skills developed during graduate study.  Still others are abandoning academic work and work like it altogether, even going so far as to strive to prevent other people from entering into it.  Even so, Tuhus-Dubrow offers some indication that those with terminal degrees can find satisfying, worthwhile work to do, even if it is outside the tenure track.

How much indication is offered, though, is questionable.  As is understandable for a New York Times piece, Tuhus-Dubrow's article focuses on New York City.  In such a place, there is potentially the kind of work that she outlines, work that takes the kind of expertise and focus that I have yet to see develop outside of the academy or the type of monastic community from which Western academia proceeds.  But that does not mean that people are willing to hire the terminally degreed for such jobs, as I found out in my own recent job hunt (discussion begins here, I think).  Indeed, as a result of that experience, I have little sympathy for the Capitanio quoted by Tuhus-Dubrow; if he thinks sending out 60 applications in three years is "kind of desperate," what, then, must I call my having sent out some 70, to the academy and to "real" jobs, in a few months?  And I got interviews at perhaps three institutions (and a job, yes, but still a contingent one).

As my wife and I have both looked for jobs in the past few months, we have encountered a number of employers who have rejected us because we have advanced degrees in humanistic study.  Outside the academy in a number of places, a doctorate in English is not usually the kind of thing employers want to see; rather than bespeaking attention to detail, focused discipline, a careful attention to the beauty and accuracy of language, and an abiding understanding of the people who create that language, the degree is seen as a waste of time, a frivolity, an avoidance of thing that matter, a disconnection from "real life," a predilection away from "wholesome" values, and, perhaps, a hyper-exaggeration of primary and secondary school English teaching (I never thought I'd get so much use out of Zawacki).  Advanced degrees in some other humanities fields--linguistics, for example--are met with confusion.  And even when those of us who have such things work (using the argumentative skills our degrees demand of us and, in the case of those whose degrees are about writing, those in specific language) to present ourselves and our credentials in ways that suit the working world, we are rejected.  Never mind being able to dash out several hundred words of researched, well-written, and relatively polished prose in under an hour; never mind being able to type close to 80 wpm while generating the content, or ten-keying at several thousand strokes an hour; never mind being able to glance at a document or webpage and understand in that glance what audience is intended and what the composer thinks about that audience; never mind that these are the very things asked for in the job description.  None of that matters; the degree is the wrong one (even though no degree is mentioned in the job listing).

It is something I have remarked upon relatively recently and earlier, that those of us who have sought the kind of education scholars in the humanities have find themselves made Other.  And it is a choice that we make, going into it.  I have to wonder, then, if the post-ac people Tuhus-Dubrow mentions do not have the right idea...

Thursday, November 7, 2013


A thought occurs to me (and shuddering in either revulsion or terror may well occur to you as you read such a statement).  I have noted in at least some of the writing that I do and in several of the conversations I have had with people that I do not want to be that guy with the coming baby.*  We have all known that guy, the one who cannot talk about anything but the baby; it is difficult to fault him for being happy about becoming a father, and it is good to see him so taken with the child, but it becomes annoying to have only one thing to discuss.

I do not want to be inadvertently annoying.  I want to have more things to discuss.  And when I am annoying, I try to do it on purpose so that I can do a better job of it.  I have standards to uphold.

This webspace, however, is conducive to my ramblings and to discussions of things that might not be fit for polite company or oddities (such as science fiction franchise insults and racist video games) that I might be accused of overthinking matters for seeing and commenting upon them.  If there is a place for me to wax rhapsodic about the baby, it is this place.  And therein lies the dilemma that is the thought that occurs to me: how much of this space ought I to spend in discussing the child, both now while there is little to report (I cannot really see what's going on) and in the future when there will no doubt be a lot going on.

Here, as in meatspace, I work to demonstrate a diversity of interests and to consider a number of ideas.  I am a generalist literary scholar, trained to treat the words generated throughout the English languages as well as the contexts in which they are written and read.  It is to my benefit therefore to take a broader view and range fairly far afield in my writings, both professionally and more personally (as in this webspace).  Narrowing my field of discussion to the experience of nascent (heh) parenthood would seem to work against that benefit.

Too, there is the issue of the child's future.  In the years to come, will my child look back favorably on my commenting extensively on such things as development in utero and the early fun and adventures that I am warned come with a new infant?  I know that I am not always happy to have parts of my life detailed to others, whether by my parents or by other people entirely.  (I have discussed this at some length with my parents.)  Despite what I reveal of myself online, deliberately and inadvertently, and despite my not feeling shame about a number of things that perhaps I ought, I don't relish being bruited about.  I imagine that my child, who will be in some senses shaped by my views and actions, may well feel similarly...

At the same time, I am REALLY excited about becoming a father.  My child is always on my mind, and I am still giddy--yes, actually giddy--about meeting the developing person I have helped to make.  My child is going to be awesome, and the world needs to know of such awesomeness...I am just not sure how much.

*While in other capacities, the term might be sexist, I am referencing my own situation and those similar to mine, and I am male (there are relevant anecdotes).  I do not presume to comment about the female experience of pregnancy or to claim that my experience as an expecting father is nearly so...involved as that of my wife as she carries our child.  But I have no qualms about commenting on mine, as should be obvious. 

Wednesday, November 6, 2013


I admit that I have not been diligent in my civic duties of attending to elections in this cycle, to my shame.  It has not always been the case that I have not paid much mind to who is running what and where; I have remarked about such things in the past.  But this time around, I have not had the energy to spend on watching politics, I have not had much to add to discussions of them, I have not had the time to take away from a number of other pursuits, and I have found it hard to care.  (I also remain unsure about my eligibility to vote in state and local elections at this point...I really ought to ask after and among all of the other things that need doing...if I remember...and if it matters...)

That I have not done so removes me from the ability to complain about things that might have been influenced by me had I voted.  Senators and Representatives against whom I have not yet had the chance to vote (and admittedly more the former than the latter) are still targets, as are those people chosen in the elections in which I voted (and there were a few).  Executives can get the same treatment.  But I did fail in some of my duties, and so I must accept the consequences thereof; I didn't do what I could to stop things, so I do not get to complain about them not being to my liking.  (Again, unless I was not eligible to vote because of my move...I really need to look...)

My "reasons" for not voting are the same as I have railed against in others, which, I suppose, makes me once again a hypocrite.  I suppose it also serves to position me among the great masses of people in the United States, consumed with my own immediate advancement and agenda rather than the greater good.  And while I know that some would say that my concern for such things, myself rather than the body politic, is actually the best thing for that body, I do not know that I can agree.  Certainly, I want to see a strong social safety net (i.e., welfare, food stamps, social security, basic universal health care, effective and ethical law enforcement and military forces, public schooling) in place.  I might need it (again), so that in one respect, it is like insurance--and I do not mind others using it while I am not, so long as it is there when I do need it.  But, more to the point, I have seen the stupidity and, frankly, evil that result from people focusing on themselves and their own immediate needs to the exclusion of others'.  It is that focus that leads children to gorge themselves to the point of vomiting.  It is that focus that prompts people defending themselves to flail about wildly, possibly hitting their intended target, and likely inflicting harm on others not involved in the attack (and leaving aside the question of whether or not an attack is actually being made).  It is at some level the same focus that has led to atrocities time and again, and--

--and, like I noted, I did not vote, so I ought not to complain.  I have helped it be this way, and I am (and should be) ashamed.

Tuesday, November 5, 2013


First, my cousin's birthday is today.  I have already wished him the joy of it, because I am a good older cousin and remember to do such things.

Second, and more normally (in several senses, some of which may actually be fairly...perverse), I realized something after my wife and I watched an episode of Star Trek: Deep Space Nine, "Statistical Probabilities."  The plot is readily available, and I am certain that video of the episode can be easily had, so I will not re-hash the narrative here.  What I will suggest instead is that the episode speaks to a trope present in the main line of the Star Trek franchise (TOS, TNG, DS9, and Voyager; I refuse to include Enterprise in the discussion for a number of reasons, upon which I may elaborate later).  The trope is one that associates superior intellect with dehumanization.

The Star Trek universe is one that features a number of hyper-intelligent characters among its primary roles.  Most of those who appear regularly on the shows are graduates of a military academy modeled after an amalgamation of the United States' service academies--and those schools are perennially noted as among the best in the US.  As of this writing, in fact, the US News and World Report college ranking has
  • The US Coast Guard Academy #2 among regional colleges in the North
  • The US Air Force Academy #25 among national liberal arts colleges
  • The US Merchant Marine Academy #3 among regional colleges in the North
  • The US Military Academy #17 among national liberal arts colleges
  • The US Naval Academy #12 among national liberal arts colleges
Presumably, those graduating from the fictional successor of those schools will be amply qualified, and, given the technologies hypothetically involved (and the physics upon which they purportedly rely), the qualifications will necessarily bespeak a fair degree of intelligence.  But some stand out even from academy graduates in terms of brainpower: Spock from TOS, Data from TNG, Bashir from DS9, and the Doctor from Voyager (who is the product of and modeled after academy graduates).  Each is the most intelligent member of the regular characters.  Each is also somehow proximal to but other than human; Spock is half-human, Data a human-mimetic machine who strives to become more human (necessarily implying a lack of humanity), Bashir an illegally genetically altered human (whose very existence makes him a second-class citizen who must get special dispensation to be in a position to help people), and the Doctor a holographic version of Data.  In each case, a superior intellect is coupled with a lack of humanity; being really smart is inhuman, problematic, Other.

What strikes me is the disjunction between the message being sent--that being smart is somehow bad because it makes the smart one less than human--and both the in-milieu need for intelligence (how else to operate and maintain computers that can handle the equations necessary to navigate at speeds faster than light, or to disassemble and reassemble living beings without killing them for more than a part of a second?) and the primary expected fan-base of science fiction generally and Star Trek more specifically: nerds.  There is a contradiction in the storytelling that the smartest characters, who ought to be the best equipped to live in the milieu, being relegated to secondary status.  There is something of sadism in the writers, and of masochism in the fans, that the writers (themselves likely nerds) heap abuse on the intelligent, and that we nerds so eagerly flock to a property that at some level insults us.

Monday, November 4, 2013


Somehow, I have made it to see my thirty-first birthday.  Somehow, today is the first time I have blogged on my birthday (at least in this space; I have had other things that have counted as blogs, although they are gone, and I might have made a note in one of them on an earlier birthday).  I can report that my beard is growing back, if a bit more slowly than I had expected; my cheeks and chin no longer feel like sandpaper, but more like the hook-side of hook-and-eye closure materials (e.g., Velcro).  And that provokes a thought in me, something to follow for another few hundred words in today's note: the making-generic of what are brand-name words.

The phenomenon is far from new to me.  I grew up (if I can be said to have done so even now) in the Texas Hill Country, where all sodas are Coke (except for Dr Pepper, which, being Texan, is in a different category), even if they are for some strange and foolish reason Pepsi.  Similarly, corn chips are Fritos (again, a Texan product in Texas can be expected to get special treatment).  But all copy machines are Xerox (except for the few holdouts for mimeography), even if they are not.  Facial tissues are Kleenex.  And, yes, hook-and-eye fabric is Velcro.

In such formations, we betray the extent to which we have been infiltrated by advertising over generations, as I am sure has been argued by people more aware of it than I am.  The case could be made that something of the emphatic force of the terms has been stripped away by their having been made the generic terms; as the go-to words, they would seem at some level to be less specific, less important, than their more distinct synonyms.  Yet this is not actually the case, because by becoming the unmarked term, the default standard by which others are judged, the terms (and, perhaps, the things with which the terms are linked) become privileged as the "normal," making all else abnormal--and what is abnormal is too often regarded as wrong.*

This is true even for things which are nominally considered beneficial.  Returning to myself for a moment (and, since it is my birthday, I think I may safely do so), the issue of being "too smart for [my] own good" comes to mind.  Many times, I have been told that to be smart is a blessing, a gift, one that should be cultivated and which ought to be celebrated as a means for doing good in the world.  Yet far more frequently and systematically have I been derided for having and exercising intellect, physically in my earlier life, verbally and in too many cases culturally in my more recent (as I have noted, I think--here, here, and here, among others).  The derision has, at times, prompted me to state the fervent wish (one I no longer have, mind) that I were not so smart as I happen to be.  (One or two other things have done so, as well, but they are less pertinent.)  My own exclusion for being outside the "norm" has been minor; I fall into the unmarked in a number of ways, so that the effects of my lone "abnormality" are mitigated by all of the ways I am "normal."  I can only dimly imagine what others--perhaps many of my former fellow congregants at the United Methodist Church of the Village--have had to endure for being outside the "norm" as they are, and many of them are damned fine people.

Normally (heh), some call to action would need to go here, some cry into the ether for a change to come.  I would be far from the first to make it, and the call has yet to be heeded in earnest; I will add my voice to those that prompt for careful reconsideration of such normalization, as I have already done, but I am not sure how many are willing to hear.  The message is a bit...unusual, after all.

*A similar chain of reasoning underlies part of the ongoing objection to the use of the masculine pronoun as a catch-all, something I have had to discuss with my students.  By accepting "mankind" as the default for "humanity," or by saying "men" for a mixed-gender group, we tacitly position the male as the norm and what is not male as somehow deficient or wrong--and that is not right.  Anyway, biologically, female is the default setting, so if language is going to take a gender as a default, it ought to be that one.