Sixteen Credits

Co-written with Mark Hall

Some people say a man is made out of gore
Well a student is just a credit score
A credit score and a mind that’s spry
A future that’s bleak, and a bank that’s dry

You take sixteen creds and what do you get?
Nearer your degree and deeper in debt
St. Peter don’t you call me cuz I must stay
I owe my soul to Sallie Mae

Enrolled one morning, it was drizlin rain
“Get a degree” was the school’s refrain
Should I study English, or should it be Math?
Decades of debt was the only path.

You take sixteen creds….

I enrolled one morning I was at an impasse
Picked up my laptop and I walked into class
I took sixteen creds based on aptitude
And the T.A. said, “Well son, you’re screwed.”

You take sixteen creds…

This job pays just $8.95
It ain’t enough to keep a man alive
I can’t rent a roof to stop the wet
‘Cause $5.50 of that goes to service my debt

You take sixteen creds….

If you see me coming just say hello
I’m working a job that don’t pay what I owe
I earned a liberal arts degree
And all it got me was bankruptcy

You take sixteen creds….



(Yeah, there are scansion problems, I know. Suggestions welcome.)

Orlando and the Need to Talk About It Anyway

Let’s see if I can manage to say this right.
Orlando was a horror. We’re shocked, disgusted, angry. Moreover, many of us believe we understand an important piece of why it happened and what should be done (I’m not pointing fingers, me too), and we don’t agree, and the disagreement gets angry and frustration grows and we just want to scream at our computer WHY DON’T YOU IDIOTS GET IT?
Of course that’s what happens. Because it matters.  We’re horrified, and we want it not to happen any more. And it matters. Our usual, general feeling of, “I want to convince you I’m right” is suddenly three octaves higher, because people are dead, and it is so very ugly and wrong and the need to find a solution is suddenly acute. It matters.
Yes, now is when it is hard to try to be patient, to try to explain your position, because (if you’re like me) you’re furious and upset, and because the person you’re trying to explain it to (if he or she is like me) is also furious and upset. And you want to say, “I don’t want to talk about this any more.” And, hey, maybe that’s the right call, I’m not saying to force yourself into anything.
I will continue trying convince you that I’m right (and you’re wrong) about how to look at this, and I will try my best to be patient and to explain as clearly as I can, and I understand if you get angry and want to shut down the conversation, and maybe I will get angry and try to shut down the conversation. But I’ll remind myself that the reason this is hard to talk about is exactly why it is important we try to do so. Because it matters.

Commenting Bug?

I’ve been running into a bug, and I’m posting this to see if anyone else has.

If I make a comment that is larger than the comment box, the “post comment” button vanishes, so the only way I can make the comment is to cut it, post the beginning only, then edit it to add in the rest.  Has anyone else run into that when posting long comments?  It doesn’t seem to happen all the time, and I haven’t figured out what conditions cause it.  Most likely, there is a really obvious thing I’m missing, but if anyone has any ideas, let me know.

The Language Police: That’d Be Me

Last night on Twitter I objected to a NASA announcer saying “…any data that is…” rather than “…any data that are.”  Why did I object? Well, to be honest, because I was really upset about Antares blowing up, and I was looking for someone I could be mad at. Yeah, I know, not very rational; but sometimes I’m just not.

A friend then replied with the following: ‘But, to summarize, do you also insist on “The agenda are…”? If so, good for you, but the language, per the OED, has moved on.’

The OED certainly is a good source, and I agree with them about agenda.  But I still prefer “data” as a plural and “datum” as the singular.  But, more important, I am heartily sick of, “the language has moved on.” According to whom?  Who gets to decide?

The answer is: I do, because I’m arguing about it, and stating my preference.  If you argue about it, and state your preference, then you get to decide too.  “The language has moved on” is meaningless rubbish.  If it has moved on to the point where no one is arguing about it, then it need never come up.  If there are people arguing about it, then it may be in the process of moving on, but it hasn’t gotten there yet.  How do I know? Because people are still arguing.

The arguing, you see, is the whole point.

There was a time when “awful” meant “filled one with a sense of awe.” It doesn’t mean that any more.  How do we know that? Because no one is using it that way, and no one is arguing for it.  In this case, the language has moved on; the proof is that in this case we never hear anyone insisting “the language has moved on.”

Now, perhaps, what you’re saying is, “usage is determined by majority rule, and the majority now does it this way.” If that’s what you’re saying, well, let’s say I disagree.  But if so, say so.

In the particular case in question, “data” vs “datum” as the singular, I don’t know that I can find a strong reason for my preference other than being used to it; so if you can find a good reason for your preference, you’re liable to win that argument, and then I’ll stop making irritated tweets correcting anonymous commentators.  But make it!  Tell me why that usage is better.  I’m here.  I’m listening.  What, it isn’t better?  It has no advantages, and you only claim the language has moved on because lots and lots of people say it? That doesn’t convince me this change makes the language more flexible, more powerful, more elegant, more nuanced, better able to express fine distinctions.

Examples: I dislike the current use of “hopefully”  because I think the distinction between “I hope,” “you should hope,” and “all right-thinking people ought to hope,” is useful and I don’t like to see it concealed.  I dislike the word “proactive” because it sounds as if it is conveying information when in fact it says nothing*.  Those two battles are mostly over, but I haven’t given up yet.  If you want to argue with me, you are free to do so.  If your argument is, “the language has moved on” do not expect to convince me.

Obviously, you have as much right to your preference as I have for mine.  Moreover, you have as much right to make a case for or against a given change as I do.  But if I insist a usage is wrong, and you don’t agree, then, make the case.  “The language has moved on” is never a valid argument, because it contains its own contradiction: as I said above, if it had moved on, we wouldn’t be talking about it.  Hell, I’ll even tell you how to make the case.  Instead of a strident, smug, empty, “the language has moved on,” try saying this: “Most people have accepted that “data” is a singular noun.  The language seems to be changing.  Can you make a case for keeping it the old way?”  There, see, now you’ve put the burden of proof on me.  That’s fair.

Who gets to decide what is correct usage? Anyone and everyone who bothers to have an opinion about it.

Now, it is perfectly reasonable to shrug and say to yourself, “Let the silly dinosaur keep raging; in twenty years everyone who insists that ‘data’ is a plural will be dead, and the language will have moved on.”  If you say that, you’ll almost certainly be right.  But if that’s your attitude, why are you telling me?  Do you expect to convince me that, just because a lot of people use “infer” and “imply” interchangeably, I should adapt myself to it?  If you want to convince me, convince me.  If you want to roll your eyes and let me fight my doomed battle, do that.  But “the language has moved on” is useless as an argument, and empty as an observation.  Argue, or shut up.


*For those of who believe “proactive” does convey something, I challenge you to find a real-world situation in which it suggests an action that isn’t better said by simply dropping it and moving on to the next sentence or clause.


History and Objectivity

Sometimes I feel the need to mount my white charger, pick up my sturdy lance, and ride off in defense of some poor, abused word.  Often, it is a word that has been mugged and robbed of its precision, like hopefully.  Sometimes, it is a word that has been enslaved and required to labor under a burden of meaning it was never meant to carry, like, relationship.  The fact that these one-man campaigns are hopeless does nothing to discourage me; on the contrary, it just makes me feel more heroic, noble, and self-sacrificing.  Please do not disabuse me of this illusion; my self-love might not be able to stand the truth.

Today, we fight for the defense of a word that has been framed for a crime it didn’t commit.  I refer, as you are already aware from the title, to the word objectivity.  Somewhere along the line, objectivity, particularly in discussions of history, came to be used by some to mean something like, not having an agenda, or, not being a part of what one is examining, or, pretending to have a perspective that is uninfluenced by one’s knowledge or experience.  Naturally, with definitions such as this, poor objectivity finds itself convicted of uselessness without due process, and ends up in solitary confinement in some ideological prison where it must endure of hours of people taunting it with comments like, “there is no such thing as objectivity in history.”  Cruel and unusual, I say.  We will call this the casual definition, because calling it sloppy is a bit more confrontational than I’m ready for just yet. Now, where did I put that lance?

Let us begin with the dictionary, because I like to know dictionary definitions before I ignore them.  The American Heritage Dictionary, 1981, has this for definition 1 of objective: “Of or having to do with a material object as distinguished from a mental concept, idea, or belief.  Compare subjective.”  Definition 2 goes on, “Having actual existence or reality.”  It is not until we get to definition 3a that we find, “Uninfluenced by emotion, surmise, or personal prejudice,” which at least waves at the definition to which I refer in the previous paragraph.  And then 3b merrily goes on, “Based on observable phenomena; presented factually: an objective appraisal.”

We often hear, “No one can be objective regarding history.”  I beg to submit the following: 1. Generally, when someone says that, it is the casual definition that is being used.  2. By the casual definition, not only can no one be objective, but those who claim to be are usually being disingenuous, and working very hard to conceal their agenda.  3. Using the casual definition, objectivity is not only impossible, but also unnecessary, and not even a goal worth striving for; on the contrary, a good historian makes not the least effort to be objective in that sense, knowing that such an effort can only lead to distortion.

But when we go with the dictionary definition, we have an entirely different approach and result.  When I say a work of history is objective, I mean that it bases itself on real, material events and relationships.  Right now, I’m studying the history of Kansas, 1856-60, and the formation of the Republican Party.  I neither expect nor desire the historian to pretend to display events as if devoid of prejudice, belief, or agenda.  What I do demand is that conclusions be based on facts that are clearly laid out, that the historian’s beliefs and programs be either clearly stated or easily deduced, that “inconvenient facts” not be omitted, and that the internal consistency of the narrative, built on verifiable facts, be laid out.  In other words, “show your work.”

My two favorite historical works are James M. McPherson’s Battle Cry of Freedom and Trotsky’s History of the Russian Revolution.  McPherson makes no secret of his antipathy for the slave power, and Trotsky, of course, is quite clear and open about his support for the insurrection of which he was one of the principal architects and the primary organizer.  What makes these works so profoundly convincing is the revelations of the general historical laws at work effectively explain the events; the logic holds together.  In both cases, it becomes very difficult to dispute the conclusions without taking the position that the author is out-and-out lying about facts (which is problematic in both cases, given how easily verifiable the facts are).

When I refer to a work or a method as subjective, I mean that it bases itself on the particular, individual, personal.  A work is subjective insofar as “I feel” is the starting point, as opposed to, “this happened.”  Even more so if, “this is how you should feel about it,” as opposed to, “this is why it happened,” comes slithering through the subtext.  Individual, personal experience can be vital in helping us empathize with another human being, but it is not how we come to a scientific understanding of the processes of history which, though inevitably happening to individuals, are nevertheless fundamentally impersonal: they are what happened whether I like it or not.

To be sure, no one can have a complete, perfect, total understanding of an historical event any more than one can have a complete, perfect, total understanding of, for example, the formation of Earth’s crust.  But we do not find hoards of pseudo-intellectuals telling us how geology cannot be studied objectively.  To achieve a scientific understanding of the formation of the Earth’s crust, a geologist does not base his work on how he feels about it, but rather endeavors, as well as possible, to determine what really happened and why.  And then we test that understanding by making predictions, and so modify our theories as needed.  To apply this same method to the study of history is, without doubt, more difficult: the effects of prejudice and social pressures generated by class society are much more immediate.  But that is no excuse for applying different standards. So, yes, I reject the notion that there can be no objectivity in historical studies.  Those who support this notion are, in my opinion, abusing the poor word, and ought to stop.