Posts

The hatred of theology

In the latest issue of The Point, Jon Baskin writes on behalf of the magazine's editors about what he calls "the hatred of literature." By this team he means the attitude—apparently dominant in English departments a couple decades ago and imbibed by graduate students across the land—that the study of literature exists not to appreciate its multifarious goodnesses and beauties, rooted in love for the object of study, but instead to uncover, unmask, and indict the social, moral, and political problems belonging to its conditions of production. The novel or poem is therefore not an object at all, that is to say, an end, but a means to a larger, political end; criticism thus becomes an instrument of political advocacy. The work of literary art plays no role in calling me or my convictions into question. Rather, the critic measures the work by the correctness of its views or its capacity to activate social change (for the better, that is, more or less in line with my priors),…

"Unique": absolute or relative?

Apropos of nothing, it's always bugged me that, grammatically speaking, the modifier "unique" is not supposed to be modified adverbially (as in "relatively" or "somewhat" or "nearly" unique). Instead, either "unique" is absolute or, by definition, it is simply not unique. I recall reading something by David Foster Wallace about this years ago.

Isn't it the case, though, that nothing is absolutely unique? Rather, anything is unique relative to some qualifier, property, activity, or question. Otherwise, it would follow that everything is unique—because nothing is itself but itself—or nothing is, with the exception of God, who alone (existing a se and in se and thus non est in genere) is actually unique in an absolute sense.

I understand the desire to want to mitigate popular usage of "unique" as a less powerful adjective than it ought to be; used colloquially, and always modified by synonyms of "partially," it…

An amendment to the amendment

If you couldn't tell, I've spent a good part of 2019 trying to figure out what to do with Twitter. I limited my time on it, I nixed tweeting, I cut out all but Saturdays, I basically exited for two months. Then a few weeks ago, after seeing friends at AAR in San Diego whom I had "met" via Twitter, I decided to amend my tech-wise policy and dip my toe back into the service. And once the semester I ended, I allowed myself to get back on a bit more while home for the Christmas break.

Following all that experimentation, I think I'm back to where I was last May. That is, at the macro level, the world would unquestionably be better off without Twitter in it, because Twitter as a system or structure is broken and unfixable. But at the micro level, the truth is that my experience on that otherwise diabolical website is almost uniformly positive. Aside from the "itch" that results from any social media participation—an itch that is not conducive to the life of t…

On Episode IX

Well, it happened. Abrams didn't even rise to his own best level. He capped off a 42-year cinematic saga with a stinker so bad that it sullies not only the new trilogy he helped to launch but his own reputation as a filmmaker.

I thought I'd avoid writing about the film, but instead of spending time on Twitter or Slack, let me just share my thoughts here.

What makes The Rise of Skywalker so bad? Well, there are multiple levels of badness involved.

[SPOILERS HEREON Y'ALL.]

First is the filmmaking itself. This was the most shocking thing about IX. I knew Abrams would go for nostalgia and servicing fandom. I figured he'd undermine VIII. I didn't know he would make such a straightforwardly bad movie, one alternately boring (the guy next to me on opening night fell asleep) and poorly told (my wife can't be the only one who found it difficult to follow).

The opening 30 minutes in particular move so fast, across so many worlds and plot points and characters old and new…

The 11 Best Hour-Long TV Dramas of the Decade (2010–2019)

Image
A few months back I posted this list to Twitter, but I thought I'd re-post it here, with a bit more commentary, as well as a reshuffling due to Mr. Robot's outstanding fourth season.

First, to the rules. This is a list of hour-long dramas: so no half-hour genre-exploders (Atlanta, Louie) or comedies (Parks and Rec, Brooklyn 99). I'm also only thinking of TV series, with discrete seasons that tell something of a unified narrative: thus excluding miniseries (e.g. The Honourable Woman) and specialty shows (a la Sherlock or Black Mirror). Further, in order to qualify the series must have at least three seasons to its name (so The Knick falls short and both Succession and Yellowstone ran out of time before decade's end). Seasons prior to 2010, however—such as Mad Men's first three or Breaking Bad's first two—don't count for the purposes of this list. I am solely considering television seasons comprising hour-long dramatic episodes shown or streamed between 2010 …

With Mr. Robot till the end

The TV show Mr. Robot ends its four-season run in six days.* It began four and a half years ago, in the summer of 2015. It didn't exactly begin with a bang, but the whimper of its premiere (if I may mix metaphors) snowballed into one by its first season finale. Its seemingly omnipresent, omniscient mastermind of a creator Sam Esmail appeared to be the Next Big Thing in TV: a child of 90s cinema, he was and is Fincher and Spielberg and Soderbergh and PTA and Tarantino all—at least aspirationally—rolled into one. Eventually writer-director of every single episode—40 in total by series' end—the show is the complete vision of a self-styled auteur if ever there was one.

The second season lost much of the good will and momentum generated by the first. Sprawling, dense, literally and figuratively dark, trapped for much of its time in Elliot's mind: both critics and viewers in notable numbers dropped the show then, or so it appears from online commentary and anecdotal conversation…

A Twitter amendment

Last weekend I was in San Diego for the annual meeting of AAR/SBL, and (as has become my custom) I mostly saw old friends and new acquaintances. Most of the latter I have "met" online; most of those "meetings" were on Twitter.

Tomorrow marks 8 weeks since I began my experiment with decreasing my Twitter usage: zero time on that infernal website Sunday through Friday, and 30 minutes or fewer on Saturday; moreover, no active tweeting (original, RT, replies, etc.) on any day of the week: only occasional links to something I've written.

After San Diego, I'm reconsidering my experiment, or rather, considering an amendment to it. I think I'm going to try a modest "return" to being an active rather than passive user on Twitter, albeit within the same time and use constraints I've already set for myself. That is: limit both reading and tweeting to Saturdays, for 30 minutes or so, but become a sort of power-user for that half-hour of time: sharing t…

Luddites and climate activists, unite!

I encourage you to read Ben Tarnoff's piece in The Guardian from a couple months back: "To decarbonize we must decomputerize: why we need a Luddite revolution." The very worst approach to technology is fatalism: it's inevitable; it's the future; we just have to accept it. The second worst approach is denialism: it's not so bad, since (obviously and necessarily) nothing so central to our lives could as bad as the naysayers suggest. The third worst approach is a failure to make connections. This last characteristic is one oddly ubiquitous among liberal folks I talk to about this issue. If either free-market liberalism or the digitization of our lives is so good, then why are the effects so bad for the environment? And what brakes stand in the way of further ecological harm? Denial underwriting technological fatalism certainly won't do the trick.

Perhaps climate activists are allies in waiting for Luddites, and vice versa. As Tarnoff observes, both perceive …

MCU Phases 4 & 5: dream or nightmare?

Image
I have a mixed relationship to the Marvel movies that have so dominated the last decade of Hollywood. On the one hand, I readily enjoy them. I think, for the most part, that they are well made blockbusters, occasionally quite good, directed competently, written with care, and acted superbly. Their achievement as TV-like serialization across 23 films (and three "phases") is, as Matt Zoller Seitz has written, without precedent and accordingly impressive.

On the other hand, I'm neither a comic books "fan" nor an apologist for the MCU. I've read all of two graphic novels in my life, and have nothing invested in "geek culture." I furthermore share the general sentiment that the Marvel-fication of cinema as such is an unhealthy trend. It isn't good that there's a new superhero movie out every three weeks, and that Hollywood wants any and all blockbuster filmmaking to be (a) built on preexisting IP and (b) part of a larger "cinematic universe…

Experiments in Luddite pedagogy: dropping the LMS

This semester I wanted to experiment with teaching my courses without the use of an LMS. For those unfamiliar with the term, LMS stands for "learning management system," i.e., an online program for turning in assignments, communicating with students, updating the syllabus, inputting grades, etc. Some of us used Blackboard back in the day. My campus uses Canvas.

Now, Canvas is without question the best LMS I have ever encountered: intuitive, adaptable, not prone to random glitches and failures, useful for any number of pedagogical and technological ideas and goals. So far as I can see, after 15 years or so, the technology has finally caught on to the vision of using the internet well for teaching purposes, a vision ahead of its time one to two decades ago, and which probably, as a result, led to a lot of wasted time and self-defeating habits.

But, you might be wondering, if Canvas is a good LMS, why did I want to experiment with not using one? Here's why.

1. I want to be i…