Posts

An amendment to the amendment

If you couldn't tell, I've spent a good part of 2019 trying to figure out what to do with Twitter. I limited my time on it, I nixed tweeting, I cut out all but Saturdays, I basically exited for two months. Then a few weeks ago, after seeing friends at AAR in San Diego whom I had "met" via Twitter, I decided to amend my tech-wise policy and dip my toe back into the service. And once the semester I ended, I allowed myself to get back on a bit more while home for the Christmas break.

Following all that experimentation, I think I'm back to where I was last May. That is, at the macro level, the world would unquestionably be better off without Twitter in it, because Twitter as a system or structure is broken and unfixable. But at the micro level, the truth is that my experience on that otherwise diabolical website is almost uniformly positive. Aside from the "itch" that results from any social media participation—an itch that is not conducive to the life of t…

On Episode IX

Well, it happened. Abrams didn't even rise to his own best level. He capped off a 42-year cinematic saga with a stinker so bad that it sullies not only the new trilogy he helped to launch but his own reputation as a filmmaker.

I thought I'd avoid writing about the film, but instead of spending time on Twitter or Slack, let me just share my thoughts here.

What makes The Rise of Skywalker so bad? Well, there are multiple levels of badness involved.

[SPOILERS HEREON Y'ALL.]

First is the filmmaking itself. This was the most shocking thing about IX. I knew Abrams would go for nostalgia and servicing fandom. I figured he'd undermine VIII. I didn't know he would make such a straightforwardly bad movie, one alternately boring (the guy next to me on opening night fell asleep) and poorly told (my wife can't be the only one who found it difficult to follow).

The opening 30 minutes in particular move so fast, across so many worlds and plot points and characters old and new…

The 11 Best Hour-Long TV Dramas of the Decade (2010–2019)

Image
A few months back I posted this list to Twitter, but I thought I'd re-post it here, with a bit more commentary, as well as a reshuffling due to Mr. Robot's outstanding fourth season.

First, to the rules. This is a list of hour-long dramas: so no half-hour genre-exploders (Atlanta, Louie) or comedies (Parks and Rec, Brooklyn 99). I'm also only thinking of TV series, with discrete seasons that tell something of a unified narrative: thus excluding miniseries (e.g. The Honourable Woman) and specialty shows (a la Sherlock or Black Mirror). Further, in order to qualify the series must have at least three seasons to its name (so The Knick falls short and both Succession and Yellowstone ran out of time before decade's end). Seasons prior to 2010, however—such as Mad Men's first three or Breaking Bad's first two—don't count for the purposes of this list. I am solely considering television seasons comprising hour-long dramatic episodes shown or streamed between 2010 …

With Mr. Robot till the end

The TV show Mr. Robot ends its four-season run in six days.* It began four and a half years ago, in the summer of 2015. It didn't exactly begin with a bang, but the whimper of its premiere (if I may mix metaphors) snowballed into one by its first season finale. Its seemingly omnipresent, omniscient mastermind of a creator Sam Esmail appeared to be the Next Big Thing in TV: a child of 90s cinema, he was and is Fincher and Spielberg and Soderbergh and PTA and Tarantino all—at least aspirationally—rolled into one. Eventually writer-director of every single episode—40 in total by series' end—the show is the complete vision of a self-styled auteur if ever there was one.

The second season lost much of the good will and momentum generated by the first. Sprawling, dense, literally and figuratively dark, trapped for much of its time in Elliot's mind: both critics and viewers in notable numbers dropped the show then, or so it appears from online commentary and anecdotal conversation…

A Twitter amendment

Last weekend I was in San Diego for the annual meeting of AAR/SBL, and (as has become my custom) I mostly saw old friends and new acquaintances. Most of the latter I have "met" online; most of those "meetings" were on Twitter.

Tomorrow marks 8 weeks since I began my experiment with decreasing my Twitter usage: zero time on that infernal website Sunday through Friday, and 30 minutes or fewer on Saturday; moreover, no active tweeting (original, RT, replies, etc.) on any day of the week: only occasional links to something I've written.

After San Diego, I'm reconsidering my experiment, or rather, considering an amendment to it. I think I'm going to try a modest "return" to being an active rather than passive user on Twitter, albeit within the same time and use constraints I've already set for myself. That is: limit both reading and tweeting to Saturdays, for 30 minutes or so, but become a sort of power-user for that half-hour of time: sharing t…

Luddites and climate activists, unite!

I encourage you to read Ben Tarnoff's piece in The Guardian from a couple months back: "To decarbonize we must decomputerize: why we need a Luddite revolution." The very worst approach to technology is fatalism: it's inevitable; it's the future; we just have to accept it. The second worst approach is denialism: it's not so bad, since (obviously and necessarily) nothing so central to our lives could as bad as the naysayers suggest. The third worst approach is a failure to make connections. This last characteristic is one oddly ubiquitous among liberal folks I talk to about this issue. If either free-market liberalism or the digitization of our lives is so good, then why are the effects so bad for the environment? And what brakes stand in the way of further ecological harm? Denial underwriting technological fatalism certainly won't do the trick.

Perhaps climate activists are allies in waiting for Luddites, and vice versa. As Tarnoff observes, both perceive …

MCU Phases 4 & 5: dream or nightmare?

Image
I have a mixed relationship to the Marvel movies that have so dominated the last decade of Hollywood. On the one hand, I readily enjoy them. I think, for the most part, that they are well made blockbusters, occasionally quite good, directed competently, written with care, and acted superbly. Their achievement as TV-like serialization across 23 films (and three "phases") is, as Matt Zoller Seitz has written, without precedent and accordingly impressive.

On the other hand, I'm neither a comic books "fan" nor an apologist for the MCU. I've read all of two graphic novels in my life, and have nothing invested in "geek culture." I furthermore share the general sentiment that the Marvel-fication of cinema as such is an unhealthy trend. It isn't good that there's a new superhero movie out every three weeks, and that Hollywood wants any and all blockbuster filmmaking to be (a) built on preexisting IP and (b) part of a larger "cinematic universe…

Experiments in Luddite pedagogy: dropping the LMS

This semester I wanted to experiment with teaching my courses without the use of an LMS. For those unfamiliar with the term, LMS stands for "learning management system," i.e., an online program for turning in assignments, communicating with students, updating the syllabus, inputting grades, etc. Some of us used Blackboard back in the day. My campus uses Canvas.

Now, Canvas is without question the best LMS I have ever encountered: intuitive, adaptable, not prone to random glitches and failures, useful for any number of pedagogical and technological ideas and goals. So far as I can see, after 15 years or so, the technology has finally caught on to the vision of using the internet well for teaching purposes, a vision ahead of its time one to two decades ago, and which probably, as a result, led to a lot of wasted time and self-defeating habits.

But, you might be wondering, if Canvas is a good LMS, why did I want to experiment with not using one? Here's why.

1. I want to be i…

The question for Silicon Valley

A single question has lingered over Mike Judge's Silicon Valley from the beginning. That question is whether he and his writing team—call them "the show"—believe that Richard and his unlikely crew of can-do programming losers not only can but ought to "win," and that such a win could be genuinely transformative and good for the world, or whether the system and culture of Silicon Valley are so fundamentally corrupted that even to win is to lose.

This dynamic has made the show worth watching till the end, but frustrating at times as well. It's not just whether Richard or his friends might "break bad," which the show entertained for a while. It's whether we, the audience, ought to cheer on Richard when he triumphs over the big bads of Google and Twitter and Facebook (or their stand-in "Hooli"), with his dream of a "free internet," or whether we ought to see through the self-serving rhetoric that attends every such dream.

Judge…

On blissful ignorance of Twitter trends, controversies, beefs, and general goings-on

Being off Twitter continues to be good for my soul as well as my mind, and one of the benefits I'm realizing is the ignorance that comes as a byproduct. By which I mean, ignorance not in general or of good things but of that which it is not beneficial to know.

When you're on Twitter, you notice what is "trending." This micro-targeted algorithmic function shapes your experience of the website, the news, the culture, and the world. Even if it were simply a reflection of what people were tweeting about the most, it would still be random, passing, and mass-generated. Who cares what is trending at any one moment?

More important, based on the accounts one follows, there is always some tempest in a teacup brewing somewhere or other. A controversy, an argument, a flame war, a personal beef: whatever its nature, the brouhaha exerts a kind of gravitational pull, sucking us poor online plebs into its orbit. And because Twitter is the id unvarnished, the kerfuffle in question is…