Does Football Make America Better?
If we're searching for meaning and belonging, one option is to enlist (American) football in the struggle. I'm skeptical, but maybe I shouldn't be.
Some of you might have caught wind of a controversy I was part of two weekends ago. I tweeted—and then deleted because the attacks were more than anything I’ve experienced online—a criticism of Philly sports fans and of football more generally. I thought it was playful. And it wasn’t meant to be taken that seriously, but these, for some, are foundational commitments and allegiances, and I had mocked those allegiances.
I hesitate to share the tweet in question, but it’s important for context:
My mom, who doesn’t know how football works and until that moment had probably never uttered the word “Eagles,” said something important to me after the dust-up. As she well knows, I can be a curmudgeon sometimes. So she told me, Shadi, just because something annoys you doesn’t mean you have to share it on Twitter—“just let people be happy.”
Keep reading with a 7-day free trial
Subscribe to The Agonist to keep reading this post and get 7 days of free access to the full post archives.