It’s Good to be the King

Mel Brooks is back in the news. At age 95 he recently released a memoir entitled “All About Me!”, which chronicles his long and illustrious career. The book has gotten good reviews, and is on my list of books to eventually read (along with hundreds of others). You can count on Brooks to be irreverent, funny and shamelessly self-promoting. While he has had hits and misses, Mel has never done anything that was dull. 

The publication of this book has also brought a regurgitation of Brooks’ comments regarding political correctness. Mel is not a fan, to put it mildly. He has gone so far as to say that political correctness is the death of comedy. As Brooks put it, “Comedy has to walk a thin line, take risks. Comedy is the lecherous little elf whispering in the king’s ear, always telling the truth about human behavior”. Other comedic legends, such as Jerry Seinfeld and Dave Chappelle, have echoed those sentiments. 

It is hard to argue with Brooks generalizations about comedy. Much of the best comedy is subversive of pretentions and biases. Where would we be without Monty Python’s Upper Class Twit of the Year, or Richard Pryor’s Black and White Lifestyles standup routine. Or, for that matter, Brooks’ Blazing Saddles. We need comedians to burst the balloons we inflate justifying our, often absurd, behavior. 

But are there limits? Brooks himself has said “I personally would never touch gas chambers or the death of children or Jews at the hands of the Nazis”. Is he suggesting that he would be OK if others did, or that this topic should be verboten for all? Does it matter who is doing the bit, or their intent? Are we, the audience, prohibited from crying foul if we think a comedian has gone too far towards promoting stereotypes, rather than deflating them? 

Political correctness has become a catchall defense for anyone who wants to deflect criticism from nasty, derogatory generalizations. Rather than justify controversial pronouncements, it is easier to blanket naysayers with this meaningless cliché and act as if the critics are the ones being offensive. It is a shallow, but all too often effective, defense.   

That being said, I don’t think that is what Brooks was talking about. Mel was not trying to justify anyone’s misogynistic or ethnic slurs. From everything we know about him, he is probably just as appalled as anyone else by statements from those in power, or seeking power, that denigrate people based on race, gender or sexual preferences. He just wants to carve out a niche for comedy to use those stereotypes to deflate those that embrace them.  

It is not as easy as Brooks suggests to put comedy into its own category, as evidenced by the fact that most of those who lauded his comments were political pundits who could care less about comedy. Whether he likes it or not, some people are going to use comedic bon mots to justify their own prejudices. And those justifications have real life consequences.  

For instance, hiring decisions have all too often been made not on personal qualifications, but on broad ethnic and gender generalizations. The jokes that are told around the proverbial watercooler turn into the bases for decisions regarding suitability for employment. Biases are reinforced and become part of the covert decision-making process. 

Political correctness, for all of the baggage that it has acquired, is nothing more than an effort to make us stop to realize those consequences. It is not an endeavor to protect people’s feelings, but to bury the societal assumptions that have kept certain groups from gaining equal access to opportunities. Isn’t that what this country is supposed to be about? 

By the way, comedy is doing just fine. For example, the movie Don’t Look Up was a biting satire, that managed to “tell the truth about human behavior” without stereotyping any specific group. In fact, the movie was probably stronger because it played off our universal inclination to take almost any issue, generalize it without full understanding, and then proclaim our beliefs. 

Stand-up comedians like Amy Schumer and Tim Heidecker have no problem mining our daily foibles without crossing political correctness lines, whatever they may be. They, and many comedians like them, have found a wide array of outlets, whether it’s through traditional mediums like movies and TV, or through alternative platforms like podcasts, Instagram or YouTube, to practice their craft. We probably have more opportunity to laugh now than we ever have.  

That doesn’t mean that these comedians do not have to walk the thin line that Brooks has drawn in the sand. Yes, today’s comedians have to be more adept at avoiding casual insult than Brooks generation had to be. But they do not seem to be unduly hampered by having to take that stroll.  

At the end of the day, it is up to us, the audience, to determine what we will tolerate and what we will not. There will be those we turn away from either because their “humor” does nothing more than denigrate those that are not like them (the Andrew Dice Clay’s of the world), or because their personal behavior makes laughter impossible (hello, Louis C.K.). But hasn’t that always been the way? 

Maybe Brooks is right and some of his movies could not be made today, though I am not so sure. We still long to laugh at ourselves and the silly things we do. But if people want to use the guise of comedy to disparage, diminish and deprecate, we have every right to call them on it. That’s not political correctness. It’s the power of the peanut gallery, and long may it reign.     

Reading the Tea Leaves

Our emergence into 2022 struck me more than most flipping’s of the calendar. Maybe it’s all of those 2’s. Maybe it’s that we are almost quarter of the way through the 21st Century. It probably has a lot to do with the sci-fi novels and movies that I’ve consumed over the years, many of which prognosticated a much different world by the time we got this far. Whatever the reason, it is taking me some time to absorb where we are on the cosmic timeline.  

Many of the predications, both fictional and otherwise, posited that technology would significantly change the way we live. The Lords of Technology promised that their innovations would do more than enhance our lives, they would transform them. The sense was that would have happened by 2022. Much of the futurist fiction agreed with that premise, but assumed that the transformation would be catastrophic, not beneficial.    

The innovations have been significant. The personal computer, the internet, e-mail, the cell phone, social media. All have altered how we operate on a day-to-day basis (and all were 20th Century inventions). But I am not sure that they have changed how we view the world, or how we relate to each other. They may have brought certain traits to the fore, like social media’s platform for tribalism, but those traits have always been there. 

A good argument can be made that the real shift was with the industrial revolution, which started in the mid-1700’s, and that the societal changes since then have flowed from there. It was with the industrial revolution, and the enlightenment which promoted it, that science, for better or worse, became central to our world view, replacing a predominantly theistic outlook. That trend may have accelerated through the 20th and 21st Centuries, but there has been no significant break in perspective. 

Humans are basically conservative. Not politically, but personally. Most of us take comfort in continuity. We may adapt to technical change, but we still want it to fit within the framework of the familiar. We generally bend the technology to the lifestyle we know, using it as an augmentation rather than a disrupter.  

It will take something more radical than electronic gadgets, no matter how sophisticated, to break us out of the habits and mindsets that currently predominate. Climate change has that potential. If trends play out as some scientists predict, parts of the globe could become uninhabitable, and scarcity more prevalent. That could well result in deep seeded alterations in how we connect to the earth and each other.  

The current chip shortages prefigure another possible radical shift. Perhaps at this juncture our world is not as threatened by new technology as it is by a loss of the technology we have come to depend on. So much of what we use in daily life requires those chips to operate. A more permanent disruption of that supply chain would entail a step backwards, which would be much more difficult than steps forward.  

Of course, none of that will happen in 2022, if it ever does (the climate change scenario is obviously much more likely). Even if any of these catastrophes do occur, they will play out in ways that we cannot imagine. They will demand flexibility and innovations that are beyond my ability to conjure. 

It is that personal inability to envision the future that will keep me going back to dystopias. The ability of books like Blindness by Jose Saramago or The Children of Men by PD James to awaken a future when humans are forced to rethink life’s basic assumptions are fascinating. The power of movies like Mad Max: Fury Road, Blade Runner 2049, or even Wall-E in creating a broken-down world is striking.      

That is the fun of dystopias. They let you explore worst case scenarios without actually experiencing them. At the same time, they reinforce that the future is not set in stone, and that we need to carefully consider the impact of what we are doing today on future generations. Not a bad message to receive. 

I would be remiss if I did not give a shoutout to depictions of the future that are not so grim. The various Star Trek series are at the top of that list, providing a possible societal evolution that is positive. I think that goes a long way towards accounting for its continuing popularity. As gripping as the dystopias can be, it is heartening to imagine a world where we are not careening through deserts in search of remnants of water and food, while fighting off drug addled automatons. 

2022 will undoubtedly be a year with surprises. They all are. But it is unlikely to be a watershed that will change the basic structure of our society. We will continue to muddle along as we have with incremental steps to who knows where. In the meantime, the futurists will continue to give us plenty to think about, and look forward to, or not.           

Ding Dong, Ding Dong

It is inevitable that the end of a year will bring reflection. The turning of the calendar all but requires that we look back on the year that’s past to remember the key events, the best music, TV shows and movies, and the famous who have passed away. The internet is awash with such lists, so it is somewhat presumptive to make my own. Yet, I feel called to do so, with a focus on positive stories, putting my own inimitable spin on events we know well, and highlighting a few that might have fallen through the cracks.  

  1. Workers got a taste of working at home and there is no going back.  This started as an emergency measure in 2020, but it became clear this year that at home work is here to stay, whether companies like it or not. Employees are going to demand this flexibility. It will be a factor in recruiting that companies cannot ignore if they want to attract top talent.  
  1. Europe generated more energy from renewable resources than from fossil fuels in 2021. Even if you ignore questions of climate change, this translates to a better use of global resources, less reliance on imported energy and better air to breathe and water to drink, making this a healthier planet to live on. Hopefully the rest of the world follows suit. 
  1. Medical science came through with a COVID vaccine. We have to remember that initial estimates for a vaccine were a minimum of 1 to 2 years, yet this was basically done in 8 months. Put aside the political debates. This confirms that when we focus our resources, whether for financial gain or otherwise, we can achieve great things. Dwarfed by this accomplishment, yet maybe in the long term even more significant, is the fact that researchers also developed a vaccine for malaria, a devastating disease for much of the world. Both are reminders of human potential properly directed. 
  1. On-line social networking has come in for harsh criticism, much of it deserved, so it is worth noting that in less than a week the WallStreetBets subreddit raised $350,000 to “adopt” 3,500 endangered mountain gorillas through the Dian Fossey Gorilla Fund. It started when a user posted that he had adopted a gorilla, and exploded from there. Obviously, this is not earth-shaking news, but it is nice to get a reminder that media can be used positively. 
  1. Our democracy held. When protestors stormed the Capital on January 6 there was a real question about what would follow. Would there be similar uprisings throughout the country? Would the military intervene, one way or the other? Would there be a unilateral rejection of certified election results? These things did not happen. Instead, the rule of law prevailed.  Questions on election validity played themselves out in courtrooms, where they belonged. We had a peaceful, if tense, transfer of power three weeks later. 
  1. Derek Chauvin is convicted in the death of George Floyd. There is no doubt that police have a difficult job, or that they can make honest mistakes in times of stress. But there must be a limit. Killing a prone suspect by kneeling on their neck for nine minutes has to be unacceptable. If it is not, the police have no accountability, no matter what they do. The conviction of Chauvin recognized that limits exist, and they can be enforced. 
  1. Juneteenth is declared a federal holiday. Like many government actions, this is purely symbolic. It doesn’t erase the history of slavery, or its vestiges. However, the end of slavery is a milestone. A turning point in our quest to live up to a cornerstone of this country’s founding premises, which is that all people are created equal. We know that we have not always lived up to that standard, but recognizing the removal of its major antithesis is a step in the right direction. 
  2. The African Continental Free Trade Agreement went into effect. It is  incredibly easy to forget Africa. News from this huge continent tends to be the last  reported on, unless it is something awful. We too often come away with a  sense that Africa is lost, with a bleak past and a  bleaker future.  This Free Trade Agreement won’t change that impression overnight, but  should enhance the competitiveness of member states within Africa and in  the global market. That should lead to a more balanced world in which the  exploitation of poorer countries is more difficult. I think that’s good for all  of us. 
  3. The withdrawal of US forces from Afghanistan. It seems odd to list this as a  positive development, since our withdrawal resulted in the return of the  Taliban. However, as I have argued previously, the withdrawal was  inevitable, as was the aftermath. We were never able to instill democratic  ideals into the Afghan people, and probably could have stayed another 50  years without doing so. At some point we had to pull the plug.   
  4. 2021 was not 2020. I know, this may be faint praise, but it is worth noting.  For much of 2021 it seemed that we were turning the corner on the COVID  pandemic. This meant a return to travel, at least domestically, the  reopening of arts venues, and a sense that the world could be explored  beyond the four walls of our homes. It also helped that we were not going  through a Presidential election. Even with the current surge, it is much  easier to look forward with some sense of optimism than it was a year ago.  Hopefully, next year’s list will end with an entry saying that 2022 was not  2021!!!               

The Spirits of Christmas

I have been on an odd movie kick over the last few months. It started with a random decision to watch the three Sergio Leone/Clint Eastwood Spaghetti Westerns, A Fistful of Dollars, A Few Dollars More and the Good, the Bad and the Ugly. I enjoyed the development of Clint Eastwood’s Man With No Name across these films, and the increasing assurance of the director as the plots and characterizations got more complex. It made me want to watch other movie series to see if they could match the consistency and creativity. 

Some of the series I chose were predictable, like The Lord of the Rings – Extended Version (except for one glaring plot hole filled in the last movie, probably not necessary) and the Daniel Craig Bond movies (stick with Casino Royale and Skyfall). Other series are less well known, such as the Koker Trilogy, by Iranian Director Abbas Kiarostami. (Where is the Friends’ House is one of the most unexpectedly tense movies I have ever experienced. Who would have thought that watching a 9-year-old boy search for his schoolmate’s house to return his homework book could be so suspenseful?).  

As we approached years’ end, I realized that most Christmas series were of little interest (no, I do not want to watch The Santa Clause 3: The Escape Clause), so I decided to create my own. Like many people, I have certain Christmas movies I watch every year. One of those is the 1951 A Christmas Carol, starring Alister Sim. I decided to make it a Dickens Christmas and pair it with The Man Who Invented Christmas, recently recommended to me (thanks P.S.), and A Muppet Christmas Carol, which I remembered fondly, but had not seen in years. 

Even before the Hallmark channel decided to churn Christmas movies out like bad sausage, such films were a mixed bag. Unless part of the sub-sub-genre of Christmas horror films (Krampus, Treevenge, Silent Night, Deadly Night), Christmas movies have to ultimately be uplifting and cheery. Even those that buck the usual tropes, like Bad Santa and Scrooged, must give in at the end to reflect the “true meaning of Christmas”. 

In fact, I would go so far to say that if a movie involving Christmas does not end on a positive, edifying note, it is not a Christmas movie. Fanny and Alexander has one of the greatest Christmas segments ever, but those scenes are there to contrast the children’s subsequent life with their stepfather, not evoke Christmas. Yes, Die Hard takes place on Christmas day, but that is superfluous to the plot. Not a Christmas movie!!!  

Knowing what you are going to get with a Christmas movie is not necessarily a bad thing. There is something nice in being assured that you can pluck any Christmas film off the shelf (a dated Blockbuster reference) and come away smiling. Still, it is a fine line between a film that is comfortably elevating and one that is sickly sweet. No other genre crosses that line again and again like Christmas movies.  

A Christmas Carol provides the ultimate Christmas arc. A character who rejects everything that Christmas stands for – hearty fellowship, charity, goodwill to all. Then, through a string of unlikely events, comes to realize the enchantment of Xmas and fully embrace its magic (bring out the hankies).  

According to Wikipedia, there have been 30 live action filmed versions of the Dickens story, 17 animated Scrooge films and a dozen TV renderings, though I have no doubt the list is incomplete. There are no surprises in any of them. You know the characters. You know what is going to happen to them. And yet the story stands up, whether Scrooge is Captain Jean-Luc Picard (alias Patrick Stewart), an American like George C. Scott (amazing similarity between his Patton and his Scrooge), or Mr. Magoo.   

And yet, for me the Alister Sim version is the one I come back to time and time again. I am not sure why. Part of it is undoubtedly that I know it so well, but I think there is more to it. Not only is this Scrooge spot on, but so are Fezziwig, Marley, Mrs. Dilber and the other secondary characters. The film perfectly captures the spirit of the story, its otherworldliness, its humor, its intensity.  

The Man Who Invented Christmas, about the writing of A Christmas Carol, attempts the difficult task of mixing the tortured artist and Christmas genres. Surprisingly, the film pulls it off. It does not hurt that Jonathan Pryce plays Dickens’ father, and Christopher Plummer the avatar of Scrooge. It is a pleasure to watch this classic story emerge, while at the same time Dickens is learning the lessons imparted by his own characters. I am not sure if it is historically accurate, but I really do not care. 

The Muppets Christmas Carol deftly manages to blend the wonderful zaniness of the Muppets with this, at times, dark tale. The songs by the under-appreciated Paul Williams, who also wrote The Rainbow Connection and Rainy Days and Mondays, are terrific. (We’re Marley and Marley is my favorite). But what really surprised me was that it was mainly filmed with natural lighting, maintaining the gloom of old London so central to the story. A more than worthy addition to the Christmas Carol inventory. 

Another season passes. Time to park these movies on the to watch list for another 11 months. However, it’s nice to know that come next year, and the year thereafter, there will be films to revisit that are guaranteed to raise a smile and put a clutch in the throat. Until then, as Scrooge would say, “A merry Christmas to everybody! A happy New Year to all the world!”.       

Flip for Side 2

It is that time of year again. Time for the WXPN greatest countdown. Some of you may remember that last year they did the 2020 best songs of all time, with Thunder Road by Bruce Springsteen the ultimate winner. This year it’s the 2021 greatest albums. Let the debates begin. 

I came of age in the heyday of the album. In the 1970’s you couldn’t make a playlist across artists, or shuffle through multiple discs. Skipping a song meant getting up, lifting the needle, and the putting it back down again on the next track, usually with an accompanying screech that told you another scratch was coming. Needless to say, it didn’t happen very often. You just accepted that there were going to be tracks on most albums you had to put up with. (i.e., Maxwell’s Silver Hammer). 

It is why finding an album that was outstanding first song to last was a revelation. There is something satisfying about a side 2 deep cut that you like better than the “hits” (like “Chest Fever” of The Band’s “Music From the Big Pink” album). It is even better if the album had no hits and you feel like you are in on something that the radio listening world was missing. (Radio was just awful in the 70’s, unless you liked to listen to the same song over and over again). 

An album is also a better measure of an artist’s worth. Many musicians can come up with a good song now and then. “Come on Eileen” is a great rocker, but do you really want to listen to a full Dexys Midnight Runners’ album? It takes real talent to put together 10 to 14 songs worth listening to, let alone sustain that across multiple albums. Pink Floyd, with the four albums from “Dark Side of the Moon” through “The Wall”, pulled it off, but few have been able to do so. 

The trouble is that appreciating an album takes commitment. You have to listen to it over and over. Many tracks sound good first time through, but wear quickly. I am sad to say that I haven’t taken the plunge all that often for many years, so my list is dated. I know that there are outstanding albums coming out annually (St. Vincent’s, “Masseduction” (No. 711), and Father John Misty’s “I Love You Honeybear” (No. 824) are some of the more recent that come to mind), but I hear few of them all the way through. 

It is almost surprising that artists today bother with albums the way music is consumed now. Streaming that allows playlists, and shuffling begs for most of an artist’s output to be ignored. And yet, musicians keep putting out quality material (My son would point you to Phoebe Bridgers last two albums, Nos. 536 and 551). My guess is that much of it does not get the listens it deserves.    

Looking at the WXPN responses, I am not the only one to be stuck in the 70’s. The list, especially at its higher reaches, is scattered with albums from the 2000’s, but earlier years predominate, with the 1970’s having 150 albums more than the next nearest decade (the 1990’s). Just as telling, almost half of the top 100 come from the 1970’s, with only 3 issued in 2000’s, as well as 8 of the top 10 (the only two holdouts being Beatles albums). 

I know that a lot of that has to do with demographics. There are way too many old people like me voting and not enough younger listeners. Plus, the WXPN audience is mainly white, which definitely skews the list (nice to see To Pimp a Butterfly by Kendrick Lamar in there at 137). Yet, I do think the list reflects the change in how music is being consumed, for better or worse.  

I submitted my top ten list some time, ago and am not fully sure I remember what I sent in. However, to the best of my recollection it was as follows (in no particular order). 

  1. George Harrison – All Things Must Pass (No. 62) 
  1. The Beatles – The White Album (No. 35, down from No. 6 when they last did this in 2005)  
  1. John Coltrane – Giant Steps (No. 274. A Love Supreme came in at No. 68) 
  1. Miles Davis – Kind of Blue (No. 22. Top Jazz album) 
  1. Steely Dan – Aja (No. 19) 
  1. Pat Metheny Group – Still Life (Talking) (No. 804) 
  1. Bob Dylan – Blood on the Tracks (No.7) 
  1. Pink Floyd – Wish You were Here (No. 40) 
  1. Bob Marley and the Wailers – Exodus (No. 107) (Thanks Dan W.) 
  1.  David Bowie – Blackstar (No. 1562) 

The WXPN Top 10 were: 

  1. The Beatles – Abbey Road (also No. 1 in 2005) 
  1. Pink Floyd – Dark Side of the Moon 
  1. Bruce Springsteen – Born to Run 
  1. The Beatles – Sargeant Pepper’s Lonely Hearts Club Band 
  1. Stevie Wonder – Songs in the Key of Life (up from No. 54) 
  1. The Clash – London Calling 
  1. Bob Dylan – Blood on the Tracks 
  1. Fleetwood Mac – Rumours (up from No. 36) 
  1. Joni Mitchell – Blue 
  1. Carole King – Tapestry 

Again, all of this has no meaning, but it is still fun. I wonder what XPN will do next year?

A Pertinent Question

A Mr. Richard Feder of Fort Lee, New Jersey asked, “Hey Tomser! Why do you think we focus on the significance of World War I when the Spanish Flu, starting in 1918, is estimated to have killed twice as many people?”. In response, I gave a typically shallow and glib answer, saying, “I think as humans we are attracted to stories, and the WWI stories are just more compelling than stories of the flu”. As I thought about this more, I realized that this is a material question. Why do certain events resonate in our collective memories and others not, even if they should, and how this has changed over time? 

I read a book a number of years ago (don’t ask me the title) where the author posited the theory that humans had evolved in respect of memory. He claimed that humans had traditionally relied on what they could personally recall to remember what they needed to, and developed the skill to retain it. Now, he argued, humans rely on sources outside the brain for memory storage, changing not only how we access memories, but our actual anatomy.    

I am not sure that I bought the biological aspects of this argument, maybe because they went over my head (which they did). But the idea that we had, to a large extent, outsourced our memories has stuck with me. Whether we like it or not, our understanding of events, and even the importance of events, relies primarily on sources external to us. That is true for things that happen to us directly, where we often rely on such items as photographs, but even more so for things we were not directly involved in.  

Outsourced memory has been a expanding process that went into hyperdrive in the 20th Century. As travel and communication became easier, we all of a sudden had a wealth of information at our fingertips about what was going on not only in the next town, but on the other side of the world. With the advent of the internet, that knowledge could be instantaneous. (For example, a quick search disclosed multiple days of on-going violent protest in the Solomon Islands. Who knew.). 

The problem is that this is more information than any individual can digest. We have access to everything, but must filter the news somehow. We cannot do that ourselves, and so have to rely on sources which we hope are relatively honest and accurate. Those sources must also pick and choose what they cover and emphasize, even within individual stories. 

This conundrum is multiplied when it comes to understanding historic events. For example, there are thousands of books, websites, movies, etc. about World War I, more than any individual, even an obsessive, can absorb. And that is just if we isolate it as a topic. What about everything that led to WWI, like Austria-Hungary’s 1908 Annexation of Bosnia and Herzegovina, or the Moroccan crisis of 1911? And what about everything that followed, like the partitioning of the Middle East, German hyperinflation, not mention the Spanish Flu?  

 So, we are stuck with what comes across our path. What books are recommended to us, what movies are streaming, what internet sites Google chooses to highlight. Those decisions are going to be largely driven by what is accessible and popular, not by what is the most insightful or thorough. 

I recently encountered this with the book The Bomber Mafia by Malcolm Gladwell, covering debates within the American Air Corps about bombing strategies, leading to the incendiary bombing of Japanese cities in the months before the atomic bombs were dropped. Gladwell paints this as a morality tale between two Generals with competing views of the morality of bombing non-combatants. 

It is an interesting, readable book, but, as pointed out in a review by a history professor, is incredibly simplistic. The debate was more one of tactics, not morality, and encompassed many more players than the Generals Gladwell highlights. However, as the Professor also points out, Gladwell’s best-selling take is likely to become the defining narrative of these bombings because of his popularity and the book’s accessibility. A more nuanced history will have to wait, and even then is unlikely to be read by many people.    

This historical culling is also impacted by the fact that the moving image is generally more memorable than the written word. Movies and television play an outsized role in determining what historic events are burned into the public conscious and which are not. This can spark great public debate, as did the airing of Roots in respect of slavery, but can also leave less dramatic incidents out in the cold. 

The other problem with a reliance on moving images is that those making these films and shows are driven as much to entertain as they are to enlighten. This means cutting historic corners. Dunkirk was a movie that sought to depict a key landmark in WWII as realistically as possible. Yet, in watching the movie you would think that there were only about a dozen planes in the sky during the retreat, rather than the hundreds that were there. It was just dramatically more satisfying to focus in on a handful of pilots. History be damned. 

More insidious are movies that get the factual record horribly wrong, but manage to instill those errors in the public conscious. The most notorious of these are, of course, Birth of a Nation and Gone with the Wind. Both are great movies, and both distort the realities of slavery and reconstruction. Both also were hugely influential on how Americans viewed the Civil War for decades (No, Woody Wilson, it was not all “so terribly true”).  

So, back to Mr. Feder’s original question, from which I wandered so aimlessly. Much of what we remember as a society, and how we remember it, is out of our hands. For the most part we have to rely on others to present materials for our consumption and absorption. We can fight this on a personal level by taking in multiple narratives of an event, and by reading historic accounts that may challenge accepted wisdom, but there is only so much time in the day. And even then, that wouldn’t change societal focus. So there we are. Feel better now Mr. Feder?                

Just Let it Be Already*

I am somewhat embarrassed by my obsession with The Beatles. How banal and mundane. When asked what about music, I am hesitant to say that The Beatles is my favorite band, and that I still constantly listen to them. You can see the yawn being stifled, and the usually futile attempt to hold back the inevitable response of “Can you be any more boring?”. 

I have often thought that I need to come up with a more eclectic response. Maybe assert my love of Folk punk (let’s put on some Violent Femmes, or The Pogues), or perhaps Instrumental Rock (tough to beat Jeff Beck, or Soft Machine). Better yet, disavow pop altogether and proclaim my love of Free Form Jazz (wasn’t the Free Form Jazz Odyssey the best part of The Spinal Tap movie?), or classical Futurist music (all hail The Art of Noise manifesto). 

Alas, I am stuck with who I am. I am doomed to listen through the entire deluxe box sets of Sargeant Peppers, The White Album, Abbey Road and Let it Be (The White Album Esher demos are especially good). I was inexorably drawn to an 18 month “Masterclass” in Beatles lore, dissecting every album and controversy. (Did you know that the first British performance of the Beatles as a group was at the Casbah Coffee Club). I get mad when I perceive that my favorite Beatle (George) is being dissed (He was right to walk out!!!!!). 

While I grew up on Beatles music, my obsession really started in college with The White Album. I listened to it over and over again, mesmerized by what I heard (unfortunately, so did Charlie Manson). I and my friends used a pencil to playing it backwards, listening for Paul is dead clues (John definitely says “Paul is Dead. Miss him. Miss him. Miss him.” at the end of I’m So Tired). I bought most of my Beatles albums used, and still anticipate skips in certain songs 40 years later. 

I keep asking myself what keeps drawing me back to these songs. Some of it is no doubt nostalgia (oh no, there’s that word again). Beatles songs certainly evoke memories of a time and place. But then again, so do many other songs and I don’t listen to them repeatedly.  

There is also the complexity of the songs, which reward multiple listenings. The Deconstructing the Beatles series by Scott Frieman (one of my Masterclass instructors) highlights the myriad nuances and influences embedded in these tracks. Those influences have led me other directions, like to an appreciation of Indian music (maybe I can use that as my go to response to questions about the music I like). 

The incredible progression over the eight years of recording is definitely a factor. There are light years between I Want to Hold Your Hand and the Abbey Road medley and yet you can see the steps leading from one to the other. Witnessing that growth is fascinating. 

Finally, there are the Beatles themselves. There personalities were established in A Hard Day’s Night (the best rock and roll movie ever), and built from there. Few have faced the glare of fame with as much humor, honesty and aplomb. Knowing those personas, even if it is through the lens of media, enhances the performances.         

All of this is coming to the fore now because of the long anticipated (at least by me) Peter Jackson retelling of the 1969 Get Back sessions. I have watched the official trailer and all of the various promotional videos many times. I have read every interview I could find about the making of the documentary. I cheered the expansion from a 2-hour film to a 6-hour extravaganza. I am ready!!    

I saw the original Let it Be movie at midnight showings when it was still available. I never bought into the narrative that it was a film of a band breaking up. After all, Abbey Road followed. I am looking to Peter Jackson to set the record straight.  

Peter Jackson could not have been a better choice. While still best known for his Lord of the Rings, he vaulted to my list of favorite directors with his WWI documentary, They Shall Not Grow Old. I have mentioned this film before when discussing my preoccupation with WWI. It is the epitome of bringing the past alive, and I trust him to bring the same magic to this film. 

More than anything else, this will be a chance to wallow in my obsession. I can spend multiple nights devoted to my favorite band, and justify it as witnessing a cultural event. Even if, for public consumption, my real love is Psychedelic Soul or Acid Jazz, there is no need now to hide my latent Beatlemania. I can put on my mop top wig, John Lennon glasses and Beatle boots and scream to my heart’s content. I can’t wait. 

*This was actually written before Get Back premiered on November 25, but I got caught up in holiday planning (I hope everyone had a great Thanksgiving), and have been in a bit of a tryptophan haze over the last couple of days.

Get Back to Where You Once Belonged

I seriously dislike the word nostalgia. It is a musty word. A word that connotes clothes that have been too long in a cedar closet. Or a gumball covered with lint emerging from your pocket. And yet, if you keep it in the right context personal nostalgia can be both incredibly enjoyable and illuminating. 

I spent ten days over the last two weeks engaging in some personal nostalgia. I visited friends from my college days at the University of South Carolina, and then went to Charlottesville, Virginia to spend time with a close friend from my working life. The trip was very gratifying, not least because it was good to be on the move again after the COVID isolation, but also because it forced me to think back on who I was at specific times of my life and how that long ago self still inhabits who I am today. 

Life generally forces us to live in the present. There are so many things that must be dealt with on a daily basis that it is not possible to give much thought to our past iterations. Even when we do so, it tends to be very cursory, calling to mind a memory here or there that makes us smile, or cringe. We (or at least I) rarely think hard about the odyssey that got us here. 

Immersing myself in that past, even for a few days, forced that reflection, especially as my trip entailed many hours alone in a car with a cell phone that would not recharge (just try to find a decent radio station outside Fayetteville, North Carolina). The memories flooded back. I conjured up people I haven’t thought about for years, even though I often could not recall their names. I thought of times that were great fun, as well as times of great guilelessness and stupidity. The person I was seemed both a distant relative and a boon companion. 

Spending time with lifelong friends takes you down that rabbit hole even more. People often comment how very quickly you fall back into comfortable patterns of communication and interaction with old friends. How a part of you that you haven’t seen for some time reemerges. I find that very true.  

I think, however, it is more than just a passing dive into nostalgic revery. The person I was 40 years ago has never left me. The essence of who I am today is tied very closely to who I was then. The so-called formative years were not only childhood, but each swerve along the path, through college and law school, into the early years of working up to my last days before retirement, all leading up to where I am now. 

That doesn’t mean that things haven’t changed. It’s impossible to go through life, with its many twists and turns, and remain exactly the same. I know that fewer things strike me as funny than once did, and I miss that. I also know that I was ridiculously naïve and innocent, and while innocence may seem like a blissful state, it is unsustainable, and not even preferable, unless you’re willing to put your head in the proverbial sand.  

Regardless of those changes, falling back in old rhythms for a while strikes me as very healthy. It reminded me that, even now, personality is not static. Time never stops, and neither does our development. I continue to build on the edifice (shaky as is) of what has gone before. Who I am is an on-going question that is never fully answered. 

Just as important, it is great fun. Being able to kick back and relax with people who have seen you at your best and worse is cathartic. You’re able to pull out refences that make sense to no one else (e.g., the trestle, home run derby, Fencourt), and riff on them. And there is nothing to do but laugh at yourself and the silly things you did. 

Testing memory is, of course, a mixed bag. Many incidents come rushing back, but how many of those incidents are as I recall, is very up for grabs. Did I really do the things I think I did, as I remember doing them? Maybe yes, maybe no.  To what extent am I editing my history? Who’s to say. My friends’ memories are as suspect as mine, and luckily there were no cell phone cameras in those days to resolve any discrepancies. 

All that being said, I would not want to live in that nostalgic haze. The temptation to do so is why nostalgia has such a negative connotation. Memory has a tendency to whitewash the past. I remember much more of the good than the bad. It is dangerous to get too caught up in that and see the bygone days of youth as some idyll. The reality of the present can sour in the glare of such a fantasy, and that is a living death. 

The truth of the matter is that I would not want to go back. There would be too much to give up. For all of the ups and downs of the last 63 years, there are still things to look forward to. And while the past inhabits who I am today, it is no longer me, with all my flaws, anxieties and regrets, but also with all my hopefulness (still somewhat an innocent) and excitement about each new day. 

I know that I am going to keep connecting with old friends. They are just too important to discard, and too much fun to be around. Plus, in a way I can’t really define, looking back at the past helps me appreciate what I have now. It’s a very odd process, this thing called life.    

            

We Need the Weird

The induction ceremony for the Rock and Roll Hall of Fame went forward on October 30. While I generally think that the whole concept of a Rock Hall of Fame is anti-Rock and Roll, and suggests little, if anything, about rock greatness, I still like to see who is being inducting and who is left out. It’s especially interesting now that most of the obvious inductees have been in the Hall for quite a while.   

This year’s inductees are a typically mixed bag. I am glad to see Tina Turner make it, especially when the Hall inducted a much less talented and influential Stevie Nicks a couple of years ago. You have to admire Todd Rundgren, though his overall output is spotty. I enjoyed the Go-Go’s, but did they do anything after Beauty and the Beat? Not that I know of. 

With the inductions come the inevitable complaining about Hall snubs. Kiss member Gene Simmons called it disgusting that Rage Against the Machine and Iron Maiden didn’t make it this year. Personally, I think that it’s disgusting that a band like Kiss, better known for their faux-goth make-up and Simmons ginormous tongue than their music, is in there.   

To my mind, there is one snub that outweighs them all. One snub that pushes the Hall to the edge of irrelevance. One snub that should make the current inductees blush with shame. I am, of course, taking about Weird Al Jankovic.  

Starting with the immortal My Bologna in 1979, through 1986’s Fat, 1993’s Bedrock Anthem and 2006’s White & Nerdy, Weird Al has provided us with some of the most unforgettable rock anthems of the last 40 years. Can any of us listen to Nirvana’s Smells Like Teen Spirit without picturing Al trying to sing with marbles in his mouth on Smells Like Nirvana? Isn’t the Dire Straits Money for Nothing that much better with Al’s converting it into a tribute to the Beverley Hillbillies? 

Some may complain that Al doesn’t write his own songs, but that ignores his incredible polka output. Who else can find the common polka heart in such songs as LA Woman, Smoke on the Water and Hey Jude as Al did with the Polkas on 45 masterpiece? And let’s face it, Bohemian Polka more than rivals Queens Bohemian Rhapsody for audacity and musicianship.  

I am the first to admit that Al’s career has not been without controversy, but isn’t that part of Rock stardom? I do wonder whether his continuing snub is tied directly to his squabble with Coolio over the exquisite Amish Paradise. Apparently, Coolie did not appreciate the brilliance of this piece (who else can write lyric’s like “I’m a man of the land, I’m into discipline. Got a Bible in my hand and a beard on my chin. But if I finish all of my chores and you finish thine, Then tonight we’re gonna party like it’s 1699”), and took affront. But similar controversy’s (alright, maybe not similar) have not kept Ringo Starr (33-year-old men shouldn’t be singing You’re Sixteen, Richard) or Genesis (Nice Latino accent on Illegal Alien, Phil) out. 

Many of you may think that I am kidding about this nomination, but I am not (or at least not entirely). Rock and Roll is at its worst when it gets pretentious (the same can probably be said about this Blog). And that is from someone who is a big prog rock fan (Yes to Yes). Weird Al is the antidote to that pretention. 

Let’s face it, for all of the hullabaloo about rock stars being artists, 90% of rock lyrics are downright inane. The Hall already has plenty of examples, such as “You say ‘black’ I say ‘white’. You say ‘bark’ I say ‘bite. You say ‘shark’ I say ‘hey man ‘Jaws’ was never my scene’” (Queen, Bicycle Race) or “Bonafide ride, step aside my johnson. Yes I could in the woods of Wisconsin”. (Red Hot Chili Peppers, Around the World). Weird Al’s “Have some more Yogurt. Have some more spam. It doesn’t matter if it’s fresh or canned. Just eat it. Eat it! Eat it!” is no less frivolous than “Showin how funky and strong is your fight. It doesn’t matter who’s wrong or right. Just beat it (beat it, beat it, beat it)” from the King of Pop. I could go on, but you get the point. 

Faced with this kind of junk from feted artists, we need someone to confront the silliness. We need someone to step up and cleverly point out again and again that rock and roll is something to enjoy. Something to bop your head to. Something to bring a smile to your face and a bounce to your step. And Weird Al is that man. 

There is no place on earth that needs this lesson more than the Rock and Roll Hall of Fame. They want to be the gatekeeper to the realms of rock royalty. But the whole concept of rock royalty is heretical. At its heart rock is a bunch of kids in a garage banging away on their instruments trying to come up with something that their parents will hate and people will dance to. Weird Al embodies that spirit like no one else. 

There’s my argument. Next year we must all unite to get Al into the Hall. Only then will it fulfill its mission to truly reflect the essence of rock and roll. It’s drive. It’s joy. It’s power. And, yes, it’s wackiness. It’s daftness. It’s zaniness. All hail Rock-and-Roll. All hail Weird Al Yankovic.              

Our History, Right or Wrong

I originally wrote this post about Critical Race Theory over the summer, but then I thought it was just too trendy. Another nothing issue for people to vent over before it disappears into the night. I should have known better. We are now a country where almost anything, no matter how flimsy, can be whipped into a political issue, and then then flogged to death by fanatics. After reading an article about a Wisconsin school board recall effort, and the centrality of this issue in the Virginia Governor’s race, I decided to revisit the topic.

My guess is that I am not the only one who had never heard of Critical Race Theory until this past spring. In fact, my guess is that most of us still could not say what it is, where it came from, who is propounding this theory, or what it teaches. And yet, it has become a rallying cry. A line in the sand that supposedly separates wholesome historical thought with Anti-American propaganda designed to destroy love for this country. Whatever it is, we cannot let it infiltrate our schools to pollute the minds of our youngest citizens.

You would think that CRT is a newly developed idea that was concocted by bitter, out of touch academics over the last couple of years. In fact, according to Wikipedia (the font of all knowledge) CRT originated in the mid-1970s in the writings of several American legal scholars. The core insight of CRT is that disparate racial outcomes are the result of complex, changing and often subtle social and institutional dynamics rather than explicit and intentional prejudices on the part of individuals (how’s that for an academic mouthful). In other words, merely making laws colorblind may not be enough to insure that application of those laws is colorblind as well. Of course, CRT is much more nuanced than that, but that’s the gist. 

Considering the prevalence of race as a driving force in the history of this country, CRT would seem essential to an understanding of the United States. At a minimum, we need to think critically to counteract two of the great historical lies of the 20th Century. First, that the Civil War was not about slavery, but instead about states’ rights, and its corresponding fantasy that Reconstruction was an utter failure which proved that Black Americans were unfit for full participation in American political life. 

The second great lie is that race was a Southern problem, and not one in the North. Discrimination in areas such as employment and education obliterate this false dichotomy. In fact, a strong argument can be made that CRT is more important in analyzing what was done in the North than in the South. Southern politicians were clear and unapologetic about the Jim Crow laws and their purpose. Northern politicians were more subtle, but the racial impact of laws in Northern states was just as profound.       

Enforcement of drug laws over the last 50 years could be exhibit number 1 for the need for CRT. The laws themselves are race neutral, yet enforcement has impacted the black community to a much larger extent, despite the fact that study after study shows there is little disparity between illegal drug use by Afro-Americans and other races. The disparity is in enforcement. Who is targeted. Who is prosecuted. 

The failure of a change in laws remedying the effects of long term discrimination is most evident in housing. The Fair Housing Act of 1968 prohibits discrimination in housing, yet it is battling against Federal, State and Local policies that specifically sought to segregate on the basis of race. The Fair Housing Act could not simply wipe that history clean, nor could it fully change ingrained practices with a stroke of the pen. If we don’t understand this history and its continuing impact, such as in respect of the 2008 sub-prime mortgage debacle, we don’t understand this country.

There is no surprise that such critical analysis has been a staple of academia for years. Isn’t that what should be happening at universities? Shouldn’t Professors be looking deeply into their field of study to understand the underlying realities of their specialty? Shouldn’t that be as true in history and law as it is in physics and biology? 

More importantly for this debate, there is no evidence that this theory has permeated elementary and secondary schools. While slavery and its impact is, and should be, taught, school curriculums are highly unlikely to delve into issues of systemic racial impact. One wonders whether the real goal is to eliminate any discussion of this uncomfortable topic.   

The scope of proposed laws banning the teaching of CRT would seem to back up this as the real agenda. For example, Tennessee’s proposed anti-CRT bill would ban any teaching that could lead an individual to “feel discomfort, guilt, anguish or another form of psychological distress solely because of the individual’s race or sex.” In addition to this vague proscription, it restricts teaching that leads to “division between, or resentment of, a race, sex, religion, creed, nonviolent political affiliation, social class or class of people.” Those who decried PC culture as raising a generation of hyper-sensitive snowflakes are now worried about their children’s “discomfort” and “psychological distress”. Give me a break. 

Of course, this debate is not about fields of research, but about controlling narrative, and continuing a never-ending manufactured culture war. The irony is that efforts to pass laws that ban the teaching of CRT reinforce the need for critical thinking in all areas of study. We need an electorate that can analyze what is being proposed, why it is being proposed and judge the ramifications of its passage into law. That can’t be done without critical thinking, whether it’s to analyze idiotic proposals like anti-CRT legislation, or crucial ones like the rebuilding of our infrastructure. Heaven knows we can’t rely on our politicians, or our TV pundits, to provide real analysis. 

Even though the European sojourn in the Americas is a small part of world history overall (500 years within 6000 years of recorded history), it is one of the richest and most unique aspects of the human story. It has incredible highs (the Declaration of Impendence; the Lewis and Clarke expedition; the Civil rights movement) and incredible lows (the slave trade; the Trail of Tears; Japanese-American internment). It is an incredible story of mankind’s quest for human, religious and economic rights. It teaches endless lessons about the nobility of that quest, and its pitfalls. To the extent that we try and pick and choose within that history what makes us feel good about ourselves, and white-wash anything that makes us uncomfortable, we denigrate those who, with all their flaws, espoused the ideals that make this country what it is. And that is the real crime.