It is my policy to respond to any e-mails I receive through ClistonBrown.com, so long as the e-mails are generally respectful and not threatening. However, I have received such a large volume of e-mail in response to my Observer column of May 31 (“It’s Okay If You’re A Republican“) that I have been unable to respond to each individual note, for which I apologize.
While a fair amount of the e-mails were abusive, and most were in disagreement with my column, I was pleased that so many of them were constructive and clearly written in an honest attempt to engage. If you took the time to write a thoughtful critique and I did not respond, I apologize, and I encourage you to stay in touch in the future.
In the next day or two, I plan to post on this site a general response to the comments I have received via e-mail and Twitter.
Whether you agreed with my latest column or not, thank you for reading and responding.
Over the last quarter of a century, there have been countless allegations and criticisms hurled at Hillary Clinton. Some of them have had merit. Others have been silly. Not a small number have been outlandish. (“She murdered Vince Foster,” for example.)
But by far the most ridiculous criticism ever leveled at Clinton is that she may, on occasion, have said some nasty things about the women who went to bed with her husband. This bit of foolishness is rearing its head again today, as Donald Trump apologists attempt to draw a parallel between his outrageously sexist and porcine statements, revealed this weekend, and how Bill Clinton has behaved with women over the years. Inevitably, Clinton supporters counter that Bill Clinton, unlike Trump, is not running for office, and that Hillary Clinton is not to blame for his behavior. And Trump’s remaining apologists counter, “Yes, but she degraded the women her husband preyed on.”
First, Bill Clinton didn’t prey on anybody, despite Republicans’ repeated insistence over the years that he somehow tricked or pressured impressionable young women into bed. He committed adultery with consenting adults who, like him, should have known better.
Second, and this is an important point, wouldn’t anybody whose spouse cheated have some negative things to say about the person or people he/she cheated with?
Seriously, if you find yourself criticizing Hillary Clinton for voicing a poor opinion of her husband’s mistresses, do yourself and everyone else a favor and just stop talking. You’re being ridiculous.
Don’t be expecting too much from tomorrow’s presidential debate, or any of the debates. We live in a time in which most people already have their minds made up and can’t be swayed by anything. If Donald Trump climbs up on the moderator table, drops his pants and defecates right there, his supporters will cheer.
The country is locked into two ideological camps. People are going to tune in tomorrow night largely to cheer for their side, much like a sports contest. They’ll boo if their candidate gets a tough question, in the same way sports fans boo every call against their own team. Most of the few who don’t tune in to cheer or boo will just be watching to see if a train wreck occurs.
Rah-rahs and gawkers. That’s the American electorate. We have met the enemy, and it is us.
It would be no exaggeration to say that I owe my position as a columnist with the New York Observer to Twitter. I happened to get into a Tweet-out with an Observer columnist about a year ago, and the editor noticed, took a look at my blog, and offered me the opportunity to write for him—an offer it took me roughly four-tenths of a second to accept.
By that time, I had been an avid Tweeter for some time and was beginning to grow a small following of people who liked my political analysis. I had a strict policy: nothing but political analysis. I soon found other serious political analysts liking and re-Tweeting my tweets, not to mention hundreds of regular Joes and Josephines with an interest in the topic.
Somewhere along the way, I went from learning how to master the medium to being mastered by it. I’m not sure exactly when that happened, but I soon found myself typing anything that I thought was going to get eyeballs, the more outrageous the better. And it worked. I kept getting new followers. It was still a modest following, but one that went up by about 30% in a year’s time.
It became addictive. It was like a sugar rush to throw out the snappy comeback and to get the cascade of likes and retweets. It meant I was finally being noticed; that I had something to say that people wanted to read. It was a giant ego boost to have all these people cheering me on whenever I landed a punch.
I found that I, introvert of introverts, had all kinds of new friends I’d never met. And suddenly, I was sucked in. I found myself compulsively checking my phone to see the latest Tweets. It began to be where I got most of my news and, increasingly, a disturbingly large share of my social interactions—a substitute for the real social interactions I increasingly failed to seek out in a new city.
Somewhere along the line, the line got crossed. It became personal. And I found myself getting into increasingly nasty arguments with trolls. Or was I the troll? I guess that would depend on one’s point of view, wouldn’t it?
The more invested I got in it, the less fun it became and the more it seemed like an unhealthy escape from real life. I began to treat occurrences on Twitter as if they really mattered to me personally. I actually stressed over whether somebody I’d never met, and never would meet, managed to get the better of me in an argument. I guess you could say the people on Twitter did matter and they do matter, because the people I interact with are indeed real people with real lives, but it’s dangerous to forget that you’ve got to live your life where you are. You can’t live it in the Twitterverse, even if it seems like a more interesting world than the one you actually inhabit physically. I began to forget that. Hell, I forgot it entirely.
So here’s the dilemma. How do I pull myself back from this overindulgence in Twitter without cutting off the opportunity to interact, to influence, and most importantly, to market myself as a political analyst and columnist with valuable things to say? How do I keep attracting an audience without overdoing the personal touch?
I guess that’s the trick, isn’t it? Maybe one day I’ll discover the answer.
Much has been made of House Speaker Paul Ryan’s pledge that he will not accept the Republican presidential nomination if an open convention chooses him this July. We are expected to believe that this mere statement has definitively settled the issue and that there are no circumstances under which the Wisconsin Republican will be the party’s nominee.
Hogwash. Ryan’s statement settled nothing. In fact, his recent behavior — making a highly publicized speech and cutting a web video in which he went out of his way to be statesmanlike — indicates the opposite. These moves give every appearance of Ryan making himself available as an alternative. Even his protestations of disinterest are part of the silly dance expected of candidates.
History shows us that declarative statements are not binding. In 1944, President Franklin D. Roosevelt released an open letter in which he said that if he were a delegate to that year’s Democratic convention, he would vote to renominate his vice president, the ultra-liberal Henry Wallace. Meanwhile, behind the scenes, Roosevelt was actively working with Wallace’s detractors to ensure the nod would go to Senator Harry Truman, who succeeded Roosevelt as president when FDR died months later.
Politicians lie about their intentions all the time, so why should we automatically believe Ryan?
Oh, but we are told that this pledge is so ironclad that if he broke it, he’d be finished in politics.
Nonsense. There are all kinds of ways to wiggle out of a pledge. Imagine we’re going on the third or fourth ballot at a chaotic GOP convention, and Ryan says this:
“As I have repeatedly said, I did not want the nomination. But many leaders in our party who I respect greatly have told me that I am the only person who can unite our party and lead us to victory in the fall. I cannot in good conscience refuse this call, and so it is with great personal reluctance that I have decided to accept my party’s nomination for president of the United States.”
There. It almost sounds noble, doesn’t it?
Never take any politician at face value if he or she disclaims any interest whatsoever in being president. If the nomination is gift-wrapped and handed to Ryan on a platter, he’ll take it, just the same way he took the speakership he said he had absolutely no interest whatsoever in taking. Don’t be naive. Ryan’s past pledge meant nothing, so why is this one guaranteed to be for real?
I live in an apartment complex in a town of about 75,000 people, right across the street from San Francisco Bay. It’s a nice place to live, and the scenery is astonishingly beautiful.
We have two shared laundry units in the complex, and this morning, when I went down to move some laundry from the washers to the dryers, I saw that somebody had left a mess of powdered laundry detergent all over the floor.
My initial reaction was to get upset with whoever had been so irresponsible as to leave such a mess for someone else to clean up. Aren’t we all taught, at some point in our lives, that if you make a mess, you should clean it up yourself?
Then I turned my thoughts to more practical considerations. It was only 9:30, and the complex office doesn’t open until 11, so it wasn’t going to get cleaned up anytime soon. And sometimes, when I am moving the laundry from the washers to the dryers, I inadvertently drop an occasional item on the floor. I realized that the only way this situation was going to get any better, for me or for anyone using the laundry room for the next couple hours, was if I went back to my apartment, grabbed a broom and dustpan, and cleaned the mess up myself.
Was it fair, or right, that I had to clean up somebody else’s mess so that I wouldn’t have to deal with it? No. It wasn’t fair, and it wasn’t right. But I realized I could either complain about somebody else shirking his or her responsibility, and still have the powder all over the floor, or I could clean it up myself. Those were the only options.
And then I thought about this situation as a metaphor for community and country, and I thought about all the homeless people I see on the streets of San Francisco five days a week when I commute to and from the city. No doubt many of these people are just unlucky, and no doubt many of them have issues they can’t cope with. And certainly, there must be some among them who are just too lazy to take care of themselves. No doubt, there are those among them who were shipped here from Nevada, where state budget decisions have led to a phenomenon called “Greyhound Therapy.” No, this doesn’t mean giving the mentally ill kindly service animals for their benefit. It refers to putting mental patients on a bus and shipping them off to San Francisco, where some might find help, but others inevitably end up on the streets. In the latter two cases, we have examples of people refusing to clean up their own messes.
And many of us see these people and see “lazy, irresponsible drunks/drug addicts,” and gripe about how they need to take responsibility for themselves. Maybe there are some who could or should. But in the meantime, while we complain, they continue to be in the streets, and this is bad for everybody—both for them and for the rest of us. While we bitch and moan about the “takers,” we also abdicate responsibility for our communities.
It doesn’t have to be that way. We can make a better society, if we are willing to get past what’s “fair” or “right” and just see a problem and take steps to solve it.
And it doesn’t have to be partisan either. The state of Utah, dominated by the Republican Party for generations, has all but ended chronic homelessness by essentially giving housing to the chronically homeless, no questions asked. By so doing, the state has saved itself many of the myriad costs associated with homelessness.
Sometimes, the only way to improve your own life, your relationships, your community, your society, your country, is to recognize that being part of a community—part of being alive and connected to other human beings—means that sometimes you’re going to have to clean up other people’s messes. To do otherwise is to cut off your nose to spite your face.
So let’s all pick up that broom and get to work.
We know, of course, that the Republican majority in the United States Senate is not going to approve any candidate President Obama nominates to the Supreme Court. With the death of Antonin Scalia, the conservatives have lost their 5-4 majority on the court and whoever is chosen to replace him will tip the balance. The Republicans would far rather take their chances on the coming election and wait it out in the hopes that they’ll be able to appoint another conservative in January 2017.
Of course, Twitter is abuzz today with all of the potential “blue state” Republicans and halfway reasonable GOP Senators who might be persuaded to join Democrats in approving a nominee, but this is a fantasy. These theories all leave out the facts that there will never be enough aisle-crossers to break a filibuster (which would require any nominee to get 14 Republican votes, not four), or that Senate Majority Leader Mitch McConnell (R-Kentucky) does not even have to call a vote.
So clearly this isn’t going to happen. The next president and the next Senate will select Scalia’s replacement, period.
With this understanding, President Obama and the Democrats should be thinking about how to gain the maximum political benefit from Republican intransigence. And the way to do that is to nominate Senator Elizabeth Warren (D-Massachusetts) to fill the vacancy.
There is no question of Warren’s qualifications. The former Harvard Law professor has impeccable credentials, so Republicans could not claim she is unqualified. It would therefore become clear, if it wasn’t already, that they were blocking her for strictly political reasons, and this would diminish their standing with the few true swing voters.
But there are greater political benefits to be had. First, a Warren nomination would provide a jolt of energy to progressives who adore her, which could be crucial in terms of base turnout in the upcoming election. Secondly, nominating a fourth woman to the court would reiterate that Democrats are the party of equality.
Both Hillary Clinton and Bernie Sanders could take this ball and run with it, hammering the Republicans for blocking an eminently qualified (progressive, female) nominee. Meanwhile, the president can also exploit this situation to hammer the Republicans every day.
There is no need to worry about who would replace Warren in the Senate because, as noted above, there is no chance in hell the Republicans will approve her (or anybody) between now and the next presidential inauguration. So if the Republicans want to play hardball, the Democrats have a great way to win the war by losing the battle.
The death of Justice Antonin Scalia has added a major new dimension to the 2016 elections, as what was previously theoretical is now an undisputed fact: the next president of the United States, and the Senate sworn in the first week of January 2017, will determine whether the Supreme Court will have a liberal or conservative majority. Scalia’s death leaves the court with four liberals and four conservatives, so the next justice will become the swing vote.
Of course, it must be immediately understood that the current Republican-controlled Senate will not approve any appointee that President Barack Obama nominates. With Republicans holding a 54-46 majority, the president would have to get four Republican Senators to support his nominee, with Vice President Joe Biden breaking the tie. While there may be a slight possibility of getting four Republicans, there is no chance whatsoever that the president would get the 14 Republican Senators he would need to break a filibuster. It probably won’t even come to that. It is doubtful that Senate Majority Leader Mitch McConnell (R-Kentucky) would even allow a nomination to come to the floor.
It is not difficult to predict how this issue will play out over the course of the election. Senators Ted Cruz (R-Texas) and Marco Rubio (R-Florida) will angle for votes by promising to filibuster any candidate the president nominates for the remainder of his term. They will also use this opening to undermine Republican presidential frontrunner Donald Trump by telling conservatives that they can’t trust Trump to appoint a “true conservative” to fill Scalia’s vacated seat. All the other Republican candidates will also promise to appoint a “strict, constitutional conservative,” but Cruz and Rubio, the only Senators in the field, will have the advantage here, and they’ll milk it for all it’s worth.
The Democratic presidential contenders will both stress to their bases the opportunity inherent in this situation to change the composition of the court away from its longtime conservative majority. Hillary Clinton will hammer home to the Democratic base the idea that she is more electable than Bernie Sanders and that it is crucial to nominate the candidate with the best chance to win the election, in order to ensure a liberal majority on the court. Sanders will cast this as an opportunity to bring about revolutionary change and may well float the idea of appointing Senator Elizabeth Warren (D-Massachusetts) to the court.
President Obama will likely hammer the Republican Senate at every opportunity between now and the election for refusing to act on his nominee or nominees and leaving a Supreme Court seat vacant for a year or more for political reasons. All candidates of both parties will stress the need for their party to control the Senate in 2017. With Senate control up for grabs this year, this will be a key point of emphasis.
This election just got ratcheted up to Defcon 1.
I stayed the night last night in an unfamiliar city in southern California, and needed to get some breakfast before embarking on an eight-hour drive home. As I knew nothing about the town, I didn’t know where I could find a good breakfast place, but there was a Denny’s right next to my hotel. It’s not the most spectacular food, but it is predictable and cheap, and when it comes to road food in an unfamiliar location, predictable-and-cheap is often preferable to the alternatives.
As I ordered—a shaved ham-and-egg sandwich with Swiss and American cheeses on sourdough and hashbrowns—the server asked if I’d like anything else, maybe a short stack of pancakes.
For a second, I was sorely tempted, and then it hit me that the question—which could be paraphrased as “Would you like 600 calories’ worth of pancakes, butter and syrup to go with the 1,100 calories of ham, egg, cheese, potatoes, bread and grease?”—was really a metaphor for the last 70 years of American politics: “You don’t have to choose one tasty meal or the other, hungry patron of mediocre breakfast food! You can have both!”
This is the same idea we Americans have been sold from the political menu since the end of World War II, and by and large, we have happily consumed it. “Guns OR butter? Who says you have to choose? You can have guns AND butter—and you can put that butter on that extra side of pancakes!”
In the heady postwar era, it really did seem like we Americans could have all we wanted and never have to concern ourselves with the consequences. While most of Europe and large swathes of Asia and Africa were devastated, America was the only major industrial power that was largely unscathed. America became the material colossus of the world, the leading supplier of goods. Factories worked in three shifts around the clock, and well-paid jobs were there to be had by anybody who grabbed a high school diploma one day and walked into the local factory the next.
Hindsight is always 20/20, and it is easy to look back and realize that it couldn’t last—that sooner or later the rest of the world would rebuild itself and compete with us, and that we would no longer be the only major seller of industrial and consumer goods, with inevitable consequences for the postwar U.S. economic boom. The economic crisis of the 1970s, with skyrocketing inflation and a growing dependence on cheap imports, made it plain to people such as President Jimmy Carter that the nation’s voracious consumerism needed to go on a diet. The late 1970s was an era of downsizing and increased efficiency—smaller, fuel-efficient cars, turning down the thermostat, installing solar panels. The gravy train was also over, certainly, in the political realm; we either had to raise taxes or cut services, or some combination thereof.
However, Ronald Reagan—who ironically rose to political prominence by delivering a speech, on behalf of the doomed 1964 Goldwater presidential campaign, titled “A Time For Choosing”—came along in 1980 and essentially told us all that no, we did not have to make any hard choices. We could continue to have all the essential government services, and we could also institute massive tax cuts (largely benefiting the wealthiest Americans). He gave us a third option by which we could avoid both raising taxes and cutting services, and we eagerly took it: Put it all on the credit card. We’ll pay it off later. Reagan told us we could have both the tasty sandwich-and-hashbrowns meal, AND the pancakes, and we listened because it was what we wanted to hear.
The truth is that we do face hard choices as a society, and the longer we put those choices off, the higher the bill is going to be. In the same way that a person who makes no responsible choices at the table will eventually suffer health consequences, the longer we go on failing to choose as a society, the more it will hurt us in the end.
So our choice is this: do we continue to allow a small number of extremely wealthy people to continue to hoard vast sums of money, more than many of them could spend in dozens of lifetimes, while everybody else lives either in poverty or the realistic possibility of poverty? And let’s be honest, that’s exactly the situation we face. How many Americans today live a life of quiet desperation, knowing that if they lose a job or suffer a major illness, they go from financially comfortable to bankrupt, or even homeless, in a few short steps?
Do we continue to pay our bills as a country on a giant credit card, knowing that at some point, future generations will have to pay the bill? Hell, they’re already paying the bill. As colleges have raised tuition to meet growing costs, students have had to take on massive amounts of debt that were largely unheard of prior to the 1990s. They are then entering a job market in which there are very few jobs for them, in part because older workers, whose ability to draw secure pensions has been severely eroded, lack the financial security to retire. Those same older workers, faced with working into their 70s, are also faced with partially or fully supporting their adult children—as upwards of a third of all Americans between 18 and 31 now live with their parents. And their adult offspring, more and more, are faced with delaying their careers, thereby delaying their ability to save money, pay off their loans, buy homes, have children if they wish, and eventually retire. We are just now starting to see the consequences of our failure to choose.
Sooner or later, we Americans are going to have to understand that we have to choose the sandwich or the pancakes, and maybe eat only half of what we ultimately do choose. We have to unlearn the lie, which we have eagerly accepted, that we can have everything we want and that we never have to choose—that we can continue to spend trillions of dollars on a bloated military budget, and idiotic wars of choice like the war in Iraq, and that we can simultaneously continue to give tax breaks to bloated billionaires and corporations. Meanwhile our cities fill with the homeless and hungry; our bridges and roads crumble; we are losing an entire generation of Americans whose careers and paths to financial security appear hopelessly blocked; and our educational system continues to collapse, in part because of continued political meddling, and in part because teachers are so poorly paid and underappreciated that the most promising college students choose fields such as finance. We need to decide whether it is more important to fund the things that will benefit our society as a whole, or if we are going to continue to buy more of what we don’t need and can’t afford.
Even if you disagree, as I do, with the solutions Reagan advised us to choose in 1964, he was right about one thing: that we do, in fact, face a time of choosing. But he was wrong, when he told us in 1980, that he was just kidding.
My wife and I saw Divergent today, and while I thought it was an extremely entertaining, high-quality movie, I found one thing in particular to be deeply troubling when we discussed the movie at an early dinner afterward:
Why is it that the “Erudite” faction—the intellectuals—were the evil people who were trying to take over the society and violently wipe out the “rightful” rulers, the “Abnegation” faction?
My wife was also interested in this question, so she looked up the author of the book, and discovered that the writer, Veronica Roth, was reported to be a “moralistic, devout Christian.”
Admittedly, this aroused my suspicion, and I began to look at the movie the way that I know many evangelical Christians look at society, from my own personal experiences and the experiences of friends. When I began really thinking about the movie, I started noticing several notable parallels:
1) The main character, Triss, is deeply uncomfortable about sex and firmly denies the sexual advances of her love interest, Four. The high level of her discomfort about sex becomes clear when we learn that is one of her greatest fears. Evangelical Christianity, of course, has a near obsession with premarital chastity, and Triss, the movie’s heroine, behaves exactly the way the evangelical religious culture would expect her to behave.
2) The plot is set in a post-apocalyptic world, and as anyone who has observed the “end times” element of evangelical Christianity knows, a post-apocalyptic society is a major focus of evangelical Christian thought.
3) Even though we learn that Four’s father abused him as a child, when Four’s father asks his son to save his life, Four does it—exactly what would be expected in a culture in which “honor(ing) thy mother and father” is paramount, regardless of parental transgressions.
4) The Abnegation faction consists of “simple, modest folk” who dress plainly, eat simple foods and behave, at all times, in an unfailingly virtuous way. This mirrors some of the practices—and the conceits—of many evangelical sects that discourage dressing or behaving “immodestly” and decry being “worldly.” They also harbor and shelter outcasts who have run afoul of the intelligentsia; the parallel here is that evangelicals like to view themselves as outcasts from the “immoral” society around them. It seems obvious that the “Abnegation” faction is a stand-in for evangelical Christians.
5) “Erudite,” the intellectual faction, is presented as scheming, evil, and jealous of the prerogatives of the ruling faction, which the “Abnegation” faction has somehow managed to become. How these simple, unassuming, virtuous people became the rulers of the post-apocalyptic Chicagoland is hard to imagine—unless one takes the view, as evangelical Christians in America do, that “God’s people” are the rightful rulers of our “Christian nation.” It is also worth noting that, in Judeo-Christian texts, Satan is portrayed as scheming, evil and jealous of God, and that he uses his intellect to turn the godly toward sin—so it hardly seems a stretch to see a parallel in the movie between intellectuals and Satan himself. This linkage of intelligence and evil should be deeply troubling for those of us who value evidence and intellectual inquiry over blind faith and unquestioning obedience.
6) The Erudites, consumed by their elitist arrogance, decide that they, not the righteous Abnegation faction, should rule. They use their intellect to control and corrupt the “Dauntless” faction—the military—in order to round up and execute the Abnegation faction. This is crucially important, because it lines up, chapter-and-verse, with the tenets of the persecution complex that is rampant in American evangelical Christianity. It is taken as an article of faith among evangelical Christians that they inevitably will be persecuted for their faith. In the evangelical community, mistrust of intellectual “elitists”—who, in the evangelical view, cleverly twist and spin the truth, idolize science and other “worldly” things, deny the Bible, and generally seek to corrupt our “Christian nation”—runs very deep.
I have some personal experience that informs my knowledge of the evangelical persecution complex. When I was about 10 years old, two friends of mine from school recruited me to join an evangelical imitation of the Boy Scouts. In between hikes and campfires, we were constantly being told by our adult leaders that we, as Christians, might one day be tortured, even killed, for professing our faith. (Finally, one night we were taken into the sanctuary where the Wednesday night services were taking place, and we were told to raise our hands, close our eyes, and softly, rhythmically chant “Jesus, Jesus.” Even as a 12-year-old, I found this experience decidedly weird, and I never went back.)
In short, I think the parallels between the movie and some of the stranger obsessions of American evangelicals are frightfully clear. I can’t help but wonder whether the flurry of books and movies such as this one, as well as Noah, Son of God, the impending Exodus, the Twilight series, and some of the obviously Christian morality parables being filmed by artists such as Tyler Perry, isn’t indicative of a serious push by the evangelical community to reverse their declining hold on American culture and politics via the aggressive use of the arts.
But most of all, I’m deeply troubled by the equating of intelligent people with the forces of evil, and I think we have to be wary of letting this train of thought get down the tracks unchecked. There has long been a tendency toward mistrust of intellectuals in American society—as noted by Frenchman Alexis deTocqueville in his seminal 19th-century work Democracy in America—and in the decades that have followed Charles Darwin’s publication of his Theory of Evolution, this anti-intellectualism has been taken up in a full-throated way by fundamentalist Christians as well. I think it is extremely dangerous when popular culture begins to reinforce this longtime loathing and mistrust of intellectuals.
I can almost smell the book barbecue now.