After suffering through the new Black-Eyed Peas song (which may in fact be the worst song ever written, although we will never know for certain because the council of learned men responsible for awarding such titles all shot themselves upon hearing it) and engaging in an it's-OK-I-want-to-die-anyway game of chicken with my friends over this Brokencyde video (it shows terrific musical growth over their earlier work) I have an important message for the world's shitty, talentless musicians. Gather all ye F-list rappers, Top 40 knock-offs, and aspiring TV talent show rejectees, for I have news. RE: AutoTune. Enough. Just fucking stop it. If the final stage of the decline of a civilization is grotesque self-parody, we are the Roman Empire circa 400 A.D. The use of AutoTune has long since leapt the chasm from novel to moderately cute to is-this-a-joke?. Today's musicians are engaged in a battle, one which has long since spiraled out of control, to make the album or the song which will sound the most dated in five years and serve as the stock punchline for future generations looking to make fun of 2008.

No one in this world listens to the kind of crap that gets airplay on the radio and thinks, "Gee, I'd like this song better if Stephen Hawking sang it." Unless and until that thought crosses someone's mind, AutoTune is the answer to an unasked question. It was invented to fix tracks from talentless models who can't sing. Now its sole purpose is to take ass-rapingly awful music and somehow, in violation of the laws of thermodynamics, make it worse. It is bad enough that the Black-Eyed Peas are so creatively bankrupt that they've taken to writing event-specific "songs" (like the geniuses who realized that they could write a song called "Closing Time" and every bar on Earth would play it to the great delight of sodden, Abercrombie-shirted assholes eating street vendor burritos at 3 AM and power-barfing Jagermeister on a 24 year-old mother of four in an ill-fitting halter top). What purpose other than shepherding mankind closer to complete intellectual collapse is served by AutoTuning the inane vocals?

We know you can't sing. We know you have no talent. We know that highly-paid sycophants are hired to try heroically to make you sound decent, stretching to the limit the deceptive abilities of AutoTune, ProTools, and self-aware Cray supercomputers hidden safely away in the craters of extinct volcanoes. We know you have the integrity of a long-expired Chinese vending machine condom. We know that you would bang your own mother for a plug on TRL. We know you are not terribly bright. Despite these many handicaps you are still Making It in the music industry. Isn't that enough? Aren't you satisfied with being plain, ordinary, run-of-the-mill terrible? When I was your age, pop music horrorshows and one-hit wonders had some self respect. Billy Ray Cyrus. 4 Non-Blondes. Snow. Joan Osborne. Skee-Lo. Lou Bega. They sucked, and the whole world knew it. They didn't resort to cheap signal processors during post-production work to try to make themselves sound better. You fool no one and the effort insults our collective intelligence, or whatever remains thereof.

The next time you reach for the AutoTune controls, ask yourself "Why am I doing this?" The answer, most likely, is that you are a pretty but tone-deaf pile of crap who but for the grace of God and that A&R guy from Columbia you blew in the bathroom at Lit would be getting fired for poor performance from your night job cleaning the grease traps at a Long John Silver's. That is not sufficient cause to start doing robot vocals. We've established that it isn't clever anymore and it doesn't fool anyone into thinking you can sing. You have nothing to gain. You are already successful irrespective of your utter lack of ability. Is that not enough for you? Must you forever be looking for ways to twist the knife in the backs of people with a shred of decency and taste? Must you high-five one another between gulps of Cristal while double-teaming our souls, rejoicing in how lavishly your lack of taste is rewarded?

In closing, I urge you in the strongest possible terms to kill yourself.



Americans believe anything that feels like it should be true and, as a nation, know absolutely dick about economics. Thus bits of folk wisdom based on an Econ 101-level understanding of macroeconomics quickly rise to the status of gospel truth – half urban legend, half unadulterated bullshit. The Federal minimum wage is set to rise to $7.25 on Friday and, as we all "know," raising the minimum wage makes jobs disappear.

Whenever Americans appear to be at risk of forgetting this kind of wisdom, a corporate-backed front group emerges with a marketing campaign to ram home the point. The "Employment Policies Institute," which pitches itself as a think tank (while the media happily play along) but is actually a lobbyist- and business-funded anti-living wage group, fills that role today. Their point is simple. The money in our economy represents a zero-sum game; if wages must go up, the number of jobs sharing the finite pool of cash available for wages must decrease. Write that on the final exam and revel in your C-. Then get a job writing syndicated financial columns with your sophomoric logic.

Disregard the fact that half of the states already have the $7.25 minimum and there is not the slightest bit of evidence that any state or Federal minimum wage hike in the past led to job loss. Why, in the zero-sum game, must the variables be limited to the number of jobs and the minimum wage? If the number of employees and hours worked at a business are held constant, then the cost of employing a minimum wage workforce will increase. Why must that result in job loss? Why can it not result in a 10 cent price increase on your Whoppers Junior? Why can it not result in a seven-figure bonus for the higher-ups instead of an eight-figure one? Why can the cost not be recovered from other non-wage costs of operating the business?

None of those scenarios can be proposed because the purpose here is not, and never has been, to discuss the role of labor costs in the economics of operating a business. The purpose is to scare ignorant or desperate people into thinking that they will lose their job of they don't speak up to protect the boss's right to pay them like illegal immigrants. As the current marketing campaign to this effect is speaking out against a wage hike which has already been signed into law, I'm guessing that understanding the relationship between wages and job supply is not the only aspect of economics that the Employment Policies Institute and its followers fail to grasp correctly.


It has been a while. As long as I'm being boring this week, let's really stretch our legs and take the dog for a walk.

Attributing the characteristics of a group to its individual members lies at the root of a vast portion of the bad logic in this world (and certainly a majority of the bad social science). In the most basic sensee, aggregation destroys data. And whether in life, politics, or academia, we often want but cannot have that data. We don't know what each member of an audience thinks of a performance; all we know is that the crowd applauded at the end. We want to know what kind of people voted for Obama, but we know only that 54% of voters did. We want to know if liberals or conservatives are experiencing more home mortgage foreclosures, but the only information we have are general foreclosure statistics.

A basic fallacy of aggregation assumes the actions or decisions of an individual based on a group. Mike was at the concert and the crowd cheered loudly, so Mike must have enjoyed it. White people like Arrested Development, so Ed likes it. These may be good guesses, but playing the odds is not the same as being logical. Knowing that Jim Inhofe is a Senator and that the Senate is about to confirm Sotomayor, would we conclude that Mitch McConnell voted to confirm her? Yeah, not really.

A second kind of fallacy is particularly prominent in sociological, political, and economic research because of the preponderance of pooled data (election results, jury verdicts, unemployment rates, the Gross National Product, etc.). A famous sociologist termed it the "Ecological Fallacy" in 1950. An ecological fallacy correlates pieces of aggregate data without evidence that a relationship exists among the data. Here is a basic example.

In the 2008 Election, residents in a particular state were asked to vote on Proposition 1. In City A, Prop 1 got 5% Yes votes. In City B, the Yes votes were 40%. Fine. But we also note that 5% of the population in City A is Latino – as are 40% of the residents of City B. Ergo we conclude, seemingly quite logically, that Latinos voted for Prop 1 and non-Latinos didn't. The percentage of Yes votes and Latinos is equal in both cities. Lacking detailed data about who voted for what and why, commentators often make leaps of faith along these lines. Here's the problem. What if Prop 1 is, "Should public services throughout the state, including schooling, be conducted solely in English?" City A has few Latinos, the ethnic group most likely to want or need services offered in a different language. Because there are few people who are likely to be native Spanish speakers in City A, voters there neither think nor care much about English-only laws. But in City B, the large Latino population makes the issue highly contentious and polarizes non-Latino voters. So City B is 40% Latino, but the 40% voting for Prop 1 are the white people who feel threatened by the multilingual environment in which they live. The fact that the Yes votes and percentage of Latinos in each city are equal does not imply a direct correlation. The true relationship among the data, in this hypothetical, is entirely different than the numbers would suggest.

The foreclosure maps which are popping up in newspapers and around the interwebs are just too tempting for many people. Note the county-level foreclosure rate, throw in some election data, and start making conclusions about partisan balance and imprudent lending. Were it that simple, I and most of my colleagues would be out of a job and contributing mightily to the foreclosure landscape. The data lost in aggregation are often of great interest but no amount of rationalization can recreate them from the pale substitute of numerous data points smashed together in one big, indistinct pile.


Any skeptics about lingering racism and gender biases in our society would do well to pay some attention to the Sotomayor nomination and confirmation process (and lord knows there's little other reason to pay attention). Now, when those terms are mentioned people immediately think "But I don't hate Mexicans!" and "I don't oppose her because she has a vagina!" These statements are probably true. Her political opponents (discounting Tom Tancredo) may not hate Hispanics or women, but they are less than shy about concluding that she is not otherwise qualified for the office.

The argument starts from a basic premise: the President chose Sotomayor from a pool of potential nominees which, in theory and according to the letter of Art. III of the Constitution, includes every American citizen because she is a woman and she is Hispanic. It's the basic Identity Politics argument. This conclusion can stand alone, i.e. supported by the belief that white men are, by definition, the most qualified people and others can only be elevated as a form of tokenism or "affirmative action." It can also be based upon a form of logic, which is an accurate description inasmuch as "shit that is wrong and does not actually make sense" is a kind of logic. And it is. It is bad logic.

While the Constitution leaves the door wide open for Supreme Court nominations, in the context of modern politics I think we can agree that the pool of potential choices is limited to people with law degrees. We might go even further and state that the pool is limited to people with judicial experience. Either way, this creates a pool of applicants in which white men are the most numerous demographic. This leads people who are bad at thinking to conclude that the probability of the most qualified individual in the applicant pool being a white male is higher than any other demographic group. In other words, if most judges are white men then the most qualified judge is most likely a white male. Or, to put it another way, if you have 20 socks in your drawer and 15 are white, a blind grab in the drawer is most likely to produce a white sock.

But at this point the logic is already irrevocably shot to shit. The preponderance of white males in the pool of potential Supreme Court nominees supports only the conclusion that a randomly selected member of the group is likely to be a white guy. Race and gender are two variables in the process and qualifications are the third. So to draw an accurate inference about tokenism or Identity Politics we would need to define the pool of qualified applicants and then observe its racial and gender composition. So let's treat Qualification as a dichotomous variable (i.e., either Yes or No).

What would make a judge unqualified? Blatant disregard for the law or inability to interpret it reasonably, either of which could be revealed by studying the percentage of said judge's decisions which are overturned by a higher court. If an appellate court is overturning half of someone's decisions, said judge probably has no idea what he or she is doing. Or he or she has an ideological ax to grind. Experience would also be a measure of quality (hence Clarence Thomas's 12 months on the bench led many to conclude that he was unqualified). But once we have defined this group of people who meet the minimum threshold to be called "qualified," then what? Any Federal judge or state Supreme Court justice who performs the duties of the job without problems is probably "qualified." Who, then, is "most qualified?"

The answer is that no one is, obviously. The top 30 or 40 candidates for the Supreme Court are, on paper, largely indistinguishable. They've all sat on the bench. They all went to big-name law schools. They could all be thrown on the Supreme Court and perform the required duties adequately. So the President's choice is – sit down for a moment, Mr. Beck – personal. That's why the Constitution stipulates that a person should make it. From a group of equally qualified candidates, the President is tasked to choose the person whose philosophy, temperament, and personality mesh with his. After all, this is the choice of a person who will be a major part of defining a President's legacy.

This process is not the damn BCS or the eHarmony personal compatibility test. The purpose is not to develop a computer program to weight characteristics and rank-order the candidates from most to least qualified. It's a process in which the Senate's role is to ensure that the President's choice reaches the threshold to be called Qualified. Beyond that the process is and, more importantly, is intended to be a subjective and personal one. The Most Qualified Candidate is a straw man and the quest to find him (and it's inevitably Him) is a search for last night's thunderstorm.


So the Most Trusted Man in America, legendary anchorman Walter Cronkite, died on Saturday. That moniker is not hyperbole. There actually was a survey done in which Cronkite was found to be the most trusted public figure. Anyone else get the sinking, nauseating feeling that a similar survey done today would pronounce Glenn Beck the winner? The idea raises two interesting questions. First, to what extent was the enormous level of trust and influence attributed to Cronkite a fact rather than an artifact of nostalgia? Second, is it even possible that any individual in the media could be so esteemed in the eyes of the public today?

Nostalgia is simply the interpretation of history through rose-colored glasses. Cronkite is a symbol of the pervasive American yearning for a simpler time, as we almost universally buy into the notion as a society that everything has been going downhill since 1960. We remember Cronkite and the idea that we could believe what our newsmen told us Back in the Day, which we subsequently contrast with today's media who are liars one and all. I suspect that we remember both Cronkite and his Simpler Time a little too sentimentally. Conservatives always mistrusted Cronkite, at least after 1968. In the Nixon years, years so influential to the modern right-wing movement, conservatives began insisting ever more urgently that if the mainstream media's reporting of reality failed to confirm their worldview then liberal bias was to blame. The idea of the cranky conservative who didn't trust the librul media was so widely recognized that legendary fictional alpha-reactionary Archie Bunker used to call him "Pinko Cronkite" on All in the Family. So the idea that "everyone" or even any reasonable approximation of "everyone" trusted Cronkite is probably bunk. The only difference is that the conservatives who distrusted the mainstream media didn't have a media outlet of their own to rally around and make visible their numbers. The percentage of people who would follow braying jackasses like today's Talk Radio stars was, if anything, higher in Cronkite's day than today. They simply didn't have the option to do so.

On the second point, I find it inconceivable that any news outlet or news person could capture such widespread public trust. There is little doubt that among people who did not reject the mainstream media out of hand as an insidious liberal mouthpiece Cronkite was widely watched and respected. But the right has invested so much time and energy into getting its adherents to reflexively reject the media that I doubt they could declare trust in any newsperson who wasn't agreeing with the GOP at least 90% of the time. And on the left, the media's pathetic performance through Iran-Contra, two Iraq Wars, the 2000 Election, and the Clinton impeachment has led most to conclude that the mainstream media are either timid stenographers or motivated shills for corporate interests. So today's star anchors (Can people even name the major network anchors these days? And does anyone even watch the ABC/NBC/CBS nightly news anymore?) garner little to no respect among the vast majority of the population.

Media bias and social attitudes toward the media are rarely discussed in context. Compare a modern newspaper to one from 1910 and you will find that biases were far more egregious in the past than they are today. Furthermore, the individuals have always rejected media which attempted to disconfirm his or her worldview. The source of the Cronkite legend is the simple fact that the three major networks, which were once the alpha and omega of televised news programming, now compete against cable networks which have the ability to coordinate, unify, and amplify the voices of people who didn't trust Pinko Cronkite.


The pharmaceutical industry, like the larger healthcare industry, goes out of its way to address problems which are profitable but of dubious urgency. Coincidentally, I'm sure, they seem a little more concerned about one gender than the other. We have dozens of male pattern baldness treatments and an unceasing parade of magic dick pills, yet the side effects of contraception aren't getting any less severe and some insurers still won't cover it. I've always hoped that we might even the playing field by having drugmakers devote less R&D to Rogaines and Viagras and more to life-threatening diseases and conditions regardless of who is affected by them. Instead, it looks like we're going to begin leveling the playing field by developing equally stupid drugs for women.

This brings me to the Latisse commercial you've probably seen as part of a summer marketing blitz. You know, the one starring Brooke Shields touting a drug which promises to bring an end to the global menace of insufficiently long eyelashes (hypotrichosis).

Yes, this is a real thing.

Now, I suppose that having no eyelashes would be a serious medical problem, but no amount of bullshitting can make the pitch used in this commercial sound legitimate. Yes, Latisse was developed as a glaucoma treatment many years ago, but someone still spent a lot of R&D money figuring out a cosmetic use which would let the manufacturer extend its patent.

It's good that big pharma has given all of the serious diseases such a vigorous thrashing that it can turn its attention to gender-based body hair insufficiencies on equal terms.


I am sitting in the lobby of a hotel in Nashville (having just had my face melted off front-and-center at the first The Jesus Lizard show in the U.S. in over a decade) "enjoying" one of those complimentary, inedible chain hotel breakfasts with many of my fellow hotel guests. On the television is today's installment of the Sotomayor confirmation hearing. Dozens of Nashvilleans and Nashville visitors are positively glued to the set, letting out muffled sounds of displeasure when the judge says something displeasing (which, as best I can tell, is often) and slightly less muffled cheers when a rhetorical superstar like John Cornyn or Jeff Sessions performs a soaring, backboard-shattering tomahawk dunk (which, as best I can tell, describes every word they say).

If CNN or the other networks declined to cover this, I would probably be critical. I'd let loose with some torrent of indignation about the awful media and the collective dumbing down of America. But honestly, if I've seen anything less interesting or less newsworthy being covered live on CNN I can't remember it right now. This makes the Michael Jackson memorial coverage look relevant in comparison.

These proceedings:

1. Provide less-than-no insight into the nominee's judicial philosophy, personal beliefs, or favorite New Kid. As our nation has been through many of these hearings in the last 10-15 years it is readily apparent that the answers given are rehearsed exercises in obfuscation and monuments to meandering vagueness. And to the extent that the nominees provide any direct answers, they bear absolutely no relation to future judicial behavior.

2. Have a predetermined outcome, hence this is little more than a cheap opportunity for the majority Senators to lob softball questions ("Judge Sotomayor, I've heard you described as fair-minded. Would you agree with that?") and the minority to show off for the combined audience of the 700 Club and the UFC.

The design of the Constitution in no way implies that the Supreme Court is the slightest bit accountable to the public. Neither the President who appoints them nor the Senate who confirms them were popularly elected in the original text of the Constitution. We don't need these hearings. If the Senators are interested in asking real questions and getting real answers, turn the goddamn cameras off and have these hearings in closed session. The Senators would be much freer to say "Look, this is what we really want to know" while the nominees could provide answers that aren't the product of excessive coaching and stage fright.

Neither I nor the knuckleheads mouthbreathing around the waffle iron in this room have a relevant opinion about this or any other judicial nominee. If we want the Court subjected to public scrutiny and approval, Congress should grow some nuts and amend Article III. Barring that, what is the point of any of this? We're watching a woman give non-answers to grandstanding questions in a process of which the outcome was decided the morning after the 2008 election. The idea that the nominees owe this to the public or that the Senate is making the process more "democratic" with this spectacle reflects the cheapest, least informed kind of We the People demagoguery.


Sadly, one of my favorite things on the internet – the Paper Cuts blog, a database of newspaper closures and layoffs maintained by Erica Smith – has disappeared. Hopefully it is on hiatus, to return better than before. A little bit of poorly-formatted archival material is shown here.

Law schools long ago mastered the scheme of promising x students who can afford the tuition that they would get great jobs despite knowing full well that only x/2 could actually find decent employment upon graduation. It's a serious ethical dilemma on the administrative side of academia. We need the money so we take everyone who can pay. Then we let the students discover for themselves, several years and $100,000 later, that, well, those six-figure jobs we dangled in front of them are pretty rare.

Journalism schools are on the bandwagon now, and not because they're taking more students. The industry simply is disappearing. I saw an estimate (and lord help me, I can't find a link) that only 40% of current journalism students can feasibly be absorbed by the print media industry. Are they doing the ethical thing and reducing enrollments? No. That's hard to do when no one has a job and the number of applications skyrockets. Just keep taking them, take their money, let reality steamroll them in a few years, and then rely on hacky right-wing moralization to absolve academia of blame ("It's the students' own fault if they're not good enough to find a job. We tried.")

The really sad part is that many of the jobs that are available only loosely resemble what we'd call "journalism." Rather than becoming news reporters, many of these students are going to end up in lifestyle publishing (magazines of the Modern Bride and Stuff variety), re-writing press releases and wire stories for small papers, or freelancing. An acquaintance of mine graduated from Northwestern journalism school, one of the top 5, and was the envy of many colleagues for landing a real, well-paid job…as the "Gadgets" Editor for a well-known national magazine, a genre of journalism which amounts to badly disguised advertising. Many a quality journalist from Columbia or Northwestern are doomed to sit in offices writing about great software for printing one's own wedding invitations, while many more will be unemployed and forced to compete with other desperate people in a race to see who has less integrity. "I have one job here and there are four of you. Whoever writes the most enthusiastic feature about the new Scion tC gets the job. You have an hour."

It might feel a little less dirty if journalism schools took a fully informed, buyer beware attitude about the state of the industry. But to be honest, would that stop any of the current wave of applicants who are unemployed and lack better options for the future? Like Daniel Clowes said in his hilarious Art School Confidential comic strip, (please disregard the horrendous film based on the same) telling a room full of college kids "Out of you 50, only one will land a job in this field" simply makes all of them think "I'll be the one!"


I love watching old clips of news and talk shows from the early days of television. They lay out the evidence of just how much we've changed as a nation in high contrast. In my opinon, the most consistently entertaining of the early TV pioneers is the eponymous star of The Mike Wallace Interviews. He interviewed people like Maria Callas, Frank Lloyd Wright, Salvador Dali (amazing clip – see note below), Aldous Huxley, Erich Fromm, and Ayn Rand. Today we have 60 Minutes episodes about Tom Brady. If the fact that they don't talk to anyone interesting anymore isn't sad enough, the change in the level of discourse is flat-out depressing. Watch this clip of Wallace's Rand interview.

(Side note: Straight from the horse's mouth, Rand's philosophy sounds every bit as dumb as it sounds coming from her followers. Amazing. You'd think it would sound slightly less retarded.)

Note the depth of the discussion they're having. Neither is dumbing it down because they think that home viewers are too stupid to follow it. And Wallace's shows were popular. People watched this.

This recalls an anecdote I like to use when talking about the public capacity to follow politics. We've all heard about the great Lincoln-Douglas debates, right? During the 1858 Illinois Senate race (not, as is commonly assumed, the 1860 Presidential race) the two men staged debates around the state of Illinois. The format was three hours long – 90 minutes per candidate, plus opening remarks from other speakers. They attracted crowds in excess of 20,000. Now think about that for a minute. People travelled long distances to sit outside in August heat listening to candidates engage in a debate that lasted well over three hours. Today the debates compile 90 second sound bites, and even that is unable to capture the attention of many Americans.

Why did people turn out in droves for the Lincoln-Douglas debates? Why did Mike Wallace's slow, methodical interviews with people like Erich Fromm attract big audiences? Education can't be the answer. Many of the people at the Lincoln-Douglas debates were barely literate if at all. High school graduation rates and college attendance are higher today than ever. We're smarter, on paper, than all of our American forefathers. No, they weren't smarter than us. They paid attention because they were forced to.

When there were three TV networks, people who wanted to relax in front of the tube after work had to watch what was on. If that was the evening news or a news talk show, then that's what you watched. In 1858, people were starved for both information and entertainment, hence the allure of a big spectacle like the Lincoln-Douglas debates. Let's not fool ourselves – if Mike Wallace's or Stephen Douglas's audiences had the opportunity to watch Survivor or the Food Network, many of them would have done so. But they didn't. So they watched something that was good for them, and people like Wallace didn't need to sex up their formats to compete for viewers with entertainment programming.

This is, in my opinion, the single greatest example of market failure in American history. The 'democratization' of the airwaves and the proliferation of media outlets have made it so that no one needs to watch Mike Wallace talk to Frank Lloyd Wright anymore. Even though we are much smarter we sound dumber because we are never forced to listen to two intelligent adults talk about something interesting for an hour uninterrupted. No one makes us take our castor oil. We have been given limitless choice and we use it to avoid thinking, which is hard, at all costs. Nine hundred channels of satellite TV are the ultimate enabler. We know what we should do (eat carrots, read books, and watch Jim Lehrer) but we're bombarded with the opportunity to do what we want to do (eat Doritos, read nothing, and watch VH1 I Love the 80s!). Television didn't ruin us, but the changes in its content and format may have.

(Dali note: In one of his late-career retrospectives, Wallace called the Dali interview his favorite, noting that at the end of the interview he concluded that Dali "walked among humanity but was not one of us.")


Naomi Klein wrote a well-received book recently, The Shock Doctrine, about how neocons use large-scale disasters to ram through economic policies which wouldn't have a chance in hell of making it through the democratic process. It's simple, whether in 2003 Iraq or post-Katrina New Orleans: wait until the public is "shocked and awed" to the point of complete social collapse and then while people are scrambling around for food, clean water, or the right to avoid being executed by Sunni death squads, install an AEI wet dream of a government and begin auctioning the entire area off to the highest bidder. That last part is optional, by the way. You'll notice that this process is little different from how cattle are slaughtered. A blow to the head. Unconsciousness. Strung up and dissected. Sell every usable part.

On a less grand scale, Christian Parenti has written a number of essays (and a pretty good book, Lockdown America) about Prison Economics. There are some similarities to Klein's theory, but instead of a single disaster the vultures simply take advantage of communities while they're weak and desperate. Take Crescent City, California for example. The state waited until the logging industry collapsed and the town was nearly vacant before they offered the dying town salvation in the form of 4,000 ultra-violent (not to mention predominantly black and Hispanic) Supermax prisoners at the shiny new Pelican Bay State Prison. Of course no community would want a Supermax prison any more than they would want a toxic waste dump or a massive landfill. But prisons, landfills, and waste dumps all have one thing in common: they end up in communities that are too broke to say No.

I daresay the remaining players in the auto industry who still have a pulse are taking advantage of a similar dynamic these days.

art_kia_sign_cnnPerhaps you've seen the tale of West Point, Georgia, a typical rural craphole situated on the GA-AL border, which has been sent its salvation (quite literally) in the form of a new factory from KIA Motors, a subsidiary of the Korean Hyundai Corporation. With 20% of its rapidly-shrinking population under the Federal poverty line and more than 1/3rd receiving some form of government assistance, I can imagine why the folks of West Point are so thrilled to have KIA set up shop. And we must admit that an auto plant is a (relatively) pleasant economic savior compared to nuclear waste dumps and landfills. All things considered, the factory will be a good thing for the town. So why is it that this story feels so depressing?

West Point, the county, and the state laid out the usual smorgasbord of government cash to lure KIA or any other major employer to town. Has it really come to that in the United States? That in order to retain a manufacturing sector or provide any employment whatsoever for people outside of cities we have to bend over to the tune of $400,000,000 in free land, free amentites, tax breaks, and up-front cash payments? The factory and other employment sources which are expected to accompany it will provide something on the order of 7,500 jobs. That's probably an optimistic estimate, but let's go with it. Various governments with jurisdiction over West Point, Georgia just paid a foreign company $53,300 per job. That, apparently, is where we're at as a society.

If you don't find anything sad about that, try this video. The mayor of nearby Connorsville, Indiana speaks for a promotional video the town is using to woo an automaker – and it's not even a real one. It's some startup operation called Carbon Motors which will probably fold before making a single vehicle. The town has everything Carbon could want: desperation, terrible location, and an abandoned Visteon (Ford) auto parts factory. I managed to watch the video up to the point at which he began stridently reassuring his potential feudal lords that the town DOES in fact have a Red Lobster, albeit 3 miles away.

Local governments, Chambers of Commerce, and the like have always laid out a red carpet to attract potential large employers, but this level of grovelling and begging is a recent development. Rural towns have become the equivalent of a homeless person waving a sign reading "Will do anything for food" at passing cars. We could even leave aside the fact that it's disgusting because it sets communities against one another in an unwinnable race to the bottom (who can offer more tax breaks, who can most convincingly promise to keep the UAW out, who can grab their ankles the fastest and most enthusiastically). Even in a vacuum these stories are simply pathetic. Watching people in what claims to be the world's economic powerhouse fight to prostrate themselves before an employer and surrender to it completely – before it even gives them a job, no less – is enough to make Eugene Debs rise from his grave, which is not too far from Connersville, so he can die again of shame.