Posted in No Politics Friday on May 5th, 2016 by Ed

My commute to and from work is very long, but so far I've had a great deal of luck avoiding the kind of unforeseen events that make it even longer. If I can do the drive in three hours or a little less that counts as Normal. On Thursday, in a series of events that would be considered comical had I not known for a fact that people died, the drive took ten minutes short of six. Six hours sitting in a car, mostly riding the brake, is enough to ruin anyone's day. Five hours into that you will find yourself quietly envying the dead.

First a major interstate was shut down and everyone forced off it (to continue an agonizing northbound crawl along the tiny rural roads of central Illinois). This easily set me back 90 minutes. Another major accident that necessitated landing a helicopter on the highway to remove victims (presumably) cost another hour. When yet another accident promised to add time to my stint on I-90, I exited to navigate my way home on Chicago side streets…only to find the major non-highway east-west road closed for maintenance. We were re-routed through, among much else, a cemetery.

At this point I began to wonder if it might not be best to stop, give the car keys to the first pedestrian in sight, and start a brand new life wherever I found myself.

Accordingly the vigor to write a proper NPF is missing. If you're in the mood for some environmental realism, check out these sad-funny pieces on Norilsk, Russia and Baotou, China, two cities dependent on the smelting of extremely toxic heavy metals for their economic existence. If anyone lives to 50 in those places, he or she should be whisked away and studied to learn their secret to immortality. Baotou is the source of 90% of rare earth elements upon which modern electronics rely, although interestingly they are not called "rare earth elements" because they are scarce. Most aren't.

It's not your typical NPF, but do you notice in the pics from those two cities there isn't a single living member of the plant kingdom? Not a tree, shrub, or blade of grass. Yeah. That's kind of jarring.


Posted in No Politics Friday on April 15th, 2016 by Ed

I really like cars. Sorry if this makes me Dumb and destroys your perception of me. As a regular consumer of things related to car types of which I am especially fond and more general Car Guy Stuff like Regular Car Reviews, Autoblog, and Jalopnik, I am well aware that no one is allowed to be a Car Bro without having a borderline obsession with manual transmissions. Shift gate tattooed on your forearm or GTFO, brah! Three pedals or it doesn't even count as a car, brah! Automatics are so gay, brah! Despite the cogency and persuasiveness of such arguments, stick shifts are fast disappearing in the United States. They now account for almost no new truck sales and something like 1% of new car sales. More tellingly, they are no longer available on many performance models aimed explicitly at Car Enthusiasts.

There are practical reasons for the decline. Most drivers see cars simply as appliances and they want whatever is most convenient and whatever makes driving easiest. Americans also sit in a lot of stop-and-go traffic, which is the environment in which driving stick is most annoying. But I think that the biggest problem – Unpopular Message Board Car Bro Opinion warning – is that modern no-clutch-pedal transmissions are just so goddamn good.

Automatic transmissions suffered until the last 10-15 yearrs from two drawbacks. One was poorer fuel economy; prior to 2000 most cars gave up two or three mpg on their automatic version when compared to the manual one. Most automatics were 4 speeds, which made it difficult to gear for fuel economy without sacrificing performance. And on that note, the second drawback was performance. They were slower and the rudimentary transmissions basically had three gears plus an overdrive, and most cars aimed at the mass car buying public didn't have the horsepower to pull them effectively. They didn't shift particularly crisply either. GM's ubiquitous 4L60-E, which I experienced in numerous vehicles, shifted as though it was filled with pudding. I remain unconvinced that it wasn't.

So, for years manual transmissions had two big bragging points: better mpg, better acceleration. Combine those with lower price – automatics tend to be a $1000 option even today – and you had an airtight argument. The problem is that now 6-plus speed automatics and dual clutch (DCT) boxes have better fuel economy, provide quicker acceleration, and shift more quickly. The only remaining practical argument is based on style.

I had a BMW that I truly loved driving, and it had a DCT/automatic. Like most DCTs, it had "paddle shifters" on the steering wheel for manual shifting. When I sold the car, the young man who bought it asked about the paddle functions. I told him that they worked just fine but that I determined fairly quickly that I could not shift better than the software controlling it. And that's the thing: nobody can. It might make you feel better to shift your own gears, but the days of manual shifting outperforming sluggish 80s style automatics (the true "slushbox" automatics that are no longer used) are gone. Long gone. Performance cars like Corvettes, Porsches, BMWs, and Italian exotics now have dual-clutch automatic or robotic manual (BMW's SMG or the Porsche PDK) boxes that can execute shifts in milliseconds. Literally milliseconds. They are designed and programmed to outperform the human clutch foot and right arm (left in the UK and Japan, I guess) and they do exactly that.

Getting a manual transmission on a new 2016 vehicle strikes me more as a badge one wears to establish Car Guy cred than something that makes sense. Manuals no longer outperform their self-shifting counterparts in any area. The historical advantage they had in fuel economy is gone along with any performance advantages. If you think manual transmissions are more fun, by all means go for it. Do what you enjoy. But they are in no way empirically "better," and in fact by any performance or economy measure they are now worse than modern self-shifting units. Manual gearboxes are now to cars what Amtrak is to long distance travel. You can take Amtrak from Chicago to LA if you like being on trains, but in practical terms it makes no sense at all. Flights are cheaper and infinitely faster. Your choice, then, is one based solely on personal preference at the expense of logic, which is your prerogative. The attitude of superiority is pretty tiresome, though, especially when attached to a technology that is demonstrably inferior now.

NPF: ZAHA HADID, 1950-2016

Posted in No Politics Friday on April 2nd, 2016 by Ed

Architects are not household names, especially not living ones. The average reader of Sunday newspapers can probably name Frank Gehry or recognize his derivative blobitechture by sight, but otherwise it's difficult to think of a living architect who might be recognizable to a non-enthusiast or professional in the field. The most important, decorated, and accomplished living one died on Thursday, and her death was no more than a Page 3 level headline.

Zaha Hadid was born in Iraq in 1950 to a wealthy family, which allowed her in 1972 to move to London to study architecture under, among others, Dutch giant Rem Koolhaas (who, like her, would win the Pritzker Prize, the Nobel of architecture). Today both are recognized as founders of the first identifiable successor to postmodern and modernist architecture, a heavily geometric yet smooth style that defies its mathematical origins by blending in place with its surroundings. It is a shame that neither figure is better known, but it is not uncommon in architecture for time to be a crucial ingredient in the growth of one's reputation.

Heydar Aliyev Cultural Center - Baku, Azerbaijan

Heydar Aliyev Cultural Center – Baku, Azerbaijan

In a professional world in which few women become prominent, Hadid won the Pritzker in 2010 (the only solo female recipient to date), two Stirling Prizes for individual works (Rome's MAXXI art museum – get it? XXI? – and London's Evelyn Grace Academy), and the Royal Institute of British Architects Gold Medal, of which she is also the only solo female recipient.

London Aquatic Centre for 2012 Olympics

London Aquatic Centre for 2012 Olympics

The word "visionary" should not be tossed around lightly. Hadid was one. Her architecture of multiple perspectives – buildings that present dramatically different impressions depending on the point at which one views them – is now a commonly imitated aspect of contemporary architecture and even interior design. The BMW Building and the aforementioned Evelyn Grace Academy are probably the most representative examples of this, as well as excellent examples of how geometric designs can be made to blend naturally with the landscape. Any architect can make a geometric design that stands out like a jagged shard from a flat landscape. It takes restraint and an eye for aesthetics that few have or ever will have to make it look natural.

Broad Art Museum - East Lansing, MI

Broad Art Museum – East Lansing, MI

It's sad to think someone so important could depart without attracting more attention. Maybe it is the lack of major projects in the United States. Maybe it is the absence of a loud, garish "Hey look at me" style to her work. While the name might not be familiar to you, she did as much as or more than anyone to shape the way the world around you looks today and the aesthetics of urbanism in the foreseeable future. Her influence will outlive her.


Posted in No Politics Friday on March 24th, 2016 by Ed

An anecdote of such great interest from Vincent Cannato's American Passage that I don't think my words can do it justice:

Frank Woodhull’s experience at Ellis Island began in 1908 when he returned from a vacation to England. The Canadian-born Woodhull, who was not a naturalized American citizen, was heading back to New Orleans where he lived. As he walked single file with his fellow passengers past Ellis Island doctors, he was pulled aside for further inspection. The fifty-year-old was of slight build with a sallow complexion. He wore a black suit and vest, with a black hat pulled down low over his eyes and covering his short-cropped hair. His appearance convinced the doctors to test Woodhull for tuberculosis.

Woodhull was taken to a detention ward for further examination. When a doctor asked him to take his clothes off, Woodhull begged off and asked not to be examined. “I might as well tell you all,” he said. “I am a woman and have traveled in male attire for fifteen years.” Her real name was Mary Johnson. She told her life story to officials, about how a young woman alone in the world tried to make a living, but her manly appearance, deep voice, and slight mustache over her thinly pursed lips made life difficult for her. It had been a hard life, so at age thirty-five Johnson bought men’s clothing and started a new life as Frank Woodhull, working various jobs throughout the country, earning a decent living, and living an independent life. Mary Johnson’s true sexual identity was a secret for fifteen years until Frank Woodhull arrived at Ellis Island.

Johnson requested to be examined by a female matron, who soon found nothing physically wrong with the patient. She had enough money to avoid being classified as likely to become a public charge, was intelligent and in good health, and was considered by officials, in the words of one newspaper, “a thoroughly moral person.” Ellis Island seemed impressed with Johnson, despite her unusual life story. Nevertheless, the case was odd enough to warrant keeping Johnson overnight while officials decided what to do. Not knowing whether to put Johnson with male detainees or female detainees, officials eventually placed her in a private room in one of the island’s hospital buildings.

“Mustached, She Plays Man,” said the headline in the New York Sun. Despite her situation, officials deemed Johnson a desirable immigrant and allowed her to enter the country and, in the words of the Times, “go out in the world and earn her living in trousers.” There was nothing in the immigration law that excluded a female immigrant for wearing men’s clothing, although one can imagine that if the situation had been reversed and a man entered wearing women’s clothing, the outcome might have been different.

Before she left for New Orleans, Johnson spoke to reporters. “Women have a hard time in this world,” she said, complaining that women cared too much about clothes and were merely “walking advertisements for the milliner, the dry goods shops, the jewelers, and other shops.” Women, Johnson said, were “slaves to whim and fashion.” Rather than being hemmed in by these constraints, she preferred “to live a life of independence and freedom.” And with that Frank Woodhull left Ellis Island to resume life as a man.

That's a pretty powerful statement of how limited the prospects in life were for women in the 19th Century. Not much has been written about Frank Woodhull, but you can find the archived original news stories with a simple Google search.


Posted in No Politics Friday on February 26th, 2016 by Ed

The popularity of and reverence for Christopher Columbus in the United States is more than a little puzzling. That an Italian mariner of bewilderingly little talent, sailing for the Spanish crown, who landed in the Bahamas, Cuba, and Hispanola (today's Haiti / D.R.) became an American icon makes sense only in the context of American culture and society immediately after independence from Great Britain.

As new nations always do, the US experienced a fit of nationalism and a backlash against all things reminiscent of the now-hated British. For the first two centuries on the North American continent, we had played the role of loyal subjects to the Crown and accordingly most pre-Revolutionary places and institutions bear distinctly British (or less commonly other European nations') overtones. We have states, for example, named after William Penn, Queen Elizabeth (The Virgin Queen, hence Virginia), the British Channel Island of Jersey, King George II, the Duke of York, and the 3rd Baron De La Warr, not to mention two Carolinas named after Charles I and countless cities named after the likes of Lord Baltimore. Very suddenly in 1776 it became highly unfashionable, bordering on socially unacceptable, to show pride in any name that served as a reminder of the humiliating subjugation to a faraway island in Europe, and henceforth only good American names would suffice.

The problem was that America didn't really have any American history distinct from that of Britain upon which it could draw. There were the Native Americans – and indeed many Native American place names dot the map east of Ohio – and some admiration for the French who had aided us during the Revolution (Lafayette and King Louis both became popular naming inspirations). There were also living American icons, primarily Washington, but one could only name so many things after a man who was still living and around whom many feared the development of a cult of personality. So that is how Americans at the time seized upon an Italian mariner of bewilderingly little talent, sailing for the Spanish crown, as a national icon. Columbus, who had very rightly been completely ignored and forgotten in this country up to the American Revolution, suddenly became America's founding saint for the sole reason that he was untainted by any association with England.

First there was the transmutation and anglicizing of his Spanish name, Cristobal Colon, to something he was never actually called during his lifetime (the British also did this to Giovanni Caboto, "John Cabot"). Then the fact that he did not land in any part of what was actually the United States was swept under the rug, deeming it sufficient that he had landed in "The Americas." Finally, that Columbus was a poor mariner who insisted to his death that he had landed in Asia was deemed historically irrelevant. And then we started naming things after him with a vengeance.

From 1776 until about 1800, nearly everything of consequence requiring a name in the new United States was dedicated to Columbus. The classically British-sounding King's College in New York became Columbia University (1784). The region of the American Northwest explored beginning in 1793 was called Columbia (which lives on today in the part that was retained by Britain after a treaty: British Columbia). The Ohio Territory had its future capital, Columbus, established in 1797. South Carolina renamed its capital Columbia in 1786. The District of Columbia was chosen as the site of the future national capital (1790) with the Residence Act. A march written for George Washington's inauguration in 1789 became "Hail Columbia", America's national anthem until it was replaced with the Star-Spangled Banner in 1931. Counties, towns, bodies of water, and other geographic features almost too numerous to count were similarly given some version of Senor Colon's non-name as were scores of Columbian societies and institutions, one of which eventually became the Smithsonian Institute (that story will have to wait for another day).

Half of success in life is showing up. And that's how Columbus became indelibly stamped into the fabric of the United States: by not being British at a time in which lots of new things needed names.


Posted in No Politics Friday on February 12th, 2016 by Ed

The electric light bulb was invented much earlier than most people realize. That is, if you don't mind a light bulb that burns out in two or three hours. There's a reason most sources qualify Thomas Edison's achievement as the man who invented the first practical, long-lasting light bulb. The idea was more than a half-century old by the time Edison and Joseph Swan (the Briton who invented the carbon filament bulb nearly simultaneously to Edison) made commercially viable designs. As is often the case, the invention everyone remembers only made the leap from theoretical possibility to practical reality because of a much less glamorous invention (and inventor) nobody remembers. You can stop reading at this point if you've heard of Hermann Sprengel.

As early as 1800 scientists working with electricity demonstrated all of the principles necessary to create electric light. Humphry Davy used a platinum filament and a huge amount of current in 1802 to generate a feeble light – not much, but considering Edison and Swan didn't patent their bulbs until 1879 it demonstrates just how old the idea was. In fact, by the 1840s there were any number of patents for incandescent bulbs of varying designs and materials. Of particular interest is the mysterious American John Starr, who patented a bulb in 1845, died immediately of cholera, and disappeared from the historical record. Nothing is known about him and only the diligent archiving of the US Patent Office allows us any evidence that he existed. His design was never exploited commercially.

Part of the problem in developing electric light was obvious – the "electric" part was lacking. Any home that wished to make use of electric appliances prior to 1880 had to build its own electric generator on-site. This is where Edison succeeded, and really truly succeeded, in a way neither Swan nor anyone else did. He didn't just invent a bulb; he convinced Gilded Age New York industrialists to build a power grid across the city. Many people had designs for bulbs but only Edison had a design for how to get light bulbs from patent drawings and laboratories into houses and places of business.

The two major obstacles to the design of the bulb itself, independent of electricity, were the material of the filament and the ability to remove air from the bulb (to extend the life of said filament). Every manner of material on Earth was tried and rejected as a suitable filament once the basic principles and components of incandescent light were understood. Eventually a thin piece of carbon – charred wood shavings, bamboo, or even paper – through which current could flow in a vacuum was identified as the answer (metal filaments of tungsten and tantalum were invented just after 1900 and put the German giant Siemens on the industrial map). But that was all well and good except that nobody could achieve a suitable vacuum during bulb manufacture. Enter Hermann Sprengel.

The German engineer developed a device, universally called the Sprengel Pump, that enabled air pressure to be reduced to less than one-millionth of its atmospheric level. While not a perfect vacuum it was more than close enough to enable the light bulb to make the leap from idea to mass produced reality. Incidentally, the Sprengel Pump relied upon a process involving considerable amounts of mercury, so many early Edison researchers and workers lost teeth, sanity, or central nervous system functioning in service of achieving that elusive vacuum. It is not unfounded to wonder if Edison himself had a touch of the Mad Hatter syndrome, as he worked with the device intimately and was known to be, you know, a dick. But that is merely speculation.

So without Hermann Sprengel there is no light bulb, and without the light bulb there is no Thomas Edison as he is known and revered today. And don't mention Edison around the British. They're still a bit sensitive about Joseph Swan getting the historical shaft.


Posted in No Politics Friday on February 5th, 2016 by Ed

The Internet has been a part of our culture for long enough and is sufficiently well-understood that I no longer feel any sympathy for people who violate its most basic rules or social dynamics and end up being embarrassed. It is 2016. Everyone knows that if you let people vote for something online, especially in an open ended fashion, people on the internet are going to identify quickly the most ridiculous possible outcome and swarm to it. The NHL just learned this the hard way when its online voting for the All-Star game resulted in a 6'9" oaf-pugilist-Ent named John Scott, a man who has played on every team in organized hockey for about 5 minutes while possessing no skill other than throwing punches at small Canadian men, being voted Captain of the All-Star squad. This embarrassed the league inasmuch as he 1) is terrible, 2) has been cut three times already this season alone, and 3) Scott was not even in the league at the time of the game, having been demoted to his natural environment in the minors.

The same scenario unfolds every time this happens. The entity in question refuses to honor the online vote, which inevitably makes them look even worse. Waves of approbation and indignation crash over them until finally they relent, embarrassing themselves yet again by backtracking and attempting to insist that, no, really, we get the joke ha ha ha. (Hockey enthusiasts with a sense of humor shamed the league into letting Scott into the game, in which he played to the great glee of everyone involved and, to rub it in, fans voted him All-Star MVP as a write-in.)

My favorite example of this phenomenon happened a few years ago when the City of Austin allowed internet voting to choose the name of its new solid waste facility. Either they were shocking naive or they were in the mood for comedy, because when you ask the internet to name a building where human feces fills giant tanks and is chemically processed you are just begging for trouble. The winning entry – Fred Durst Center for the Performing Arts, if I recall – was ultimately vetoed by the city management. Boooooooo.

What are some other good examples?


Posted in No Politics Friday on January 29th, 2016 by Ed

Anyone who has taken an English class at the high school level probably can respond with "Moby Dick!" when hearing Herman Melville's name. Fans or English Lit major types during college can go farther and tell you about Benito Cereno, Billy Budd, and Bartleby. Melville fans will also tell you that Moby Dick received no attention during the author's lifetime save for a few viciously negative reviews and it was not until 1920 that the literary world re-discovered it and decided it is great. But if you find someone who can name the books Melville wrote that actually were successful and popular, that's rare.

Today nobody in their right mind would read Omoo or Typee, and in fact you'd have a hard time finding someone who has heard of either. They were Melville's Hits, both in the once terribly popular "high seas adventure" genre. As the titles imply, both tales were set in the South Pacific (and were based on Melville's own experiences traveling there). These books are not good. "Dated" doesn't begin to explain how irrelevant this kind of writing feels today. In its time, though, these stories about adventures in foreign and exotic lands were popular given that most readers in the 1840s were unlikely to see much if any of the world during their lives. Today there's nothing mysterious or exotic about the South Pacific, for example, because at a moment's notice you can watch videos, see pictures, or get on a surprisingly affordable (although certainly not cheap) flight to see it for yourself. Traveling around the world doesn't impress us anymore. And it takes a lot more to titillate the imaginations of modern Americans than some "natives" speaking pidgin English in an island setting. We have movies about robots punching monsters, for christ's sake.

As a kid I was (OK, I still am) fascinated by maps and globes. I'd stare at them for hours sometimes, looking at different places with strange names and wondering if I would ever be there at some point in my life. And I'm not going to lie, well into adulthood I maintained the illusion of the Pacific islands as idyllic paradises. On more than a handful of bad days I imagined myself running away to a tiny island and living on the tropical beaches. The reality is not hard to uncover, and it isn't pleasant. Most of the Pacific islands are floating slums. They're tiny, packed with people, and largely devoid of economic activity. Oh, and the planet is going to swallow most of them soon due to rising sea levels. The Times ran a piece in December that I've read probably a dozen times about the Marshall Islands, a former US possession and now not-really but-kinda-still a US possession. Look at the videos and photos with that story. That place sucks. I don't want to dwell right now on the myriad reasons the Pacific is full of slums (hint: It's basically our fault) but it's difficult to think of a better term to describe what has become of these places. Suffice it to say that the fantasy is better than reality.

The more of the world you see, the less magical any of it seems. We can't expect that other parts of the world will be frozen in time for our enjoyment and appreciation as rich Westerners, but it strikes me as particularly sad that we've exported only the absolute worst parts of America to places that were doing fine on their own before Europeans arrived. Staggering obesity, even more staggering environmental degradation (remember the guano post?), Spam, Coke, McDonald's, shitty beer, and about 75 nuclear detonations by the US and France that leave several areas uninhabitable even 50-plus years later.

It's sad that reality and the shrinking of the world in general have burst our fantasy bubble of island paradises. It's even sadder to think of what it must be like to live there now, and the changes that a 70 year old person living there today must have seen during his lifetime.


Posted in No Politics Friday on January 22nd, 2016 by Ed

Did you know that stereotypes didn't exist before 1922?

Alright, more accurately the use of the term stereotype in the sense of an oversimplified generalization about a particular type of person or thing is less than a century old. Walter Lippmann's classic text Public Opinion – read it and you'll feel like it was written last year rather than during World War I – debuted the term to the mass public. Oddly enough, though, he drops it in the text cold without introducing or defining it, leaving open the possibility that the term may already have been circulating in literate circles at the time. Alternatively he may simply have assumed that readers could understand it from context. So the follow-up question is: Where did Lippmann get it from? He certainly was a great writer. Perhaps he just made it up?

Not exactly. Stereotype was the name of a printing process invented in 1795 by a Frenchman named Firmin Didot, the son of the man who invented the Didot typeface for those of you who are into such things. Firmin coined it, as was the style of the time, by combining two Latin terms: stereo (solid) and typos (impression). Therefore in the literal sense a stereotype is a solid impression of a group of people. "Solid" in this case has no positive connotation (e.g., "a solid victory" or "do me a solid brah") but to its firmness and lack of variation. The stereotype printing process involved a means of reproducing printing plates efficiently and then using the reproduced (stereotyped) plates to print rather than the original. This allowed extremely consistent printing at a good rate of production, and Didot was very successful. In short, he pioneered a process of churning out unvarying, nearly identical copies of a single source.

I have no idea how or why Walter Lippmann was acquainted with the technical details of antiquated French printing processes. It is inarguable that the term is uniquely suited to conveying the spirit and concept of the modern use of the word.


Posted in No Politics Friday on December 24th, 2015 by Ed

(No, it's not Friday, but presumably everyone is done with or in the process of being done with this week)

So I've really kept you on the edge of your collective seat with that cliffhanger about guano, right? You're thinking nothing could possibly be more fascinating than the Guano Islands Act of 1856 and you keep waiting to be proven wrong. Get ready.

Last time we noted that guano experienced a meteoric rise from useless animal waste to valuable industrial commodity to totally depleted resource in a relatively short period of time. Many of the small islands on which it was found were essentially scraped clean by the latter part of the 20th Century and, just as a reminder, the landscape that gets left behind by guano mining is…bleak:

Looks promising

If you live on a very small island with no valuable resources and suddenly it is revealed that your island is literally made of something rich countries want, you can imagine how little restraint would be practiced in harvesting and selling it. Once it is all gone, though, you're back where you started, only worse. You have no valuable resources and now you live in an unremediated, totally cashed former strip mine. Hopefully your nation invested what it earned from resource extraction wisely, right?

That brings us to Nauru.

Leaving aside European microstates, Nauru is the smallest country on Earth. It is barely 8 square miles. Total. For comparison, Philadelphia is 134 square miles in size. Oh, and it's incredibly remote and lacking in infrastructure, so the usual "At least we have tourism!" corollary of the South Pacific does not apply. You're thousands of miles from anything, barren, impoverished, and 1/20th the size of Philadelphia's city limits; what do you do to ensure the economic future of your 10,000 citizens when a fairly large but one-time-only windfall comes your way?

If you're the government of the Nauru you establish the Nauru Phosphate Royalties Trust to make that money last. You also make sure that no one who has any concept of what "investing" entails is in any way involved, and those involved conceive of investing in the way a tycoon in 1920s comic books might. During the 1980s the Trust managed to lose nearly every penny the island earned by selling itself to the petrochemical industry through a string of almost comically bad investments. It financed a 1993 West End flop "Leonardo the Musical," about the painting of the Mona Lisa. It was one of the biggest financial disasters in the history of UK theater. It spent nearly a quarter of a billion dollars (AUD) to build Nauru House, a skyscraper in Melbourne, Australia. Eventually it was sold to real estate developers for about half of what was outstanding on the loans the Trust undertook to run it. Partially to encourage tourists to come to the island and partially to allow Nauru's elite to get off the godforsaken rock as often as possible, Air Nauru, the government-run airline, purchased a spankin' new Boeing 737. The tourists never came, as tourists do not generally enjoy going to the middle of nowhere to see nothing and stay in unelectrified concrete block houses. The plane was repossessed by debt collectors in 2004 in what I can only imagine was an awesome job for some Australian Repo Man. Media accounts pitifully underscored the situation by describing the 737 as the island's "only link with the outside world" and Air Nauru's only plane.

At this point Nauru, a poor and indebted nation, did what poor and indebted individuals often do. It turned to the gray economy. Or, you know, crime. Basically crime. It started an "economic citizenship" program of the type that were common prior to 9/11, in which a passport and citizenship were sold to any individual willing to pay $25,000 to the failing government. When that was shut down by the U.S. and Australia, Nauru became a money laundering hub for Russian and East Asian organized crime. That didn't last long, but this time Australia offered Nauru an appealing alternative: it could take millions of dirty Australian dollars to build a prison for asylum seekers. Aussies were alarmed in the 2000s by the number of (NON-WHITE ASIAN) asylum seekers attempting to enter the country, and housing them on Australian soil was considered an unacceptable option. A third nation desperate for cash and reasonably proximate to Australia presented the idea solution. Amnesty describes the detention camp as "a human rights catastrophe, a toxic mix of uncertainty, unlawful detention and inhumane conditions." About a month ago, Nauru decided to simply open the camp and let the detainees wander around the barren island as they please, which has improved conditions.

Just kidding, the detainees are getting raped a lot.

The end of the Guano Economy didn't treat anyone especially well, but nobody can claim that it treated them worse than it did Nauru. It might be the actual worst place on Earth right now, excepting Fresno.