IS IT? IS IT REALLY?

Many years ago a student submitted a research paper that I continue to use (without personal information, obviously) in classes as an example today. Unfortunately it's not a positive example. The paper is a ten-page treatise on how the American presidents who died in office were all secretly murdered by the Freemasons.

The student was quite unhappy with the (failing) grade this mess of internet conspiracy theories received. "I did research and cited all of the sources", s/he stated. It was true; the paper was exhaustively cited and clearly represented a good deal of research. The problem – and we/I did cover this in class repeatedly throughout the semester – is that not all "sources" are made equal. Some are legitimate, some are questionable, and some are flat-out nonsense. And back when I was still learning how to teach, it surprised the hell out of me to find that a non-trivial minority of students cannot tell the difference.

In the students' defense, it's difficult to explain how to identify a garbage source. TO some extent it's like the old Potter Stewart "I know it when I see it" obscenity test. My best advice, I think, is to err on the side of caution when the objective is to cite supporting research. Sure, government or major media outlet sources will not always be correct and may be flawed. But if the choice is between the Benghazi story on CNN or a post on something called WheresTheBirthCertificate.com, it is in students' interest to play it safe. Even if the CNN story is not entirely accurate or comprehensive, no one's going to think you're nuts for relying on a mainstream media account.

This is not a problem limited to students, of course. A sizable number of our fellow citizens have problems smelling bullshit even when their noses are buried in it. To put it charitably, the proliferation of news and information on the internet has exposed an uncomfortably high level of credulity among the public.

online pharmacy ventolin no prescription

It's problematic enough that a lot of us intentionally seek out information that confirms what we already believe (and discount or reject contradictory facts) but those of us who do make a real effort to inform ourselves about issues can't tell if what we're reading is total garbage.

Most of you know better than to argue with people in internet comment sections or on Facebook, but tell me if the following sounds familiar. A friend who spends a lot of time on websites with words like "healing" and "wellness" in their name shares this must-read link on Facebook. The article explains how vaccines upset the body's natural rhythms and chakras and enzymes. Vaccines also cause autism and leprosy and gingivitis, according to some really fascinating new findings from Jenny McCarthy. It concludes, based on an argument along the lines of "As a mother, I know what is best for my child", that children should not be vaccinated.

You try to point out politely – perhaps offering your own link that gently debunks this monumental pile of baloney – that the information your friend has provided is not entirely accurate.
buy metformin online noprescriptionrxbuyonline.com no prescription

As you know from experience, this rarely has the desired effect of actually informing the recipient. Instead, your friend says something along the lines of "It's so hard to know what information to trust anymore" or "It looks like there are a lot of good arguments on both sides."

This is one of those classic red flag statements – not unlike "I'm entitled to my opinion" or "I guess we'll have to agree to disagree" – that means "I am clearly wrong but I have no intention of changing my mind." It must be, because taking the statement literally is difficult. "It's so hard to know what to trust." Is it? Is it really? No. It's actually pretty easy.

No, doctors and experts and scientists are not infallible. They and we believe things to be true that later turn out to be wrong. But is the American Medical Association a safer bet than the Spiritual Holistic Wellness Center or the Crunchy Moms tumblr? Yes. Ten times out of ten. People argue that it's healthy to be skeptical of consensus and the establishment in any field, and that's true. However, their skepticism appears to end where the blogs and message boards and pseudoscience collections begin.

I've learned my lesson now. We cover the common characteristics of denialism, pseudoscience, opinion, frauds, and plain old bullshit. Hopefully students leave knowing how to identify those things. Unfortunately this prepares them for a lifetime of frustration from dealing with the millions of Americans who can't.

THE FOUNDATION IS OPTIONAL

With the possible exception of psychology, no field of academic study is more faddish than education. Every few years another round of assessments and surveys show us that the vast majority of students – just like the vast majority of Americans – have knowledge and skills that cannot even be described as minimal. "Minimal" implies some understanding of a given subject, and often that is not the case.

Armed with the latest Look How Dumb Everyone Is survey, new educational techniques and tools are developed to join the long line of failed techniques and tools that were supposed to solve this problem in the past. One of the most dramatic paradigm shifts occurred when it was collectively decided that fact retention and rote learning (which remains the foundation of the educational systems in places like China and Japan) were ineffective. Instead, we were to focus on building students' higher order intellectual skills – critical thinking, reasoning, problem solving, and so on.

In my view, this is a shining example of one of the worst tendencies of academics – using jargon-heavy theories to explain away why students are so bad something. What is this shift toward the almighty "critical thinking" talisman but an effort to excuse students' woeful lack of facts, information, and basic skills? If the students demonstrably cannot write well, do math, or remember facts, we have to say they're good at something. What better than an abstract concept that proves remarkably difficult to measure? Sure, we can't prove that students have Critical Thinking skills, but…you can't prove that they don't. Voila.

The dismissive attitude toward facts and information has gone off the deep end in the last decade. Now that everyone has a smartphone, there's no need for students to know anything at all. Any facts they will ever need can be looked up in thirty seconds. What's important, we're told, is that they know how to interpret and Think Critically about things. This has always struck me as dubious.

online pharmacy augmentin no prescription

We are to believe that students who know next to nothing about entire fields of knowledge somehow have good analytical skills in those same areas. There's no foundation, but somehow there is a mighty edifice built on top of it.

online pharmacy levaquin no prescription

Don't get me wrong, I understand the futility of many kinds of rote learning that were popular a half-century ago. A student does not really gain anything from being able to name all nine Supreme Court justices or knowing the capital of every nation on Earth.
buy zoloft online www.mabvi.org/wp-content/languages/new/usa/zoloft.html no prescription

However, surveys continually show that students don't know things that are relevant either. Are we to believe that people who can't explain who Napoleon was or don't know which side the Russians were on in World War II can somehow think usefully and critically about history? That someone who has no idea which branch of the government holds which powers can understand and analyze our system?
buy vibramycin online www.mabvi.org/wp-content/languages/new/usa/vibramycin.html no prescription

That someone who can't define "baroque" and doesn't know when or why Impressionism became popular has a useful grasp of art in the context of culture? Most amazingly, we are asked to believe that the ability to look any of these facts up on an iPhone will enable students to skip the knowledge step entirely and launch right into critical analysis. OK.

The underlying problem – and I'm sure some fad will pick up on this eventually, perhaps in another decade or two – is that the act of learning facts and information forces students to engage the material. Learning which powers belong to Congress requires one to read some stuff about American government, as memorizing who painted various works of art requires one to look at works of art. The hidden cost of the "Who cares, they can google it" mindset has been the "Why bother?" attitude with which, studies show, students now approach all of their academic tasks. It's not like they're spending less time on rote learning in favor of more time on other academic tasks; they're just spending less time learning anything.

No one in chemistry would argue that students can skip the Periodic Table, nor would anyone in math say, "Sure, skip algebra and trig, just go directly into calculus." Yet in the soft sciences and the humanities we continue to shoot ourselves in the foot by requiring students to learn – actually learn – less and less. We wrap up their ignorance in academic babble and explain how it's acceptable, or even good for them, to know so little. We come up with new "curricular enhancements" and pedagogical theories that, lo and behold, do not eventually bear fruit in the form of a generation of students with great thinking and reasoning skills despite being almost completely devoid of knowledge.

Shocking, really.

BUT IT'S CHEAPER, RIGHT?

Children born since the year ~1995 will need to have the concept of a pension explained to them. It will be about as relevant to their lives as the carburetor, the telephone switchboard operator, and the Victrola. And we will have to explain that after St. Ronnie descended from heaven and a lot of people in expensive suits spent ten years doing blow, corporate America came up with this great idea: rather than having a defined benefit, why not have defined contribution retirement plans? It would cost employers far less, sure, but it would benefit the working stiff, too! Why be saddled to a defined benefit when you can invest your money in Mutual Funds (remember when those were all the rage?

online pharmacy stromectol no prescription

Gosh, I Love the 90s!tm) and watch it grow, like, 10-15% per year! Hell, maybe 20%.

The internet came along at just the right time to make this seem plausible. Look! Websites! E-Trade! You can be your own stockbroker! Sure, nobody really understands any of this shit, but…Mutual Funds! A trained monkey could pick those, and our Investment Professionals take care of the rest. You just get a drink with an umbrella in it, sit back, and watch your money grow.

Now that this new era of capitalism is mature – if any aspect of such a scheme can be so labeled – it turns out that the estimates of future gains may have been slightly exaggerated. Maybe we all were a tad optimistic. So maybe it has been more like 5% annualized, if you're lucky. And then there's the Investment Professionals.
buy amitriptyline generic buy amitriptyline online over the counter

Boy, we should maybe have screened them a little more carefully, or supervised them a little, or maybe not incentivised gambling with your money for short-term gains. And then there was that whole real estate thing, which no one could have foreseen. Everyone knows that real estate is a good investment! OK, OK. Lessons learned.

Here in academia, we're one of many professions currently bemoaning the sluggish job market and pointing to Aging Boomers Who Won't Retire as part of the problem. Having almost completely abandoned the defined benefit in favor of defined contribution plans 20 years ago, apparently it never dawned on our social betters that people won't necessarily retire when we (collectively) might like them to if we give them a retirement plan with a value that changes, quite literally, by the second. Oddly enough, they seem to be hanging on "just another year or two" hoping that The Market will increase the value of their savings – savings that are, even among responsible savers, often pretty meager.

"I guess they should have saved more!" we say with wagging chins and scornful glances. Well, thanks to another invention of the 1980s – constant downward pressure on wages, temp-ification of the profession, and so on – even people who saved rather aggressively might have amassed comparatively little over their working lives, given the cost of living as they cross age 65. It turns out that if a worker's retirement savings is a percentage of their earnings and you pay them jack shit, they reach their late 60s and can't necessarily afford to retire. Contrary to the propaganda, most people in higher education are not making a ton of money. And then people in the profession wonder, gee, why won't all these old people just retire? Well, saving 20% per year on a salary that topped out at $68,000 after forty years isn't going to be very reassuring to a 65 year old. What if I live another 20 years, they think. Better work just a few more…

Everyone in this country tells you that things like communism only work In Theory, and they are right. What they neglect to mention is that most of the things about the system we have in place – and actively endorse at all levels of society – only work In Theory too. Sure, 401(k)s sounded great, if the market gave 10% annualized returns and if one's income increased steadily over the life course.

online pharmacy clomiphene no prescription

That's two Ifs too many, and the end result has been a predictable trainwreck: too many elderly people end up having to retire on Social Security and little else, or they simply work until they drop waiting for their 401(k) of magic beans to grow another 10 or 20 percent.

With no disrespect to the older readers, watching a 77 year old perform most jobs is not an inspiring sight. Someone pushing 80 has no business being in a classroom, for example. Yet here they are, still teaching, still doing any other profession in which a mandatory retirement age can't be (or isn't) enforced. For years (not coincidentally, the years I was desperately searching for a job) I wondered what was wrong with these people. Why won't they retire? Are they selfish? Senile? Delusional? Over time I learned, though, what three decades of stagnant salaries, increasingly expensive health benefits, and the "slight" under-performance of the ol' Employee Retirement Plan can do to one's definition of the right time to retire. Meanwhile, the real under- and unemployment rate for people in their twenties is, what, 50%?

But hey, it saved our employers money. So it's a win, according to the gospel of American economic wisdom (St. Friedman version).

CROWDSOURCE MY DOCUMENTARY, "THE SLAPPING OF AMANDA PALMER"

Sometimes the stars align perfectly.

Actor/smirking chimp Zach Braff has brought renewed attention to the practice of wealthy celebrity assholes using Kickstarter to fund their for-profit endeavors by soliciting donations to film a sequel to the unbearable Garden State. Braff, who makes over $350,000 per episode (!!!) on his sitcom and netted over $36 million in box office revenue alone for Garden State, admits that he has traditional funding offers for the sequel. But why spend your own money when you can spend someone else's? Braff claims to want "artistic control" over casting and editing, hence avoiding studio funding. And then he offers speaking roles to anyone who donates $10,000.

Palmer, whose own Kickstarter crimes are extensive – claiming that she needed $100,000 to record an album, raising $1.1 million, and then soliciting unpaid volunteer musicians (to play shows people would pay to get into) because she claimed she could not afford to pay them. Wow, you'd think a successful recording artist with a multimillionaire author spouse could afford to pay a couple of touring musicians! But you just don't understand the music biz.

Now, Palmer is back in the headlines with her latest desperate grab for attention, a "poem" called "A Poem for Dzhokhar," about the Boston Marathon suspect. She shared this masterpiece on her website, closing with a donation button "for her time and effort" despite admitting that it took "about 9 minutes" to write.

Since Ms. Palmer is such a fan of crowdsourcing and brazen internet attention grabs, I'm sure she'll be the first to donate to my experimental documentary film project, The Slapping of Amanda Palmer.

What it is: A short film documenting the planning and execution of a journey across the country wherein I locate Amanda Palmer and slap her across the face with a glove. As a contingency plan only, I may slap her with an open hand if the glove fails.

Fundraising goal: $100,000

+$15,000 for first class travel and luxury hotel accommodations
+$35,000 for high end cameras (operated by volunteers)
+$10,000 for handmade artisanal calfskin slapping glove
+$20,000 for six months of slap training with certified masters in slapkido and Advanced Combat Slapping (ACS)
+$20,000 for my time and creative efforts

Reward levels (the Kickstarter page is still being verified)
$100: A signed photo of Ed plotting to slap Amanda Palmer.
$500: A non-slapping role in the film. Donor must pay for own travel costs.
$1000: Donor will receive the opportunity to pose for pictures behind the slapped poetess/artist/musician after I subject her to slapping.
$5000: A private dinner and one-on-one slapping session with the director/star/producer of this documentary. Donors are guaranteed at least one clean, unobstructed slap.
$10,000: Six weeks of intensive training in Slap-kido and Advanced Combat Slapping (ACS) with master practitioners at their desert retreat near Kingman, AZ. The music of Enya plays over loudspeakers at all times throughout the experience.

Thanks in advance for your support! I'll send you a link as soon as the Kickstarter is processed and approved. In the meantime, buy some artwork or a bumper sticker! No matter what, just keep sending money. People like Amanda and I are important and we deserve it.

YOU HAVEN'T SEEN ANYTHING YET

In the cacophony of Boston-related news coverage last week, the death of USA Today founder Al Neuharth on April 21 barely registered. The way perceptions of Neuharth's paper changed since its founding in 1982 is a fascinating look at how American media have changed as a whole. To put it another way, the relative consistency of USA Today over the past three decades highlights how much the rest of our media have changed around it.

Despite being the most widely circulating newspaper in the country (although the Wall Street Journal also claims this honor, depending on how circulation is measured) USA Today has always been something of a joke. Journalists and readers both derided it when it debuted in the Eighties, and it has become the butt of countless jokes. It is not difficult to see why.

buy antabuse online welovelmc.com/wp-content/uploads/2022/08/png/antabuse.html no prescription pharmacy

Its visual style – particularly its parodied-to-death front page "Snapshots" graphics – and willingness to place advertisements everywhere (including the banner headline) made it difficult to take seriously. That it was (and is) commonly given away for free in hotels and institutional settings reinforces the perception of the paper as disposable, shallow, and generally Less Serious than Real Newspapers like the New York Times, WSJ, and other big city dailies.

Favorite-Night-Takeout-Pie

As is so often the case in a nation that lets the free market determine which media outlets succeed or fail, USA Today established some measure of legitimacy with its popularity. It's hard to ignore a paper with circulation that spills into seven figures. But the hue and cry throughout USA Today's rise in the 1980s interpreted its sales figures as a harbinger of the apocalypse. "America is doomed if this is the kind of garbage we are going to read", said many a snobbish, albeit not entirely incorrect, commentator.

buy synthroid online welovelmc.com/wp-content/uploads/2022/08/png/synthroid.html no prescription pharmacy

It looks like a comic strip! It's more advertising than news!

It's just so un-serious!

How funny it is to fast forward to 2013 and see USA Today in its current position as part of the "old guard" of the American media; a remnant of a bygone era. Its emphasis on graphics, ads, and short blurbs in place of feature stories all became common in the intervening years. Its graphics, in fact, now look quite tame – almost quaint – in comparison to what media outlets routinely plaster all over the internet, cable TV news, and newspapers today.

buy periactin online welovelmc.com/wp-content/uploads/2022/08/png/periactin.html no prescription pharmacy

In thirty years the USA Today went from the bottom of the journalistic barrel in the U.S. to an example of how things were done in better days – without fundamentally changing. Everything else got much worse.

Direct comparisons are difficult because newspapers as a medium have largely faded into the background of American media empires. Nonetheless, the weeping and rending of garments that accompanied USA Today's emergence shows how little we knew in the 1980s about how much worse the media could get. We hadn't foreseen the Glenn Becks, the 20-words-or-less Headline News network, the bombastic graphics and music, the entertainment-as-news ratings bait, and all the other rotten aspects of the system we have today. Hell, CNN has been doing 24-7 Boston Marathon bombing coverage for the past week; did it actually deliver more or better news than USA Today during that time? Probably not, unless Grief Porn counts as news now.

The pessimist's mantra – "Don't worry, it will get worse" – would have been sage advice to anyone who saw USA Today during its infancy and declared it the worst of the worst. When I watch TV news these days, I am disgusted by how bad it is. What really depresses me, though, is not how bad it is now, but that it is inevitably going to get worse.

URGENT MEASURES

In the wake of any disaster in the United States, someone will take it upon himself to point out that what we consider tragedies are part of daily life in other places. Three people die in a terrorist attack in Boston (a fourth later during the manhunt) and the entire country loses its shit. Meanwhile, random bombings in places like Syria, Iraq, Afghanistan, and others kill a few dozen people on a daily basis.

What I infer from this is not that Americans shouldn't complain about terrorist attacks, despite the fact that they're remarkably uncommon here and, statistically speaking, we should probably worry more about every asshole in the country having access to a wide array of firearms. Instead, this underscores the fact that the United States is remarkably well prepared for a terrorist attack or (since Katrina) city-scale disaster.

Despite the appearance of chaos all week, Boston was as prepared as hell to handle what happened. The marathon planners diverted the race (as they had contingency plans developed for exactly such an event). The wounded were in hospitals within minutes, keeping the fatalities surprisingly low under the circumstances.
buy wellbutrin online buy wellbutrin no prescription

Police secured the scene quickly and, working with federal agencies, identified the perpetrators within a day or two.
buy diflucan online buy diflucan no prescription

Then, after the suspects ambush-killed a police officer in a patrol car, there was a moving shootout and standoff in which no one was killed despite hundreds of rounds being fired. That's because the local governments had the city on lockdown, and people obeyed the recommendations made by law enforcement. I'm sure criticism will develop as the events recede further into the past, but dang, Boston. All in all, excellent job. I'd challenge any city of nation to do better, even though I'm sure many could do equally well.

This is the point at which people start asking what we can do to prevent attacks like this in the future. The answer is clear: nothing. Sure, the errant, racist media coverage was a disaster, but that's not a matter of public policy. Short of banning public events or repealing the 4th Amendment, we're about as safe as we're ever going to be. All the metal detectors, closed-circuit cameras, armed cops, and knee-jerk proposals for new legislation won't make us one bit safer – we already have enough layers of security in place to catch the Idiot Terrorists, the only group that would be deterred by those kinds of things.
online pharmacy fluoxetine best drugstore for you

When people are making backpack-sized bombs out of common household items and black powder, there really isn't much anyone can do to stop them. Yes, that's scary. That's why it's called "terrorism."

Events like this are a big part of our culture of fear, and we're encouraged to incorporate this fear into a kill-em-all worldview. But here's the thing: complete security is an illusion. If it could exist, it would horrify you to see what it looks like. What are you going to do? Refuse to leave the house? Stop attending events in cities? Stop traveling? Live in a bunker in rural Montana?
online pharmacy zoloft best drugstore for you

We can't spend the rest of our lives scared of our shadows, either individually or as a society. There's no point in basing public policy on our inability to accept the fact that we can't be 100% safe at all times. I guess 99.99% safe will have to do. The sooner we accept that, the better off we will be.

OUT OF TIME

In higher education we spend ample time discussing the idea of a core curriculum. Every university comes up with a buzzwordish name for it, but the concept is the same: to define the basic, bare minimum knowledge that we feel a student must have, in addition to whatever specialized knowledge they get in their field(s) of interest, to leave college with a useful understanding of the world and the skills required to function in it. Unsurprisingly these core curricula focus on writing/composition, basic math and science, and history. While it is fair to note that some students get college degrees without mastering some or all of these core skills, polling data shows that Americans are woefully ignorant about history and world affairs – to a troubling extent.

If I may briefly mount my pedagogical high horse, I consider two historical events – if you could only pick two – absolutely essential to understanding modern American society and government.
buy amoxicillin online buy amoxicillin no prescription

The first is the American Civil War. The other is World War II. No, I don't believe students benefit from memorizing the names of battles and generals. I do think that if one is really to understand the fundamental political conflicts in the United States, an understanding of the causes and aftermath of the Civil War is indispensable. Likewise, modern global politics (and a good deal of American exceptionalism in policy both foreign and domestic) is rooted in WWII.

K-12 classes still overwhelmingly choose to teach history chronologically. This, in my experience and what I commonly hear from students, results in a seriously detrimental lack of emphasis on modern history. The academic year begins with ancient Greeks and Romans and ends sometime in May, usually having gotten no further than the Industrial Revolution or perhaps World War I. Accordingly, I get a ton of young people who, through no fault of their own, have been taught more about Plato and Tacitus than about the Cold War, decolonization, the Vietnam War, globalization, the collapse of the Iron Curtain, and 9/11 combined. Recently I found a class of 25 honors students – excellent students – totally ignorant of the basic aspects of the 9/11 attacks and the subsequent propaganda surge leading to the Iraq War. And why should they know? They were 8 when it happened, and it has never been taught to them.

American students do get the Civil War. They might get it in bizarre or ideologically motivated ways (some southern schools, I discovered, continue to teach that slavery was not the root cause of the War) but they get it. They have a basic understanding of what happened. But World War II? The Holocaust? The Treaty of Versaiilles and the rise of Nazism? The complete devastation of the industrial powers of Europe and Asia that led to 20 years of unparalleled economic growth in the U.S.? Western nations' abandonment of Poland, exploitation of empires, and refusal to take Jewish refugees? They have nothing, really. What they know about WWII is what they get from movies and from Call of Duty video games. They often believe (thanks to films like Saving Private Ryan) that Americans fought the Nazis essentially alone. They rarely know that the Soviet Union was America's ally, and primarily responsible for the military defeat of the Third Reich. They rarely understand why or how the Holocaust happened, and the economic scapegoating of Jews and other "others" during the post-WWI economic collapse in Germany. They fail to recognize how the War accelerated decades of technological development (radar, nuclear power, aircraft, electronics, medicine, etc) into a few short years.
online pharmacy zovirax best drugstore for you

buy ventolin online buy ventolin no prescription

They think – if they think anything at all about it – that America beat the Nazis and someone else (either the Chinese or Japanese) because we invented the nuclear bomb.

Everything – from international terrorism to the Israeli-Palestinian conflict to 20th Century American economic growth to the housing crisis to the Vietnam War to the woes of underdeveloped countries – about the modern world can be understood completely only by tracing the roots of these events at least as far back to WWII. And increasingly I – I can't speak for anyone else here, but I doubt I am alone – find that students have the least knowledge about these more recent events. Try saying "Arab Oil Embargo" or "Mikhail Gorbachev" to a room of college freshmen and see what kind of looks you get. Hell, try it with a group of adults; it probably won't be much better. We know very little of recent history and what we do know is often wrong. Is it any wonder that opinions about current events rarely make sense?

Perhaps the recent past is deemphasized because it is assumed, incorrectly, that students somehow know this information because "it didn't happen that long ago." Or maybe the design of grade- and high school curricula continues to talk about ancient times at the expense or exclusion of the 20th Century. In either case the consequences are the same: parochial attitudes about the world and a skewed understanding of any issue that takes place outside of the bubble around our immediate lives.
online pharmacy wellbutrin best drugstore for you

RED FLAGS

Mike Konczal deserves a huge back-pat for blowing up the internet with this post about a new academic paper identifying several flaws in the main piece of pro-austerity research at the heart of Paul Ryan's argument since 2010.

In 2010, economists Carmen Reinhart and Kenneth Rogoff released a paper, "Growth in a Time of Debt." Their "main result is that…median growth rates for countries with public debt over 90 percent of GDP are roughly one percent lower than otherwise; average (mean) growth rates are several percent lower." Countries with debt-to-GDP ratios above 90 percent have a slightly negative average growth rate, in fact.

This has been one of the most cited stats in the public debate during the Great Recession. Paul Ryan's Path to Prosperity budget states their study "found conclusive empirical evidence that [debt] exceeding 90 percent of the economy has a significant negative effect on economic growth." The Washington Post editorial board takes it as an economic consensus view, stating that "debt-to-GDP could keep rising — and stick dangerously near the 90 percent mark that economists regard as a threat to sustainable economic growth."

In short, the original paper is shown by three researchers from UMass to have three major flaws. First, it selectively excludes data on high-growth, high-debt countries. Second, it uses a bizarre (and statistically ridiculous) method of weighting the data. Third, and perhaps most awesomely, they made a formula error on the Excel spreadsheet (!!!) they used to analyze the data. As Mike says, "All I can hope is that future historians note that one of the core empirical points providing the intellectual foundation for the global move to austerity in the early 2010s was based on someone accidentally not updating a row formula in Excel." Since he explains well the three major errors with the paper, I won't belabor them here.

Explaining the petty intricacies of academic research for mass consumption is not easy, and that's what Mike really nailed here. However, I want to point out yet another issue with the original research.

The problem began when no one in academia could replicate the R&R paper. Replication is at the heart of every field of scientific inquiry. If I do a test proving that water boils at 212 F, then everyone else should be able to get the same result. In order to make that possible, I have to share my data with the rest of the scientific community – what kind of vessel I used to boil the water, the altitude and atmospheric pressure, the mineral content of the water, and so on. I have to show everyone else exactly how I did it.

What non-academics might underestimate reading Mike's account is just how egregious a red flag it is when A) no one can replicate a major finding despite considerable effort and B) the authors of a controversial paper refuse to share their data or explain their methods. To a non-academic, it might seem like "property" owned by the authors to which no one else is entitled. In academia that simply is not how it works. Every reputable journal on the planet has a policy of sharing replication data, and any publicly funded (NSF, etc) research must, by law, make all data publicly available.

So when R&R not only refused to share data for years but also refused even to tell anyone how they weighted the observations, Red Flag doesn't begin to convey how sketchy that is. Fireworks should have been going off at this point suggesting that this is not research to be taken seriously, and in fact it is possible that the "findings" were simply made up.

The science/academic people out there are probably wondering how in the hell one gets a paper published without even explaining the methodology used in the analysis. Good question! The answer is our friend the "special issue" in which papers are invited by the editor rather than being peer-reviewed. In other words, the R&R paper didn't even go through the standard review process (which is plenty flawed, but at least it's something) before publication. No one at any point in the process checked these results for accuracy or even looked to see if the authors actually did an analysis. Cool.

So that's how a paper based on cherry picked data, a scheme for equally weighting every country in the analysis (which wouldn't pass muster in an undergraduate statistics course), and a computational error became the primary evidence in favor of a pro-austerity agenda here and around the world. Mike is charitable in calling these issues innocent mistakes on the part of the authors. They might be, but I have a hard time believing that Harvard economists make the kinds of errors that somnolent undergrads make on homework assignments. When authors refuse requests for data, 99.9% of the time it's because they know goddamn well that they did something shady and they don't want you finding out.

Are these results fabricated? No. They did an analysis. A really bad one. My guess is that they ran dozens of different models – adding and removing variables, excluding and including different bits of data – until they got one that produced the result they wanted. Then they reverse-engineered a research design to justify their curious approach to the data. Every academic who handles quantitative data has been there at some point. That point is called grad school.

USE YOUR INSIDE VOICE

I'm giving one of my favorite lectures today, on the history of political campaign advertising.

It's fun to watch students realize that very little of what we see in modern politics is new. The basic messages (and to a lesser extent, techniques) have been around for a couple of centuries. Americans generally know very little about history, so they tend to assume that their problems are new. Oh, the media is so partisan! (Yeah, check out a 19th Century newspaper) Campaigns are full of mudslinging and dirty attacks! (Find one that wasn't) People are so stupid! (As they have always been).

Every time there is a disaster or tragedy these days, we look at the contents of internet comments sections and immediately lose whatever hope for humanity might remain within our cynical hearts. What has happened to this country?, people ask. The correct answer is, of course, nothing. The only difference between Americans today and those from a century ago is that now we have a global platform for making our knee-jerk, reactionary, and ignorant opinions public. It's doubtful that people in the 1890s would have been filling Twitter and newspaper websites with pearls of wisdom and kindness.

It is completely natural to speculate, to have thoughts driven more by fear or emotion than reason, and to express anger. I'm sure people felt the same way when they heard about Pearl Harbor as they did on 9/11. Those thoughts and feelings were not broadcast around the world and recorded for posterity by the millions.

They didn't run to Facebook to share tacky pictures exhorting one another to pray, engage in wild, half-assed speculation, or fuel their pet conspiracy theories. If they had the opportunity, they would have.
buy cytotec online buy cytotec no prescription

As it was, the only way to express those thoughts was out loud. Common sense and a tiny bit of decency were probably enough to prevent many people from doing that.

The internet, however, offers no barriers to speaking what we overly-generously call our minds. We have anonymity, an audience, and no repercussions for what we say. Hell, why not air our ridiculous ideas about The Muslims and tell a bunch of other anonymous strangers to fuck off. There are no costs. We can say whatever we want, immediately. And this is the problem, because Americans have a notoriously difficult time distinguishing between can and should. Maybe it wouldn't hurt to express more of our uninformed musings using our Inside Voice rather than Twitter.
buy wellbutrin online buy wellbutrin no prescription

The internet and 24-hour cable news environment overwhelms us with "grief porn" in response to events like Monday's bombing. It encourages us not only to express great sadness but to do so publicly. It's not enough to spectate; we have to be part of the chorus of prayers and tears. We personalize things to make it about ourselves – OMG, I once knew a guy who lives in Boston! – and we use the collective anguish as fuel for irrational ideas. Why wait for facts when I'm angry and upset now?

We've always been less than enlightened thinkers as a nation. Today, though, we have a window into our half-baked thought processes and speculative "journalism" encouraging us to join them in a leap to conclusions. We have every opportunity to say things and very little encouragement to think before doing it. The exhibition of bile and stupidity that we see online is evidence that we could all benefit from a quieter and more reflective response to the horrible things the world throws at us.

BRIGHT ENOUGH

You've heard the silly cliche about states as "laboratories of democracy" in the American federal system, and there is some truth to it. When a policy implemented in one state produces a positive result, other states imitate it. This assumes that state legislatures are innovative and more willing to try new things than Congress, which sounds pretty neat. Unfortunately, it turns out it's much easier to get elected to most state legislatures compared to Congress, which means a lot of state legislators are crazy people.

Legislative professionalism is a key concept in the study of state politics (see Squire 2007, "Measuring State Legislative Professionalism" SPPQ 7(2): 211-227 – warning: political science content) referring to the resources available to legislators. In some states, being in the legislature barely qualifies as a part time job and pays almost nothing (ex: Kansas) while others like California resemble the U.S. Congress in terms of salary, days in session, staff, and financial resources available.
buy prednisone online rxbuywithoutprescriptiononline.com no prescription

In low professionalism states, the people who serve are not necessarily the sharpest knives in the drawer.

Consider New Hampshire. The small, lightly populated state elects 400 members to its lower house and pays them $100 per year. If you move there, the odds are half-decent that you can serve in the legislature at some point. You might not know squat about government or politics, but don't worry. You'll find yourself in good company.

This is a partial explanation of why we see so many stories passed around the internet of some Republican state legislator saying something borderline insane and why we see so many truly idiotic things passed through state houses (recent favorites include the Alaska nullification bill and North Carolina's proposal to establish Christianity as the state religion). It is possible that these legislatures propose such bills to attract attention or to make an ideological statement.
online pharmacy lexapro best drugstore for you

It is equally possible that they do it because they are composed of people who are dumber than a sack of doorknobs and/or mentally ill.

Not being an optimist by nature, it's hard for me to argue that this scenario produces a net positive in terms of "innovation". In South Dakota, the state house recently killed in committee a bill to criminalize texting while driving. Such bills have been passed with little opposition in other states, and experiments have shown that TWD is at least as dangerous, if not moreso, than intoxicated driving. Furthermore, the bans are a rare example of legislation that enjoys near-unanimous support among voters. You'd have better luck finding people who support legalizing drunk driving.
online pharmacy antabuse best drugstore for you

If such a bill died in committee in Congress, we would follow the trail of money to discover the cause. It would turn out that, in this example, big phone and internet companies hired armies of lobbyists and spent millions to turn members of Congress against it. But that doesn't explain what's happening in South Dakota. The bill died because the lower chamber (it passed in the state senate) is full of people who are too stupid to live, yet somehow judged bright enough to craft legislation. Why bribe or lobby people to support a repugnant issue position when you can simply sit back and let some yahoos convince themselves to support it for no reason other than their own bizarre worldviews.

States certainly are laboratories, but they're ones available to untrained, amateur, and potentially unstable scientists who usually produce the legislative equivalent of an exploding beaker.