October 30, 2015

Urinal partitions a sign of cocooning?

The post below on the disappearance of public showering in cocooning times got me thinking about where else we've seen a major change in awkwardness about uncovering our shame in public spaces. Other than bathing, the only other bathroom activity that we do in front of others is taking a wiz.

Unlike showering, where guys now wait until they get back home, we can't always hold it in when we need to go. So unlike the unused showers, public urinals are still in use. But it has become more common for them to have partitions between them so that no one else will see your shame.

I don't remember those at all from when I was a kid, and they do seem familiar at least from the last 15 years. I tried to find out more precisely when they started becoming common, but couldn't tell from Googling. Maybe a visit to Lexis-Nexis would turn up some industry reports about these newfangled urinal dividers. Someone else can look that up, though.

I also couldn't tell how common they may or may not have been during the previous cocooning period of the Midcentury. Some pictures of bathrooms built back then show partitions, and others don't. Even when they're there, some look original and others look like recent remodels.

That being said, I'm still going to call urinal dividers as a cocooning-era sign of anxiety about showing your shame among in-group members who would've been trusted in more outgoing times. In socially withdrawn times, more and more guys act like Painfully Awkward Rob Lowe in a public bathroom.

These dividers are not only installed in highly diverse places where everyone is a stranger, but in those where folks know each other and come from similar backgrounds.

Believe it or not, there is an entire blog dedicated to the current state of bathrooms at Brigham Young University, in the heart of homogeneous white Mormon land. (You never know what data sources are only a few exits down the information super-highway...) And judging from these pictures, their typical bathrooms have partitions between the urinals. Somehow I doubt they were in place back in the '70s and '80s.

In outgoing times, which required greater interpersonal trust, nobody thought twice about draining the snake around others. In a cocooning climate, when people are more suspicious of one another, they're much less likely to allow themselves to be vulnerable enough to uncover their shame around strangers.

October 28, 2015

In debate, Trump all but promises to neuter the media if elected

Trump has been running on an implicit anti-media platform since the beginning. Within the mainstream media, and especially during the nationally televised debates, he's been the object almost entirely of half-truths, snark, and naysaying. At the debates, he gets asked far nastier questions, in such a baldfaced tone, than any of the other candidates.

When he draws thousands and tens of thousands at local rallies, he makes a point every time to inform the TV viewers that the media cameras will never pan to show the full scope of his massive crowds, even when he directs them to. It's their passive-aggressive way of trying to play down his popularity and the unwavering enthusiasm of his supporters compared to everyone else's supporters.

In his speeches, he emphasizes time and again how dishonest the press is, to raucous applause.

Other candidates may complain about the liberal bias in the media, but Trump made it clear tonight that he takes it personally and will not forget. In his open-ended initial answer to a question about what your weakness is, he said that he is too trusting, but if that trust is betrayed, he will never forget and never forgive. If it struck the viewers as an out-of-place answer, perhaps he was giving the CNBC talking head moderators a warning not to jerk him around in front of a live national audience, and to treat the front-runner with a little respect and seriousness.

Enter question #1: "Aaaaannnnndd Mr. Trump, like, seriously? What's up with your comic-book version of a presidential campaign?"

Later question to a rival: "Aaaaannnnd Mr. Huckabee, you emphasize the importance of morality in politics. Do you think Mr. Trump passes the moral threshold cleared by Bill Clinton, George W. Bush, and Barack Obama? Or is he literally the devil incarnate?"

What, was CNBC doing some cross-promotion with tranny SJWs submitting questions from Tumblr?

If his opening statement sounded out-of-place, so did his closing statement, which he mostly devoted to explaining to the audience that he out-negotiated the host network in setting the terms and length of the debate, so if he can bring the haughty media to heel, just imagine what he can do for the country. One of the talking heads told a bald-faced lie that the debate was always supposed to be two hours, but Trump called him out as a liar. Trump also got in the message that the only reason CNBC wanted the debate to be three hours or longer was to greedily squeeze out as many advertising dollars as possible.

It's one thing to disparage the press on the campaign trail, but to humiliate a network during their own broadcast sends a strong signal about his plans for the media. You think he doesn't have plans for them? Rewind to his opening statement about how he never forgets when someone betrays his misplaced trust.

I don't know whether he plans to break up the media monopoly, to pay for his income tax cuts by levying YUGE taxes specifically on the mass media, or what. ("Hey, you gotta stay a little unpredictable, am I right?") Whatever it is, it will deal a strong blow to the propaganda machine in our country, and it will be a smash hit among both conservatives and liberals, so no one but lobbyists will be protesting. And Trump has already made clear he's not taking anyone's money and won't have to listen to lobbyists -- let alone from the media that's been smearing him the whole time! -- once he's in office.

The media are fully aware of how little trust the public has in them, since they're in the business of reading and sometimes even reporting on opinion surveys. According to the gold standard, the General Social Survey, in 2014 Americans had almost no confidence in the press. Among liberals, 10% have "a great deal" of confidence, 49% have "only some," and 41% have "hardly any". Moderates show about the same pattern. For conservatives, 5% have a great deal, 44% only some, and 51% have hardly any. Disgust and loathing for the media is truly one of the most bipartisan unifying issues out there.

The only source of comfort that the propaganda machine enjoys is the unwillingness of any major politician to want to go after them. As leaders of the Establishment, the politicians benefit greatly from the Establishment's monopolistic media arm. Only now, the front-runner for the Republican nomination and indeed in the general election has made it one of his many personal missions to cut the media down to size for having treated him so unfairly.

Most politicians not surprisingly come from a culture of legalism, but if Trump wins, they're going to get a taste of a President who follows the culture of honor.

"Payback time!"

Go get those bunch of slack-jawed faggots in the media, President Trump. All of America is behind you!

GSS variables: conpress, polviews, year

No showering at school in cocooning times

Looking through old horror movies, it's striking how many shower scenes there are, from Psycho through the Friday the 13th series. Some movies dialed up the vulnerability factor by making the bather's exposure public, setting them in the showers of a locker room at school:


Then it hit me why you don't see these kinds of scenes anymore -- students haven't showered in the locker room for decades, so the intended audience of horror movies (teenagers and young adults) wouldn't be able to resonate with the setting. Carrie (1974) has a classic shower scene that anyone who went to high school in the '70s could have identified with. When they re-made Carrie in 2013, they kept the shower scene, but it must have felt forced and unfamiliar to adolescents of the 2010s.

Of course, there were public shower scenes in genres other than horror -- Porky's, Footloose, Heathers, and others -- but those are also from awhile ago.

To establish that these changes in pop culture are reflecting changes in the real world, I went Googling and found three articles from major newspapers, all in 1996: the first one from the Chicago Tribune, a follow-up from the NY Times, and yet another from the LA Times. The reporters interviewed students and gym teachers from various schools in their metro areas and were unanimous that high schoolers had stopped showering after gym class and extracurricular sports.

We can tell the change was abrupt because the gym teachers were in a state of disbelief, rather than saying that the shift had been gradually under way for awhile:

The antipathy to taking showers after gym class puzzles some teachers and coaches. "These guys don't want to undress in front of each other," said John Wrenn, a teacher at Homewood-Flossmoor High School in suburban Chicago, who can scarcely conceal his contempt for the new sensibilities. "I just don't get it. When I started in '74, nobody even thought about things like this. The whole thing is just hard for me to accept."

There was a court case in the Pittsburgh area in 1994, where the ACLU sued a high school to end public showering, arguing on privacy grounds (a fat chick didn't want to be embarrassed). The lawyer said he'd never received so much spontaneous positive response from around the country, meaning that this court case did not change the climate by itself but was merely reflecting the larger changes in attitudes that had already occurred throughout the country.

That's the earliest example that I could find, and it sounds about right from personal experience. I don't remember anyone using the showers in high school (fall of '95 to spring of '99). In fact, although I can recall what the locker room looked like and where the main doors and the staff office were located, I can't even remember where the showers were placed or what they looked like. My friends and I would just spend the extra 5-10 minutes of "shower time" shooting the bull.

It was like that in middle school, too (fall of '92 to spring of '95). Some of the metal kids went into the showers with their normal clothes on, and started headbanging once their hair had gotten soaked, but it was just a goof. Rarely, somebody would go in for real with their shirt off, but actually taking a shower in public -- never happened.

Here is a thread on Snopes.com from 2004 asking whether showering after gym class was a real thing or only something they showed in the movies. People who went to secondary school in the '60s, '70s, '80s, and early '90s all swear that it was real, and often enforced by the gym teachers. Nobody chimed in to say it was practiced in the mid-'90s or afterward. A more recent article from the Sun-Sentinel in southern Florida confirms that the no-shower trend has continued through the present, and was not just a phase during the era of grunge music, greasy hair, and heroin chic.

The timing of the rise and fall suggests a link to the outgoing vs. cocooning cycle, and sure enough I could not find reports of public showering at school being common in the first half of the 1950s or earlier -- i.e., the pre-Elvis cocooning era.

People in outgoing times are simply less self-conscious about their bodies because sociability requires higher trust levels than cocooning does, and trust means being willing to let yourself be vulnerable in public around other members of the in-group. In a trusting person's mindset, none of the other kids in the showers are going to try anything harmful, so why bother worrying about them? In the suspicious person's mindset, you can never know who's going to try to do you wrong when you're vulnerable, so it's best to just keep your guard up all the time. Certainly that means never uncovering your shame around others.

Only when people start withdrawing into their own private little worlds do they start to obsess over privacy and act compulsively about it. It's not a mere personal preference when 100% of students would suffer an anxiety attack from taking off their clothes in the locker room, let alone everyone showering that way. To students of the past 20-some years, it would feel more like child abuse to make the students vulnerable to the attention of their peers, all of whom they distrust, so that uncovering their shame in front of them would be the ultimate humiliation.

Not showering in public is not the only case of increasing anxiety about showing one's body in cocooning times. See this earlier post on the absence of flashing, streaking, skinny-dipping, and topless sunbathing (even in France) within Millennial-era culture. See also this related post about how young people stopped joining the nudist movement decades ago, making their membership increasingly saggy and gray-haired.

This wide variety of examples shows the importance of distinguishing between displaying your body because you trust other people not to do you wrong, and displaying it to whore for sex-appeal attention. The streakers of the '70s were not trying to increase their number of Instagram followers as an ego boost, but to put out an extreme form of the basic message about, "Hey man, what is there to be worried about or ashamed of? We all trust each other, don't we?"

Likewise, today's skin-baring young people would still drop dead from anxiety if they went out in public with no bra on, or if they went skinny-dipping with a group of friends. The code of "If you've got it, flaunt it" somehow doesn't translate into public showers, where the good-looking girls could easily one-up the uglier girls. But that would require them to be vulnerable in front a crowd of people they don't trust (their fellow students), so that's a non-starter.

Calling these changes "the new modesty," or whatever, would be foolish, given how narcissistic young people have become. "The new awkwardness" is more like it.

October 27, 2015

Convincing portrayals of ghost encounters on naturalistic TV series

Before the serial drama became a popular genre in television, it was not possible to do a serious "Halloween episode" because the dominant genre was the sit-com. Ghosts, spirits, and the like would have clashed with the light and comedic tone of the series overall. So, on hit sit-coms like Roseanne and The Simpsons, the Halloween episode took the form of the characters playing morbid pranks on one another and telling scary stories.

Then came series that follow fantastic plotlines and themes on an ongoing basis, such as Buffy the Vampire Slayer and Supernatural. The presence of ghosts, spirits, etc., on Halloween would actually feel ordinary in the worlds that these shows are set in. Bummer.

To get the Halloween atmosphere right, the series ought to be naturalistic, so that the Halloween episode stands out as up-ending the ordinary order of things, just as it is supposed to be in real life. We've already seen that it needs to be a drama rather than sit-com, so that the tone of the Halloween episode won't clash with the usual tone. And the narrative should be serial, so that we can tell that there's a disturbance to the ordinary goings-on with the characters whose lives we already know.

The Halloween episode I remember most is the one from My So-Called Life, which features a plotline about a ghost of a former student at the high school who died on Halloween. It is played seriously, as though something supernatural has entered the mundane world of the series, which usually focuses on typical teenage drama.

In fact, for their Christmas episode they again introduced a ghost plotline that was played seriously (the ghost of a homeless teen runaway, not the Christmas Carol kind of ghost). Both ghosts appear in ordinary human form rather than transparent and wispy, which helps the viewer of a naturalistic series to suspend disbelief.

Judging from reviews of these episodes at the Onion AV Club, hardcore nerds can't stand the serious introduction of supernatural elements in a realistic drama. I don't remember these episodes having a credibility problem back then, nor when I re-watched the series a few years ago. There's nothing unrealistic about the occasional encounter with something that can't be explained naturalistically. Haven't we all had some kind of experience like that in real life? And what better time to set it in than during holidays where there's a carnivalesque atmosphere of "up is down and down is up"?

If ghosts, spirits, etc. only made appearances as recurring characters in a fantasy world, they wouldn't stand out as beyond the ordinary. The nerds who write that kind of stuff (e.g. Joss Whedon, Buffy the Vampire Slayer) seem to think of ghosts, vampires, etc. as like invisible friends who interact with the normal characters on a regular basis.

In reality, ghosts are something that we only encounter very rarely, perhaps only once or twice in a lifetime. And those encounters do seem to take place within our mundane lives before and after, not as though we feel transported to some fantasy world where ghosts dwell, or as though some portal has opened up into our world like a horror movie. They have an unsettling surreal quality, where we can't tell if this is a natural or supernatural experience.

Have there been any similar episodes in the 20 years since the ones from My So-Called Life? Wikipedia has a list of Halloween TV specials with sections for "American drama" and "teen drama". It's hard to judge whether ghosts appear at all, or whether it's just about partying on Halloween, let alone whether the ghost encounter is portrayed seriously. It sure doesn't look like it, though.

Earlier, Twin Peaks made surreal episodes -- every episode. Expecting them every day or week is not how we have ghost-like experiences in real life.

In any case, you can watch the entire sole season of My So-Called Life for free on Hulu. The Halloween and Christmas episodes are 9 and 15, although it's worth watching the other ones in order, too, to appreciate the contrast in subject matter and theme. It's one of the few pop culture phenomena of my teenage years that I'm not embarrassed to have enjoyed, a kind of Breakfast Club serial drama for the grunge/alternative era. There's certainly a lot worse you could be streaming to get in the mood for Halloween.

October 24, 2015

For moron skeptics, the Bible allows everything because "Many versions, QED"

It has become clear that the attempt to popularize scholarship on the Bible has not led to a more nuanced understanding of Western civilization's sacred texts, but has instead created confusion, dishonesty, error, and downright retardation, where there used to only be mere ignorance.

Bart Ehrman, the most well known of the New Testament scholarship popularizers, found this out the hard way. A professor from an evangelical-turned-agnostic background, he wanted to enlighten believers and non-believers about what dispassionate research has to say about the Bible, and how those findings might inform contemporary debates about religion in general and Christianity in particular.

His books are easy reads, but you can get the gist of the message from the titles:

Lost Christianities: The Battles for Scripture and the Faiths We Never Knew

Misquoting Jesus: The Story Behind Who Changed the Bible and Why

Jesus, Interrupted: Revealing the Hidden Contradictions in the Bible (And Why We Don't Know About Them)

Forged: Writing in the Name of God—Why the Bible's Authors Are Not Who We Think They Are

These were published between 2003 and 2011, after agnosticism and atheism had taken a commanding lead in American society during the 1990s. Publishing them in such a godless climate was not brave, rogue, or maverick, but simply goading on an already cynical and skeptical population by telling them that PEER REVIEWED STUDIES backed up their gut feeling about contemporary organized religion being mostly full of it.

Like most brainless mobs, though, the religious skeptics got so far out of hand that they didn't just question this or that supernatural tenet of mainstream Christianity -- was Jesus the son of God, was he resurrected, did he die for our sins, and so on -- but straightforward mundane matters like whether Jesus of Nazareth even existed, whether he led a religious movement, or whether he was crucified. Those are all basic historical facts, so imagine the embarrassment to a professor of Christian history that a decent chunk of his followers were so historically illiterate, and so smug in their convictions.

This led him to publish a popular book in 2012 whose title reveals how learned and sophisticated his audience had become:

Did Jesus Exist? The Historical Argument for Jesus of Nazareth

Imagine a historian of ancient Rome having to write a book called, Did Julius Caesar Exist? Ignorance that profound would be too depressing for most Roman historians to even bother addressing at length.

Perhaps the most cancerous outcome of the popularization of Biblical studies has been the widespread belief that because the Bible has been copied and re-copied so many times, not to mention revised and edited by scribes who were not 100% neutral in their changes, we have no solid basis for beginning a statement with, "The Bible says ____". WHICH VERSION, WHICH VERSION, WHICH VERSION??!!

In the mind of the skeptic, in other words, the Bible allows us to believe anything about the topics it addresses, and those puritanical Bible-thumpers (booo, booo) are only quoting the particular version, among so many different versions, that happens to support their claim. For skeptics, there are alternative, competing versions either here-and-now or once-upon-a-time that undercut or even contradict the claims of the Bible-thumpers.

One of the most amazingly stupid statements to this effect came from professional ignoramus Bill Nye the Science Guy. In a 2014 debate against Ken Ham, he likened the history of reproducing the Bible to the "telephone game," where someone begins with a story and whispers it to the next person, who whispers it to the next, on and on down the line, until the final person tells a story very different from the original.

Let's ignore for the moment how copying a sacred text differs from playing the telephone game (serious vs. careless attitude, sacred vs. profane mindset, etc.). What is the point of this analogy except to suggest that the original versions of the Bible were substantially different from those of today, on a wide range of crucial topics?

Does Bill Nye really think the original version of the Ten Commandments went a little something like this? --

Have as many other gods before me as thou wilt. I mean, hey, they're all just different forms of the same Higher Spiritual Power, right?

Thou shalt not commit adultery, unless the side-chick is pretty hot, in which case, hey bro, I totally understand, I'd hit that too.

Honor thy mother and thy father, as long as they give you everything you ask for. If not, you can call them a fag on Twitter since those tightwads deserve to be shamed.

Does Bill Nye really think that, somewhere along the line before they mutated, the original teachings of Jesus included such nuggets of wisdom as these? --

Blessed are the impure of heart...

Blessed are the vindictive...

Blessed are the warmongers...

Judge others by a standard that you would not accept to be judged by.

Go and sin no more, j/k we all know sinning feels good.

Does Bill Nye really think that in the original unaltered form of Romans and 1 Corinthians, the apostle Paul was actually trying to help the churches organize their local Gay Pride Parade?

It's as though the Science Guy thinks the process of copying the Bible leads to a multiverse of teachings, where at every major decision a scribe had to make, some went one way, others went another way, and still others went another way still. With so many branches extending from so many ideological points of departure, somewhere out there is an Anti-Bible that contradicts every major teaching of our received Bible, but which nevertheless traces back in an unbroken chain of transmission to the original autographs.

For skeptics, followers of the Anti-Christ just might be the literal original Christians. You can't get any dumber than this, folks. It really proves G.K. Chesterton's claim that, "A man who won’t believe in God will believe in anything."

At the outset of the crusade, the idealistic popularizers thought their audiences would develop an appreciation for the similarities and differences of the three major textual families that the New Testament comes from, or how the texts reveal the evolving beliefs and practices of the nascent Christian communities. Instead they ended up feeding the smug dismissiveness of a bunch of morons.

Maybe now Bart Ehrman understands why the ministers and priests he talks to have made a conscious decision not to open up this can of worms for their congregations.

Bonus video: see how often a typical crowd of internet scholars brings up the objection about "many versions and revisions!" when getting into a real-life argument with street-preaching troll extraordinaire Brother Dean. I found examples around 18m, 29m, 33m, 45m, and 1h 8m.

October 23, 2015

Reminder: Iowa caucus not very predictive of national nomination

A recent poll shows Ben Carson ahead of Donald Trump in Iowa, which may be a temporary fluke or may signal that Carson will win the state's primary with Trump in a close second.

Even in the worst-case scenario where Carson wins Iowa, would that spell doom for Trump's shot at the nomination? No: the results of the Iowa primary are a poor predictor for what happens nationally. The primaries in New Hampshire and South Carolina are far more accurate predictors, and there Trump enjoys a comfortable lead.

From Wikipedia, here are the past results of the Iowa Republican primaries where an incumbent was not running. Winners in Iowa are listed first, winners of the nomination in bold.

2012 - Rick Santorum (25%), Mitt Romney (25%), Ron Paul (21%), Newt Gingrich (13%), Rick Perry (10%), Michele Bachmann (5%), and Jon Huntsman (0.6%)

2008 – Mike Huckabee (34%), Mitt Romney (25%), Fred Thompson (13%), John McCain (13%), Ron Paul (10%), Rudy Giuliani (4%), and Duncan Hunter (1%)

2000 – George W. Bush (41%), Steve Forbes (31%), Alan Keyes (14%), Gary Bauer (9%), John McCain (5%), and Orrin Hatch (1%)

1996 – Bob Dole (26%), Pat Buchanan (23%), Lamar Alexander (18%), Steve Forbes (10%), Phil Gramm (9%), Alan Keyes (7%), Richard Lugar (4%), and Morry Taylor (1%)

1988 – Bob Dole (37%), Pat Robertson (25%), George H. W. Bush (19%), Jack Kemp (11%), and Pete DuPont (7%)

1980 – George H. W. Bush (32%), Ronald Reagan (30%), Howard Baker (15%), John Connally (9%), Phil Crane (7%), John B. Anderson (4%), and Bob Dole (2%)

1976 – Gerald Ford (45%) and Ronald Reagan (43%)

So, Iowa has only predicted the nominee in 3 out of 7 cases. Not exactly a bellwether. They routinely pick soft-spoken candidates who are nevertheless destined not to secure the party's nomination. In recent primaries, they have favored candidates with strong evangelical appeal, who do not go over well at the national level.

They picked Bush I over Reagan in 1980. They picked Bob Dole and Pat Robertson both way ahead of Bush I in '88 -- when he had a record as the VP of the most liked President since Kennedy, and who'd won a landslide re-election in the previous cycle. They chose three candidates over the eventual nominee in '08, with hopeless Huckabee as their first choice. In '12, at least they almost called it, but still favored Santorum by a slim margin.

They did correctly predict Bush II in '00, Dole in '96, and Ford way back in '76. None of these correct predictions were running on a strong evangelical platform, however, the way that Carson is.

Turning to New Hampshire, their primary winner has gone on to secure the nomination in 5 of these 7 cases. They narrowly chose Buchanan over Dole in '96, and McCain over Bush II in '00. In neither case did they choose 2 or even 3 candidates over the eventual nominee, nor did their incorrect predictions choose an evangelical.

The most predictive of the early primaries, however, is South Carolina, where since 1980 every winner of the state's primary has won the nomination, except for Romney losing out to Newt Gingrich in '12, perhaps due to home-turf advantage in the Deep South.

Trump has enjoyed a healthy lead over Carson in New Hampshire, and an even wider double-digit lead in South Carolina. So even if Iowans do choose Carson over Trump, that won't matter in the big scheme of things.

By this point, Iowans are almost deliberately trying to choose someone who won't win, just to keep any one candidate from enjoying too much success. It's part of the extreme egalitarianism that the Scandinavians brought into the Upper Midwest (see the Law of Jante).

October 22, 2015

Electronic rather than acoustic for low-key covers of over-the-top songs

After the peak of high-energy music in the 1980s, the hip new thing was to play low-key acoustic covers of songs that originally had electric instruments and layer upon layer of slick studio production effects. During the '90s, MTV put together a popular series of such concerts that exploited the trend, MTV Unplugged, and their sister channel did likewise with VH1 Storytellers.

The reasoning was that the bombast of the original hit required not just a toning down of the intensity of the performance, but a change in the instrumentation -- perhaps because electric guitars suggested electricity, high voltage, etc.? It seemed straightforward at the time, but it's not as though there weren't electronic songs that still created a minimalist atmosphere -- "Cars," "Pop Muzik," and so on.

Those minimal synthpop songs are danceable, though, and the whole point of a low-key performance is to keep your body still and get you to appreciate the music on a (relatively) more cerebral level. But then there were minimal and non-danceable electronic songs like "Song to the Siren" by the Cocteau Twins for This Mortal Coil. That one was a cover, too, showing that there was nothing incompatible between the atmosphere that the Unplugged trend was aiming for, and plugged-in instruments.

I stumbled upon a recent example of a cover of the Backstreet Boys' mega-hit "I Want It That Way". The original is about as catchy as pop music could have been in the doldrums of the late '90s, though it does sound overly-produced. Most people will probably remember this as a techno-pop song, but the main riff is from an acoustic guitar, and there's a piano as well. This shows that it's not incompatible for an over-the-top dance-pop hit to use mostly acoustic instruments.

The cover version is by Charli XCX, an electropop singer whose recent hits you've probably heard without knowing her name. First impression of what they're channeling -- "In the Air Tonight" by Phil Collins mixed with "True Colors" by Cyndi Lauper, as sung by Gwen Stefani.

The only instrumentation is a pair of synthesizers and an electronic drum. It sounds more "unplugged" than the original, despite substituting electronic for acoustic instruments, because they're sparse and atmospheric rather than heavily layered and in-your-face. That is more important to create a low-key version; the synthetic timbre of the instruments doesn't make us feel like we're listening to a bombastic chart-topper. The slowed tempo also helps to change the mood.

In fact, the only exaggerated thing in the cover is the vocal embellishment, admittedly making it somewhat harsh to listen to. I think if she studied Phil Collins' voice better on "In the Air Tonight," it would come off much better. Simpler, cleaner, gradually escalating and receding in intensity. But then she's a Millennial, so I don't know if she can speak in anything other than mumbling, vocal fry, and tantrum-growling.

Although this is hardly the greatest re-interpretation you've ever heard, they recorded it for a cover song project by the Onion AV Club, rather than putting the most thought and effort into it for a track on one of their own albums. It is a pretty clever rendition of such a bubblegummy pop hit, though, and it's refreshing to hear an electronic approach to the minimalist cover song.

We live in another period of over-produced bombastic dance-y poppy mega-hits, and it would be nice to hear understated cover versions that still used electronic instruments, rather than the usual formula of acoustic = low-key.

Bonus: Au Revoir Simone performs an understated electronic cover of "Fade Into You" by Mazzy Star. The original was not a super-slick studio production effort, but did have a heavier emotional intensity than the cute little cover version. What sound are they going for here? It sounds like Joy Division plays for a children's tea party. Prim post-punk.

October 20, 2015

Kiddie roots kill McDonald's and Burger King in age of foodie-ism; Wendy's left standing

For the better part of a decade, the former fast food giants McDonald's and Burger King have been losing business in the US, and have been subject to one desperate makeover campaign after another. Of the three burger giants, only Wendy's has been rising or holding steady. Why the exception?

Each generation that makes up the bulk of the customer base will reshape the fast food sector to its liking. Baby Boomers were buying fast food mostly for convenience -- for themselves during lunch break, and to avoid having to cook dinner for their children. It was part of their greater emphasis on career striving: less time cooking = more time billing.

But ever since Gen X-ers and Millennials came to spend most of the fast food dollars, the prevailing values have come from foodie one-upsmanship. It's part of the lifestyle striving of the generations for whom the path of career striving would lead to an already saturated niche of competition. They compete for status in the lifestyle arena instead.

The problem for McDonald's and Burger King in this new foodie-oriented climate is quite simply that they are saddled with a kiddie image in the personal memories of the target audience, which prevents them from being taken seriously as foodie-friendly restaurants.

McDonald's has had a children's clown as their mascot since the 1960s, not to mention all the other cartoony characters that have been added to their brand image over the years. Then there are the distinctly kid-focused Happy Meals, on-site playgrounds, kids' birthday parties (at least back in the '80s), and so on and so forth.

Burger King began marketing to children a little later, with the BK Kids Club that began around 1990, playgrounds, kids meals, cardboard crowns for kids to wear, etc.

Wendy's is the only one of the original corporate chains not to have lowered themselves to pandering to children. They offer kids meals, but they don't suffer from brand recognition the way that every Gen X-er and Millennial can remember the shape of the Happy Meal box with the arches on the handle, or the look and assembly of the BK cardboard crown. Wendy's never offered playgrounds, cartoon character mascots, or primary-color decor.

Not that Wendy's was branded as a sophisticated adult restaurant, but it was always meant to look and feel like a place for basically mature people, whether or not they brought children along with them. It didn't feel hostile toward children in the way that other decadent foodie places feel that cater to childless strivers, only that children weren't the focus of attention.

If you think back to childhood memories of the big three, you probably don't remember too much that was exciting about Wendy's, while McDonald's and Burger King light up all kinds of kiddie connotations.

Newly founded foodie places like Chipotle and Starbucks began with a blank slate for their branding, and were shaped by lifestyle striver values from the get-go. Wendy's may not have been a blank slate, but it was pretty close. It never really did conduct intense branding campaigns. All they had to overcome was the image of the avuncular founder Dave Thomas. But some soft-spoken guy from the Midwest already looked like his generation was leaving the stage, and his daughter would take over the family business. Enter the redhead chick playing Wendy in commercials, all grown up now to convince X-ers and Millennials that Wendy's has entered its hipper and younger foodie-friendly stage of life.

Somehow, going from an older and more sober image to a younger and more irreverent image is easier than transforming a kiddie image into a grown-up one. Perhaps it's the boundary between childhood and adolescence that makes the attempted re-launch of McDonald's and Burger King seem like such an unconvincing quantum leap.

We keep hearing all kinds of managerial hocus-pocus about being "nimble" and "agile" in the marketplace. It means shape-shifting from one clearly delineated image to another, as soon as the ground shifts just an inch.

Back on Planet Earth, that strategy takes the form of one desperate and unconvincing makeover after another -- McDonald's as purveyors of super-sized fries, then as multiculti global awareness ambassadors, then where the homos and homo-enablers sip their McCafe, and who knows what next. With Burger King's changing image -- from the BK Kids Club, to the CollegeHumor.com mascot of The King, to the caterers of the schlub army who are going to order junk food LIKE A MAN, to the neo-Midcentury Modern decor -- their agility has flushed the company right down the toilet.

Wendy's had a stable, nondescript brand for its first four decades, and subtly and seamlessly slipped into a fast casual image. It is now the only success story of the original burger chains in the age of the fickle foodie crowd, and it hardly did poor business during the earlier years when its competitors were nimbly shifting from one image into another. Wendy's was guided more by an attitude of stewardship and is winning the long game, while its spastic and agile rivals cashed in on the child-pandering craze of the '80s and '90s but have been doomed to struggling just to do the same level of American business as last year. Their only cushion is growth in the third world, where McDonald's carries prestige over tainted street food.

And should the market shift away from foodie-ism and back toward unpretentiousness, Wendy's could easily reverse the changes they've made, since they are not drastic or pervasive. Its menu is still limited and ordinary -- and timeless -- with trendy foodie-oriented items only coming through on a rotating basis, unlike the steady bloating of the menu at McDonald's with premium and niche items.

McDonald's and Burger King are allowing short-term opportunists to run the company into the ground with their protean agility, while steward-guided Wendy's will still be thriving for decades to come.

October 16, 2015

High tax era required tightening immigration to provide civic cohesion

The only part of Donald Trump's platform so far that is not populist is his tax plan (read it here), but we shouldn't let that get in the way of voting for the far-and-away most populist-friendly candidate to come along in a very long time, especially considering how dominant his lead has been.

Since so much else of his platform is meant to return the nation to the norms of the Great Compression, and the pinnacle of the 1950s in particular, it's worth looking into whether it is actually the anomaly it appears to be. It turns out that the trend toward high income taxes of the Midcentury lagged behind the trend toward greater civic cohesion at the grassroots level. Only once that foundation was laid did people feel comfortable sacrificing so much of their income for the common good.

First and foremost was the trend toward a more ethnically homogeneous population as a result of plummeting rates of new immigrants, as well as the percent of the whole American population that was foreign-born. Both of these rates peaked around 1910 (the green line in the first graph shows the percent foreign-born):



Nobody in any race or ethnic group would want a huge chunk of their income going to taxes that showered goodies on some alien, perhaps hostile, ethnic group. So first the American population needed to feel that they were a single, homogeneous group. That was impossible with so many foreigners streaming in during the Gilded Age. But once that reversed after 1910, it made more and more sense to think of ourselves as all Americans.

The other major trend was the labor union movement, which began in the late 19th and early 20th century, also well before the introduction of high income tax rates, or indeed before the income tax at all. Reducing immigration made people feel more ethnically similar, but some kind of labor movement was necessary to build up civic spirit and participation specifically in the economy.

Here's the history of the highest and lowest income tax rates since they were allowed in 1913 (see a table in this section):


The first income tax was barely there -- 1% on the lowest bracket and 7% on the highest. It shot up for the highest bracket only to fund US involvement in WWI, after which it was reduced to 1.5% for the lowest and 25% for the highest brackets during the 1920s -- still higher than in the 1910s, though. Thus the trend toward rising income tax rates began after about a 10-year delay from the falling immigration trend.

It would only go up from there, and remain high during the Midcentury, before tumbling off a cliff during the current long period of status-striving and widening inequality, where immigration has ramped up in true Gilded Age revival fashion. Notice that the percent of the whole population that is foreign-born hit a minimum in 1970, and began steadily rising by 1980 and after. The labor movement had already begun to take a beating by the '70s, and their influence has only weakened since then.

So we see in our current era a mirror-image of the earlier change. In the early 20th C., the first shift affected labor unions and native-born citizens, who rose in influence, and the second delayed shift raised income taxes. During the later part of the '60s and '70s, the first change was again with unions and native-born citizens -- only now they began declining in influence, and the second change lagged behind by about 10 years, where tax rates were slashed across the spectrum during the '80s (especially for the top brackets).

It would be facile to point only to the labor movement, as though it alone constrained the greed of the wealthiest, and as though the only major change during the Great Compression was taxing the hell out of rich people in crude class warfare style. In fact, even the lowest tax bracket in the '50s was paying 20 percent! There was a widespread feeling of "we're all in this together" and "we all have to make sacrifices" for the greater good of the nation.

That sentiment was only possible when the population was highly homogeneous, akin to the high taxes and generous welfare programs of the Scandinavian nations -- who, in case your social studies teacher didn't tell you, are among the most homogeneous in the world. It's not that they're highly white (which they are), but that they're highly regional-white rather than a mix of all European peoples.

See also Robert Putnam's trailblazing study showing that ethnic diversity erodes trust within communities -- not only among people of different ethnic groups, but even among those of the same group. (For example, in Los Angeles, white people don't even trust one another or form civic groups with one another.)

So, if Trump's tax plan looks typical of the current era, in contrast to his populist policies everywhere else, it's not so mysterious and is nothing to worry about. First we have to make the nation more homogeneous, and have more working people involved in labor unions, and then we can talk about raising taxes back to Midcentury levels, when we can rest assured that they'll be put toward the common good.

The focus of Bernie Sanders & Co. is putting the cart before the horse, wanting to raise taxes and trying to ameliorate inequality without first solving the problem of immigration and homogeneity.

October 15, 2015

'80s flashback: Unraked lawns setting the mood for autumn (pictures)

I was going over this earlier post to see what more I could discuss about the changes in Halloween during the shift toward cocooning and helicopter parenting.

Digging through old Halloween pictures on Google Images, I was struck by how common it was to see outdoor scenes where the ground is blanketed by leaves. As in, nobody has taken a rake, broom, or leaf-blower outside in weeks or perhaps months. There were leaves all over the yard, the driveway, the sidewalk, and even the street.

Really, when was the last time you saw kids wading knee-deep through a big ol' leaf pile? Or diving out of a tree and into a mound of leaves? There used to be piles of leaves so high on the lawn that, at least for small children, your friend could hide buried underneath and try to scare you by popping up from them all of a sudden. Back in the '80s, kids didn't need the ball pit at McDonald's, they had huge piles of leaves right in their own yard.

As the cocooning mood has taken over, people have become more OCD. (See this earlier post for a discussion of the interrelated web of psychological dysfunction that stems from a cocooning zeitgeist.) When you're socially isolated, you can't rely on others to help you solve problems or cope with them if they aren't solvable. This leads to a focus on individual rituals that the isolated person feels will help them get through their problems and deal with anxiety.

One clear sign of our age's OCD is how immaculate the landscaping looks compared to 30 years ago. It looks like what we think of as the 1950s suburb. In between, everyone remembers the '70s as looking gritty, but that lasted through the '80s as well. Outside of urban areas, it wasn't even that gritty -- more like natural and a little unkempt, rather than compulsively manicured. (Related: Kant on English gardens vs. French gardens, or Steve Sailer on golf course design of the outgoing 1920s vs. the cocooning 1950s.)

Here's what the setting for Halloween looks like in recent years. Pretty leaf-free environment that these poor deprived children are growing up in.





(One thing you'll appreciate about the '80s Halloween pictures below is the lack of tweens dressed up as repulsive no-talent skags like Kesha.)

If you poke around Google Images for "trick or treat 201X," you can find occasional pictures with leaves, but they're usually staged for a professional photographer's shoot, or they're at a university campus. Most of the typical pictures of trick-or-treaters show lawns, sidewalks, and streets that still look like summer. It prevents us from getting in touch with the rhythm of the seasons.

Not so long ago, falling leaves were not blown out of sight five seconds after they hit the ground, so by the end of October there would be a romantic natural blanket of leaves everywhere outside. And they covered all areas of the ground -- lawn, driveway, sidewalk, and street -- just like a uniform blanket of fallen snow in winter. Streetscapes don't look nearly as romantically seasonal after the snow plows have driven through, and folks have shoveled their sidewalks and driveways. At least they leave the snow on their lawns undisturbed, unlike leaves.

There aren't tons of old Halloween pictures shot outdoors, but a good fraction of them show copious leaves all around. It creates a distinctly autumnal atmosphere that we don't allow ourselves to enjoy anymore. Aside from violating the OCD aesthetics, allowing leaves to pile up in high-traffic areas isn't very practical. But -- practical, schmactical. We need our fall-feeling just as much as we like waking up to a white Christmas.

These Halloween pictures from the '80s were taken throughout the decade, and don't show much change during the period. I could probably find similar pictures from the '70s and early '90s, but where's the nostalgia value in that?














October 14, 2015

College as part of lifestyle competition, not for later career / wealth / consumption

To rein in the problem of the higher education bubble and student loan debt, we first need to understand why so many adolescents are going to college in the first place.

The widespread but incorrect view is that it is for careerist reasons such as learning useful knowledge, acquiring useful skills, getting to know the ins and outs of some sector of the economy, or networking with potential employers. Or, if the observer is more cynical, a college diploma is for signaling to potential employers that you are smart and conscientious enough to be worth employing, regardless of what you may or may not have studied.

This is why the Baby Boomers first poured out of their small home towns and into college, but that was way back in the '70s. Unfortunately, given their control over the media, the Boomer view persists to this day.

In their day, the only point of going to college was to get a diploma, and since the generation before them wasn't very credentialed, a bachelor's degree gave them a substantial leg up when they were searching for their first real jobs. They had to convince employers that a piece of paper from a college was important, but they won that propaganda war, perhaps owing also to a shift in the mindset of the employer class.

In any case, they didn't care too much about "campus life" beyond the basics of there being a willing student body to get drunk and make out, and an utterly no-frills house or apartment to host the party. They could have gotten high back in their home towns -- and probably did already during high school -- so the point of going off to college was primarily to be able to secure a decent middle-class job and decent pay as a young adult.

As more and more high school grads decided to go to college, a larger and larger fraction of 20-somethings had a bachelor's, diluting its relative value in the job market.

However, we shouldn't let that fact distract us from the changing purpose of college as the higher education bubble as inflated to such extremes. For if students were truly concerned with the career-and-income value of a degree, once they wised up to the diluted value with so many of them in circulation, they would take measures to try to get the most possible out of it.

Whereas before students might have majored in arts and humanities, they would now only major in engineering, accounting, and other employable majors. They would also bust their ass more in their coursework, to make sure their skill-set was maxed out come job-hunting time after graduation. And they would ruthlessly scrutinize the colleges they were thinking of applying to -- with a keen eye to which ones added the most value to the incomes of their graduates.

Instead we observe the exact opposite. "Value added to income" is not just hovering somewhere out in their peripheral vision, they pay no attention to it at all when narrowing down a list a colleges to apply to. They pride themselves on doing as little and as poor-quality work as possible in their classes: "All right, BS-ed another essay an hour before it was due, and still got a B! Unstoppable!" Unstoppable grade inflation, moron -- that's why you got good grades. They make a point of shying away from employable majors, with a steady proliferation of junk majors fed by a ballooning demand for pointless studies -- communications, business, gender studies, African-American studies, etc. Majors that are established and respectable, yet still unemployable, continue to be popular -- philosophy, psychology, history, etc.

If it seems like students aren't going to college to prepare for a better career and higher income and consumption, it's because they're not. Instead, they are preparing for the competition in the arena of lifestyle striving rather than wealth / career striving.

In this earlier post, I discussed these two separate avenues of competitiveness in the context of generational differences. Silents and Boomers -- the Me Generation -- are career strivers, with Boomers pouring into colleges to get a leg up on the Silents, who were career strivers but did not go to college in large numbers. Competitive people will never leave the battle arena, so the careerist avenue has been closed off to new competitors. Gen X and Millennials chose to compete in an arena that was less saturated with contestants, and found it in lifestyle-based competition.

A big part of lifestyle competition is knowledge, but not necessarily of the scholarly or intellectual kind. It's whatever you need to know about, have an opinion on, and be prepared to discuss and passive-aggressively debate with the others in the contest. "Do vaccines cause autism?" is of minor importance on a scholarly or scientific level. But it just happened to be one of those areas of knowledge where lifestyle strivers became expected to have very informed and strong opinions on, whether they are pro or con.

The other part of lifestyle competition is an emphasis on leisure -- not so much with having loads of free time, but what you do with it. Do your leisure activities make you a superior person, or reveal you to be a sub-human loser? There is only so much leisure time in the day, so these contests will revolve around the most basic and frequent non-work activities -- food and drink, lounging around the home environment, sports or athletics, etc.

Combine the contest over knowledge with the contest over leisure activities, and you get decadence. Lifestyle strivers become obsessed with increasingly arcane points about seemingly mundane leisure activities, and having to flit from one fad to the next in order to not appear to be taking a break but still vigorously invested in the competition. Who is cooking the most original and titillating variation on the mac-and-cheese dinner? Who has the latest style jogging shorts? Who has the most on-point living room decor? Whose playlist contains bands that no one else has ever heard of? Ad nauseam.

If all that is the long road ahead of adolescents, then they had better get a solid training in young adulthood. In fact their parents already model the adult lifestyle striver behavior while the children are still school-aged -- bringing home bacon-and-avocado mac-and-cheese for dinner from Whole Foods, so their kids will know what to order when they're on their own. The parents drag the kids along to IKEA so that they'll learn what kinds of trendy furniture to pick out once they're living away at a college dorm room, or their first apartment.

But the parents can only accomplish so much by modeling the behavior. The kids actually have to leave home and begin lifestyle striving in earnest on their own. Hence the current form that the college experience takes.

I've already detailed how the lifestyle-striving orientation guides their choices of college, major, and other aspects that relate most directly to employment and income prospects. Let's take a look at some other revealing ways that college life is more about preparing kids for lifestyle striving rather than career striving.

- Students would rather not work. If the purpose were to take their first baby steps toward a grown-up career, they would all want to work. If they do work today, it's only to provide a little spending money for their lifestyle pursuits, not to learn the ins and outs, nor to establish trust with an employer and get a good recommendation for future employers.

- Colleges spend big bucks not on anything that will help their students earn more money or be more employable. The overwhelming trend during the higher ed bubble has been on providing more leisure and lifestyle services, both mundane amenities (cafe in the library) and spectacles (pro-level sports stadium). Libraries are hang-out spots where no books are read, instead of places for browsing the stacks and reading books that would help you earn more after graduation.

- Dining halls must cater to the nascent foodie snobs, offering charcuterie rather than meat loaf, located in separate "stations" rather than in a single assembly-line, with lighting and decoration appropriate for a sit-down restaurant rather than a public school cafeteria.

- Exercise equipment that a person would ordinarily need a gym membership to have access to.

- Always having something to do for boosting your arts-and-culture quotient. Even mid-tier colleges spend big bucks to acquire more fine art to display in professionally designed galleries. The film club screens the classics every weekend. And there are regular performances from music and dance groups, from both students and professionals.

- What the particular college tells other rival strivers about your lifestyle. Two colleges are equally good at academics and all that other unimportant stuff, but you chose that one that signals you're an urban boho-chic type of striver, rather than the sports buff type of striver. A college's brand and brand value revolve around these qualitative lifestyle matters. 

And so on and so forth.

None of these sweeping, ubiquitous changes to college life make any sense under the view that college is for getting a credential, leading to a good job, leading to good income, leading to higher consumption levels. They make perfect sense under the view that young people today expect to have no shot at the career competition and are opting instead for lifestyle competition, and that the college years are training them for that kind of striving.

It's missing the point somewhat to portray college life as merely a four-year playground experience, as though their decadence will be useless in the real world afterward. A lot of effort still goes into mastering the ins and outs of lifestyle striving -- what topics to be knowledgeable about, what pastimes to pursue, which foods are cool, which interior design schemes are cool, etc.

This isn't just fitting into youth culture or the broader culture -- there's a sense that they're going to be tested on this stuff for the rest of their lives, and they have to be able to keep up with the lifestyle contests no matter how the wheel of fashion spins. So it really is a kind of training or apprenticeship that segues seamlessly into adult status competition (rather than being a pointless vacation), only it's for lifestyle striving rather than career striving.

To wrap things up, how does this correct view let us see what's going on with the loud demands among Millennials to have their student loan debt canceled or to receive a tuition-free college life from the government?

Well, it won't do us any good to lecture them about how they can pay off their debt once they use their degree to get a decent job. They know their degree is worthless -- they went to college for lifestyle striving, not to earn more money.

What they're really asking for is state-subsidized training and apprenticeship in the domain that they'll be competing for status in as adults -- lifestyle contests. In their minds, it's akin to state-subsidized high school classes in math, science, and technology for those who are planning to strive in the career domain. Fairness would seem to argue for subsidized training for the lifestyle strivers too.

Of course, one of those domains is productive for society, and the other only enriches the individual's reputation. But the productive niche is already beyond saturated with incumbents and foreigners to whom the work could be outsourced. We can't expect most young adults to focus on career-building when there are hardly any decent careers waiting to be filled. It's only natural that they will mostly turn to lifestyle striving as their form of "bettering themselves," while accepting a crummy job and crummy living circumstances.

Thus, the decadent and fruitless competition in the lifestyle domain among Gen X-ers and especially Millennials is ultimately the fault of all the competitiveness in the career domain, where Silents and Boomers still run the show and get most of the wealth and status. As if hyper-competitiveness in the career world weren't bad enough in itself (white collar crime taking off like a rocket, selling out the country to make an extra buck, and the like), their tenacious incumbency has created a ripple effect whereby the later generations are not bothering to enter that saturated niche and are focusing their energy and effort on decadence contests instead of something productive.

Reining in the competitiveness in the career world would not only clean things up in the productive part of the economy, it would also free up more decent jobs for younger adults, blunting the appeal of lifestyle striving. Lower demand for lifestyle striving would deflate the higher ed bubble and restore sanity to tuition costs, as well as restore the college's mission to being productive somehow (economically, intellectually, or whatever, but somehow).

It's beyond the scope of this post to talk about how to start reining in the anarchic war of all against all in the career world. The important lesson for now, though, is that many of the things that are going wrong in the world are interconnected, often with one causing another, so that reforming one area will set off a chain reaction and reform some other area as well.

October 11, 2015

Are hyper-liberals secretly ultra-conservative?

In the comments to the post about the moral orientation of the UCC school shooter, the topic of SJW morality came up. A commenter disagrees with the view that liberals and libertarians operate on a moral basis of "avoid harm / provide care" and "fairness / justice," and are numb or color-blind to the other moral bases of loyalty, authority, and purity (in Haidt's terms).

He tries to re-interpret SJW morality as not so distant from conservative morality:

"Purity? See how SJWs try to go out of their way to stay away from bad influences/shut down hints of dissent with the reasoning being that they feel gross/icky/violated/disgusted by having them there. Authority? Notice how well disciplined they are when doing offensives against organizations for not being SJW enough."

SJWs are hyper-liberals, not conservatives in disguise. There's a persistent trend among libertarians and alt-right people to try to re-interpret liberals so that they are the REAL racists, the REAL sexists, the REAL puritans, etc. They aren't -- they're the hyper-liberals that they straightforwardly appear to be.

Trying to re-paint a bunch of practicing degenerates, or vocal enablers of degeneracy, as a bunch of puritans is not just stupid and confusing, but a massive derailing of where conservatives ought to be pushing, theoretically and practically. The only point of such a derail is for the re-interpreter to max their stats for COUNTER-INTUITIVENESS.

The contrarian view talks about SJW intolerance of dissenting views as though they found them disgusting -- yet they do not make disgusted facial expressions when they hear dissent. As hyper-liberals, disgust is not something they are very capable of feeling, let alone expressing. Instead, they feel anger, and the faces they make are variations on the angry face.

We don't have to take them at their word when they denounce something as "disgusting" -- the actual emotion being expressed is anger, the face they make comes from anger.

Notice the difference between indignation -- "How dare somebody have a different view from mine? How disgusting!" -- and feeling contaminated -- "You can't go anywhere in public these days without having to smell some BO-radiating foreigner, or risk catching AIDS when some queer is coughing his lungs out in the checkout line".

Indignation conveys no sense of looming threat of contagion, no vulnerability to danger. Indeed it's based on a sense of invulnerability, of superiority. Calling attention to pollution or desecration, however, relies on the listener feeling vulnerable to such a threat, and is more like sounding a warning to prevent the contagion.

The same goes for "authority" among hyper-liberals. They are obviously not authoritarians, or they would submit themselves to whatever they considered a higher authority. Obedience implies a chain of command, and each level doing their part for the greater well-oiled functioning of the superorganic whole -- both giving orders to those below and carrying out those from above.

SJWs and other hyper-liberals do not form hierarchies with differing levels of authority. Rather, their group consists of an undifferentiated mass of followers of a set of principles or code of conduct, whose motivational power does not derive from the authority of a leader, an elite cadre of leaders, or a nested hierarchy of leaders.

They don't even reach the minimal form of authority where there's a single guru and his group of followers who are exactly the same among themselves regarding how much authority they have, yet who all do the guru's bidding.

When the SJW harangues other people to behave a certain way, it isn't because they believe they have a position of authority over the others. It's based on the assumption that everyone -- the SJW, the SJW's fellow travelers, plus the others being harangued -- has to adhere to a universal set of principles and codes of conduct. Something abstract, not "what our leader / hierarchy of leaders tells us to do".

They have completely leveled the hierarchy, so that everyone has equal authority -- or in other words, no authority. That's why they don't speak and act as though they had authority that others have to obey, in a leader / follower fashion. Rather, they nag, harass, and shame other people to adhere to the common code. Nagging, harassing, and shaming are things you do to your equals in the authority pyramid -- you don't get to nag your superior, and a superior doesn't nag his inferior but gives them the orders.

It's like one great big clique of seventh-grade girls, who have no hierarchy and where constant nagging and shaming prevail in a climate of anarchy.

The problem for SJWs and other hyper-liberals is that that mode of following and enforcing a group's norms only works in small-sized groups. Trying to nag and shame random strangers on the internet will only be met with, "Yawn, go kill yourself, fag".

Related post: Liberal vs. conservative forms of purity as a moral intuition.

TL;DR -- liberal "purity" centers on the welfare of the self, conservative "purity" on that of the entire in-group. Liberals are OCD about personal hygiene and nutrition, but not about degeneracy that spreads disease, desecration of what is sacred, and maintaining populational purity (e.g., viewing foreigners as filthy, likely to spread disease, and so on).

Because morality is about the regulation of behavior for the benefit of people other than the self, personalized liberal "purity" is not really a moral foundation at all, while conservative "purity" concerned with others is.

Why can't they make good horror-comedies anymore?

Hollywood is going to take another stab at the Christmas-themed horror movie -- a genre on hiatus since the 1980s -- with Krampus. Unusually, it is also a comedy, an attempt to mix contrasting tones. See the trailer here, and a long list of horror-comedy movies here for comparison.

In Krampus, they're going for a mash-up of Christmas Vacation and The Evil Dead, but the tone that comes off in the trailer is clashing and discordant rather than balanced or blended. Perhaps it's better executed in the full movie, though tone usually comes across fairly reliably even in a trailer.

At least it's not the standard approach to horror-comedy of the past 20 or so years, where the horror is meant to be taken somewhat seriously, and the comedy comes from self-aware positive responses to the horror -- "Isn't it hilarious how gory the killing is, and how over-the-top the plot premise is?!" Meta-commentary winking at the viewer, who is in on these in-jokes, is not very funny to begin with, let alone when the audience is beaten over the head with them throughout the whole movie. When they're the sole source of humor, the attempt at comedy fails.

Zombie Strippers is a perfect example of this failed approach to horror-comedy, though I only single that one out since I caught it on TV last Halloween season. There are dozens of others like it that I've caught bits and pieces of on late-night TV since the '90s.

Attempts from the '70s and '80s were not quite so bad, when the self-aware humor was open and campy rather than unstated and smug. The Rocky Horror Picture Show and Little Shop of Horrors are way too campy to feel like real horror movies -- they're more like comedy movies set within a horror-inspired narrative.

I'll admit that there may be a genuine exception in Re-Animator from 1985, though. The characters are played as genuine eccentrics, not campy caricatures hamming it up. The overall atmosphere is likewise not deliberately exaggerated, but feels genuinely surreal and absurd. It doesn't feel like the whole movie is one great big in-joke and winking at the audience. In this way it's like Twin Peaks, another classic that's sui generis in terms of tone, blending and alternating all manner of dark and light emotions.

The sparseness of examples in the surreal approach to horror-comedy stems from the difficulty in trying to obscure the deliberate nature of cultural creation when portraying such an absurd world. Something so absurd makes the audience suspicious that the creator is just yanking our chain, and we can't slip into the suspension of disbelief. Most writers, directors, and actors just don't have the level of poker-face discipline to present such an absurd world in a sincere and straightly-played way.

More typical is the approach that Krampus follows, where horrific and comedic tones alternate and contrast with each other. Horror movies are about the supernatural or paranormal destabilizing of the usual order of things, so much so that it disturbs or even frightens the audience. Comedy could be worked into this framework if it took the form of having a sense of humor to get the victims through such a dangerously disordered world, a case of gallows humor. That tends to skirt too closely to the self-aware approach, though, since the characters are voicing what the audience is already thinking: "Hey fellow character, isn't it sickly hilarious how screwed-up our situation is?!"

Examples of this style, where humor alternates with horror as comic relief to terror, haven't been tried in a long while, and were not very successful as either horror or comedy movies back then -- Fright Night, Lost Boys, The Witches of Eastwick, Arachnophobia, Buffy the Vampire Slayer, etc., all from the later '80s and early '90s. That doesn't bode well for the revival of the style in 2015.

The only sure-fire way to incorporate comedy into horror is to blend the two tones rather than alternate them. Somehow the evil beings themselves have to be funny while wreaking havoc, instead of comedy contrasting with horror. The natural choice, then, is to make the evil beings an example of the trickster archetype, a preternaturally mischievous being whose violence takes the form of pranks. Who can suppress their laughter when someone pulls off a great prank, no matter how much their victim may be hurt by it?

Since the trickster is the source of both the horror and the comedy, the tones blend better and cohere better in the audience's mind. And they don't feel so bad laughing at violence if it is not cold-blooded, calculated, and purposeful. The trickster is not a serial killer -- he's an anarchic life-of-the-party type of guy.

Laughing at purposeful violence feels like taking sides in a dispute, and agreeing or identifying with the monster. Laughing at off-the-cuff and indiscriminate violence, however, pardons you from choosing sides. The source of danger is more like a natural disaster roaming around unpredictably, rather than a purposeful actor, and those who get in the way are more victims of bad luck than targets of malevolence. Laughing at the unfortunate victims of a trickster's pranks is therefore a type of schadenfreude. No trouble making that blend of horror and comedy work, in principle.

At the same time, times change in how willing the public is to encourage the trickster to let loose and give us viewers something both funny and a little terrifying to behold. It's hard to think of a purer example of a "bad peer influence" that parents would not want their children to hang out with, even if only in pop culture form. And horror movies are directed primarily at those who still scare easily, namely children and adolescents.

Ever since helicopter parenting took off during the 1990s, this trickster approach to horror-comedy has bitten the dust. But it was very popular during the nadir of parental supervision, back in the '80s.

The most financially successful example is Gremlins, which was the fourth-highest grossing movie of 1984, and like Krampus was set during the Christmas season. Just think about how unlikely the odds were for its success -- a horror movie set during Christmas, and blended with comedy throughout. That's a fine line to walk in writing the script and acting out the characters, as well as designing the monsters and bringing them to life.

Casting the monsters as tricksters made it easy to incorporate humor into their very look and feel -- they can look a little cartoony, and it doesn't detract from their menace, since they aren't portrayed as a serious and sublime evil. In a movie like Krampus, where the monster is designed to look frightening in itself, every time you see the monster only adds to the problem of comedic and horrific tones interrupting each other, and the audience's brain shutting off from too many emotional switches back and forth.

Gremlins spawned a host of imitations -- Critters, Ghoulies, Killer Klowns from Outer Space, Leprechaun, etc. These don't work as well as the original since the monsters are not as adorable and commit much more gruesome violence. But they work well enough to watch if there's nothing else on late-night TV.

But don't expect a successful revival of the horror-comedy genre until helicopter parenting goes into retreat, and parents won't mind their children laughing at the violent and terrifying pranks of supernatural or paranormal tricksters.

October 2, 2015

School shooter was libertarian, not conservative

Apparently the UCC spree shooter left all sorts of profiles around the internet, including a dating website where he described himself as "conservative" and "Republican". Naturally the libs are having a field day, crowing about how the evil man du jour was from The Other Side.

Back on planet Earth, this guy was a libertarian, not a conservative. Gun nut (not a hunter, but gun fetishist), fetish porno addict, into punk / goth, horror movie buff, dislikes "organized religion" AKA religion, into magic / occult / conspiracy theories / other gay shit, definitely does not want children (how uber-traditional), introverted loner, geek, and most importantly from the West Coast.

There are few to no conservatives out West, which is instead populated by libertarians and liberals. Especially in L.A., where he was from before recently moving to Oregon.

Degenerate fetishist, kneejerk "fuck society" attitude regarding authority, looking out only for Number One instead of some greater group -- this loser went against all of the distinctly conservative moral foundations (Haidt labels them purity, authority, and loyalty).

However, "libertarian" usually doesn't appear on dating profile options (and few would understand what you meant anyway), and since he's not liberal, he chose the "conservative" option by default.

But why let his obvious libertarianism get in the way when you can whack off to fantasies about "conservative tears"?