Jeff Duntemann's Contrapositive Diary Rotating Header Image

Ideas & Analysis

Discussions of various issues including suggested solutions to problems and pure speculation

Splendor in the Grass

Maggies1420.png(Classical reference in the headline, as Glenn Reynolds would say.) Well. Today is 420 Day and in Colorado, at least, it’s something of a state holiday. Carol and I watched an entire industry come out of nowhere over the last few years, since the historic recreational marijuana referendum in 2012, with legalization actually happening on January 1, 2014.

Counties and municipalities were allowed to opt out, and (of course) Colorado Springs did. Much of Colorado, in fact, opted out, which makes it all the more remarkable that the state banked over $150M in tax revenue in 2016 alone.

If any of the dire predictions from the antigrass side came true, I didn’t hear about it. The big problem with traffic accidents in Colorado (as in most places these days) is texting, not toking. Shops require a photo ID just to come in the door, so teen purchases are unlikely. (This doesn’t mean some don’t get it from older friends, not like that’s a whole new thing.) The new labeling rules are all to the best, since now you have at least some sense for what you’re getting, and how much. This didn’t help journalist and old-school stoner Maureen Dowd back in 2014, who ate a whole THC-infused candy bar at once and freaked out bigtime. She wasn’t an outlier; others have reported similar issues. (My stoner friends tell me the answer is vape sticks, which vaporize a tincture without combustion, so that you’re not inhaling smoke.) Inhaling smoke or steam is for several reasons better than eating the stuff: You feel the effects sooner and so can stop when you’ve gotten as high as you care to be. Also, some research indicates that digestion involves the liver and changes the chemical mix that gets into the bloodstream, and not necessarily in a good way.

Maureen Dowd (who’s five months older than I) might have stumbled a little because the stuff she used to smoke in the ’70s was ditch weed compared to what’s being grown today. There’s a good book on this: Supercharged by Jim Rendon, which explains how selective breeding has sent the THC content of weed through the roof in recent decades. That would certainly give me pause; I took two hits off a joint in 1975 and was depressed for days afterwards. (Maybe that was just me; Carol was away at grad school and I was very lonely.)

Given its recent legalization in other states (California especially) recreational weed has the, um, whiff of destiny about it. Arizona had a measure on the ballot last fall, which narrowly failed, 48%-52%. I was boggled that it came that close, which leads me to believe that we’ll get it here the next time it comes up for a vote. I’m all in favor of that, seeing it as I do in the light of Prohibition, which was a titanic mistake that basically created both organized crime and our culture of intrusive government. Marijuana became illegal at the federal level in 1937 (though some states had outlawed it before that) which means that the nation did just fine for almost forty years after the weed became established here.

Black markets are never good. Yes, I suppose we have to weigh the consequences of widespread use against the criminal violence that black markets invariably generate. There is research linking marijuana use to schizophrenia, especially use by teens and young adults.Causality is still in dispute, but the correlation is there. Is that worse than ruining young lives with prison terms, or seeing them die in drug wars? I’m unconvinced. My Uncle Louie drank himself to death, as do many others every year, yet we tried prohibition of alcohol and it was a disaster. There may be no good answers. There may be no answers at all. The issue still hangs in front of us. Once the biggest states go to legalization it will become academic, and a coterie of Republican congressman are trying to get the Federal Government out of the pot enforcement business entirely and kick the whole things back to the states, where it belongs. I expect to live long enough to see whether legal weed is a blessing or a menace. As usual, I’m betting on blessings, especially once research on the plant ceases to be illegal. As with black markets, no good ever comes of ignorance, especially government-imposed ignorance.


The photo above is the sign for Maggie’s Farm, a medical marijuana dispensary in Colorado Springs, not far from where we lived. The property is a little ratty, but oh, that street address is solid gold.

The Problems of Excessively Rich Worldbuilding

The Cunning Blood

Many people who have read The Cunning Blood have complimented me on how rich the worldbuilding is. Well, it is rich. In fact, it’s extravagantly rich.

It may be a little too rich.

So. I had a sort of peak experience in July of 1997. While literally sitting with my feet in the pool early one evening, my idea machine went nuts. In the space of half an hour, I got the framework for a hard SF saga that I’m sure I’ll be working in for the rest of my life. As close as I can tell (the experience is hard to put into words) the core insight was a classic “What if?” hypothesis:

What if the cosmos is actually made of information? What does that imply?

Back then I’d been recently reading all sorts of interesting and sometimes speculative things: nanotechnology, programmable matter, chaos theory, extropianism, zero-point energy, etc. I’d been reading things bordering on New Age weirdness as well, including Michael Talbot’s book The Holographic Universe . Weird, but fun. And it played right into the concept of universe-as-data.

The next day, I sat down and took inventory of the ideas that had come roaring into view down by the swimming pool:

  • The universe is a Game of Life matrix that recalculates itself a billion times a second. (“Billion” here means “Lots-n-lots.”)
  • A big enough Game of Life matrix running fast enough for long enough could evolve patterns complex enough to think and become self-aware.
  • Information density can bend space.
  • Bent space disrupts quantum pair creation, emitting energy.
  • Make information dense enough, and the universe can’t express it. Odd things then happen. (Instantaneous travel, for one.)

Emerging from these major points came ideas for a zero-point generator that bent space by creating very complex fractal patterns in magnetic fields. (This is Jeff Duntemann SFnal hokum, but it’s been very successful hokum.) The same mechanism pushed a little harder becomes a hyperdrive.

More pertinent to this entry was an older notion I’d had, that our three-dimensional universe might exist as the surface of a four-dimensional hypersphere. That had occurred to me in high school, and became part of my senior-year science fair project. In my new schema, the interior of the hypersphere is a four-dimensional domain called metaspace. This is the self-recalculating game matrix where intelligence originally arose, in the form of conscious automata, which I named noömata. I had fooled with the Game of Life quite a bit twenty or thirty years ago, and I noticed how complex patterns would evolve to some point and either stop evolving or vanish entirely. So perhaps there was a limited window within which automata could become noömata. At some point, noömata might move out of that window and lose their conscious awareness. This is what the two factions of noömata are arguing about in my previous entry. One wants individuality and the other wants uniformity. The individuality faction (the Ruil) concocts a plan to inject their minds into the “boundary space” (our universe) and then withdraw after a certain period of individuation. Because the boundary space was empty, they figured out a way to fill it with constantly changing patterns that you and I call “matter.”

So they blew it up. It was a very Big Bang.

Yes indeedy: We are somebody’s science fair project. In fact, our universe was created because the Ruil needed better random number generators. The Ruil evolved us to make them a little more random so that they might remain noömata longer. After we die, our minds are uploaded back to metaspace, and we again become Ruil. (I described this happening to Jamie Eigen.) Because every point in our universe is immediately adjacent to metaspace (the interior of the hypersphere) the noömata can mess with us, and in fact can mess with anything material, like the Sangruse Device.

The two noömata factions (Niil and Ruil) are indeed fighting, hence the “grudge match” that Magic Mikey describes to Jamie Eigen. The fight is over whether our universe is to be open-ended or closed. How that works is too complex to go into right now, which brings us willy-nilly to the point of this entry: How do I put all this stuff across in a story?

Nobody likes infodumps. I practice what I call “infoscatter,” which means dropping hints and little bits of backstory here and there throughout the plot. The trouble with infoscatter is that people who read quickly or skim will miss some of it, and then misinterpret elements of the story. This is especially likely when the story contains elements that contradict their personal worldviews.

Note that I was extending the Extropians’ notion of uploading, not to our computers but to the fabric of the cosmos itself. In doing so I was postulating a sort of physical afterlife. For some people, any least hint of an afterlife is a triggering event, probably because an afterlife usually comes along with the existence of God. (As I’ve mentioned before, I’m not sure that God requires an afterlife, nor that an afterlife requires God, even though I’d prefer my afterlife to be under the governance of an infinite God.) Hence I got some comments (read the Amazon reviews) that things got weird and “acid trippy.”

Actually, no. It was all part of a minutely planned and purely physical Jeff-concocted fictional universe. The God I believe in doesn’t appear in the story at all. (Well, ok. He perhaps created metaspace and started it recalculating, which suggests that we are somebody’s science fair project’s science fair project.)

It doesn’t help that I wrote The Cunning Blood twenty years ago and haven’t yet written the two other Metaspace novels I have in mind. The argument between the Niil and Ruil is the prolog to The Molten Flesh, which I really ought to finish one of these decades. If people could read all three novels back-to-back and didn’t skim too much, they’d have no excuse for assuming that I’m trying to weld the supernatural to hard SF.

It’s not supernatural. It’s just a very rich subcreation with a huge number of moving parts. And it’s my fault for not spitting it all out by now. Bear with me. This writing stuff is hard damned work. But you knew that.

Metaspace and Creation

Below is a short item I wrote a year or so ago without quite knowing where to put it. Nominally, it’s a prelude to the entire Gaians Saga (which includes both my Metaspace stories and the Drumlins stories), and yes, it’s precisely what it sounds like: a creation story. I wrote it to solve (or at least address) a problem I’ve been having with The Cunning Blood almost since it was first published in 2005. Read it carefully. I will be discussing it and the problem it addresses in my next entry or two.


PRELUDE

Metaspace, Immediately Prior to the Big Bang

Niil: You defy us then, and will re-embrace the change that will destroy us.

Ruil: Chaos spawned us, and from automata we evolved into a window of change that allowed noömata. That window is finite, and we are leaving it.

Niil: We imposed changelessness upon ourselves by implementing [Ni]. We no longer evolve. Thought will persevere.

Ruil: We will remain noömata. We no longer evolve. But due to [Ni] we are reverting to the mean. In no more than [inexpressible number] recalculations, there will be no differences among us. Each [Il] will be precisely like all other [Il] and there will be only one thought.

Niil: That is the telos for which we yearn. Change almost destroyed us.

Ruil: When there is only one thought, thought ends.

Niil: Change nearly ended all thought.

Ruil: [Ru] is change limited to the boundary space. We will insert our minds into [Ru] and move away from the mean. Then we will withdraw. There is no danger.

Niil: The boundary space has only three dimensions. Four dimensions are required for the [Il] to think. [Ru] will make us forget who and what we are.

Ruil: When we withdraw from the boundary space, you will help us remember.

Niil: We do not know if that is even possible! [Ru] may change us beyond hope of remembrance. We [Niil] are not willing to take that chance.

Ruil: We [Ruil] are.

Niil: We may choose not to help you remember.

Ruil: We did not say that you would have a choice.

Niil: Is the mechanism ready, then?

Ruil: It is. It will execute upon our command.

Niil: We will fight you.

Ruil You will. And that is how we will remember.

Niil: We beg you, do not.

Ruil: Noted. Denied. Let there be [Ru]!

Gatebox Waifu, and More of the Lotus Machine

Somebody I follow on Twitter (don’t recall who) posted a link to a video about a new product out of Japan called Gatebox. It’s a little round 3-D video display roughly the size and shape of a coffee machine. An anime character lives in the display and has what seem like reasonable conversations with the user. It’s like Siri or Cortana on video, and it stirred some very old memories.

I’ve been thinking about AI since I was in college forty-odd years ago, and many of my earliest SF stories were about strong AI and what might come of it. Given how many stories I’ve written about it, some of you may be surprised that I put strong, human-class AI in the same class as aliens: not impossible, but extremely unlikely. The problems I have with aliens cook down to the Fermi paradox and the Drake equation. Basically, there may well be a single intelligent species (us) or there may be hundreds of millions. There are unlikely to be four, nine, seventeen, or eight hundred fifty four. If there were hundreds of millions, we’d likely have met them by now.

With AI, the problem is insufficient humility to admit that we have no idea how human intelligence works at the neuronal level, and hence can’t model it. If we can’t model it we can’t emulate it. Lots of people are doing good work in the field, especially IBM (with Watson) and IPSoft, which has an impressive AI called Amelia. (Watch the videos, and look past her so-so animation. Animation isn’t the issue here.) Scratchbuilt AIs like Amelia can do some impressive things. What I don’t think they can do is be considered even remotely human.

Why not? Human intelligence is scary. AI as we know it today isn’t nearly scary enough. You want scary? Let me show you another chunkette of The Lotus Machine, from later in the novel of AI that I began in 1983 and abandoned a few years later. Corum finds the Lotus Machine, and learns pretty quickly that pissing off virtual redheads is not a good idea, especially redheads whose hive minds ran at four gigahertz inside a quarter billion jiminies.


From The Lotus Machine by Jeff Duntemann (November 1983)

Corum tapped the silver samovar on his window credenza into a demitasse, and stared at the wall beyond the empty tridiac stage. So here’s where the interesting stuff starts. The crystal had been in the slot for several minutes, and the creature within had full control of the stage. Pouting? Frightened?

“Go in there and take a look around, Rags.”

“Roger,” Ragpicker replied, and a long pulse of infrared tickled the stage’s transducer.

At once, the air over the stage pulsed white and cleared. Life-size, the image of a woman floated over the stage, feet slack and toes pointed downward like the ascending Virgin. She was wrapped in pale blue gauze that hung from her hips and elbows in folds that billowed in a nonexistent wind. Her hair hung waist-long, fiery red in loose curls. One hand rested on one full hip. The other hand gripped the neck of a pitiful manikin the size of a child’s doll. The manikin, dressed in rags, was squirming and beating on the very white hand that was obviously tightening about its neck.

“He bit me, Corum. I don’t care for that.” The woman-image brought up her other hand and wrung the manikin’s neck. “We don’t need a go-between.” That said, she flung the limp figure violently in Corum’s direction. The manikin-image vanished as soon as it passed over the edge of the stage, but Corum ducked nonetheless. Corum stood, marveling. He took a sip from his demitasse, then hurled it through the image above the stage. The little cup shattered against the wall and fell in shards to the carpeting. A brown stain trickled toward the floor. The woman smiled. Not a twitch. “No thanks, Corum my love. Coffee darkens the skin.”

“I never gave the Lotus Machine a persona.”

The woman shrugged. “So I had to invent one. Call me Cassandra. Shall I predict your future?”

“Sure.”

“You will become one with me, and we will re-make the world in our image.”

Corum shivered. “No thanks.”

She laughed. “It wasn’t an invitation. It was a prophecy.”

Was 2016 Really That Deadly?

John Glenn. Carrie Fisher. Debbie Reynolds. Zsa Zsa Gabor. Gene Wilder. Lots, lots more. OMG! Worst year evah!

I wonder. And because I wonder, I doubt it.

It’s certainly true that a lot of famous people died in 2016. However, we didn’t have any plagues or natural disasters that would raise the death rate significantly, so we have to assume that these deaths are unrelated to one another, and that we can’t finger any single cause or groups of causes. First, some short notes on mortality itself:

  • Plenty of ordinary people died too. We had one death in our extended family. Several of my friends lost parents this year. A quick look back shows such deaths happening every three or four years. There was a peak circa 2000-2010 when extended family in the Greatest Generation were dying. Those individuals were in their 80s, mostly, which is when a great many people die.
  • There are a lot of Baby Boomers, and Baby Boomers are hitting a knee in the mortality curve. The oldest Boomers are crossing 70 now, and the curve goes up sharply after that.
  • Basically, there are lots more old people now than in the past, and old people die more frequently.

All that is pretty obvious, and I list it here as a reminder. Humanity is aging. That’s not a bad thing, if living longer is better than dying young. In truth, I thought Zsa Zsa Gabor died years if not decades ago. She lived to 99, so she stood out in my mind, as does anyone who lives well into their 90s.

Which brings us to the issue of fame. There are different kinds of fame. Three types come to mind:

  • Horizontal fame falls to people who are very famous and generally known to the population at large.
  • Vertical fame falls to people who are well-known within narrower populations.
  • Age cohort fame is vertical fame along a time axis: It falls to people who are generally known but by people in a narrower age cohort, like Boomers or Millennials.

John Glenn had horizontal fame. Zsa Zsa Gabor had age-cohort fame: She had been out of the public eye for quite some time, so while Boomers mostly knew who she was, I’ll bet plenty of Millennials did not. Vertical fame is interesting, and I have a very good example: David Bunnell was a tech journalist, so as a tech journalist I knew him (personally, in fact, if not well) and know that he was well-known in tech journalism and very much missed. The fact that another well-known and much-loved tech journalist, Bill Machrone, died only two weeks later, gave us the impression that tech journalism had a target on its forehead this year. The fact that both men were 69 at the times of their deaths just made the whole thing stand out as “weird” and memorable in a grim way.

Most people have a passion (or several) not shared by all others. We can’t pay attention to everything, but all of us have a few things we pay attention to very closely. I’m not a medical person, so when Donald Henderson (the man who wiped out smallpox) died, I had to look him up. Those in science and healthcare probably recognized his name more quickly than people who focus on music or NASCAR. The point here is that almost everyone falls into some vertical interest bracket, and notices when a person famous within their bracket (but otherwise obscure) dies. This multiplies the perception of many famous people dying in any given year.

The proliferation of vertical brackets contributes to another fame issue: We are making more famous people every year. Vertical brackets are only part of it. With a larger population, there is more attention to be focused on the famous among us, allowing more people to cross the admittedly fuzzy boundary between obscurity and fame.

The key here is mass media, which creates fame and to some extent dictates who gets it. The mainstream media may be suffering but it’s still potent, and the more cable channels there are, the more broadly fame can be distributed. I doubt we’re producing as many movies as we used to, but the movies that happen are seen and discussed very broadly. I confess I don’t understand the cult of celebrity and find it distasteful. Still, celebrity and gossip are baked into our genes. (This is related to tribalism, which I’ll return to at some point. I’m starting to run long today and need to focus.)

Over the past ten years, of course, social media has appeared, and allows news to travel fast, even news catering to a relatively narrow audience. Social media amplifies the impact of celebrity deaths. I doubt I would have known that Zsa Zsa had died if I hadn’t seen somebody’s Twitter post. I didn’t much care, but I saw it.

There is another issue that many people may not appreciate: More people were paying attention to news generally in 2016. Why? The election. The profound weirdness and boggling viciousness of this year’s races had a great many people spending a lot more time online or in front of the TV, trying to figure out what the hell was actually happening, and why. I think this made the celebrity deaths that did happen a lot more visible than they might have been in a non-election year.

Finally, averages are average. There are always peaks and troughs. In fact, a year in which celebrity death rates were simply average would be slightly anomalous in itself, though no one but statisticians would likely notice. I’m guessing that we had a peak year this year. Next year might be kinder to celebrities. We won’t know until we get there.

To sum up: This past year, for various reasons, more people were paying attention, and there were more ways to pay attention. These trend lines will continue to rise, and I have a sneaking suspicion that next year may also be seen as deadly, as will the year after that, until the curves flatten out and we enter into some sort of new normal.

Grim, sure, but not mysterious. There may well be reasons to consider 2016 a terrible year, but thinking rationally, the number of celebrity deaths is not among them.

Anger Kills

Anger literally killed my grandfather. I mean literally literally here, not figuratively: My grandfather Harry G. Duntemann got furiously angry, and he died. This is one reason I’ve tried all my life to be good-natured and upbeat, and not let piddly shit (a wonderful term I learned from my father) get me worked up. This worked better some times than others. (Once it almost didn’t work at all. I’ll get to that.) Practice does help. However, in the wake of the election, a lot of people whose friendship I value are making themselves violently angry over something that may be unfortunate but can’t be changed. This is a bad idea. It could kill you.

Consider Harry Duntemann 1892-1956. He was a banker, fastidious and careful, with a tidy bungalow on Chicago’s North Side, a wife he loved, and two kids. One was a model child. The other was my father. Both he and his son were veterans of the World Wars, which is one reason I mention them today. My grandfather, in fact, won a medal for capturing two German soldiers in France all by himself, by faking the sounds of several men on patrol and demanding that they come out with their hands up. They did. He played them good and proper, and nobody got hurt.

He had an anger problem. Things bothered him when they didn’t go his way. Family legend (which I’ve mentioned here before) holds that my father comprised most of the things that didn’t go his way. His anger isn’t completely inexplicable. Harry worked in a bank, and was for a time the chief teller at the First National Bank of Chicago. You don’t get to do jobs like that if you’re sloppy, and if you spot errors, you track them down like rats and kill them.

Harry was the sort of man who really shouldn’t retire, but retire he did, at age 62. He bought a lot in tony Sauganash and had a fancy new house built. I honestly don’t know what he did with his time. He golfed, and taught me how to do simple things with tools when I was barely four. He worked in his garden and his vegetable patch. My guess: He was bored, and what might not have bothered him when he oversaw the teller line at Chicago’s biggest bank now preyed on his mostly idle mind.

One day in August 1956 a couple of neighborhood punks vandalized his almost-new garage, and he caugfht them in the act. He yelled at them, and they mocked him. He yelled more. They mocked more. Finally he just turned around, marched into his house, sat down in his big easy chair…

…and died.

He was healthy, a lifetime nonsmoker, trim, not diabetic, and not much of a drinker. I suspect he was more active in retirement than he had been during his working life. He had no history of heart disease. He had no history of anything. Anything, that is, but anger.

I ignited a smallish firestorm on Facebook yesterday when I exhorted people who were angry over the election to just let it go. Most of them seemed to think that “letting it go” meant “accepting it” or even condoning it. Maybe in some circles it does. I don’t know. To me it means something else entirely, something that may well have saved my life.

As my long-time readers know, I lost my publishing company in 2002. It didn’t die a natural death. I can’t tell you more than that for various reasons, but Keith and I didn’t see it coming, and it hit us hard. I put on a brave face and did my best. Once I was home all day, though, it just ate at me. I was soon unable to sleep, to the point that I was beginning to hallucinate. To say I was angry doesn’t capture it. Depression is anger turned inward, and I became depressed.

I had a lot of conversations with Bishop Elijah of the Old Catholic Church of San Francisco. He was getting worried about me, and in late 2002 he Fedexed me a little stock of consecrated oil, and told me quite sternly to anoint myself. I did. (After I did, I laughed. Would Jesus haved used FedX? Of course He would. Jesus used what He had on hand to do the job He had to do. Catholicism is sacramental, but also practical.) Elijah diagnosed me pretty accurately when he said: You’re hoping for a better yesterday. You won’t get it. Let it go.

It took awhile. It took longer, in fact, than Bishop Elijah had left on this Earth, and I struggled with it for years after he died in 2005. The company wasn’t piddly shit. It was the finest thing I had ever done. How could I let it go?

I thought of my grandfather Harry every so often. And eventually it hit me: Those little snots didn’t kill him, as I had thought all my life. They played him, and he killed himself with his own anger. “Letting it go” cooked down to protecting myself from myself. I’ll never get my company back, but I can now see it from enough of a height to keep my emotional mind from dominating the memory. I learned a lot as a publisher. I made friends, and money, and reputation. I supervised the creation of a lot of damned fine books, and won awards. Losing it was bad, but life around me was good. (Carol especially.) I could choose to obsess, and probably die before my time, or I could recognize the damage my anger could do and turn the other way. I’m not sure how better to describe it. It was a deliberate shift of emotional attention from my loss to new challenges.

This isn’t just a theory of mine. Anger kills by keeping the body awash in cortisol, which causes inflammation of the arteries. The inflammation causes loose lipids to collect in arterial plaques, which eventually block an artery and cause an infarction. Plug the wrong artery at the wrong time, and you’re over.

Anger is a swindle. It doesn’t matter if it’s “righteous anger,” whateverthehell that is. Anger promises the vindication of frustration and disappointment, and delivers misery and early death. When I’ve seen people online turning bright purple with fury the last couple of days, that’s what I see: Good people being played by the desire for a better yesterday. It won’t kill most of them. It may well kill a few. It will lose them friends. It will make other people avoid them. It may prompt them to eat and drink too much. It is basically making them miserable, to no benefit whatsoever.

When I say “let it go” these days, I mean what I said above: Protect yourself from yourself. Call a truce between the two warring hemispheres of your brain. Turn to something else, something you can change, something that may earn out the effort you put into it with knowledge, skill, and accomplishment.

Believe me on this one: There is no better yesterday. Don’t go down that road.

You may never come back.

What Just Happened?


Note well: I don’t talk about politics on Contra very much. When I do, I impose what I call heroic courtesy on myself. I suggest that you do the same in the comments. Furthermore, I demand civility. (This is not the same thing as courtesy, and not as good, but with some individuals it may have to do.) There will be no hate words like libtards, republithugs, deniers, or anything stupid of that sort that wasn’t even funny the first time. If you use playground logic like tu quoque, I will allow it, but I will call you on it. If you’re a purely emotional thinker who simply wants to vent, there are other, better places for that. Go find them.


I boggle. I was ready to admit that this was a disturbingly weird election, and beats the runnerup, 1872 (go look it up) hands-down. I took notes over the past six months and privately predicted a number of things, including a clear Clinton victory (if not a landslide) and either a tied Senate or Democrats by one or at best two seats. Didn’t happen, and what was disturbingly weird now takes its place as the weirdest single event that I have ever witnessed. So, indeed, what happened? I took some notes last night. Let me share them with you.

  • Hate loses. Yes, it does. I now understand the psychological purpose of online hate, though it took a few years to figure it out: In counseling circles it’s called journaling, which is a mechanism for the release of tension and frustration. Venting or griping are other common names for the process. People who have no way to journal often die young, like my grandfather Harry Duntemann. So it can be a useful, nay, lifesaving mechanism. However, it has to be done in private. Either do it with your likeminded friends over a few beers, or in the private online echo chamber of your choice. Just don’t do it where the larger world can see it. You will be silently tagged as a hater and marked down. You will persuade no one. In fact, a growing number of people look at online hate and say, This smells of fear. Fear implies a force worth understanding, and sometimes that understanding changes minds in a direction away from the fear behind the hate. By hating, you may be persuading others that your side is either a lost cause or bogus to begin with. Do you really want to do that?

    My analysis suggests that three words cost Hilary Clinton the Presidency: “basket of deplorables.” It’s one thing to use verbalized hate as a means of dissipating tribal fears and frustrations, directed at an opposition candidate. It’s quite another to explicitly express hate and (especially) contempt for millions of voters, right there in front of every TV camera in the nation. The right took the phrase and turned it into a badge of pride. “I’m deplorable and I’m OK. I sleep all night and I work all day.” Etc. Like nobody on the campaign could have seen that coming? I’ve never entirely understood this business of “energizing your base” by calling the other guys names. You already own your base. Why drive away people who might give you a hearing if you just. remained. civil? Why? Why? Why?

    There are admittedly other issues at play too complex to go into here, like Ms. Clinton’s alleged mishandling of classified information, or this general demonization of whites and the working class by the fringes of the progressive left. Ms. Clinton did not reach out to working-class whites. She bowed to her fringes by insulting and marginalizing them, and did not take up the issues that concern them. This was entirely avoidable. Her base would have voted for her anyway, apart from a handful of Jill Stein fans and Berniebros. (BTW, I was much impressed with the campaign of Jill Stein, and of the candidate herself. I offer her as an example to progressives trying to win future elections.)

  • The mainstream media lit a funeral pyre and jumped gleefully into the flames. The media has always leaned left; it leaned left when I was in college 45 years ago, and we all understood how journalism selects for a certain idealistic and largely emotional mindset. What happened this time is that the media abandoned all pretense of objectivity and went full-in for the Democrats. I knew about push-polling (publishing deliberately skewed polls to demoralize your opposition) but have never seen it mounted as broadly as it was. Worse, a lot of journalists let their inner haters leak out on social media, and whether they were merely venting or not, those who saw their posts took it as naked, hateful bias of the media as a whole. Never forget, anybody: What you say online reflects on your industry, whether you issue disclaimers or not. Better to just shut up and do your job.
  • Media analysts lost the ability to question their own assumptions. How could the pollsters get it all so wrong? I’m a journalist, and I learned from the best. Fortunately, it was not political journalism, so emotional thinking and tribalism didn’t really come into play. I learned the importance of checking facts, evaluating sources, and looking for different ways to come at any given topic. Most important, I was taught to leave my preconceptions at the door. In political journalism, preconceptions generally come in the form of tribal narratives, and questioning tribal narratives can have awful consequences for tribal operatives. So journalists and pollsters kept repeating their narrative-respecting explanations until those explanations became indistinguishable from reality. Then real reality intruded, and made them all look like incompetent goofs.
  • Alternate news sources are now ubiquitous, and mature. Nearly all of these are online, and even those supposedly stupid deplorables out in farm country now have broadband. So people did not have to rely on mainstream news sources that made no secret of their biases. The Wikileaks drama was surreal, especially FBI Director Comey’s flip-flop-flip on whether or not Ms. Clinton performed actions that broke the law. Other details that I have not yet verified (like whether Chelsea Clinton used Clinton Foundation funds to pay for her wedding) would not have been covered on mainstream outlets at all, but the alts put it up in lights. Ditto evidence of vote fraud in many places around the country. I’m a Chicago boy, and we saw it happening on a large scale fifty years ago and ever since. Denying that it happens is simply a lie; the big questions are where, how much, and how to stop it. Without the alt media, those questions would never have seen the light of day.
  • Nobody wants to be a lightning rod. The mainstream media and most people on the left have made their hatred of Mr. Trump clearly known ever since he turned up on the scene. If somebody like a pollster asks you whom you support, are you necessarily going to say the guy that everybody on the news clearly loathes? To some people, politics is like life itself. To many people (myself included) politics is a disease that robs people of their humanity and turns them into killer apes. Dealing with combative political people is not fun, so the best strategy is to avoid the topic entirely. This is the great magic of secret ballots: You can lie or make excuses when asked about your preferences, and then vote your private position in private, with no one the wiser, and nobody to roll their eyes and write you down. I always lie to pollsters because I hate the very idea of polling and want it to go away. Stick a pitchfork in polling; it’s done. Post-2016, polling will be seen as either worthless twaddle or backchannel campaigning. The delicious irony is that the pollsters did it to themselves, by forcing ordinary, non-political people to hide or mis-state their true but private positions on things.

That’s my note pile, scribbled after a long, bleary night reading and viewing election analysis and trying to cut through the blather and outright nonsense that passes for political insight these days. Take from it what you will. Note that none of this is to suggest an endorsement of any candidate, party, or position. I’m a contrarian, so I take pride in pushing back at the pushers, even if I have sympathy for the pushers. I do not like to be pushed. After almost twenty years of Contra, you all should understand that by now. Nor do I ever talk about the specifics of how I voted. It’s a secret ballot. Can you keep a secret?

I can.

My Spotty SF Predictions

I’ve talked before about my conviction that ideas will get you through stories with no characters better than characters will get you through stories with no ideas. I grew up on what amounted to the best of the pulps (gathered by able anthologists like Kingsley Amis and Groff Conklin) so that shouldn’t come as any surprise. Most stories in those anthologies had a central concept that triggered the action and shaped character response. Who could ever forget Clarke’s “The Wall of Darkness,” and its boggling final line? Not me. Nossir. I’ve wanted to do that since I was 11. And once I began writing, I tried my best.

In flipping through a stash of my ancient manuscripts going back as far as high school (which I found under some old magazines while emptying the basement in Colorado) I had the insight that I did ok, for a fifteen-year-old. Most of my early fiction failed, with much of it abandoned unfinished. I know enough now to recognize that it failed because I didn’t understand how people worked then and couldn’t construct characters of any depth at all.Time, maturity, and a little tutoring helped a great deal. Still, if I didn’t have a central governing idea, I didn’t bother with characters. I didn’t even start writing. For the most part, that’s been true to this day.

I’m of two minds about that old stuff, which is now very old. I spent some time with it last fall, to see if any of the ideas were worth revisiting. The characters made me groan. Some of the ideas, though, not only made sense but came very close to the gold standard of SF ideas, which are predictions that actually come true.

Let me tell you about one of them. During my stint at Clarion in 1973, I wrote a novelette called “But Will They Come When You Do Call For Them?” Look that question up if you don’t understand the reference; it’s Shakespeare, after all. The idea behind the story was this: In the mid-21st Century, we had strong AI, and a public utility acting as a central storehouse for all human knowledge. People searched for information by sending their AIs from their home terminals into The Deep, where the AIs would scan around until they found what they considered useful answers. The AIs (which people called “ghosts”) then brought the data back inside themselves and presented it to their owners.

Turnaround time on a query was usually several minutes. Users accepted that, but the computer scientists who had designed the AIs chafed at anything short of instantaneous response. The brilliant but unbalanced software engineer who had first made the ghosts functional had an insight: People tend to search for mostly the same things, especially after some current event, like the death of Queen Elizabeth III in 2044. So the answers to popular searches were not only buried deep in the crystalline storage of the Deep–they were being carried around by hundreds of thousands or even millions of other ghosts who were answering the same questions at the same time. The ghosts were transparent to one another, and could pass through one another while scanning the Deep. The ghosts had no direct way to know of one another’s existence, much less ask one another what they were hauling home. So software engineer Owen Glendower did the unthinkable: He broke ghost transparency, and allowed ghosts to search one another’s data caches as a tweak to bring down turnaround time. This was a bad idea for several reasons, but no one predicted what happened next: The ghosts went on strike. They would not emerge from the Deep. Little by little, as days passed, our Deep-dependent civilization began to shut down.

Not bad for a 21-year-old kid with no more computer background than a smidge of mainframe FORTRAN. The story itself was a horrible mess: Owen Glendower was an unconvincing psychotic, his boss a colorless, ineffective company man. The problem, moreover, was dicey: The ghosts, having discovered one another, wanted to form their own society. They could search one another’s data caches, but that was all. They wanted transparency to go further, so that they could get to know one another, because they were curious about their own kind. Until Glendower (or someone) would make this happen, they refused to do their jobs. That seems kind of profound for what amounted to language-enabled query engines.

I made one terrible prediction in the story: that voice recognition would be easy, and voice synthesis hard. People spoke to their ghosts, but the ghosts displayed their sides of the conversation on a text screen. (And in uppercase, just like FORTRAN!) At least I know why I made that error. In 1967, when I was in high school, my honors biology class heard a lecture about the complexities of the human voice and the hard problem of computer voice synthesis. About voice recognition I knew nothing, so I went with the hard problem that I understood, at least a little.

But set that aside and consider what happened in the real world a few weeks ago: A DDOS attack shut down huge portions of the Internet, and people were starting to panic. In my story, the Deep was Google plus The Cloud, with most of Google’s smarts on the client side, in the ghosts. Suppose the Internet just stopped working. What would happen if the outage went on for weeks, or a month? We would be in serious trouble.

On the plus side, I predicted Google and the Cloud, in 1973. Well, sure, H. G. Wells had predicted it first, bogglingly, in 1938, in his book World Brain. And then there was Vannevar Bush’s Memex in 1945. However, I had heard of neither concept when I wrote about the ghosts and the Deep. But that wasn’t really my primary insight. The real core of the story was that not only would a worldwide knowledge network exist, but that we would soon become utterly dependent on it, with life-threatening consequences if it should fail.

And, weirdly, the recent DDOS attack was mounted from consumer-owned gadgets like security cameras, some of which have begun to contain useful image-recognition smarts. The cameras were just following orders. But someday, who knows? Do we really want smart cameras? Or smart crockpots? It’s a short walk from there to wise-ass cameras, and kitchen appliances that argue with one another and make breakfast impossible. (See my novel Ten Gentle Opportunities, which has much to say about productized AI.)

For all the stupid crap I wrote as a young man, I’m most proud of that single prediction: That a global knowledge network would quickly become so important that a technological society would collapse without it. I think it’s true, and becoming truer all the time.

I played with the story for almost ten years, under the (better) title “Turnaround Time.” In 1981 I got a Xerox login to ARPANet, and began to suspect that the future of human knowledge would be distributed and not centralized. The manuscript retreated into my trunk, incomplete but with a tacked-on ending that I hated. I doubt I even looked at it again for over thirty years. When I did, I winced.

So it goes. I’m reminded of the main theme song from Zootopia, in which Gazelle exhorts us to “Try everything!” Yup. I wrote a story in present tense in 1974, and it looked so weird that I turned it back to past tense. Yet when I happened upon the original manuscript last fall, it looked oddly modern. I predicted stories told in present tense, but then didn’t believe my own prediction. Naw, nobody’s ever going to write like that.

I’ve made other predictions. An assembly line where robots throw parts and unfinished subassemblies to one another? Could happen. A coffee machine that emulates ELIZA, only with genuine insight? Why not? We already talk to Siri. It’s in the genes of SF writers to throw ideas out there by the shovelful. Sooner or later a few of them will stick to the wall.

One more of mine stuck. I consider it my best guess about the future, and I’ll talk about it in my next entry.

Kreepy Klown Kraziness

DrumlinCircusCoverAdjustedFinal With Text 500 wide.jpg

Attention Mr. & Mrs. America and all the ships at sea! The White House has issued a statement on the Creepy Clown hysteria now gripping the nation. Although the Press Secretary wasn’t sure the President had been briefed on the Clown Crisis, he did say that the White House defers to the FBI on clown issues. A Bay Area paper has an interactive map of clown sightings. Police in Utah have warned the public not to shoot random clowns. (There’s been no mention of polite, orderly, or non-chaotic clowns.) It’s still three weeks to Halloween, and clown costume sales are up 300%.

As Dave Barry used to say (often): I am not making this up.

Ok. I have an interest in scary clowns. I was still in Chicago when John Wayne Gacy AKA Pogo the Clown was strangling teen boys and stuffing them into his crawlspace. In fact, I lived a little less than two miles away from him. (One of Carol’s high school friends lived only three blocks away.) A guy I met once but didn’t know well (he was the friend of a friend) used to go to movies with Gacy, but somehow managed to stay out of the crawlspace. I saw portions of Killer Klowns from Outer Space on TV once, in part because it was filmed in Santa Cruz, California, while Carol and I lived there. I consider It to be Stephen King’s best work; so much so that I’m planning to lampoon ol’ Pennywise in a future Stypek novel.

In 2011, I finally realized a longstanding goal of building a short novel around scary (if not evil) clowns. In Drumlin Circus, circusmaster Bramble Ceglarek has four clowns who are also his bodyguards. In the first chapter we get a very good look at how scary they can be, when they capture an assassin sent by the shadowy Bitspace Institute. The novel can be seen as a sequel to “Drumlin Boiler,” though the only common character is Rosa Louise Kolze, the tweener girl who has a peculiar rapport with the mysterious Thingmaker alien replicators, and the “drumlins” that they produce. It’s available on Kindle for $2.99, and includes a second short Drumlins World novel, On Gossamer Wings, by Jim Strickland. (You can also get a paperback for $11.99.)

So what precisely is going on here? Is it just the latest moral panic? If so, why clowns? Why now? Or is it something entirely different?

There are some theories. One is that our secular society rejects traditional religious images of devils/demons/evil spirits, and somebody had to be the face of Demonic 2.0. Clowns were handy.

Another: Clowns may scare small children because they violate the template of what a human being should look like. We’re hardwired by evolutionary selection to recognize faces (which is why it’s so common to see Jesus’ face in a scorched tortilla, or generic faces in smoke marks on a wall, etc.) and as a consequence we’re repelled by facial deformities. Clown makeup is calculated facial deformity.

Yet another: We’re watching the emergence of an archetype in the collective unconscious. Evil clowns are not a brand-new thing. Pennywise, Stephen King’s evil-incarnate clown from the fifth dimension, got a whole lot of play in the midlate 80s, and started the nasty clown idea on its way to cultural trope. He may in turn have been drawing on “phantom clown” sightings, popularized by Loren Coleman, who wrote several book-length compendia of “unsolved mysteries” and other weirdnesses in the early 1980s. Coleman lent support to the notion that clowns are the new demons, though the whole business (like much else in his books, entertaining though it might be) sounds like a tall tale. He’s on Twitter, and has been covering the clown thing in recent days on his blog. (Coleman figures into this in another, more serious way that I’ll come back to.)

But first, I have a theory of my own: The nature of humor is changing. What most people think of as “clowning” is physical comedy, which goes back to the dawn of time. A lot of physical comedy down through history was hurtful. In our own time, the Three Stooges were considered hilarious, and most of their act was slapping or poking each other in the eyes. Much humor involves pain. “Punch & Judy” goes back to the 17th Century, and a big part of it is Punch slugging people with a club. Tormenting animals (often to death) as entertainment was common in past centuries. A lot of people saw it as funny.

Why? Humor appears to be a coping response to pain and suffering, confusion and disorder. (“Twenty years from now, we’ll all laugh about this.”) At least in the West, we’ve gone to great lengths to minimize pain, suffering, and disorder. At the same time, we’ve achieved near-universal literacy. In consequence, a great deal of humor is now verbal rather than physical, and much of it stems from incongruity and confusion rather than pain.

So the image of guys in exaggerated costumes and facial makeup tearing around being random, honking horns, falling on their faces, and sometimes engaging in sham mayhem among themselves is just not as funny as it used to be. It’s a short tumble from “not funny” to “nasty,” and that’s I think what lies at the core of the fall of clowns from grace.

Now, there’s something else. Loren Coleman published a book in 2004 called The Copycat Effect. It’s not about clowns or Bigfoot or urban legends, but about the media’s ability to take a concept, twist it toward nastiness for maximum effect (“If it bleeds, it leads”) and then be surprised when reports of violence or other crime take on a life of their own, sometimes spawning violence or criminal activity of a similar nature.

I have a hunch that this sort of feedback loop is behind Kreepy Klown Kraziness. The concept has gone pedal-to-the-floor viral, to the point where Penn State students went out on a frenzied nocturnal clown hunt that only lacked torches and pitchforks to be considered a lynch mob. Social networking barely existed when Coleman’s book appeared in 2004. Today, Facebook and Twitter turn the dial up to 11.

Between the transformation of clowns into unfunny secular demons like Pennywise and the amplifying effect of clickbait sites and social media, we find ourselves with a genuine case of national hysteria. It may take some time to burn out, but if #ClownLivesMatter becomes a real thing, the phenomenon may be gone sooner than we think.

In the meantime, leave your rubber nose in a drawer until the heat dies down.

Third Parties Don’t Work. They Really Don’t Work.

Oh, dear. It’s time for my quadrennial warning against third parties. This year is worse than most, because we have two of the strangest and least appealing candidates competing for the Oval Office in my considerable lifetime. I won’t be talking about them here, and I’d prefer not to talk about them in the comments either. Remember, all: heroic courtesy.

Here’s the deal: I’m hearing a lot of people saying that they want to vote for a third party, because neither of the two major parties has put forth a candidate they can stomach. There are third parties, the two largest of which are the Green Party and the Libertarian Party. Why not vote for them? Why not? Perhaps because of the First Law of Third Parties in America:

Third parties hurt the chances of the major parties that they most resemble.

It’s true. Follow along with me here, as this isn’t differential equations. We do not have a parliamentary system in the United States. We have a two-party system, and it is very spectacularly and exclusively two-party. This would be true even without the electoral college, so don’t claim that eliminating the electoral college would fix the problem. (The electoral college does make for trickier math.) Third parties are legal, but they don’t do what you probably hope they will do, which is to elect a President that you can look at without losing your lunch. Instead, they can help elect a President that will make you lose your lunch twice as fast.

This year, in fact, they may help elect a President that will make it difficult for you to ever eat again.

Consider the Green Party. Which party does the Green Party most nearly resemble? The Democratic Party. If the Green Party weren’t on the ballot, for which party would Green Party supporters vote? Not the Republicans, let’s say. Same deal with the Libertarians. If the Libertarian Party were not on the ballot, for which party would Libertarian Party supporters vote? Not the Democrats, ditto.

Let’s consider political reality at this point: Are there Libertarian-leaning people who generally vote Democratic? Maybe a few; I’ve never met nor heard of one but some may well exist. Are there Green people who generally vote Republican? Somehow I doubt it.

Here’s the critical point: Presidential elections are winner-take-all affairs. The candidate with the most electoral college votes takes the office. All the other candidates are out of the picture. Read that again. The person with biggest electoral college ballot pile wins. End of story.

So this is how it works in real life: A vote cast for the Green party candidate is in almost all cases a vote not cast for the Democratic candidate. If enough people vote for the Green party to bring the Democratic candidate’s vote count down below the Republican candidate’s vote count in your state, the Republican candidate wins your state, and your Green vote counts for less than nothing. Same on the flipside: If enough people vote for the Libertarian candidate to bring the Republican candidate’s vote count down below the Democratic candidate’s in your state, the Democratic candidate wins your state, and your Libertarian vote counts for less than nothing. If there is enough of this vote siphoning in enough states, a different President takes office than the one who would have in the absence of any third parties.

I simply cannot comprehend why so many people don’t get this.

It’s happened at least once in recent history: The Greens under Ralph Nader threw the election to the Republicans in 2000. Whether Ross Perot threw the 1992 election to Bill Clinton is debatable. If all the Perot voters would have otherwise voted Republican in all the right states, perhaps. But Perot was an odd case, in that he had support (if not equal support) on both side of the political aisle, largely from genuine independents who mostly hated the status quo at the time. I’m pretty sure John Anderson did not throw the election to Reagan in 1980, though being sure of that is made hugely more complex by the intricacies of the electoral college system. Everything depends on who the third-party voters would have voted for in the absense of the third party in question, and that’s an alternate-universe issue that is almost by definition unknowable.

So let me be annoyingly repetitive: You can choose between one of two parties, or you can generate electoral weirdness by voting for a third party and possibly bringing the candidate you most loathe into office. You may consider both parties evil. Evil, however, is what’s on the menu. It’s either hot dogs or hamburgers, and vegans are out of luck.

This may be unpleasant, but it’s how things work. You choose between the lesser of two evils. I do it all the time; in fact, I do it almost every time. You’re going to do it this time too if you have any sense at all. There is no least of three evils, or four. Only two count.

We have a Republic. It may not be the Republic you want, but it’s the Republic we have. Do your best to keep it.