Jeff Duntemann's Contrapositive Diary Rotating Header Image

books

In Search of the Great Unifier

I’ve been in book publishing since long before there were ebooks. Print was always primary, and you saw to print first. Once ebooks became practical, ebooks were derived from print book content. The tools were dicey, and the renderers (in ebook readers and apps) were very dicey. (I think they still are. Will any common ebook reader render a drop-cap correctly? If so, let me know. I have yet to see one that does.) The way publishing is currently evolving, this has to change. Ebooks are becoming the afterthought that wags the industry, and print, where it survives at all, looks to become an extra-cost option.

I’ve been watching for that change for some time, while continuing to use the same system I learned in the 1990s. I write and edit in Word, and then do layout and print image generation in InDesign, which I’ve used since V1.0. I’m willing to change the apps I use to generate books of both kinds, but it’s got to be worth my while.

So far, it hasn’t been. I do intuit that we may be getting close.

What rubbed my nose in all this is my recent project to clean up and re-issue my novel The Cunning Blood in ebook format. Although it was published in late 2005, I actually wrote the book in 1998 and 1999. Even when you’re 62, sixteen years is a long time. I’ve become a better writer since then, and beyond a list of typos I’ve accumulated some good feedback from readers about booboos and awkwardnesses in the story that should be addressed in any reissue. So the adventure begins.

There’s a common gotcha in the way I create books: Final corrections to the text in a layout need to be recaptured when you return to manuscript to prepare a new edition. I was in a hurry and careless back in 2005. I made literally dozens of changes to the layout text but not to the Word file. To recapture those changes to the manuscript I’ve had to go from the layout back to a Word file, which with InDesign, at least, is not easy. I don’t intend to make that mistake again.

That said, avoiding the mistake may be difficult. Word processors are marginal layout programs, and layout programs are marginal word processors. The distinction is really artificial in this era of eight-core desktops. There’s no reason that one program can’t maintain two views into a document, one for editing and one for layout. The marvel is that nobody’s succeeded in doing this. My only guess is that until very recently, publishing drew a fairly bright line between editing and layout, with separate practitioners on each side of the line. Few individuals did both. What attempts I’ve seen are shaped by that line.

Consider InCopy. Adobe introduced InCopy with CS1. It’s a sort of allied word processor for InDesign. It never caught on and is no longer part of CS. (Only one book was ever published about InCopy CS2, which is the surest measure of failure on the part of an app from a major vendor.) I have CS2 and can guess why: InCopy requires a great deal of what my Irish grandmother would call kafeutherin’ to transfer copy between the two apps. InCopy was designed for newspaper work, where a lot of different writers and editors contribute to a single project. I consider it it a multiuser word processor, for which I have no need at all. For very small press and self-publishing, we need to go in the opposite direction, toward unification of layout and editing.

There is a commercial plug-in for InCopy called CrossTalk that sets up InDesign and InCopy for single practictioner use, but the damned thing costs $269 and may no longer support CS2.

I’m still looking. A couple of my correspondents recommended I try Serif’s PagePlus. I might have done so already, but the firm’s free version installs crapware toolbars that most people consider malware. The paid version does not; however, I’ll be damned if I’ll drop $100 on spec just to test something.

I know a number of people who have laid out whole books entirely in Word, and I could probably do that. With Acrobat CS2, I could generate page image PDFs from a Word file. Atlantis edits Word files and generates good-quality .epub and .mobi files from .docx. That’s not a bad toolchain, if what you want is a chain. I already have a chain. What I want is a single edit/layout app that generates page images, .epubs, and .mobis.

Etc. The tools are definitely getting better. Solutions exist, and one of these days soon I’m going to have to choose one. As I said, I’m still looking. I’ll certainly hear suggestions if you have some.

The Manhattan Hardcover Conundrum

Judging by the online commotion, people are still arguing about whether Amazon or Hachette (and by implication, the rest of the Big Five) will win the current fistfight over ebook pricing. The media has generally positioned Hachette as the plucky little guy trying to take on Saurazon by getting everybody in the Shire to stand up, face east, and yell, “Huzzah!” It’s not that easy, heh. But then again, nothing is.

My position? I think the fight may already be over. The Big Five lost. I say that for several reasons:

  1. The Feds are against them. The whole fight is about how to keep ebook prices from falling, which in antitrust law hurts the public and becomes actionable when producers collude. Even the appearance of collusion will start that hammer on its way down again. Hachette has one leg in a sling before the kicking contest even begins.
  2. The public has already decided that ebooks can’t be sold at hardcover prices. In fact, this decision was made years ago. Although the issues are subtle, it’s completely true that producing ebooks is considerably less costly than producing print books, especially hardcovers. What publishers have tried to declare the floor ($10) is probably now the ceiling. That ship has not only sailed, it’s folded into hyperspace.
  3. Monopsony power (one buyer facing many sellers; e.g., Amazon) is not illegal. I’ve read in several places that Section 2 of the Sherman Antitrust Act does not outlaw monopsonistic practices unless they are acquired by exclusionary conduct. There’s not a lot of settled case law about what sorts of conduct are considered exclusionary by a goods retailer, as opposed to an employer. Future cases may change this, but it’s going to be a near-vertical climb.
  4. Virtually all recent technology works in Amazon’s favor. Ebook readers, cheap tablets, fast ubiquitous broadband, POD machinery, thermonuclear sales data collection, online reviews, you name it: Amazon has almost no legacy baggage.
  5. Almost everything works against print publishing generally, and the Big Five in particular. I’ll come back to this.

I’m still not entirely sure how I feel about the whole business. In general, letting publishers set their own prices via agency agreements with retailers is a good thing because it allows startups to undercut them. The value to the public of any individual publisher (or conglomerate) is low, as long as startups have access to markets and can replace them. Access to bricks’n’mortar retail shelves has always been and still is tricky. Access to other retail channels has never been easier. If I were ten years younger I might be tempted to try again.

Now, why is Big Print in such trouble? Somebody could write a book (and I wish Mike Shatzkin in particular would) but here are some hints:

  1. Trade book print publishing is a big-stakes wager against public taste. It’s hard to predict what the public will want even in categories like tech. Literary fiction? Egad. Guess wrong, and you’ve lost what might be a million-dollar advance plus the full cost of the press run and any promotional efforts.
  2. The economics of trade book publishing are diabolical. Trade books are basically sold on consignment, and can be returned by the retailer at any time for a full refund. This makes revenue projection a very gnarly business. Books assumed to be sold may not stay sold.
  3. Online used bookselling reduces hardcover sales. Buying a hardcover bestseller soon after release is a sort of impatience tax. The impatient recover some of the tax by listing the book on personal retailing sites like eBay or Amazon Marketplace at half the cover price. The patient get a basically new book for half-off, and then sometimes sell it again…for half the cover price. This would not be possible if online searches of used book inventory weren’t fast and easy.
  4. Related to the above: Remaindering teaches the public that new hardcovers are cheap. Most print books are eventually remaindered. The remainders are generally sold online for as little as three or four bucks. They’re new old stock books with a marker swipe on one edge. The more publishers guess wrong about press runs (see Point #1) the more books are remaindered, and the more hardcovers lose their mystique and (more important) their price point.
  5. Fixed costs for the Big Five are…Big. There is a very strong sort of “Manhattan culture” in trade book publishing. Big publishers are generally in very big, very expensive cities, which carry high premiums for office space and personnel. My experience in book publishing suggests that none of that is necessary, but as with Silicon Valley, it’s a cultural assumption that You Have To Be There, whatever it costs.

Bottom line: The Big Five need the $25 (and up) hardcover price point to maintain the business model they’ve been evolving for 75 years. If hardcover sales ramp down, they need ebook sales to make up the difference. Ebooks are cheaper to produce and manage (i.e., no print/bind costs, shipping, warehousing, or returns) and it’s quite possible that a $20 ebook price point could stand in for a $30 hardcover price point. However, Amazon has trained the public to feel that an ebook shouldn’t cost more than $10. Indies have put downward pressure on even that, and the demystification of hardcovers via used and remainder sales hasn’t helped.

What options do the Big Five have? Culture is strong: They’re not going to cut the glitz and get out of Manhattan. (That may not be invariably true; Wiley US moved from Manhattan to New Jersey some years ago. Wiley, however, does not publish trade fiction and has never been deep into glitz. I doubt, furthermore, that they would have moved to Omaha.) A reliable midlist might help, but midlist titles now exist mostly as ebooks. Most publishers, big and small, have long since outsourced design and production to third parties, and are already doing a great deal of printing in China. Beyond that, I just don’t know.

Don’t misunderstand: My sympathies are with publishers, if not specifically large publishers. I was in the trenches and I know how it works. Books can only be made so cheap before quality suffers, especially ambitious nonfiction like Steven Pinker’s The Better Angels of Our Nature. We may be in a race-to-the-bottom that cannot be won by either side. What I’d really like is honesty in all quarters about the issues and (especially) the consequences. Rah-rah tribalism helps no one.

Both sides have points in their favor. Amazon has done something not well-appreciated: It’s made it possible for self-publishers and indie publishers to reach readers. Physical bookstores have long been barriers to entry in publishing. Quality remains a problem, but hey, is that a new problem? Traditional publishers claim that they guarantee quality, even though “quality” is a very tough thing to define. Most of my life I’ve abandoned a fair number of print books every year as unreadable, not because I dislike the approach or the topic but because the writing is bad. This is supposedly the value that publishers add. The adding is, shall we say, uneven.

My suggestions sound a little bit banal, even to me:

  • Publishers need to pay more attention to objective quality. Bad writing is a fixable problem; you either don’t buy it, or you fix it after you buy it if you judge the work important enough to go forward. This is the edge traditional publishers have over the indies.
  • Amazon needs to consider that book publishing is an ecosystem in which many players have important roles. Market share won’t matter if you kill huge segments of the market. They may not care; there’s plenty of money in selling thumb drives and diapers.
  • Readers need to meditate on the realities of writing. Writers need to be paid. Cover price isn’t everything. Quality matters.
  • The hardcover as the core of trade publishing must die. Hardcovers need to become a luxury option. If I read an ebook or paperback of a truly excellent work, I may want a hardcover, and we’re very close to having the machinery to do hardcover onesies at reasonable cost. I’ve upgraded to hardcover many times, but generally on the used market, since by the time I read a paperback the hardcover may already have been remaindered and unavailable new.
  • Publishers need to ask themselves if Manhattan and San Francisco really deliver benefits comcomitant to their astronomical cost.
  • Amazon is a given. The Internet leans toward channel capture. If it weren’t them it would be someone else. Grumble though we might, we need to start there and figure out the best way forward.

In the meantime, remember: There are countless sides to every argument, and no easy answers to anything. You are always wrong. And so am I. Get used to it.

Print and Ebook from the Same PDF

I’ve been tinkering with a recast of my 1993 book Borland Pascal 7 From Square One since 2008, for the excellent FreePascal compiler. One reason I set the project aside after a year or so is that I wanted to see if the Lazarus IDE would mature a little. I had originally planned to use the text-mode IDE bundled with the compiler, but it had what I considered dealbreaker bugs. Besides, if Lazarus became usable, I could create a tutorial for it as well. Lazarus is now at V1.2.2 and (at least to the limits of my tests so far) works beautifully. I’ve gone back to the FreePascal From Square One project, yanking out all mention of the text-mode IDE, and deliberately tilting it toward a prequel tutorial for a future book on OOP and creating GUI apps in Lazarus, Delphi-style.

One of my goals with the project has been to create a single PDF that can be used as both an ebook and a print image. I’ve experimented with implementing it as an epub, with disastrous results. Layouts containing lots of art just don’t work as reflowable text. On the other hand, PDF images are painful to read and navigate unless the reader device can render a full page legibly. Back in 2008, we didn’t even have iPads yet, and the target display for my PDF ebooks was my 2005-era IBM X41 Tablet PC. You could read the PDF…barely. I spun a couple of page layouts that used smaller pages and larger fonts. They were readable, but looked bizarre (almost like children’s books) when printed to paper.

I left it there for some years. Come 2010 we met the iPad, with the multitude of Android slabs hot on its heels. Displays improved radically. I got an Asus Transformer Prime in 2012, and found the 1280 X 800 display startling. I took the page design that I had originally created for my X41 and tweaked it a little. The page size is A4 rather than letter or standard computer trim, for three reasons:

  • Whereas some POD houses can give you computer trim, sheets aren’t readily available at retail and thus can’t be printed at home.
  • A4 paper is the default paper size in Europe, where I suspect that most of my readers will be. It can be had in the US from the larger office stores, and modern laser printers will take it. Lacking A4 paper, the book can be printed to letter sheets with only a little bit of reduction.
  • A4 paper is taller and narrower than letter, and maps a little better to the wide-format displays that dominate the non-iPad tablet segment.

The layout still looks odd to me. Half a century of reading has made me used to fine print, so the larger type jars a little. However, the layout has plenty of room for technical art and screenshots, and full pages read very well on the Transformer Prime. (This is not true on the much smaller Nook Color.)

With print publishers struggling terribly, I’m guessing that this is one possible future for technical publishing: new layouts that allow the same PDFs to be either printed or rendered as ebooks. More and more specialty books that I buy are POD (I know how to spot them) and the customary high prices on computer books leave plenty of margin to make POD copies profitable.

You can help me out a little. The ebook is far from finished, but I’ve posted the PDF here. I’d be curious to know how legibly it renders on other tablets, particularly those smaller than 10″ but with 1280 X 800 or better resolution. Again, it’s not a complete book and there are plenty of typos and layout glitches in it. What I want to know is whether or not technical readers will find it usable on modern tablets. Thanks in advance for any feedback you can provide.

Odd Lots

  • This is where we stayed on Grand Cayman last week. Unless I misrecall, it was about $150 a night. Don’t forget that it was not air conditioned.
  • For deep reading, print may be the way to go, for reasons we don’t yet understand. In looking back a year or so, I realize that I generally read fiction on my Transformer Prime, and nonfiction on paper. It wasn’t a conscious decision–and may simply be due to a reluctance of nonfiction publishers to issue ebooks–but it was probably the correct one.
  • Here’s yet another reason why I’ve decided to let the Sun actually reach my skin.
  • It’s starting to look like diet has little or no effect on cancer risk. This has been my suspicion for a long time. Obesity, yes. Diet itself, no. (Thanks to Bruce Baker for the link.)
  • Ohh, Ancel Key’s beautiful wickedness is all starting to unravel. Saturated fat has nothing to do with heart disease. This has also been a suspicion of mine for some time, along with the suspicion that eating fat will make you lose weight more quickly than simply going low-carb. It certainly worked that way for me. I now weigh only eight pounds more than I did when I was 24, and a good deal of that is probably muscle I put on via ten years of weight training. (Thanks to Trevor Tompkins for the link.)
  • Interesting paper on why the Neanderthals died out. They didn’t necessarily die out becausethey were inferior. (Maybe they didn’t die out at all but are still here, pretending to be ugly Saps.) If I had to guess, I’d say their skulls got so big as to make childbirth problematic. But what were they doing with all that gray matter? (Thanks to Erik Hanson for the link.)
  • I stumbled on a year-old article that pretty much captures my reaction to weather.com. I will add, however, that weather.com beats the living hell out of The Weather Channel.
  • I’m still waiting for reports of cataclysmic pwnage on XP machines. The number “2000″ comes to mind.
  • Speaking of which, I still need XP because my HP S20 slide scanner has no driver that will run on Windows 7. Haven’t tried the VM trick yet, but ultimately that’s the way I’ll have to go.
  • I knew there was a reason I only lived in Baltimore for 23 months.

Daywander

This entry will be a hodgepodge, or as they say in some circles, a “hotch potch.” (I think it’s a Britishism; Colin Wilson used that spelling many times.) Stuff has been piling up in the Contra file. Carol and I have been slighting housework for these past six months, she laid up after surgery on both feet, and me writing what has doubtless been the most difficult half-a-book I’ve ever written. We’ve been cleaning up, putting away, and generally getting back to real life. Real life never tasted so delicious.

One reason is rum horchata. I’m not one for hard liquor, mostly, and generally drink wine. (Beer tastes far too bitter to me.) But Rumchata got me in a second. It’s a dessert cordial no stronger than wine, with the result that you can actually taste the other ingredients, like vanilla, cream, and cinnamon. Highly recommended.

People ask me periodically what I’ve been reading. After soaking my behind in computer science for the past six or eight months, I’ve been studiously avoiding technology books. That said, I do endorse Degunking Windows 7 by my former co-author Joli Ballew. I actually used it to learn some of the Win7 details that weren’t obvious from beating my head on the OS. I wish it were a Coriolis book, but alas, it’s not. That doesn’t mean it’s not terrific.

True to my random inborn curiosity about everything except sports and opera, I’ve developed an interest in the chalk figures of southern England. The next time we get over there (soon, I hope, though probably not until summer 2015) we’re going to catch the Long Man of Wilmington, the White Horse of Uffington, and that very well hung (40 feet!) Cerne Giant. Other chalk figures exist, many of them horses. Some can be seen from Google Earth. A reasonable and cheap intro is Lost Gods of Albion by Paul Newman. The book’s been remaindered, and you can get a new hardcover for $3. I wouldn’t pay full price for it, but it was worth the hour and change it took to read. My primary complaint? It needs more pictures of chalk figures, duhh.

Quick aside: While researching kite aerial photography with my found-in-the-bushes GoPro Hero2 sports camera, I came upon an impressive video of the White Horse of Westbury taken from a double bow kite (rokkaku). I have the cam, and loads of kites. All I need now is a chalk figure. (I suspect I could coerce my nieces into drawing one for me.)

Far more interesting than Lost Gods of Albion was Gogmagog by Thomas Lethbridge. I lucked into a copy of the 1957 hardcover fairly cheap, but availability is spotty and you may have to do some sniffing around. If you’re willing to believe him, Lethbridge did an interesting thing back in the 1950s: He took a 19th century report that a chalk giant existed on a hillside in Wandlebury (near Cambridge) and went looking for it. His technique was dogged but straightforward: For months on end, he wandered around the hillside with a half-inch metal bar ground to a point, shoving it into the ground and recording how far it went in before it struck hard chalk. His reasoning was that the outlines of a chalk figure would be dug into the chalk, and thus farther down than undisturbed chalk. In time he had literally tens of thousands of data points, and used them to assemble a startling image of two gods, a goddess, a chariot, and a peculiar horse of the same sort as the Uffington White Horse.

Not everybody was convinced. Even though Lethbridge was a trained archaeologist, his critics claimed that he was a victim of pareidolia, and simply seeing the patterns he wanted to see in his thousands of hillside holes. The real problem was that Lethbridge was a pendulum dowser, and a vocal one: He published several books on the subject, which make a lot of claims that aren’t easily corroborated. Lethbridge claims that most people can dowse, and hey, it’s an experiment that I could make, if I decided it was worth the time. (It probably isn’t.)

The third book in my recent readings is The Physical Phenomena of Mysticism by Herbert Thurston, a Jesuit priest who spent a good part of his life collecting reports of peculiarly Catholic weirdnesses (stigmata, levitation, inedia, odor of sanctity, etc.) and presenting them in a manner similar to that of Charles Fort, if better written. Most of the articles were originally published in obscure theology journals, but were collected in 1952 in a volume that I’ve never seen for less than $100. Last year it was finally reprinted by White Crow Books and can be had for $18. I’m not sure what one can say about reports of people who have not eaten for forty years. Mysticism is a weird business, but physics is physics. The book is entertaining, and it’s given me some ideas for stories, particularly since I have a spiritually butt-kicking psychic little old Polish lady as a major chartacter in Old Catholics. (Vampires are just so 2007.)

If three books doesn’t seem like much, consider my habit of going back to books I’ve read and liked, and flipping through them to see what notations I’ve made in the margins. We all make them; when was the last time you deliberately went back to read and reconsider them? I’ve been dipping into Gary Taubes’ Good Calories, Bad Calories, Steven Pinker’s The Better Angels of Our Nature, Colin Wilson’s A Criminal History of Mankind, and Matt Ridley’s The Rational Optimist, and arguing with my own marginal notes. One can learn things arguing with oneself, and I’ve been known to change my mind based on things I scribbled in other people’s book’s ten or twelve years ago. (Before that I was too young to have anything like informed opinions.)

For example, I’ve gone back to calling it “global warming.” Climate is always changing, and the assumption that we know all the forces propelling those changes is just wrong–and in tribalist hands, willfully dishonest. Carbon dioxide has exactly one climate trick in its bag: It warms the atmosphere. That’s it. If the discussion is about carbon dioxide, it’s about global warming. Why climate changes is still so poorly understood (and so polluted by political hatred) that we may be decades before we even know what the major forcings are. In the meantime, I want predictions. If your model gives you climate data out fifty years, it will give you data out five. Publish those predictions. And if they prove wrong, be one of those people who really do #*%^*ing love science and admit it. Being wrong is how science works. Being political is how science dies.

I have a long-delayed electronics project back on the bench: Lee Hart’s CDP1802 Membership Card. I started it last summer, and set it aside when the Raspberry Pi gig turned up. It’s basically a COSMAC Elf in an Altoids tin. I had an Elf almost forty years ago. I programmed it in binary because that’s all there was in 1976. And y’know? I can still do it: F8 FF A2. F8 47 A5…

Some things really are eternal.

Odd Lots

An Ebook Piracy Mystery

For the most part, the ebook pirates have forgotten about me. Five or six years ago, I was all over the pirate sites. Now I’m not even on the Pirate Bay, and haven’t been for some time. Binsearch shows that the last time I was uploaded to Usenet was almost a year ago. It’s enough to give a guy a complex. (It’s certainly enough to make me feel like I need to write more books.)

So last week the backchannel sent me a link to an article about how several major textbook publishers have subpoenaed a couple of Usenet service providers demanding the identities of two prolific Usenet uploaders operating under the pseudonyms Rockhound57 and HockWards. Both upload technical books to a certain newsgroup devoted to technical nonfiction.

Boy, do they.

I fired up my newsreader and took a look. I’d been there before, and have gladly downloaded crufty scans of old Heathkit and classic tube gear manuals and the occasional supreme oddity, like the German-language service manual for the Nazi V-1 flying bomb. There are scans of military field manuals and much other odd junk, plus all the spam, trollery and asshattery we’ve been accustomed to seeing in newsgroups since, well, there were newsgroups. (I first got on Usenet in 1981.) Rockhound57′s posts are, for the most part, academic science books of almost vanishing narrowness. If you’re ever curious about Dipetidyl Aminopeptidases in Health and Disease, well, Rockhound57′s got it. Ditto Automorphisms and Derivations of Associative Rings. I actually thought that “cobordism” in Algebraic Corbordism was a typo. Then I looked it up. Man, if you can make head or tail of that one, you’re a better geek than I.

If you think about what those books (and they are indeed books, and not articles) have in common, you may understand some panic on the part of the big presses: Those books have very, very small audiences and very, very high cover prices. Algebraic Cobordism has a cover price of $99. Small potatoes. Hold on to your manifolds: Automorphisms and Derivations of Associative Rings will cost you $269. I’m not exaggerating when I suggest that there are maybe 500 people on Earth who might conceivably buy such books, most of them starving graduate students. (I suspect that the publishers make what money they make selling to university libraries.) Having perfect PDFs flitting around on Usenet is an academic publisher’s worst nightmare.

But that leads us to a very important and completely unanswered question: Where did all those perfect PDFs come from? Not a single one of the titles I spot-checked is available as an ebook on Amazon. These copies are not slap-it-on-the-glass pirate scans. They are as perfect as the print images we used to generate for our books at Coriolis and Paraglyph. If they’re not being sold, how did the pirates get them to begin with?

I can think of a couple of possibilities:

  1. They’re DRM-stripped versions of e-texts that aren’t sold on Amazon but rather through heavily protected textbook sales channels like Adobe’s.
  2. They’re the print book equivalent of “screeners,” sent out for review, proofing, indexing, etc.
  3. They’re downsampled print images lifted somewhere along the pipe leading from the publisher to the printing presses.

My gut is going with #2, though #3 is certainly possible. Publishing services have been thoroughly commoditized. Most publishers use freelancers for proofing and editing, and many outsource layout itself. Any time a print image goes outside a publisher’s doors, there’s the chance it will “get legs.”

That said, I boggle at how many perfect PDFs were uploaded by those two chaps. We’re talking literally tens of thousands. Are there that many leaks at the major presses? Or is something else going on here? I still can’t quite figure it. I do know that a number of backchannel sources have told me that ever more file sharing is being done locally and off-Net, often by passing around now-cheap 1TB SATA hard drives. There’s no stopping that. Publishers need to start taking a very close look at their own internal processes. Pulling production back in-house might help, but it wouldn’t be a total solution, at least as long as desktops have USB ports. Problems don’t always have solutions, and piracy is probably one of those increasingly common nuisances.

There were times when I miss being in publishing. Alas, there are other times when I’m glad I’m not.

Odd Lots

Novel Compression Schemes

I’ve been selling my writing professionally since I was an undergrad, now literally forty years ago. I’ve had to do remarkably little selling. My first story and first article both sold to the first places I sent them. I’ve never had a publisher turn down a computer book proposal. (Granted that selling books to a publisher you co-own is rarely a challenge.) My fiction has been a mixed bag, but in general a story either sells quickly or not at all.

All changed. This is the toughest market for novel-length SFF since, well, forever. I’ve just spent two years writing Ten Gentle Opportunities, and now the selling begins. This is a new thing for me. I’ve historically considered tireless self-promoters to be tiresome self-promoters, and now I are one. I hate to go that way, and if there were another way I’d already be taking it.

It begins this weekend, when I have a chance to pitch to a major SF publisher at the Pikes Peak Writers Conference. The pitch happens in a time slot literally eight minutes long. I have eight minutes to make a bleary editor hungry to read my book. No pressure.

The primary challenge is to summarize the novel in synopses of various sizes, from 5,000 words down to…140 characters. Various markets and agents prefer synopses of various sizes, so they’d all better be right there on the shelf, ready to go at a moment’s notice.

This is harder than it looks; nay, it’s diabolical. The story itself is insanely complicated to begin with: One of my beta testers described it as “a Marx Brothers movie with twice as many Marx Brothers.” That’s just how I write, as anyone who’s read The Cunning Blood will understand. I have a mortal fear of not giving my readers their money’s worth, and a venial fear of being boring.

The way to write synopses of five different lengths is to start with the longest one, and write each one from scratch. In other words, don’t write the longest one and then try to cut it down to the next smaller size. This is like trying to turn hexacontane into propane by pulling carbon atoms randomly out of the middle; sooner or later the molecule has too many holes and falls apart.

It’s work, but it works. I finished the 300-word synopsis earlier this morning, and then set my hand to the gnarliest task of all: the “elevator pitch,” AKA logline. I get to summarize a manic 94,200 word story into 140 characters. I’ve actually been trying and failing to nail this for literally six months, since I finished the first draft. I first thought it would be easy, as I used to write cover copy for early Coriolis books. Heh.

The solution, as I said, is to start from the beginning. Each time I wrote a synopsis from scratch, I was forced to take two more steps up the ladder, and look down at the story from a little more height. You literally tell it again, each time with half the words you had last time. In the process, you get a clearer sense for what the story is about, and what the major themes are. Finally you end up with something you can say in an elevator between two adjacent floors:

A spellbender flees to our world with ten stolen nuggets of magic, and a crew of AIs helps him battle a repo spirit sent to retrieve him.

Will this work? Dunno. I guess I’ll find out this weekend.

Why I Like Old Software

I still use Office 2000. I still use Visio 2000. I still use InDesign 2.0. I still use VMWare Workstation 5. Hell, I still use Windows XP. Am I lazy? Am I cheap? Am I nuts? No, no, and hell no. Every piece of software I use is the result of a calculated decision and a certain amount of research. I am by no means averse to paying for software, and I do so regularly. But I don’t always upgrade, especially if the upgrades cost money and/or deliver 80% of their value to the vendor. By that I mean software designed to win what I call “pip wars” (feature-comparison charts on review sites) or place new restrictions on installation and use ostensibly to limit piracy. (Mostly what anti-piracy features do is massage titanic corporate egos.) There are loads of people who will stand there drooling in wait of the next major release, money in hand, never suspecting that the largest single reason for the upgrade is to keep the revenue stream flowing.

The longer I use my Old, Old, Software, the better I like it. Here are a few reasons why:

  • It’s already paid for. The longer I use it, the more hours of use I get for my buck.
  • By and large, old software (at least pre-2002 or so) doesn’t activate. The benefits of activation flow entirely to the vendor, at least in circumstances where the benefits are not entirely imaginary. Most of the time, they are. Activation delivers nothing but annoyance and occasionally downtime to the end user–and in doing so, trains many otherwise honest users to be pirates.
  • Old software is smaller and faster on new machines. Bloat is real, even if it’s not the result of fighting pip wars somewhere. Office 2000 seems almost supersonic on my quadcore, doubly so on my quadcore from my new SSD.
  • Old software respects the skills I’ve developed over the years. Most of the changes I’ve seen across major upgrades are gratuitous, and don’t add any value over the old versions. UI changes in particular had better deliver spectacular new value, because while I learn them they slow me way down.
  • Tutorial books on old software can be had for almost nothing. I routinely buy books on early-mid 2000′s software for $5 or so…books that had original cover prices in the $40-$50 range. Many of these books are unused remainder copies that are essentially new.
  • The argument I hear when I make this point mostly cook down to, Isn’t it eventually obsolete? That depends on what “obsolete” means. Backward compatibility is usually retained, because people rebel when it isn’t. (Windows 8? Are those peasants carrying registered torches and pitchforks?) The only significant thing that Word 2000 doesn’t do is handle docx files. I bought Atlantis to convert any docx files I might need to keep on hand. (Atlantis also creates extremely clean epubs.) Word 2000 is weak on PDF skills, so I bought PDF Xchange Pro to handle that, and as a bonus eliminate any need for the exploit farm we know as Adobe Reader.

I do upgrade software when I see a need. Windows XP eventually replaced Win2K here, even though it activates, because it had certain things I eventually judged useful. I’ve purchased InDesign three times, because I make money laying out books and the new features were useful, but I stopped when Adobe added their uniquely paranoid activation. (Interestingly, I haven’t felt any compelling need to upgrade since V2.0, and I’m interviewing Scribus.) I dumped Dreamweaver when I wanted to move my Web pages to CSS, because Komposer did CSS as well as I needed it to, for free and without activation. It pains me to say it, but with Delphi pricing now up in four figures (and encumbered by activation) I’ve moved all my Pascal programming to Lazarus 1.0.6.

This last issue is important. Open source has changed a great many things. I used to pay for email clients, including Eudora and Poco Mail. Since I discovered Thunderbird, I’ve stayed with Thunderbird. Why? Email is a mature technology. I’m not sure there’s much innovation left to be wrung from it. This is less true of Web browsers, and I now use Chrome most of the time. But man, what’s new in word processing? What? Lemme think for a second… Hang on, it’ll come to me…

This is a key point: The basic mechanisms of computing are mature. There has been time for the slower dev cycles of open source to catch up with commercial software. The action is out on the edges, in speech recognition, automated translation, vertical markets of many kinds, and niche-y mobile apps. We’re still seeing some useful evolution in Web browsers, but there’s damned little in releases of Office past 2000 that I find compelling. Most of the new features are UI tweaks and useless gimmicks.

Old software still has fizz: The best we could want already is!