Jeff Duntemann's Contrapositive Diary Rotating Header Image

Ideas & Analysis

Discussions of various issues including suggested solutions to problems and pure speculation

Contrarian Wisdom: Butter

Still on my several-day antihistamine high, but this short entry might be useful:

One of the odder things I uncovered during my ongoing research on the Carb Wars was the divide in American thought on whether butter needs to be refrigerated. You’d think something that simple and that wide, er, spread would have a simple yes/no answer that everybody accepted.

Not so: See the Yahoo Answers forum on the question, “Does Real Butter Spoil?” Almost as many people thought that butter left out even overnight would spoil as thought it could be left out at room temperature for some time.

The first answer on Yahoo Answers is correct: You can leave butter out for weeks and it won’t spoil. I don’t know precisely how long butter will last at room temperature, but it’s at least six weeks. I know that because on one of our trips to Chicago, Carol and I forgot to put the butter in the fridge before we left. When we got back a month and a half later it was fine, and we finished it.

In truth, we weren’t worried. 35 years ago, Carol’s grad school roommate was an Iowa farm girl, and Connie simply left the butter out in a covered dish in the middle of the kitchen table. It never went bad. Seeing (and tasting) is believing, and ever since then our butter has lived on the kitchen counter unless we knew we’d be away for a week or more.

I could never quite understand the confusion (nor the product category of “spreadable butter”) until I read Barry Groves’ uneven but worthwhile jeremiad Trick and Treat, which describes how margarine mostly replaced butter in American households after WWII, at first because it was cheap, and later because it was supposedly healthier. Margarine does go rancid if left at room temperature for more than a day or two, and in time margarine’s conventional wisdom replaced butter’s.

Butter is always spreadable unless you stick it in the fridge. And it makes almost anything taste better. Best of all, it isn’t shot full of chemistry-set goodies, like (of all things) nickel. (See this shrill but scary description of how margarine is made, and from what. They’re not exaggerating; I’ve seen similar descriptions elsewhere.)

Make peace with butter. The science that condemned it was weak, and little by little it’s being exonerated by more recent (and more honest) research.

The Biggest American Place You’ve Never Been

I’m still too groggy to tackle anything cerebral in this space (and it’s uncouth to let your nose drip down into your expensive buckling-spring keyboard) but this self-directed question came up while I was pondering the life I had documented versus the life I remembered: What was the largest American city I had never set foot in?

My guess going in was Minneapolis/St. Paul, but I was wrong, and wrong by a lot. Working from this list of the largest American cities by population, I discovered that the most populous American city I had never visited was…Jacksonville, Florida.

WTF?

Jacksonville is not only larger than either Minneapolis or St. Paul, it’s larger than both of them put together, by almost 150,000 people. I didn’t know that. Nor did I think that Indianapolis was as big as it is (800,000) nor Pittsburgh as small (310,000).

My top three were Jacksonville, Memphis, and El Paso, assuming you don’t treat Minneapolis and St. Paul as a unit. (In practical terms, most people do.) If you do consider them as a unit, the Twin Cities come in third, narrowly ahead of El Paso. Of the top 100 cities listed, I had visited 66. I guess I’m not as much of a hermit as I thought I was, though I’ll admit it took me 57 years to get there.

It’s an interesting exercise, and if any of you are inclined to do the test yourself, I’d be interested in seeing your results in the comments. Now I need to pop another decongestant and lie down again, so that I can get back to real work (of which there is much) tomorrow.

The Impersistence of Memory, Part 1

The other day, I had dinner with my high school locker partner and college friend Tom Barounis. He handed me something that he had found among his own things: a college-era non-SF story manuscript of mine, a typewritten original and not a Xerox copy, complete with comments by an unknown third party who sounded like a college prof. On the back of the last sheet, in my own distinctive block printing, was the date: 4/30/72.

There were two things wrong with this: 1) I don’t remember having my Selectric typewriter in the spring of 1972, and 2) I don’t remember writing the story itself.

Point 1 is checkable. I used to date typewritten manuscripts, and I have two moving boxes full of them back home, so as soon as we get back to Colorado I can haul out my writer’s trunk and see when exactly I made the transition from Smith-Corona to IBM. I recall it being a year later, as I was ramping up for the Clarion SF workshop, which I attended in the summer of 1973.

Point 2 is more peculiar. I vaguely remember writing a story with that title, but the story I remember writing was nothing like the story I read last night, for probably the first time in 37 years. I know what probably happened with the manuscript: After getting it back from the prof I wrote it for, I passed it on to Tom to read, and it remained with him since the spring of 1972.

But why do I remember the story being about something else entirely?

I remember the story being a failed experiment, about two (male) friends who experience a physical attraction between them and don’t know how to deal with it. Instead, it was about two male friends stressing about the draft lottery, and how one of them runs to Canada when he pulls number 5. Furthermore, it was not a failure but a pretty decent story, considering that I only wrote “mainstream” (non-SF) fiction with a gun to my head in those days. (I’d even consider sending it out for publication, except that I don’t think anybody remembers what the draft lotteries were about anymore.)

It’s a headscratcher. It’s also the latest in a series of headscratchers that have turned up here and there as I’ve grown older, and have realized that a growing number of things that I remember happening did not happen anything like the way I remember them. Some did not, in fact, happen at all. I’ve begun to wonder what other memory holes are waiting for me to discover, and how much the life that I remember living resembles the life that I actually lived.

More in coming days.

Big Brother’s Ebooks

An interesting thing happened the other day: People turned on their Kindles to discover that several books they had purchased were just…gone. Amazon had without warning or explanation reached down the devices’ Whispernet connections and wiped all traces of the books, which were by George Orwell. I’m not sure anyone has ever spelled “irony” more clearly than this.

Amazon refunded the full price of all books to all those who had purchased them, of course, or this would have been theft. (Many think, with some justification, that it was still theft.) Yea, the world of Copyright Deathwish is getting stranger all the time.

What I find intriguing is that there are two versions of the story out there:

  1. The rightsholders of the books changed their minds and decided they didn’t want ebook editions on the market, and demanded that Amazon pull them.
  2. The people who licensed the ebook editions to Amazon did not have the right to do so.

Story #1, if true, reflects badly on both Amazon and the Orwell rightsholders. Books are published under contract, and if the author/rightsholder can negate a contract simply by changing his mind, it wasn’t much of a contract. On the other hand, if Amazon won’t hold a rightsholder to the terms of a contract, Amazon isn’t much of a publisher.

Story #2, if true (and I think it’s more likely) reflects badly on copyright law as we have it here in the US. It’s entirely possible that Amazon did what it considered due diligence on the purported rightsholders and decided that they were legitimate. Alas, US copyright law makes it diabolically difficult (and in many cases, simply impossible) to determine who the legal rightsholders to a work actually are. Rights change hands all the time, especially for popular works that have been around for a few decades, and double especially works by authors now deceased. Someone who once had rights to a work may not currently have them, or the rights may have been divided by medium, or the rights may be under dispute between heirs and former licensors, or among the heirs themselves.  Michael Jackson bought the rights to the Beatles’ canon in the US years ago; those rights are now “in play,” as they say.

The core of the problem is that there is no public record of ownership for copyrights, as there is for “real” property, like land or even cars. And in today’s environment of cheap server space, there’s no reason for that to be true. It should be possible to trace ownership of IP from the date it was registered down to the current day, with a legal requirement that changes in ownership be recorded, for copyright to be enforceable. There should be no ambiguity whatsoever about who owns what works in what media, and that record should be available to the general public. As long as it is not, incidents like this will continue to occur.

Amazon has pledged that they won’t do this again, but the damage has been done, both to Amazon’s Kindle system and to the idea of copyright itself. People who bought and paid for a book in good faith had that book taken away by copyright holders without notice or explanation. It may have been legal in the narrowest sense of “legal,” but that doesn’t matter. The incident adds yet another brick to a growing edifice of public opinion seeing copyright holders as arrogant, greedy bullies who can harass individuals on little or no evidence, and take back what they’ve offered to the public on a whim. Whether the perception is true or not (and to what degree) doesn’t really matter. Copyright, especially in an era of fast pipes and massive electronic storage, operates primarily on the honor system, which requires honor on both sides, and a legal framework making it possible for that honor to flourish. No honor, no copyright–and we’re much father down that road than most people think.

The Twilight of the Ad Era

I made a decision late last year without saying much about it: I won’t be using AdSense ads anymore. Now, I’m not going to remove them from existing pages, and I’m not going to shut my account down, but as you might have noticed if you’ve perused my articles over on junkbox.com, my new layouts do not contain ads.

There’s not a lot of point. The curve is heading in the wrong direction.

When I first used AdSense in 2006, my goal was to bring in a dollar a day on average, and I either met or beat that for the rest of 2006 and the first few months of 2007. After May, 2007 things went into a slow decline. My page impressions grew slowly, but revenue slumped, and over 2007 I averaged only 85c per day, which is still worth pursuing. Across 2008 I was averaging only 61c per day, even though page impressions were higher than they had ever been. People just seemed to stop clicking on ads. (“Ad-numb” is a coinage that I’ll offer here if no one else has.) 2009 has earned me an average of about 20c per day, and that’s really not enough to warrant the effort of designing ad spaces into my layouts, especially if it’ll be down to 10c per day next year.

An interesting thing has happened over the course of 2009 so far: Google-tracked page impressions have plunged, even though my overall page hits continue to climb. Some of this is doubtless the rearrangement of my Web content that I began last fall, but it was also true for individual pages (like my Homebrew Radio Gallery) that had not changed significantly since 2006. Daily AdSense page impressions for that single page were always up in the high 30s to low 50s, and are now down to 15-20 tops.

I didn’t start doing anything differently. I’ve never worked at building traffic to my site, and in fact the only way AdSense makes sense to me is if you don’t have to screw with it. Spending time and effort trying to drum up traffic for the sake of ad clicks is time and effort I can’t spend researching and writing new articles (or heaven knows, fiction) so I’ve never bothered.

I think I know what happened: Malware delivered from Web ads has gotten enough publicity that people in large numbers are starting to install ad blockers. This is the only way I can reconcile imploding AdSense page impressions with steadily growing traffic to my site as a whole. Google only counts a page impression when an ad is served; block the ads, and viewing the page does not generate an AdSense page impression.

I’ve never used an ad blocker before, and it was eerie surfing around using the Iron browser, which blocks ads from a huge number of major ad sites (including AdSense) by default. Eerie–and fast. Malware isn’t the only issue with Web ads: Overloaded ad servers slow down page render time, sometimes hugely. This is not new news, but until I saw it myself I couldn’t appreciate the scale of the problem. Iron may not be intrinsically faster than IE or FF, but it looks faster because it doesn’t wait on ads. Blocking ads still makes my conscience twinge a little; here is an interesting discussion on whether it’s wrong to block Web ads. The tipping issue is malware: If all it costs me is time to render your ads, then that may be the cost of viewing your pages. But if there is some significant chance that your ads are serving malware (whether you knew anything about it or not) I feel that I have a right to protect my system and my network. Remember that I can’t tell if your site even has ads before I go there, and if your ads serve malware, my system gets nailed faster than I can back out. The only way I can reliably protect myself against ad-served malware is to block ads entirely, so until each browser instance is a thoroughly isolated VM, there’s no other way.

Thus fades the Great Hope of “free” content supported by ads. What replaces it is obscure. One barely hears the term “micropayments” anymore, and those sites that have retreated behind paywalls don’t seem to be doing well. Among the pubs I read, The Atlantic Online dropped its paywall last year, and the only paywalled site I still read is The Wall Street Journal. Money does need to be involved somehow: I write better material when I get paid for it, and when I pay for material, I have higher standards for it than for what’s lying around free. That being the case, I intuit that a paid Web would be a smaller but far more useful thing than a free Web groaning under the weight of pages (you see them all the time) that exist solely to serve ads. Still, I’ll be damned if I can see the way there.

The Cloud in Your Pocket

We’ve been getting rained on a lot this week, in more ways than one. Carol’s garden is going gangbusters, and I’ve never seen an explosion of wildflowers along my accustomed hiking paths as I’m seeing right now. There’s a bee shortage somewhere in the country, I’ve heard, but the little buggers are thronging the wildflowers here. Temps are deliciously cool, for June, which seems to be a trend this year.

On the flipside, some dorks broke into my hosting directory a few days ago and inserted porn spam links into all my static HTML. They tried to modify the PHP in my instances of the Gallery photo manager for purposes unclear, but Gallery stopped working and I had to delete both instances. (I have backups of all the photos and captions and will reinstall as time permits.)

That whole adventure happened while I was on deadline reading copyedits on the first five chapters, and it did not endear me to cloud computing. I’ve had some time to think about the whole sorry mess, and some larger questions arise:

  1. How do we keep crap like that from happening? (This is a mostly rhetorical question; I’m not sure that we can.)
  2. Apart from portability (i.e., accessing your data on the road) what’s the real value-add in cloud computing? Remember to figure in the cost-benefit of having to find and sometimes pay dearly for a broadband connection to use it.
  3. And if portability is the only value-add, why screw with something as inherently pricey and dicey as the Cloud?

Why not put the Cloud in your pocket?

I just ordered my very first 32GB thumb drive. I skipped the 16 GB size entirely, because my trusty and much-missed 2001 Thinkpad X21 had a 32 GB hard drive, and I never filled it up. It contained all my major apps (Word, Excel, PowerPoint, MapPoint, InDesign) and all my Internet apps, plus a scattering of smaller utilities. It also contained a great deal of data, including everything I had written electronically since 1979, though in truth much of the bulk lay in MP3s. Document files are remarkably small.

One of the most significant trends of the last two years is the explosion in “portable” apps, meaning software that does not require a formal installation process beyond unzipping it into a directory. Nothing goes into the Registry, nor into \windows\system32. The whole app lives in a single folder. What a brilliant idea! (Wait…all software used to be like that…)

There have always been portable apps, but for the last fifteen years or so it’s been seen as declasse to produce them. Why? Think for a second: Once installed, a conventional Windows app can’t simply be lifted out of its folder and copied to another machine. It was one of the earliest forms of stealth DRM, invented, I suspect, specifically to keep MS Office from wandering.

No more. There are now lots of portable packages, many or most of them completely free. See PortableApps and 100 Portable Apps for Your USB Stick. You can get OpenOffice, the Gimp, Thunderbird, Firefox, Kompozer, and just about anything else you might need in portable versions. You can unzip them into directories on a thumb drive, and execute them from File Manager. (There are also portable app managers like CodySafe that give you a separate UI for your portables and stick data.) Portables run like conventional Windows apps, except that they don’t crap in your machine.

I got into portable apps while thinking about degunking for Windows. The Registry is Gunk Central, and much havoc is caused by duelling and mis-versioned DLLs dropped like softball-sized hailstones into the system32 directory. When I got my new Core 2 Quad last summer, I resolved to install only what conventional apps I absolutely needed, and use portable apps for everything that I could. The results? I have a cleaner-running machine that boots fast and has a remarkable lack of line items in Task Manager’s Processes tab. I’ve tried to stick with FOSS apps, because commercial apps are always down there in your taskbar popping up nag balloons, trying to upsell you or force updates down your throat.

It’s worked very well. What I want to try is having a single largish thumb drive containing not only data but also the programs used to manage it. Other people have been doing this for years, and it’s time I gave it a try. In the meantime, my view of the Cloud cooks down to this:

Take from the Cloud what can only be had from the Cloud–and keep the rest in your pocket.

Marvell’s SheevaPlug

Two years ago, I discovered PowerLine networking and have used it ever since, first to cover a CAT5E “dead spot” in my Colorado house, and more recently to finesse Wi-Fi outages at Carol’s sister’s house. The Linksys PLE200 Ethernet bridges work fantastically well within our house, and have sufficient bandwidth to stream HD video. With one unit near my router downstairs, I can take the other unit and plug in to the Internet anywhere in the house where there’s a power outlet, and there are power outlets every six or eight feet on every wall in the place. So whereas it’s not quite Internet Anywhere, it’s pretty damned close.

I remember thinking with a smirk back when I first got the units that it wouldn’t be too long before somebody made Linux run on it. And suppose somebody did? What would be the use of that?

Well, a use occurred to me a few months later, though it wasn’t anything I felt like discussing at the time. But this morning I saw something on Slashdot that made me change my mind. It’s the Marvell SheevaPlug. It’s an 1.2 GHz ARM-based Linux box in a wall wart, and bears a striking resemblance to the various PowerLine brick bridges that I’ve seen in the last few years. It’s got a gigabit Ethernet port and a USB 2.0 connector, but no other interfaces. You talk to it through the Ethernet port, and can use the USB port for external mass storage or whatever. It takes its power from the wall outlet it’s plugged into. It’s only missing one (obvious) thing: PowerLine connectivity.

One of these plugged into a wall is cute, but not a major win. Equipping them with jelly-bean PowerLine logic changes everything: One plugged into a wall downstairs with a terabyte hard drive on the USB port, and three or four plugged into the wall upstairs acting as USB peripherals to computers, and you’ve got a media distribution system for cheap, with no dependence on CAT5 or even Wi-Fi. You can do that now via Wi-Fi, piecing together a system from components. Products based on the SheevaPlug (which is actually an OEM-able hardware platform) already allow this, with more or less kafeuthering, usually more. (See HipServ and PogoPlug.) My take is that if the idea is in fact to make a cheap and simple media distribution component for home use, PowerLine is a no-brainer.

The SheevaPlug does not have PowerLine connectivity, but someday it or something like it will. And a cheap (in my view, ~$80) implementation could turn an entire hotel into a LAN party–a LAN party where nobody knows precisely who or where anybody else is.

I’m not sure if that’s important to gamers or not. I’m not a gamer and have never been to a LAN party. I have read online, however, that there are LAN parties at which the games are almost a secondary attraction, behind the unusual ability to share files at high speeds with few (if some) concerns about Big Media’s enforcers. At public LAN parties, it’s always possible that the MPAA could plant a mole at the party. But if everybody’s sitting quietly in their hotel rooms either gaming or sharing files (or both) any moles tuning in with their own Sheevas would have a hard time knowing whom to call the cops on. Unlike Wi-Fi, it’s hard to get a directional fix on a PowerLine node, and without routable IP addresses, there’s no way to connect a node to a particular person.

This may or may not be technically feasible; it’s an SF concept for me, and I have a couple of story ideas that follow from it or at least make use of it. Much depends on how hotels are actually wired–and if something like this catches on, I’m guessing that they’ll begin soldering a low-pass filter on the 110v feed to every room.

But in the meantime, it’s cool to see my long-time prediction that computers will eventually become bulges between peripherals moving toward actualization. (I did not guess that computers would become their own wall-warts.) And there’s much more to say about what I call “backnets,” which are networks that happen in unexpected ways, often parasitically on other connections. Backnets may be the third coming of pirate radio, in which tweaking the Man is often more important than accomplishing anything useful. (Is there any fiction about pirate radio out there that you know of? Drop titles in the comments if you’ve got any. Thanks!)

An Outrageous Experiment, Part 3

(Continued from yesterday’s entry; the series began on 1/25/2009.)

Recapping: After losing five pounds by not eating Cheerios every morning for breakfast, I tried replacing the calories with protein and fat calories to see if those five pounds would return. I deliberately ate more to see if I could accelerate the process, but what I ate more of was limited to eggs, meat, and cheese. It backfired, and I lost two more pounds in ten days.

When I told Carol on the phone that I was down to 148, she told me to knock it off and go back to my Cheerios. So on the 11th day I called a halt to the experiment. Most of the meat and cheese was gone by then, and I’d had to get another dozen eggs and more yogurt. But I started cooking carbs again: primarily rice, and some conventional pasta. Since I was still batching it, I did weird things like having a bowl of Cheerios as my carb course at supper, next to a yummy plate full of formerly frozen shrimp and a side of creamy cole slaw.

That was only about a week ago, and as of this morning, stark naked and dripping wet, I weighed 151. It only took a week of slamming carbs again to gain three pounds. Carol got home last night. I’m a much happier guy, and will be returning to eating like a real human being. The only long-term change is that I’m having an egg for breakfast instead of Cheerios. Keeping my edge all morning has been delicious.

This experience didn’t surprise me too much. I’ve run into the effect before, although I never had the opportunity to do anything quite this gonzo to test it. Back when I was in college, I weighed about 125 pounds and was mostly skin and bones. Over the years I gradually put on weight, as people do. By the time I was 45 I weighed 170, and Carol told me that I was starting to look several months’ pregnant. Then something interesting happened: I threw a bad kidney stone, which forced me to stop drinking three or four Snapple bottled sweetened iced teas every day. I stopped drinking anything but water while the stone was being analyzed, and I lost several pounds almost immediately. This intrigued me, and when I started drinking sodas again, I drank only diet. The weight stayed off, and started drifting slowly downward. (None of this is news to my long-time readers.)

The next event happened a year or so later when I stopped eating rice bowls down at the corner for lunch every day. I switched to sandwiches or pizza (and no longer ate a softball-sized wad of white rice on a daily basis) and lost another slug of weight very quickly. My weight since then has wandered between 155 and 160. Once I started weight training in 2004, it drifted down to 155 and has been remarkably consistent since then…until last summer, when I stopped eating Cheerios for breakfast.

And now the experiment is over. So…what did I learn? Mostly, this: The conventional wisdom that Fat Bad, Carbs Good, is not unassailable, and the whole business is hugely more complex than most people think. It’s not an issue of thermodynamics, as far too many people believe. We do not “burn” calories in the same sense that we burn leaves out in the alley. Metabolism is an enormously complex biological mechanism, one that we still don’t understand as well as we should–or even as well as we think we do.

I was certainly struck by this: Changes happened a lot more quickly than our conventional understanding of calories and weight gain/loss would explain. If it were simply a matter of wadding on weight when we eat more than we burn, or losing weight when we burn more than we eat, it should take a lot longer. A pound, after all, represents 3,500 calories, and my intake deltas were nowhere near large enough to account for the changes I saw as quickly as I saw them, both on the downswing and on the upswing. I’m aware from my reading of the tendency to shed water on low-carb diets. I took care to drink more water than I generally do, and did not notice myself losing any more than usual. Something else must be going on, and while I’m still researching it, I think the answers may lie in a book I read almost by accident a month ago, a book that triggered this whole crazy idea.

(To be continued as soon as I can manage it.)

An Outrageous Experiment, Part 2

Recapping Part 1 of this series, yesterday: Back in the summer of 2008 I stopped eating a bowl of Cheerios every morning, to see if I could avoid the “fuzzy” feeling that commenced half an hour after breakfast and lingered for an hour and sometimes longer. Within three weeks, I had lost five pounds. I also lost the fuzzy feeling.

I found this intriguing, since it meshed with a few other things that had happened years earlier when my diet changed abruptly for some reason. (I’ll save the deep history for Part 3.) I read a few books, some of which I will review in the near future. There is a very old and very contrarian position in the health field to the effect that if you eat more carbs, you gain weight, and if you eat less carbs, you lose weight. This seemed to be the case with me, though all the data that I could find had been gathered in the treatment of overweight people. I was not and had never been significantly overweight. (I have never weighed more than 170.) It was a head-scratcher, and the question would have remained purely academic, except that we have known since last fall that Carol was going to be in Chicago for two or three weeks in January. I was going to be cooking for myself and eating alone all that time.

Hmmm.

I had lost weight by dropping one daily bowl of Cheerios from my diet. The hypothesis was obvious: Suppose I replaced the calories represented by a bowl of Cheerios with an equivalent number of calories, but from protein and fat. Would I gain the weight back?

I went shopping on the way home from dropping Carol off at the Denver airport. I bought more almonds. I bought a dozen extra-large eggs. I bought lots of cold meat, cube steaks, bratwurst, and frozen shrimp. And cheese, wow: sliced Havarti, a wedge of Romano, grated Parmesan, and a package of those appallingly delicious artificial Swiss-flavored cheese slice substitutes. I bought a big container of creamy cole slaw. I bought several cups of Greek-style high-fat yogurt. I bought a pint of table cream for my coffee. As a coup de gras (heh) I bought half a pound of bacon.

Slightly daunted by all that unapologetic fat, I drew up my courage, and I ate.

Now, a largish bowl of Cheerios with a half cup of 2% milk represents about 150 calories. An extra-large egg is 85 calories; fried in butter brings it up to a little over 90. A tablespoon of cream for my cafe au lait is another 29 calories. 3 oz of Greek-style yogurt gave me 115 calories, over about 85 for the light yogurt I had been eating before, for a calorie delta of 30. It was close to a wash; 150 before; 150 after.

That was breakfast. For lunch I had cold meat and cheese and occasionally an egg, and every couple of days, two strips of bacon. I did a lot of interesting things with the raw materials: I made a handcrafted Bacon Cheese Egg McMuffin. I made a new sort of ham and cheese sandwich, by sandwiching two slices of ham between two slices of Havarti cheese. I did not cut out carbs completely–I like them too much–but one Bays english muffin was it for lunch. For dinner I typically had a cube steak fried in walnut oil, another 3 oz of Greek-style yogurt with blueberries, some Romano cheese, and maybe a few Wheat Thins.

I made buffalo spaghetti sauce, enough for several nights, and served it over whole-wheat capellini. When I didn’t feel like cooking, I just thawed some shrimp and went nuts.

What I did not eat was sugar or refined carbs. I read labels like I generally read only SF, history, and theology, taking notes. There’s at least a smidge of sugar in almost everything, but if it was high-fructose corn syrup, I put it back on the shelf. I had no desserts, and I left the last two boxes of Christmas cookies in the pantry, unopened. I did not eat any potato chips. I did not eat any rice. I did not eat any white pasta. The only bread I ate was from a package of cracked-wheat bratwurst rolls. When I snacked at all, it was on dry roasted almonds.

I did not scrimp. I ate as much as I wanted; in fact, to accelerate the process (given that I only had a little over two weeks to regain my five pounds) I ate as much as I could stand. I probably ate about 25% less in terms of carbs than I generally do, but I ate a lot more protein and fat. I did not change my exercise regimen.

After ten days of this, I tallied the results: I felt great. I was never hungry.

And I had lost two more pounds. Oh, dear. If I wasn’t careful, I would be burned at the steak for heresy.

(To be continued tomorrow.)

An Outrageous Experiment, Part 1

Carol’s coming home tomorrow, finally, after two and a half weeks in Chicago helping her mom. This was nothing sudden, and I had had a crazy idea in reserve, at which I hinted in my 2009 plan file, which I posted on New Year’s Eve. Some of you mailed me, puzzled, about this item:

  • Eat Less Sugar. Eat More Meat. Lose More Weight. (More on this shortly.)

One woman, whom I’ve known for a number of years, scolded me: “You’re crazy! You don’t need to lose any weight!”

That’s true. I do not need to lose any weight. However, when I do lose weight, I damned well want to know why.

Ok. There is some backstory that I haven’t given you yet. This may take me a couple of days to get through, but I think it’s important. So let’s get underway.

For a number of years now, I’ve weighed 155, and I consider that my ideal weight. I’m 5’9″ tall and lightly built. My blood chemistry is good and I have no major health problems. I walk regularly, and do weight training once a week. This has been my regimen (such that it is) since we moved to Colorado in 2003.

My customary breakfast all this time has been a bowl of Cheerios in 2% milk, and half of a 6 oz cup of fat-free, low-sugar “light” yogurt, mixed with organic blueberries. (The organic is incidental. I don’t care how they were grown; they just taste better.) I’m used to a certain period of muzziness that follows breakfast, and assumed it was just my blood rushing to my stomach. Morning is my productive time for writing, and my post-breakfast fuzzies slowed me down. I resent that, but I considered it inevitable until I read something online about the phenomenon. Eating carbs for breakfast will do that to you. Hmmm. So some months back, I just stopped eating Cheerios in the morning, hoping that I would be mentally sharper until lunch. And wham! It worked. I got a little hungry at 10:30 AM, but I did not lose my edge after breakfast. I was writing more, and better, from 7 AM all the way until noon. So I bought dry-roasted almonds to snack on mid-morning, and kept to the regimen.

Well, something else happened: In about three weeks, I lost five pounds.

I did not think that had five pounds to lose, but I shed another inch of waistline, and had to punch another hole in a couple of my belts. Carol told me she wanted me back at 155. However, I am unwilling to lose my morning edge. It was a bit of a conundrum, but I knew that, come January, I would be batching it again for almost three weeks, eating alone. So a totally outrageous experiment suggested itself…

More tomorrow.