When your own upper management can’t even accurately describe your project, you know you’re doomed.
A project was just getting off the ground several years ago. The senior manager responsible for the team developing the product decided to stoke the hype train by sending a message to the entire division. He laid out a number of detailed scenarios in which this new product would make your life so awesome that you’d wonder how you managed to survive without it for so long.
The scenarios and solutions laid out in the message struck me as rather strange. They weren’t even situations I considered to be problems. Consequently, the so-called solutions he proposed seemed unnecessary.
For example, this technology apparently kicked in when going out to dinner with friends at a restaurant: “Just by the act of walking into the restaurant, our technology controls the interactions with the restaurant for you. The technology is smart enough to build a joint menu for your group with choices you all like to eat.”
I neither needed nor wanted a computer to select dishes for me and my friends. Reading over the menu with my friends and having a back-and-forth discussion over what to order is just as much a part of the fun as eating with the group.
This technology reduced the highly social activity of eating at a restaurant with friends to an electronic transaction. (Maybe electrons enjoy meeting new electrons?) It removed the randomness that makes life interesting. Maybe I’ve never had the lemongrass chicken before, but I might decide to give it a try because I heard the song “The Smell of Lemongrass” on the radio, or I overheard the person at the next table order it for dinner.
Instead, the proposed technology “helpfully” removed lemongrass from the menu and steered me to the items it “knew” I would like. There would always be the suspicion that the restaurant was selfishly trying to steer my group to high-profit items, or perhaps the technology generated a menu with “special prices.”
Another scenario was when you visited a friend’s house. I thought I would stop over to do some knitting, or perhaps spend some time disassembling and reassembling her toaster just to see how it works. The music would automatically change to accommodate your musical tastes upon entering the room, and remain that way for the duration of your visit. The technology would delete all electronic music from the playlist. In that controlled world, you’d never discover that some Pet Shop Boys songs aren’t all that bad.
Every revolution has its holdouts. I guess I was one of the holdouts. Later on, the project was canceled. Everybody went about their business like they always did. If people ever found themselves in a situation described in the original message, they didn’t seem to mind. If they did mind, they just dealt with it the same way they’d been dealing with it for decades: collectively discussing menu choices at restaurants and risking being exposed to different types of music when they visited a friend.
Many years passed, and for some reason, I mentioned the bizarre dystopian universe promised by that project from days gone by. It just so happened that one of the people involved in the conversation worked on that project and took great offense at my characterization. “That’s absolutely nothing like what we were trying to do,” he said.
They had no intention of having their technology take over my interaction with a restaurant or censor the music I heard. Those scenarios were a complete fabrication. Everybody got in a tizzy about nothing. My now-exasperated colleague continued, “I don’t know why everybody thought we were doing that.”
I responded, “We thought you were doing that because that’s what your very own upper management told everybody you were doing.”
You know your project is doomed when your own senior management can’t accurately describe it to anyone.
Raymond Chen's Web site, The Old New Thing, and identically titled book (Addison-Wesley, 2007) deal with Windows history and Win32 programming. It’s not intended to diagnose, treat, cure or prevent any disease.