A computer is a general-purpose machine with which we engage to do some of our deepest thinking and analyzing. This tool brings with it assumptions about structuredness, about defined interfaces being better. Computers abhor error.
Internet voting is surely coming. Though online ballots cannot be made secure, though the problems of voter authentication and privacy will remain unsolvable, I suspect we'll go ahead and do it anyway.
With code, what it means is what it does. It doesn't express, not really. It's a very bounded conversation. And writing is not bounded. That's what's hard about it.
Human thinking can skip over a great deal, leap over small misunderstandings, can contain ifs and buts in untroubled corners of the mind. But the machine has no corners. Despite all the attempts to see the computer as a brain, the machine has no foreground or background.
If you've ever watched someone who is a mother talk on the phone, feed the dog, bounce the baby, it's just astounding to see someone manage, more or less well, to do all those things. But on a computer, multitasking is really binary. The task is either in the foreground, or it's not.
Before the advent of the Web, if you wanted to sustain a belief in far-fetched ideas, you had to go out into the desert, or live on a compound in the mountains, or move from one badly furnished room to another in a series of safe houses.
I think technical people now should learn literature, because literature teaches you a great deal about how - the depths and variety of human imagination.
Multitasking, throughput, efficiency - these are excellent machine concepts, useful in the design of computer systems. But are they principles that nurture human thought and imagination?
So many people for so many years have promoted technology as the answer to everything. The economy wasn't growing: technology. Poor people: technology. Illness: technology. As if, somehow, technology in and of itself would be a solution. Yet machine values are not always human values.
My mother told me that my birth mother got pregnant by a married man who didn't want to leave his wife.
Y2K is showing everyone what technical people have been dealing with for years: the complex, muddled, bug-bitten systems we all depend on, and their nasty tendency toward the occasional disaster.
The ability to 'multitask,' to switch rapidly among many competing focuses of attention, has become the hallmark of a successful citizen of the 21st century.
I hate the new word processors that want to tell you, as you're typing, that you made a mistake. I have to turn off all that crap. It's like, shut up - I'm thinking now. I will worry about that sort of error later. I'm a human being. I can still read this, even though it's wrong. You stupid machine, the fact that you can't is irrelevant to me.
It had to happen to me sometime: sooner or later, I would have to lose sight of the cutting edge. That moment every technical person fears - the fall into knowledge exhaustion, obsolescence, techno-fuddy-duddyism - there was no reason to think I could escape it forever.
No one in the government is seriously penalized when Social Security numbers are stolen and misused; only the number-holders suffer.
What happens to people like myself, who have been involved with computing for a long time, is that you begin to see how many of the 'new' ideas are simply old ones coming back into view on the swing of the pendulum, with new and faster hardware to back it up.
The web is just another stunning point in the two-hundred-thousand-year history of human beings on earth. The taming of fire; the discovery of penicillin; the publication of 'Jane Eyre' - add anything you like.
With all the attention given to the personal computer, it's hard to remember that other companion machine in the room - the printer.
I used to pass by a large computer system with the feeling that it represented the summed-up knowledge of human beings. It reassured me to think of all those programs as a kind of library in which our understanding of the world was recorded in intricate and exquisite detail.
Reading code is like reading all things written: You have to scribble, make a mess, remind yourself that the work comes to you through trial and error and revision.