The biggest problem is that people have stopped being critical about the role of the computer in their lives. These machines went from being feared as Big Brother surrogates to being thought of as metaphors for liberty and individual freedom.
— Ellen Ullman
I think that focusing all experiences through the lens of the Internet is an example of not being able to see history through the eyes of others, to be so enamored of one's present time that one cannot see that the world was once elsewise and was not about you.
I don't consider myself a Jewish writer.
I think technical people now should learn literature, because literature teaches you a great deal about how - the depths and variety of human imagination.
Our Constitution is designed to change very slowly. It's a feature, not a bug.
Writing was a way to get away from my life as a programmer, so I wanted to write about other things, but of course nobody wanted to publish another story about a family, unless it was extraordinary. When I began writing about my life as a programmer, however, people were interested.
'I am not adopted; I have mysterious origins.' I have said that sentence many times in the course of my life as an adopted person.
Technology does not run backward. Once a technical capability is out there, it is out there for good.
Watching a program run is not as revealing as reading its code.
Y2K has challenged a belief in digital technology that has been almost religious.
The act of voting, to put it in computing terms, is a question of user interface.
Multitasking, throughput, efficiency - these are excellent machine concepts, useful in the design of computer systems. But are they principles that nurture human thought and imagination?
The condition of my personal workspace is my own business, as I see it.
It's possible to let technology absorb what we know and then re-express it in intricate mechanisms - parts and circuit boards and software objects - mechanisms we can use but do not understand in crucial ways. This not-knowing is fine while everything works as we expected.
I used to pass by a large computer system with the feeling that it represented the summed-up knowledge of human beings. It reassured me to think of all those programs as a kind of library in which our understanding of the world was recorded in intricate and exquisite detail.
Human thinking can skip over a great deal, leap over small misunderstandings, can contain ifs and buts in untroubled corners of the mind. But the machine has no corners. Despite all the attempts to see the computer as a brain, the machine has no foreground or background.
A computer is a general-purpose machine with which we engage to do some of our deepest thinking and analyzing. This tool brings with it assumptions about structuredness, about defined interfaces being better. Computers abhor error.
My approach to being a self-taught programmer was to find out who was smart and who would be helpful, and these were - these are both men and women. And without learning from my co-workers, I never could've gone on in the profession as long as I did.
I'm a pessimist. But I think I'd describe my pessimism as broken-hearted optimism.
I really don't like books when characters are just bad or just good.
I don't like the idea that Facebook controls how people express themselves and changes it periodically according to whatever algorithms they use to figure out what they should do or the whim of some programmer or some CEO. That bothers me a great deal.
There's some intimacy in reading, some thoughtfulness that doesn't exist in machine experiences.
Through the miracle of natural genetic recombination, each child, with the sole exception of an identical twin, is conceived as a unique being. Even the atmosphere of the womb works its subtle changes, and by the time we emerge into the light, we are our own persons.
Programmers seem to be changing the world. It would be a relief, for them and for all of us, if they knew something about it.
With all the attention given to the personal computer, it's hard to remember that other companion machine in the room - the printer.
To be a programmer is to develop a carefully managed relationship with error. There's no getting around it. You either make your accommodations with failure, or the work will become intolerable.
Y2K is showing everyone what technical people have been dealing with for years: the complex, muddled, bug-bitten systems we all depend on, and their nasty tendency toward the occasional disaster.
After we have put our intimate secrets and credit card numbers online, what can prevent us from putting our elections there as well?
Introduced in the 1960s, multitasking is an engineering strategy for making computers more efficient. Human beings are the slowest elements in a system.
UNIX always presumes you know what you're doing. You're the human being, after all, and it is a mere operating system.
Productivity has always been the justification for the prepackaging of programming knowledge. But it is worth asking about the sort of productivity gains that come from the simplifications of click-and-drag.
It had to happen to me sometime: sooner or later, I would have to lose sight of the cutting edge. That moment every technical person fears - the fall into knowledge exhaustion, obsolescence, techno-fuddy-duddyism - there was no reason to think I could escape it forever.
Abhorring error is not necessarily positive.
Tools are not neutral. The computer is not a neutral tool.
Has Google appropriated the word 'search?' If so, I find it sad. Search is a deep human yearning, an ancient trope in the recorded history of human life.
I'm a dark thoughts writer.
I feel the best villains are the ones you have feelings for.
You can only get a beginner's mind once.
So many people for so many years have promoted technology as the answer to everything. The economy wasn't growing: technology. Poor people: technology. Illness: technology. As if, somehow, technology in and of itself would be a solution. Yet machine values are not always human values.
I like mysteries.
Computer programming has always been a self-taught, maverick occupation.
Reading code is like reading all things written: You have to scribble, make a mess, remind yourself that the work comes to you through trial and error and revision.
Computer systems could not work without standards - an agreement among programs and systems about how they will exchange information.
I like the little semi-competencies of human beings, I realize. Governance, after all, is a messy business, a world of demi-solutions and compromise, where ideals are tarnished regularly.
Internet voting is surely coming. Though online ballots cannot be made secure, though the problems of voter authentication and privacy will remain unsolvable, I suspect we'll go ahead and do it anyway.
The ability to 'multitask,' to switch rapidly among many competing focuses of attention, has become the hallmark of a successful citizen of the 21st century.
I came of technical age with UNIX, where I learned with power-greedy pleasure that you could kill a system right out from under yourself with a single command.
When knowledge passes into code, it changes state; like water turned to ice, it becomes a new thing, with new properties. We use it, but in a human sense, we no longer know it.
The human mind, as it turns out, is messy.
I hate the new word processors that want to tell you, as you're typing, that you made a mistake. I have to turn off all that crap. It's like, shut up - I'm thinking now. I will worry about that sort of error later. I'm a human being. I can still read this, even though it's wrong. You stupid machine, the fact that you can't is irrelevant to me.