Like civil-rights protesters who sang rousing hymns as they were carried off to jail, Twitterers are bearing witness to what's happening around them, and calling out into the darkness of cyberspace for confirmation. I'm here. You're here, too. We are present.
— Douglas Rushkoff
When Steve Jobs toured Xerox PARC and saw computers running the first operating system that used Windows and a mouse, he assumed he was looking at a new way to work a personal computer. He brought the concept back to Cupertino and created the Mac, then Bill Gates followed suit, and the rest is history.
While Google has given away pretty much everything it has to offer - from search and maps to email and apps - this has always been part of its greater revenue model: the pennies per placement it gets for seeding the entire Google universe of search and services with ever more targeted advertising.
What makes a great standalone piece of hardware is not the same thing as what makes a great networking device. One can work as an essentially closed system. The other is absolutely dependent on its openness.
Files on iTunes - and thus iPods - are incompatible with everything else. Applications on iPhones may only be sold and uploaded through the iPhone store - giving Apple control over everything people put on to the devices they thought they owned.
Google is in a position where it doesn't even have to strive to become a hip, conscious choice. Brands are temporary fads. Functionality is forever. Google just has to 'be,' and everyone will end up there sooner or later.
The competitive advantage professional journalism enjoys over the free is just that: professional journalists, whose paid positions give them the time and resources they need to commit more fully to the task. If we can't do better, so be it.
Open source is a beautiful way of collaborating; but what's happening on the free Internet is more akin to the 'crowdsourcing' of journalists and other content creators by advertisers who no longer have to pay them - only the search engines that parse their articles.
Removed from 'Gmail' doesn't necessarily mean removed from all Google servers. In fact, your old emails are the data set from which Google models our behaviors - the real product it is offering its advertisers.
Virtual simulations allow post-traumatic stress disorder sufferers to re-experience the events that traumatized them, and then slowly desensitize themselves to their impact through repeated recreations involving not just sight and sound but even smell.
The cloud is still really just a bunch of servers, owned by someone or something, whose decisions and competence must be trusted. This applies to everything from Google Docs to Gmail: Putting our data out there really means putting it 'out there.'
The iPad - contrary to the way most people thought about it - is not a tablet computer running the Apple operating system. It's more like a very big iPhone, running the iPhone operating system.
Digital media are biased toward replication and storage. Our digital photos practically upload and post themselves on Facebook, and our most deleted e-mails tend to resurface when we least expect it. Yes, everything you do in the digital realm may as well be broadcast on prime-time television and chiseled on the side of the Parthenon.
Everything we do in the digital realm - from surfing the Web to sending an e-mail to conducting a credit card transaction to, yes, making a phone call - creates a data trail. And if that trail exists, chances are someone is using it - or will be soon enough.
While we may blame the Internet for the ease with which conspiracy theories proliferate, the net is really much more culpable for the way it connects everything to almost everything else. The hypertext link, as we used to call it, allows any fact or idea to become intimately connected with any other.
It's not that MySpace lost and Facebook won. It's that MySpace won first, and Facebook won next. They'll go down in the same order.
Remember when those CD-ROMs from AOL came in the mail almost every day? The company was considered ubiquitous, invincible. Former AOL CEO Steve Case was no less a genius than Mark Zuckerberg.
To buy an Apple product is to bet on the longevity of the closed system to which we've committed ourselves. And that system is embodied - through marketing as much as talent - by Steve Jobs.
Our eyeball hours are scarce, indeed. That's why Google wants us to do as much as possible online, in range of their ads, and is willing to spend billions creating more reasons and ways for us to do so.
By turning every Yahoo search box into a Bing box, Microsoft may have bought itself the exposure it needs to be the next Google.
One argument against open systems is that they become open to everything, good and bad. Like a Richard Meier skyscraper, the anal retentive, Bauhaus elegance of the Mac does prevent the loose ends and confusion of a less sterile environment. But it also prevents fertility. Apple's development must come from within.
Microsoft's new OS, Windows 7, may finally be a worthy successor to XP, eliminating the clutter of Vista and letting users get to what they want to use without the fuss. All this, while remaining compatible with their IT departments' demands for scalability and custom implementations.
Apple enjoys 'Harry Potter'-like adoration and queues because it sells physical objects, limited by the pace of assembly lines in China. To own is to have, to have is to hold, and to hold is to show off.
As a professional journalist who nonetheless champions a 'people's' Internet, I am happy to compete against the thousands of amateur bloggers out there reporting and commenting on the same stories I do.
If money can't be made reporting and writing articles, then professionals simply can't do it anymore. Unless we adopt the position that the amateur blogosphere is really capable of taking on the role that the 'New York Times' and CNN play, then we do need solutions for paying for content.
Most of us still haven't grasped the fact that everything we commit to the digital space - not just our public blogs and broadcast tweets, but every private text message, email, and voicemail is likely to be stored and accessible. Forever.
Like most early enthusiasts, I always thought the way the Internet encouraged multitasking made users less vulnerable to manipulation, while simultaneously exploiting even more of our brain's capacity than before. Apparently not.
As Apple continues to release new styles of netbooks, laptops, and even desktops with untold movie-watching and game-playing capabilities, I wouldn't be surprised to see the iPhone operating system running on them - and the Macintosh eventually becoming a thing of the past.
Beyond the hype, style, and speculation, the truth is that the iPad is really just another tablet device. A really big PDA, where a touchscreen does what a laptop's keyboard used to do.
Since the dawn of the Internet, I have always operated under the assumption that if the government or corporations have technological capability to do something, they are doing it - whatever the laws we happen to know about might say.
Once everyone is connected to everyone and everything else, nothing matters anymore. If everyone in the world is your Facebook friend, then why have any Facebook friends at all? We're back where we started. The ultimate complexity is just another entropy.
Like the diminishing beauty returns for a facially paralyzed Botox addict, the more forcefully we attempt to stop the passage of time, the less available we are to the very moment we seek to preserve.
Facebook's successor will no doubt provide an easy 'migration utility' through which you can bring all your so-called friends with you, if you even want to.
I find myself unable to let go of the sense that human beings are somehow special, and that moment-to-moment human experience contains a certain unquantifiable essence. I still suspect there is something too quirky, too paradoxical, or too interpersonal to be imitated or re-created by machine life.
Technology has moved away from sharing and toward ownership. This suits software and hardware companies just fine: They create new, bloated programs that require more disk space and processing power. We buy bigger, faster computers, which then require more complex operating systems, and so on.
Online advertising may not be much more successful than an old double-barrel, but - like a good spray of buckshot - it makes up for its lack of accuracy with sheer volume. There are 10 unique ads listed with every Gmail message in your queue, each tied to the message content. And a paying sponsor.
In spite of my own reservations about Bing's ability to convert Google users, I have to admit that the search engine does offer a genuine alternative to Google-style browsing, a more coherently organized selection of links, and a more advertiser-friendly environment through which to sell space and links.
The reason why Apple computers have worked so well over time is that, unlike Microsoft, they don't bend over backward to be compatible with every piece of hardware or software in the digital universe. To code or create for Apple, you follow Apple's rules. If you're even allowed to.
The new Zune may not be an iPod killer, but it does offer a clean interface, great industrial design, HD radio, and a subscription model for music, making it significantly less expensive for big users.
For Google, the problem with being a free, abundant, and rather infinite set of services is that it's hard to create much of a stir about anything. There are so many major software service options under the 'more' menu on the Gmail page that they've had to go and add a final item called 'even more.'
Just as infinite access to free music ultimately leads to no one making a living at music anymore, free journalism just doesn't pay for itself - particularly not when a search engine is serving all the ads.
Your next SMS will probably be around longer, and remain more legible, than your tombstone. For, unlike your tombstone or even your mortal coil, your texts may be worth something.
Brains are tricky and adaptable organs. For all the 'neuroplasticity' allowing our brains to reconfigure themselves to the biases of our computers, we are just as neuroplastic in our ability to eventually recover and adapt.
Not only have computers changed the way we think, they've also discovered what makes humans think - or think we're thinking. At least enough to predict and even influence it.
It feels as if ever since the iPhone was released, the Macintosh computer has become just another leverage point in this other operating system's marketing plan.
Google did a great job hacking the Web to create search - and then monetizing search with advertising. And Apple did a great job humanizing hardware and software so that formerly daunting computers and applications could become consumer-friendly devices - even a lifestyle brand.
Marketers use big data profiling to predict who is about to get pregnant, who is likely to buy a new car, and who is about to change sexual orientations. That's how they know what ads to send to whom. The NSA, meanwhile, wants to know who is likely to commit an act of terrorism - and for this, they need us.
New content online no longer requires new stories or information, just new ways of linking things to other things. Or as the social networks might put it to you, 'Jane is now friends with Tom.' The connection has been made; the picture is getting more complete.
No matter how invasive the technologies at their disposal, marketers and pollsters never come to terms with the living process through which people choose products or candidates; they are looking at what people just bought or thought, and making calculations based on that after-the-fact data.
Social media is itself as temporary as any social gathering, nightclub or party. It's the people that matter, not the venue. So when the trend leaders of one social niche or another decide the place everyone is socializing has lost its luster or, more important, its exclusivity, they move on to the next one, taking their followers with them.