I once saw a cartoon in the Red Herring, a glossy technology business magazine whose rise and fall mirrored that of the first generation of the dot-com world.
It appeared sometime during the magazine’s heyday, showed a frustrated writer staring at his computer screen, and was captioned below: “Writing is God’s way of showing you how sloppy your thinking is.”
This memory seems appropriate as I reflect on my years in the earliest days of the web. What surfaces from those years, now almost three decades distant, is not a unifying narrative but a series of powerful impressions, if somewhat diminished by time.
Modest beginnings
I was not an obvious choice to become a tech revolutionary. I didn’t take a single computer science course as an undergraduate. I didn’t even really love computers. Sure, I played l video games as a kid – my parents, who were not wealthy, had splurged and got an advanced-for-the-time Commodore 64 system soon after it came out in 1982. It even came with something called a “letter quality printer” – essentially a typewriter attached to the computer (and it even made as much noise as a manual typewriter) that printed something that looked a lot more professional than the pixelated dot matrix printers of their day. What’s more, we had an actual modem and subscribed to the online service CompuServe – but at 300 bits per second speed. (For those old enough to remember painfully slow dial-up internet connections, those were 14,400 BPS or 28,800 BPS. Today, if we have less than 20,000,000 BPS to our home, my teenagers complain I’m committing a crime against humanity.) It was so slow, and so expensive for our limited budget, that while I tried it a few times, it was pretty much useless – even for transmitting text – and it didn’t make much of an impression.
But while I didn’t care much about computers, I did love reading and information, both of which computer networking, even in the primitive early 1990s state, could bring to me. And in a true blessing, when I arrived at college, our dorms at Yale had what passed for high-speed internet back then, which allowed me to explore this strange new world. I now had access to books without even going to the library, including online services like Lexis-Nexis. Add to that the ability to communicate with others around the world almost instantaneously (a love of mine since I had a brief spell doing ham radio in the 1980s as a kid) and I was hooked. I’d also, like a number of bright young men of my age, read Douglas Hofstadter’s magisterial and extremely complex “geek bible,” "Gödel, Escher, Bach," which had a huge effect on my interests and thinking. GEB is a difficult book to summarize but had the net effect, for me and many others, of making the world of computers seem exciting and important.
The summer after my sophomore year (1993), I was in Washington, D.C., pursuing my first love of politics and sharing a house in D.C. with a number of college friends. I went to Olsson’s Books in Georgetown (then the best bookstore in D.C.) and found the computer section because I wanted to learn more about the mysterious internet and, in particular, the web, which I had started using earlier that year. The store had a large selection of computer books but only one book about the internet: "The Internet Companion" by Tracy LaQuey, a senior employee at networking company Cisco Systems. The web was so new at this point that it was mentioned briefly on just one page of the book, which mostly discussed email, Gopher, FTP, and other technologies for accessing information online that will be a happy mystery to most readers younger than my 48 years of age.
And here is where I must, for the sake of truth, make what lawyers call a statement against interest: The foreword for this obscure book, published in 1992, was written by then-Senator Al Gore. Many of my generation or older may remember the mockery Al Gore received while he was the vice president mulling a presidential run and was quoted out of context as apparently claiming he “invented the internet.”
But not only was Gore far ahead of any politician of significance in understanding the internet, but various pieces of legislation he sponsored, some dating back to as early as the 1970s, were absolutely pivotal in advancing the internet and the web. The fact that Gore could not turn this weakness into a strength when he was attacked is little short of amazing.
It is fascinating to think about what the world might have looked like had he won his razor-thin race in 2000 – with no President Bush and no Bush family running the GOP, our future would have looked very different. Certainly, Gore’s knowledge of the internet might have led to less abuse by companies of ignorant regulators. On the other hand, Gore was on Apple’s board, a senior adviser to Google, and a partner at ultra-blue-chip VC firm Kleiner Perkins. In a Gore-as-president scenario, perhaps the despotism we see from a few monopolizing Big Tech firms would have happened even sooner.
Regardless, it is almost impossible to imagine Donald Trump as president except in response to the Bush legacy. The arc of history is long, but it bends toward mind-altering counterfactuals.
A couple of my Yale roommates genuinely loved computers, and they went along with me on this journey into the early days of the web. One of them is now a senior developer at Apple, the other a multimillionaire tech entrepreneur in Seattle. We’d sit there on our ancient Macs and get them to do cool things, including a primitive form of computer-to-computer instant messaging that could only be done with other students on the Yale network. These sorts of technologies, including email, were widely used by students – but few at the time realized their revolutionary potential.
When I first started using the web in 1993, on the first graphical web browser, Mosaic, developed by the National Center for Supercomputing Applications, there were about 130 websites in existence. I could surf essentially the entire web and basically did. Thereafter the number of websites multiplied rapidly. I soon designed my first personal website, an exercise in enthusiastic collegiate cringe. It was one of the first few thousand sites in the world. There were no automated ways to build web pages at the time. I learned to code by hand from NCSA’s “Beginners Guide to HTML 1.0.”
I wasn’t just interested in the technology itself but in the culture around the technology. I remember the first few issues of Wired magazine, a breath of fresh air compared to stodgy academic courses and mainstream magazines out there. I was thrilled about the new information economy and the patron saint of media technologies, Marshall McLuhan. I thrilled to the mantras from the Whole Earth Catalog’s Stewart Brand: “Information wants to be free” or the Electronic Frontier Foundation’s John Gilmore: “The internet interprets censorship as damage, and routes around it” (an adage that subsequent events would prove to be rather oversold). What I lacked in sophistication about the world, I made up for with enthusiasm about the subject.
I remember reading Vannevar Bush’s article “As We May Think,” which postulated a proto form of hypertext years before the web existed. I was blown away by Neal Stephenson’s "Snow Crash," a novel that seemed to have foretold much of our virtual future. Available at the time only in a cheap paperback with a lurid cover, befitting its “trashy” science fiction origins, more recently it was named by uber-establishment Time magazine as one of the hundred greatest novels of the last 100 years – a timely example about how much and how quickly the internet changed our aesthetic and literary sensibilities.
I don’t generally enjoy things: Or rather, I enjoy many things, but not often while I am doing them. Like many others with a more reflective character, my pleasure tends primarily to be retrospective, more “that was cool” than “this is cool.” But this was not true with my involvement in the earliest days of the web. Even at the time, I knew what we were doing was exciting and revolutionary, even if most people didn’t understand it.
In general, techno-libertarian optimism ran amok in those early days of the web. Anything seemed possible, and we were too naive to realize that the suits and the bean counters always get their piece of the action.
Great expectations
By the end of 1994, there were about 2,700 web sites. This was the year I started writing professionally about the internet, for Michael Wolff’s "Net Guide" series of books. This was before Wolff became famous and wrote "Fire and Fury" about Trump. At the time, he was just a guy with an idea, like a lot of us. By the year 1995, when I graduated from college, there were 23,000 web sites. Today, there are almost two billion.
Upon graduation in 1995, I joined the first publication dedicated to covering the internet business. Taking a job related to the internet was not normal or cool among Yalies, who were supposed to go into law school, management consulting, or investment banking. In fact, the notion of working in the internet industry didn’t really exist, since there really was no internet industry at the time. My starting salary offer was $28,000, which I negotiated up to $32,000. I lived with two roommates and work colleagues in a small apartment that was literally at the end of a freeway exit. If a car had a brake failure coming off the freeway, it would have plowed into our front door.
We were based outside New York City, but I would take frequent trips to Silicon Valley, which was rapidly asserting its primacy in the space. Mountain View, Menlo Park, Sunnyvale, a seemingly endless and largely characterless selection of office parks and mid-century suburban housing flew by as I took my rental car from meeting to meeting, with CEOs often not much older than I was. Cities I had never heard of that would have been among the biggest cities in the state of North Carolina, where I grew up. There was not yet a universal feeling of affluence in the Valley. Palo Alto’s median housing price was just $450,000 in 1995. Today it is 3.3 million. In broader Santa Clara County, a normal family still had a chance; the median housing price was just $250,000. Today it is $1.3 million.
The internet, and as a fortunate consequence my early career, took off like a rocket ship. I ended up co-hosting a morning drive-time radio show about the internet, in which we tried to explain it to the masses who were just waking up to its potential. Barely out of college, I was getting offers to do TV and radio appearances, to consult and to speak at conferences. I have a video of myself at age 22 on a national television program, discussing politics on the web and explaining to the host that “this is a World Wide Web page.”
I wrote what I believe was the second-ever article written about Amazon.com — the company had not even been launched to the public and was still in private beta, but I thought the idea of a bookstore with 100,000 titles (soon ramped up to a million) was revolutionary. I interviewed Jeff Bezos, and my enthusiasm for his business idea showed through and probably got the better of my journalistic objectivity. Bezos offhandedly suggested coming out to Seattle and taking part in his start-up, but I had personal commitments that kept me from exploring that possibility further – probably the greatest “what if” in my professional life.
Bezos wasn’t the only Silicon Valley titan I encountered. I remember confronting Bill Gates at Comdex, then the biggest tech trade show in existence, about a technical matter on which he had been deceptive. I remember going back and forth with Mark Cuban, then a small-time entrepreneur who was desperate for our attention, when he was running Broadcast.Com. Cuban, who knew the most important thing about investing is to buy low and sell high, sold his essentially minimal-value business to Yahoo for $5.7 billion near the height of the dot-com boom, then hedged against the impending crash in value of Yahoo shares.
I interviewed Marc Andreessen, who had written the first graphical web browser, Mosaic. I distinctly remember looking at the beta version of Google with one of our engineers and us both saying, “Wow, this is really a lot better than AltaVista,” the gold standard for search tech at the time. I got a Yahoo mail account, which I have kept to this day as my primary email, a small digital relic I can’t quite bring myself to resign to the dustbin.
I spent many nights hanging out with young millionaires and wannabe millionaires drinking expensive champagne. It was decadent, and absurd, and exciting for a guy who did not come from wealth and had been eating bargain-basement Chinese food in his dorm room just a year or so before.
My first article on politics and the internet came out in Internet World in August 1995, just two months after I graduated from college. It was entitled “Star Spangled Net: Politicos, Feds, Activists, Hate Groups and Anti-Hate Groups are Jumping Online” – showing not much has changed in more than 26 years. At the time I wrote it, however, online politics was still in its infancy, and just sixteen out of 535 members of Congress had even the most rudimentary web pages. I discussed with awe Mario Cuomo’s website, which actually had its issue positions and uploaded some of its commercials. This was what passed for cutting edge.
In August 1995, none of the GOP candidates taking on Bill Clinton had their own web page. When Bob Dole emerged as the nominee, an unofficial page was created on his behalf, featuring little more than a brief biography and a few links to articles about the candidate. I noted that in 1992, it was considered a radical innovation that after the campaign you could email President Clinton at president@whitehouse.gov.
This was about the time of the third World Wide Web conference in 1995, the first on U.S. soil. The previous two had been held in Europe, where the web had been invented, but in that end of history context, nothing really happened until it happened here. The internet had, in a sense, arrived. Web inventor Tim Berners-Lee and Douglas Engelbart, inventor of the mouse and the author of “the mother of all demos,” were in attendance and signed a special book given to attendees, which I have to this day.
The view from the inside
Eventually, I left the tech press and joined a company called RealNetworks that pioneered audio and video streaming on the internet. Some of the things we were trying to do in building businesses and content around streaming were literally more than a decade before their time. The bandwidth, infrastructure, institutions, and advertising dollars just weren’t there yet. But we still had a lot of success. The swings in the market were wild. Even as mid-level managers, our net worth might increase on paper by hundreds of thousands of dollars in a given day. When you are in your mid-twenties, that is what Joe Biden might call a “big f***ing deal.”
I had a good friend whom I worked with who didn’t understand the importance of hedging and dollar cost averaging when selling our stocks (again, we were in our mid-twenties). He was desperate to buy a new car, so he sold a large allotment of shares. We ended up calling that car the $550,000 Corolla. Later, however, having learned his lesson, he sold a significant portion (but not all) of his shares to buy a nice townhouse in a fashionable part of Seattle. After the stock crashed, we called that the $27,000 townhouse.
The company did very well. We were in the midst of the first internet bubble, and even though it was relatively modest by the scale of what came after, to us it seemed huge. At our peak in the late 1990s, our company was worth around $30 billion. We had a number of large customers, but our biggest was Enron, which would later become America’s largest and most infamous corporate bankruptcy, the subject of the best-selling book The Smartest Guys in the Room. We got full payment for our contract up front in cash — perhaps an early indication that they weren’t so smart after all.
Crash 1.0
In Hemingway’s "The Sun Also Rises," Bill Gorton asks Mike Campbell how he went bankrupt, to which Campbell replies: “Two ways: gradually, then suddenly.” This pretty well describes my growing disenchantment with the internet business as the 20th century came to a close.
I don’t remember when exactly I decided I wanted out of the business, but one incident in particular sticks in my mind. Toward the end of my time at RealNetworks, I got a call from a billionaire investor. I was a young business development guy looking for companies for us to buy. We were considering purchasing a company largely held by this investor (whose name I will omit since I don’t like being sued). Our discussions over various provisions of the possible deal got somewhat heated.
Finally he raised his voice and yelled something to this effect: “I have two Picassos, three yachts, and five mansions each worth more money than you’ll ever make in your lifetime. I don’t need this deal!” Then he promptly hung up on me.
Fortunately, I was more amused and appalled than intimidated by his schtick, and we didn’t buy his company, which eventually went bankrupt.
Later I would see numerous profiles about how philanthropic and wonderful this billionaire was – he was one of Bill Clinton’s closest friends. It was a great introduction to fake news.
It was around this time I figured I needed to get out. I had been working in tech for only a half-decade, but it consisted of an entire era. I was swimming with the sharks. And I just wasn’t as enthusiastic a biter as some others. Our youthful idealism had given way to “the suits” – and the internet felt played out to me. Not so much in terms of the technology, but in terms of the genuinely naïve enthusiasm that had fueled its early days. Obviously, the commercial internet had several future dramatic acts to put on stage, but I still believe there was a huge difference between Web 1.0 and everything that came after. It was a more pure, more exciting time. The horizon seemed endless. The venture capital world existed, but it was so much less developed. Nobody thought at the time that Silicon Valley was the center of the universe. Those of us who got in really early had done it for the excitement and the novelty, not for the money, though the money was a nice bonus.
In October 2000, I quit my job (right before the dot-com bust, which I saw coming, though I underestimated its intensity and carnage) and headed for the Nepali Himalayas. I wasn’t there to “find myself” (God forbid) but just to fulfill a lifelong dream of seeing the world. Up to that point my international travel experience consisted of a single trip to Europe. If I was going to live up to the example of my childhood hero, the explorer and linguist Richard Burton, a polymath who was the greatest traveler of his age and a scandal of Victorian society, I had a lot of catching up to do.
I spent two years on the road traveling mostly through the Middle East and Asia, having adventures before coming back, settling down, getting married, and renewing my youthful love for politics. But I would later get back into the internet, starting to sound warning alarms about internet censorship of conservatives in 2015, long before most others were paying attention. I recalled the original debates over the Communications Decency Act and section 230, which I had covered as a young journalist in 1996. And I knew what these companies were saying now was not what they were saying then. So at least on a part-time basis, I find myself back in the tech arena, with a career that has come full-circle.
Reflections from afar
In some ways it is dispiriting to look back at those early days and look at the internet’s broken promise. John Perry Barlow’s “A Declaration of the Independence of Cyberspace” seems impossibly naive. We had that brief period of idealism only to see it crushed by the most crass and greedy forms of capitalism. And even that has been replaced by something more ominous – a Deep State/Big Tech leviathan greedily conquering every last inch of the internet, devouring our personal information to feed its ravenous appetites.
And yet, there is no doubt that for all of the ominous trends on the horizon, the internet also opened up unprecedented opportunities for normal people with great ideas, wherever they resided or whatever their previous social status. I have met countless friends and had my intellectual life incredibly enriched through my engagement with it. Like any powerful tool, it can be used for both good or ill. One should be wary of cheap nostalgia.
As Chuck Palahniuk observed, “Every generation wants to be the last. Every generation hates the next trend in music they can't understand. We hate to give up those reins of our culture. To find our own music playing in elevators. The ballad for our revolution, turned into background music for a television commercial. To find our generation's clothes and hair suddenly retro.”
That’s kind of how Web 1.0 is for me as I look back on it. It was a spectacular dream to live through, but all of what we hoped we were building has succumbed to the cheapening tendencies of the marketers and profit maximizers. The internet we believed in is long gone. As Neal Stephenson observed in "Snow Crash," “Software development, like professional sports, has a way of making even thirty-year-old men feel decrepit.” While I’m not yet a senior citizen, it’s been a long time since I first saw 30 in my rearview mirror. Best to look back fondly at what we accomplished, let the young folks take over, and realize that the most wonderful luxury is to dream about the future – and then create it.
Jeremy Carl is a senior fellow at the Claremont Institute. He lives with his family in Montana.
Want to leave a tip?
Jeremy Carl