STEALTH TRENDS ON THE INTERNET

"Arguing in My Spare Time" No. 23

by Arnold Kling

Sept. 27, 1998

The purpose of this essay is to point out three trends that have sneaked up on many people who follow computers and the Internet. These trends are:

1. The fact that this is 1998, not 1996

2. The sudden death of the personal computer

3. The search engines are into their second round of failure

This is 1998, not 1996

The unseen importance of this fact was brought home to me by an article in a recent issue of Internet World magazine. The article listed the top 50 Internet companies, meaning the fifty largest companies whose business primarily involves the Internet.

What struck me is that the majority of these companies reported losses in their latest financial reporting period. This means that they still are in phase one of the McKinsey business plan, in which the company loses money initially in order to position itself for "hockey stick" growth later on.

The problem with this situation is that the exponential growth of the Internet is behind us. Surveys tend to vary, but they generally show that between 30 and 40 percent of U.S. households access the Internet at least once a month. As a simple matter of arithmetic, this number no longer is going to triple. In fact, like a plane approaching cruising altitude, the number of households on the Net is increasing at a decreasing rate, as it asymptotically approaches the number of households where people have access to a computer either at home or at work.

Two objections could be raised to this analysis:

1. Internet use could increase faster than the number of users, because the intensity of use could increase.

2. International use of the Internet is not even close to saturation.

Both objections are valid. However, my guess would be that neither factor will be sufficient to cause the growth rate of the Internet to be as fast in the next few years as it has been in the last few years. In 1996, an Internet company's revenues could be expected to grow exponentially even if its market position held constant. That is no longer a given.

The Sudden Death of the PC

It is a safe bet that in the last ten years you have increased the percentage of phone conversations where you used a phone that was not plugged into a wall. Portability is an obvious trend in phones.

Computers may be poised to jump onto the same track. While professional programmers will continue to require powerful workstations, other knowledge workers may perform an increasing share of their communication and analysis chores on alternative devices. The Palm Pilot finally has introduced people to the idea that there are alternatives to desktop and laptop PC's.

In between the original concept of a Personal Digital Assistant and the stillborn concept of the Network Computer, there is a sweet spot of functionality that combines high portability with high connectivity. This market, which is just emerging, could turn out to be highly fragmented, unlike today's PC market. Microsoft, ever-agile, already is in the market. (I am typing this essay on a Nec Mobile Pro, a hand-held computer using Windows CE that boasts a usable keyboard.) However, there is no reason that other operating systems cannot be functional in this context.

With the advent of the Internet, PC owners have begun to value bandwidth over processing power. As more applications migrate to the Web, and the Sun Microsystems slogan "the network is the computer" becomes a reality, people will find that with sufficient connectivity they do not always need the all-purpose versatility of the standard computer. What people want are Web-enabled cell phones and digital notepads.

I recently had a conversation with someone who works in the marketing department for a Baby Bell's pay phone division, which is not exactly a hotbed of growth given what is happening with cell phones. Morale in the pay phone industry is not too high these days. If the trend toward portable information appliances gathers momentum, the personal computer industry may start to look equally unexciting in a few years.

The Search Engines Fail Twice

The search engines have reached the second stage of failure. First, their technological model failed. Next, their business model will fail.

In the first phase, Yahoo was unique. Yahoo began as a do-it-yourself card catalogue for the Internet. It was do-it-yourself in that a site only would become listed if someone submitted that site (in the early days, surfers submitted sites that they found interesting. When Yahoo became a powerhouse, webmasters took over the submission process.) The submitters used Yahoo's system to categorize sites, and submitters helped to extend the classification system by making up new sub-categories.

Yahoo's approach has held up better than others, but it has not scaled well as the Internet grew. The system requires that Yahoo's editors manually police the work done by the do-it-yourself site submitters. This has become impossible with the millions of Internet sites, and with webmasters submitting multiple pages from each site for the catalog.

The other approach, which meets the more literal definition of search engine, is a robot that automatically surfs the Web, collects pages, and then stores aspects of those pages. When you query one of these search engines, it examines its database and returns results according to some algorithm for matching words in your query with information it has stored about Web pages.

This brute force robot search technology is what has failed completely. The typical user experience with these tools is not, "I found exactly what I was looking for." Instead, the typical experience is, "The query came back with 15,439 sites matching my key words, and it showed me the top ten. None of them had anything to do with what I was looking for, so I gave up."

One of Kling's Laws is that "Artificial intelligence is an oxymoron." The breakdown of the robot search engines under the weight of Web site volume, clever "key word spamming" by webmasters, and other factors, illustrates this law.

If Kling's Law of artificial intelligence is correct, then the search engines cannot be fixed with better algorithms. No doubt there are experts in the field of artificial intelligence who believe that they can solve the problem, but I am skeptical.

In any case, we are not likely to see much research and development designed to improve search engines, because we have moved on to phase 2 of their failure. This is the phase in which the search engines have joined the long list of enterprises that have futilely tried to treat the Internet as though it is going to evolve in the direction of concentrated mass media.

The Internet search engines have abandoned their putative mission to help individuals to find information on the Internet. In response to a query, the search engines now present people with two sets of information. One set is content paid for by sponsors. The other set is the results generated by the search query. The business model of the search engine requires that the sponsored content attract the most attention. From the sponsor's perspective, the ideal set of results would be more-or-less random noise, which seems to be the result that the search engines most often provide these days. The search engines now are in a position where if they were actually to produce useful results in response to a query, their sponsors would suffer and their business model would collapse.

The search engines are the latest example of companies proceeding under the assumption that the Internet is going to end up like television, with a few major names dominating the market. The alternative concept of the Internet is that it is a decentralized architecture combining the communication features of a telephone system with the information retrieval features of a library.

The most important tools for finding information on the Internet are directories that are category-specific or location-specific. For example, when I was interested in finding a consulting firm to help with development of my Web site, I looked for directories of computer consulting firms in the Washington, DC, area and for directories of computer consulting firms with experience in the technologies that our site uses. An all-purpose search engine such as Yahoo can help me to find these smaller directories, but I no longer expect to find the end result using an all-purpose search engine.

Eventually, Internet search is likely to evolve in this direction of decentralization. A central search point is likely to be a catalog of catalogs, rather than a pseudo-comprehensive directory of the entire Internet. The lower-level catalogs are likely to be the most useful information sources.

Another possible evolutionary direction involves more systematic self-descriptions of Web site contents. Buzzwords for this include XML and "meta data." The idea is that each individual Web site can embed in itself a "homing signal" that enables browsers to find that site when the consumer is looking for information that can be found there.

Yahoo and the other search engines have very creative managements, and they are putting up all sorts of interesting applications: free email, calendars, games, etc. These applications could lead to an interesting future. However, when we observe companies whose basic product does not meet the consumer's needs branching off into other products, and we see this occurring on the Internet, where consumers can search for the best of breed of software, we have to ask "What's wrong with this picture?"

The Internet architecture favors firms that are specialized and excellent, not firms that are eclectic and mediocre. The search engines seem to me to be going in exactly the wrong direction, which is why I see the "portal" concept as the second phase of failure for these companies.