This article was written March 2000, when I was asked to organize a research group's web appearance, but the only server used at the institute was a Hyperwave server. I tried to understand what Hyperwave (and earlier Hyper-G) was in 2000. Original links are kept even though they are no longer pointing to valid documents.


Why Hyperwave?

The Enthusiastic View

Hyperwave offers with the Hyperwave Information Portal a turnkey solution for enterprise-wide knowledge management ("Enterprise Information Portals" in marketing speak). It is based on the Hyperwave Information Server which is an advanced multimedia content management system. Hyperwave calls itself the "Portal Authority".

Hyperwave was founded 1997 and targets the corporate intranet market, which is projected to grow rapidly. It won the BYTE Magazine's Best of Show award at CeBIT 1997, won 8 months later the European IT-Prize (worth 200000 euro), and managed to get quite a few (mostly European) clients quickly. Hyperwave grew 121% p.a. in 1999. About 80% of the turnover is made with license fees. Hyperwave announced its IPO for 2000.

Hyperwave is based on many years of academic research that went into the development of what was then called Hyper-G. Herrmann Maurer, the father of Hyper-G, is Chairman of the Board of Hyperwave. Frank Kappe, who was a leading developer of Hyper-G, is now Chief Technological Officer of Hyperwave. Hyperwave is a member of the WWW Consortium and continues to participate in the innovation and standard setting process.

The technological core, the Hyperwave Information Server (HIS), is said to be superior to the usual web servers because of the following problems with the World Wide Web:

  1. The "lost in hyperspace syndrome": It is difficult to get an overview of available information or to find specific information.
  2. Broken links and orphans: When documents are deleted or renamed, hyperlinks break. For the person removing or renaming a document it is costly to check all links to it. Removing documents may orphan other documents with no links left pointing to them.
  3. Weak security model: As links are embedded into HTML documents, links inherit the access rights of their documents, although it would be advantageous if links had separate access rights.
  4. The "GOTO syndrome": HTML only knows one single type of link, when in fact there are structural links (pointing to documents that form a group like chapters of a book) and referential links (pointing to documents that are merely cited). (The fact that HTML only knows one type of link is likened to early programming languages with their overuse of the goto statement.)

The central idea behind Hyper-G was to separate the contents of documents from their (structural) links. The content of documents is stored separately, and the whole link structure is kept in a centralized database. The server can then during all check-in, rename, or deletion operations maintain link integrity quite easily. Once there is a central link database, it is natural to allow queries like "Show me all documents pointing to this document". The idea of separating content and link structure solves problems 2-4. The first problem is solved by also storing metadata (authorship, keywords, ...) in the link structure database. (Note that the problems 1-4 are only solved for documents residing on a single database/server.)

Other goodies of the Hyperwave Information Server are:

  1. Searches over the link structure and the associated metadata are possible and - due to the data structure - faster then a comparable search over a collection of HTML documents.
  2. Integration with Windows clients is excellent (virtual folders, Hyperwave Office Extension, ODMA support)
  3. HTML-Documents are transformed according to PLACE macros and server-side javascript. This allows to force an identical layout (corporate identity) on all documents. (HTML-Documents submitted to the HIS have to be parsed for links and transformed anyway. So its a small step to add other features automatically.)
  4. Fine-grained access rights for documents and links are possible.
  5. Scalability: Distributed instances of hyperwave databases can be treated as a single database.
  6. Embedded versioning.
  7. Embedded discussion boards.
  8. Tools for automatic creation of CD-ROM-versions of a hyperwave database.

An Alternative View

To put Hyperwave's claims of its technological superiority a bit into perspective it is very instructive to look at the (concurrent) development of the World Wide Web at CERN and Hyper-G at Technical University Graz, respectively. The two hypertext projects started at about the same time, namely 1989. The original proposal for a hypertext-system by Tim Berners-Lee starts with the observation of social aspects (relations at CERN are like a web that evolves over time) and then discusses the short-comings of hierarchies and keyword-based systems. At the core, it proposes a decentralized web based on hypertext. The ideas behind Hyper-G, on the other hand, where more based on technical considerations and earlier hypertext systems like Intermedia and Xanadu. At the end, the Hyper-G system was a centralized database of hypertext.

The next steps were the development of servers and clients. CERN developed the CERN HTTP server between 1991 and 1993. First versions of the Hyper-G server appeared 1992. First versions of Amadeus, the MS-Windows client for Hyper-G, were released in 1993 and (version 0.84 of) Harmony, the UNIX/X11 client, appeared a year later. Both Hyper-G clients reached version 1.0 in 1995.

The real breakthrough of hypertext on the Internet, however, came with the development of the HTTP-client Mosaic and the HTTP server at the NCSA. Mosaic was implemented at breakneck speed. Development began June 1993. When I enrolled at Columbia University in September 1993, I used Mosaic from the first day on. Version 1.0 appeared November 11, 1993. The NCSA HTTP server surpassed CERN's server soon in terms of popularity. Within one year of the appearance of Mosaic, WWW went from zero to the most voluminous traffic and most popular service on the Internet.

Throughout 1994 and 1995, the proponents of Hyper-G published articles, in which

(Look also at A Comparison of WWW and Hyper-G, for example.) Other "features" of the system were that a Hyper-G client could only speak Hyper-G and needed a Hyper-G-server to proxy other (FTP, WWW, Gopher) requests. As a consequence, the Hyper-G-server had full control (and logging) of what each single user did, even if she requested documents from the outside world. Wired had an Article on Hyper-G in 1996, discussing Hermann Maurer's book Hyper-G: "The Next Generation Web Solution" and the future prospects of Hyper-G quite optimistically.

In 2000, it can be safely concluded that the WWW did not "break down" but the Hyper-G system simply lost the race for the hypertext standard. Neither HTF, nor Hyper-G clients, nor the protocol HG-CSP play any role anymore. Only the server, now called Hyperwave Information Server, has a niche market. Compared to the original Hyper-G system, however, the Hyperwave server has to bend backwards: it has to interpret HTML though it preferred HTF and it has to talk HTTP though it preferred HG-CSP. Needless to say that you can achieve all the fancy features of the Hyperwave Information Server by combining a classic WWW server like Apache with modules and other third party products. And since Apache runs more than half of the whole WWW, the market for third party products is big, be it indexing and searching, log file analysis, link checkers, dynamic content (PHP, database connectivity, server-side java, server-side-what-you-want, turnkey e-commerce solutions), automatic load balancing, you name it.

By now you probably ask why the "technologically superior" Hyper-G lost the race for the standard? I don't know. But I have the feeling that the answer is related to the following questions:

Liberty is highly valued by most people. The PC was the liberation from the demi-gods of the mainframes. Linux was the liberation from monopolistically acting operating system vendors. The wall came down because people didn't want to live behind an iron curtain. My guess is that the decentralized WWW out-ran the centralized Hyper-G because the structure of a web fits most creative, collaborative efforts better than that of a hierarchy. (For more on this issue see The Cathedral and the Bazaar and The Magic Cauldron.)

Decision Check List for the IT Manager

When should you prefer Hyperwave over Apache?

Decision Check List for the Investor

When should you invest into Hyperwave instead of, say SuSE?

References