Reinventing the Web

January 12, 2009 · · Posted by Greg Lloyd

John Markoff wrote a really good Jan 11 2009 New York Times profile, In Venting, a Computer Visionary Educates on Ted Nelson and his new book, Geeks Bearing Gifts: How the Computer World Got This Way (available on Markoff notes that Tim Berners-Lee invented the World Wide Web, but: "Lost in the process was Mr. Nelson’s two-way link concept that simultaneously pointed to the content in any two connected documents, protecting, he has argued in vain, the original intellectual lineage of any object... His two-way links might have avoided the Web’s tornado-like destruction of the economic value of the printed word, he has contended, by incorporating a system of micropayments."

I was one of the skeptics who thought that the World Wide Web with its fragile one-way links would never take off as a global hypertext platform. Classic hypertext systems (from HES and Augment though Xanadu, Plato, Intermedia, Lotus Notes, and Dynatext) went to great lengths to preserve the integrity of links, relationships, and content.

For an excellent first hand history of the Web - and a linked data proposal which seems to share many of the simple, scalable properties of his original invention - see Tim Berners-Lee's Feb 2009 TED Talk on the 20th anniversary of the Web:

Tim Berners-Lee The next Web of open, linked data Feb 2009, Published Mar 2009

Some comments on this talk's comment thread suggest that it's inappropriate for TBL to take credit for inventing the Web. I replied:

I believe that TBL is typically modest and accurate in saying he invented the Web. HTTP, the SGML-based definition of HTML, and the URL protocol were quite literally his inventions - in concept and reduction to practice. Almost every concept of value on the Web: search engines, browsers, notification is built over this simple, open, highly scalable architecture.

TBL does NOT claim to have invented hypertext or the underlying and pre-existing internet protocols which he used very effectively. He quite intentionally made an inspired set of tradeoffs. I suggest paying very careful attention to his linked data proposal.
March 14, 2009 | # | Greg Lloyd

The idea that any sensible person would rely on a global hypertext system where links on one computer pointed at locations on another computer which would break whenever the remote computer was unilaterally moved, renamed, taken off line or abandoned seemed absurd.

The idea that you would have no way to know what incoming links would break when editing or refactoring content seemed just as bad.

The Word Wide Web protocols looked like they would work for relatively small cooperative groups like CERN who could keep things from breaking by having shared goals, and using peer pressure plus out of band communication to keep distributed content alive.

Actually that intuition was pretty good, because the World Wide Web took off in a direction based on other incentives compatible with those assumptions - and grew like crazy because unlike alternatives, it was was simple, massively scalable, cheap and eliminated the need for centralized control.

1) The Web became a distributed publishing medium, not the fabric for distributed editing and collaboration that Tim Berners-Lee and others envisioned. People and Web publishing engines like Amazon created content and kept it online while it had economic value, historical value (funded by organizations), or personal value. Content hosting became cheap enough for individuals or tiny groups. Advertising supported content became "free".

2) Search engines spanned the simple Web. Keeping content addressable now gained value since incoming links not only allowed people to bookmark and search engines to index what you had to publish (or sell), but the incoming links gained economic value through page rank. This provided even greater motivation to edit without breaking links, and to keep content online while it retained some economic, organizational or personal value.

3) People and organizations learned how to converse and collaborate over the Web by making it easy to create addressable content others could link to. The simple blog model let people just add content and have it automatically organized by time. The Wiki model required more thought and work to name, organize and garden content, but also creates stable, addressable islands of pages based on principals that reward cooperative behavior.

4) Search engines, syndication and notification engines built over the Web's simple, scalable protocols connected the Web in ways that I don't think anyone really anticipated - and work as independent and competing distributed systems, making rapid innovation possible.

Tim Berners-Lee made an inspired set of tradeoffs. Almost every concept of value on the Web: search engines, browsers, notification is built over his simple, open, highly scalable architecture.

I believe it's possible to provide what TBL calls "reasonable boundaries" for sharing sensitive personal or organizational data without breaking basic W3C addressable content protocols that makes linking and Web scale search valuable. That should be the goal for social and business software, not siloed gardens with Web proof walls.

As TBL said in a Jan 2013 interview: “The web isn’t about just sharing everything, destroying privacy… [but] if I want to share something with you it shouldn’t be the technology that gets in the way.”

So when people ask what will deliver two-way links, fine grain comments and tagging, traceable transclusion and the promise of the Semantic Web, I suggest an approach which layers these hypertext capabilities over the basic Web in way that exposes readable content which is absolutely compatible with the basic Web for all readers and existing engines.

Offer seamless collaborative editing, traceability, semantic search and other capabilities by extending the hypertext editing engines to support new layered protocols and transparently downsample richer models to deliver basic Web content to clients who use basic Web protocols. Offer extended formats and services to client or other servers with extended capabilities.

I'm sure that won't satisfy Ted, but before a sea change in the basic structure of the Web - which is what Nelson and other's global visions require - I believe you'll have to be satisfied with stable islands in the Web's storm tossed sea and protocols that support robust connections among islands.

I believe it's even possible to implement Ted's micropayment transclusion model as a layered protocol. People's DRM aversion, rights contracting and enforcement seem to be bigger issues than the technical barriers.

I also believe that Enterprise 2.0 secure collaboration and social networking provide the motivation to make this new way to think of reinvention of the Web a reality.

Traction TeamPage was designed from the start to use layered principles, working with and over the Web without sacrificing (internal) two-way links, paragraph grain comments, tagging and relationships, content journaling, spaces with role based borders, and other capabilities that match and better capabilities of classic hypertext systems. Consider TeamPage a proof of concept.

I hope that the evolution of Enterprise 2.0 platforms leads to definition of layered protocols which extend valuable hypertext capabilities across hypertext systems - Traction's and others - to extend the Web for everyone's use and remember the lessons of simplicity, scalability and innovation that the Web has taught us all.

For additional thought's see Peter O'Kelly's comments on Markoff's profile, and Peter's excellent followup notes on the Web and Hypertext.

For more on how to intertwingle sites and services over the Web see Intertwingled Work and Enterprise 2.0 - Letting hypertext out of its box.

Update 14 Jul 2014: See Reinventing the Web II for follow on discussion and analysis.


I originally titled this post "Re: In Venting the Web" - but chickened out - grl

Page Top