Implemented our new, integrated graphics system, which could support remote display and manipulation of illustrative graphics on a Tektronix 4014 storage-tube display plugged into the line-processor's printer port. The graphic data were embedded within the NLS structured-text files; an illustration produced as a plotter-driver file by any other graphics system could be picked up and attached to a specified location in an NLS document, and be subsequently viewed and modified. Also, our Output Processor could direct that a properly scaled image of a each such illustration be located appropriately within a multi-font page layout.


Gave up our high-performance, local display system for the line-processor supported, remote display system -- to make ourselves live with the same remote services as our NIC clients and Utility customers. [On principle, we gave up our integrated, direct-view graphics and the fast response of our direct-memory-access, local display generator.]

Opened our "Workshop Utility Service" -- delivering NLS service over the ARPANET to DoD customers as pilot applications of office information service; had gone out on bid for commercial time-sharing services, selected Tymshare Inc. of Cupertino, Ca.; their host, named Office-1 provided the computer service; we fielded special trainers and application development staffs and cultivated special customer representatives into a spirited community.


Brought up a Table Subsystem in NLS.

Designed our first, totally modular User Interface System. Got it running on a PDP-11 that talked to our TENEX through the network, via our Procedure Call Protocol.

Developed our Line Processor, as described by Don Andrews in (Pub-67-DispSel) . It incorporated Intel's first microprocessor (the 4004) in a special box which was inserted in the communication line between a dumb display terminal and a modem. This made use of our Virtual Terminal Protocols, and managed a multi-window, two-dimensional screen using off-the-shelf, "dumb" display terminals. Our mouse and keyset input devices were plugged into the line processor, which appropriately translated their actions to control cursor position and special communications to the host. A printer port on the line-processor provided local printout service; a special communication protocol allowed the host to send printer packets mixed in with display-support packets.

Finalized specification for our Network Virtual Terminal, something which has become a key part of our architecture. The objective on the one hand was to free the application programmers from worrying about the special features of different workstations, and on the other hand, to enable more flexible evolution by users of workstations they may adopt to fit particular needs. As part of this, there was a terminal-independent Display Manipulation Protocol for communication from application program to terminal, and an application independent Input Protocol for communicating from terminal to application program.

Generalized the file structure of our document files to provide for generalized property structures associated with each addressable object; intended to accommodate composite integration of such as graphics, digitized speech, scan-coded images, or any other arbitrary data form.


Began developing our first, integrated Help system.

Formulated the "AKW Architecture" -- implemented in stages.

Implemented the "shared-screen," televiewing mode of online collaboration between two or more NLS users.


Detailed use of NLS for internal management processes of ARC: Cost records, working forecasts, purchase Renaissance, etc.

In 1970 we began using the ARPANET to facilitate our re-programming of NLS for the forthcoming PDP-10 TENEX. The University of Utah had a TENEX on the network, and we used NLS on the 940 to write our new PDP-10 code; using our Tree-Meta compiler, we developed a cross-compiler for our 940 that produced PDP-10 relocatable binary code. We would ship that over the net for loading and debugging on Utah's TENEX. When the two computers and the intervening network link were all working properly (lots of flat tires in the early days of automobiles), our programmers would do all of this back and forth transitioning "through" the same workstation. I think that it was not only a record-making way of working, but the NLS transport task was accomplished in remarkably short time (we attributed part of the efficiency to the network, and part to the use of NLS).

In late 1970 we brought NLS up on the PDP-10 TENEX with improved and new features (including multiple windows).

Began using our Mail/Journal system within our group. Integrated into NLS, this assumed that a mail item was a document -- so any part of all of an NLS document could be sent. Provided for permanent record in explicitly retrievable form (our Journal). As an electronic-mail system, this was quite advanced. It had a Directory service (our Ident System) to provide mail-relevant information about registered users; mail distribution was addressed by people's Idents, with no need to know or specify which host they used. Fields were provided for superceding other items, and for attaching keywords. An online index was provided for stored items.


Began design of windowing capability for NLS.

Developed concept of a user "reaching through" his personal workplace (i.e. his familiar online working files and application programs) to access less basic, specialized data and application processes (and other people); i.e. the "reach through" should provide access to these, translated by the integrated support system, so as to appear as coherent parts of his familiar, personal workplace.

Specified our first mail and "Journal" system as part of an explicit pursuit of a "Dialog Support System," planning for it to be part of our ARPANET-NIC service.

Developed document-outputting capability processing our composite, text-graphic document files to drive a service-bureau, CRT-based, full-page, Stromberg-Carlson photo printer to produce documentation with graphics and text mixed on the same pages.

Became the second host on the ARPANET with our SDS 940. (UCLA was first, UCSB next, then the University of Utah, then ....)


NLS by now had a full set of basic features that have since characterized it and AUGMENT (its commercial successor). E.g.: full-screen, integrated outline and text processing; in-file addressing; in-file and cross-file text or structure manipulation by address; basic repertoire of view-control commands; content filtering; generalized, computer-executable citation links; verb-noun, consistent command syntax with optional use of ultra-fast, concurrent control using the mouse and chord keyset; Included also was a calculator package, integrated into NLS: mouse-selecting operands; totaling columns; inserting accumulator contents at selected locations or replacing selected numbers in a file; executing user macros with pauses and prompts for users to select file variables or provided typed in values.

Put together our home-designed, custom-built displays system to run with the SDS-940. Two custom-built, random-deflection display generators were each time-shared to drive six, small, 5" high-precision CRTs. In front of each of these CRTs was mounted a high-quality, video camera so as to scan the CRT face. These twelve video lines were brought out to our work area, where each work station had a high-quality video monitor for its display. This gave us four sizes of alphanumeric characters, and accelerators vector-graphic figures. The display generators were connected on a Direct Memory Access bus so that switching from one stored view to another occurred essentially in a thirtieth of a second.

Came online with an improved NLS on a time-shared, SDS 940; large swapping drum; special, home-made display system operating from direct-memory access, providing integrated text and graphics, and delivering video to up to twelve workstations out in our laboratory.

Made a public debut at the Fall Joint Computer Conference in San Francisco, December. For this event, we added another layer of new technology on top of NLS, a system that was already very complex for its day. It is worth an extra bit of description here. Bill English and I wrote a paper for this conference describing ARC's objectives, physical laboratory, and the current features of NLS. In the Spring, when the Program Committee was considering candidate papers and organizing its sessions, I also proposed that they let us have a full hour-and-a-half session to put on a video-projected, real-time presentation. After considerable deliberation, and no less than two site visits to our lab at SRI, they consented.

It was a considerable gamble, possibly an outright misuse of research funding. I have no illusions that it could possibly have been pulled off without Bill English's genius for getting things to work. Our new display system provided us with twelve video cameras; we left about half of them working as display generators, and used the others to provide video views of people, borrowing tripods and drafting all kinds of people as camera operators and prompters.

We leased two video links to send images from SRI to the Conference Center in San Francisco -- a direct distance of about 30 miles. It required temporarily mounting four pairs of dishes -- two atop our SRI building, two atop the Conference hall, and four on a truck parked on top of a relay mountain. We procurred some video-lab equipment: frame splitters, switches, faders, and mixers. We made special electronics to get our mouse and other terminal signals from the podium to the 940 at SRI.

It required a special video projector, whose rental included a specialist from New York to set it up and operate it. He proved invaluable in making other things work that day, too. Two cameras were mounted on the stage where I sat at the special work station (which the Herman Miller Company had made for us, and donated).

I was on-stage as anchor man during the continuous, 90-minute presentation, and Bill sat in the canvas-enclosed, raised booth at the back of the auditorium, directing the participants according the the script that I had prepared. People in our laboratory had key roles, and Bill coordinated us all via a voice intercom; while he also did the switching and mixing and frame splitting to put together the projected images.

During that 90 minutes, we used the projected display images (composite text & graphics) both to present agendas and descriptive portrayals, and also to demonstrate what NLS could do and how we applied it to our planning, documenting, source-code development, business management, and document retrieval.

The audio and video of the entire presentation was captured on film (no portable video recorders in those days). We had ten prints made, and circulated free loaners to people for years. (This film has recently been converted to VHS video cassette form to facilitate viewing by other people.)


At Spring meeting of ARPA Principal Investigators, in Ann Arbor, during session kicking off ARPA's newly revealed Network development project, I volunteered to develop and operate a Network Information Center (NIC) -- as a special, prototype support service for the network-connected community of ARPA-IPTO researchers.

Developed our first explicitly separated command processor (I used the term "control" then, instead of "command"), supported with our "Control Meta-Language," a separate command-description, compileable language.

Developed our MOL 940, a Machine Oriented Language and compiler for the SDS 940.

Equipped a meeting room with specially-built tables, video hookup, and computer controls for real-time support of meetings with our CDC-3100 version of NLS. Over twenty people could participate, each had full view of a nearby video screen; each could pick up a mouse to control a specially shaped "participant's cursor" so all could see what he was pointing at; one master control station provided discussion-recorder control for accessing, flexible viewing, and modification of project documents, agendas, formulation of measures to act upon, etc. We used this support system on 12-13 October, 1967, for a progress-review meeting with our research sponsors: Bob Taylor and Barry Wessler from ARPA, Fred Dion and Dean Bergstrom from RADC, and Gene Gribble from NASA.


All source-code development, maintenance and documentation now fully moved into the NLS environment (providing natural and powerful support for structured programming).

Began planning for move to time shared environment.


Beveloped the mouse as part of an explicit search for optimum screen-selection techniques in association with our online-application framework.

Moved NLS to a stand-alone CDC-3100, with online disk pack, 16K of 24-bit memory, line printer, paper-tape and punched-card I/O, custom built display. Full structured files, with in-file addressing and uniform text- and structure-manipulation commands.

Published (Rpt-62J) , which included the following: Discussion of how the computer can give aid to the basic communication processes between the human and his external environment (which environment of course includes his kit of computer tools). Clear forerunner of our later User Interface System. Uses example of introducing a chord keyset into a user's environment to bring out specific examples of the NEED-POSSIBILITY REVERBERATION phenomena which was introduced in 1962, (Rpt-62J) , and which is a basic part of all my subsequent strategic thinking. Extensive discussion of USER-SYSTEM, SERVICE-SYSTEM DICHOTOMY, including the characteristics of research in each domain. (CO-EVOLUTION without yet naming it such.) Detailed description of the earliest version of our later FLTS (oFf-Line Text System), and what eventually became our DEX System (Deferred EXecution). Working at a Model-33 Teletype, which makes a punched-tape record of all that is typed. Escape and Command codes embedded anywhere in the text could cause the later "batch" process to correct any previous error or omission, including those in any prior commands. The Model-33 had only one case of alphabetic characters; this process enabled me to designate cases so that the later, processed printout on a two-case Flexowriter (paper-tape driven) would come out with proper alphabetic cases. (I made myself write most of that report this way. And my wife made me move out into the garage to do it because the Model-33 was so noisy. I remember the extra problem of typing with cold fingers.)

Showed real time movie of NLS to ARPA IPTO Principal Investigators at our May 16-17 meeting at MIT. Simple, structure-text manipulations, with very fast concurrent control (mouse and keyset), and very fast computer response (stand-alone CDC-3100). Illustrated the basic difference in perspective between our approach and the prevailing concepts of "time-shared computer support." At that evening's cocktail hour, Bob Taylor told me, "The trouble with you, Doug, is that you don't think big enough." Well, after he dragged out of me a description of what I'd really like, he encouraged me to formulate a proposal for it -- a multi-user laboratory, based on a time-shared machine, to get on with real bootstrapping.


Made explicit strategic decision to bypass the online typewriter and go directly to display workstations. In spite of most other online developers working with typewriters, I felt that the much higher "augmentation potential" of displays warranted early pursuit, and figured that cost would surely come down in a few years. Keep in mind that CRT workstations were very expensive then -- they weren't a consumer product by any means. The single, large-CRT workstation we attached to the CDC-3100 that year probably cost us over $100,000, and required a lot of custom work on our part. [But we have always kept an access mode open for "dumb terminals" and "glass teletypewriters," which can elicit as much service as possible within their limitations.]

Moved to stand-alone CDC 160A; paper-tape I/O to and from a Flexowriter paper-tape typewriter; first, primitive version of our online, structured-file editing. Had a simple, batch system, too: type in correction directions on a paper-tape-punching Flexowriter; feed that paper tape into the 160-A for pre-processing; results and the file to be corrected transferred via mag tape to SRI central processor (Burroughs 220, I think); processed results brought back for post-processing and printing on the 160A.


Developed the chord keyset as a one-handed alternative for character input in an interactive environment. S

I participated in the ARPA-sponsored "summer study group" that kicked off Project MAC at MIT. Memories: Wrote a project memo about the dichotomy between "User System" and "Service System." Had my turn to address the whole group (50 or so people); not much reaction, but I do remember the group hoot when I said that we'd all be seeking to boost response time for many man-computer interactions down to at least a quarter of a second before the returns began to diminish.

Tried unsuccessfully to develop a workable system using a long-distance data link from Menlo Park to Santa Monica, with the SDC Q32 as a time-sharing host, and a CDC 160A as local communication manager and display driver.


Launched a long-term R&D program with the following basic Framework Principles:

There are many inventions, skills, methods, working conventions, organizational modes, etc. which have been integrated into the human's knowledge-work environment to "augment" his basic, inherited, biological capabilities. Let a working assemblage of all of these be called an Augmentation System; my Framework considers this whole system to be a valid object of study and improvement.

I broke the many parts of the Augmentation System into two main sub-systems: one contained all of the hardware, software and other artifacts -- the Tool System; and all the rest of it I called the Human System. Note that the Human System contains our natural languages, and the conceptualizations and formalisms of every discipline: an overwhelming network of invention. (Sometimes, in the early years, I called these the Service System and the User System).

The emergent information technologies promised such startling innovation in the Tool System that, according to my Framework, most of the Human System elements that are involved in our knowledge work are likely to be up for improvement or replacement. And the Framework further predicted very large improvements in human knowledge-work capability as a result -- after we have gone through a few generations of evolution.

The challenging strategic question for me was: how best to invest whatever resources are available in a manner that best serves the evolution toward truly high performance by our knowledge organizations?

One answer: begin consciously scouting for elements in our Human System that are candidates for being changed as a means of better harnessing the technology toward our human ends.

This led directly to such things as our structured-text, with its links and views, and also to the mouse and the keyset.

"Bootstrap" leverage was another answer. Look for the innovations that will boost not only our regular productive capability, but will add as much as possible to our capability to further improve our Augmentation System.

This kept me focused on documents -- which carry the knowledge, plans, arguments, etc. that are critical to helping us better climb the evolutionary hill. It also very directly pointed to the importance of developing improved support for collaboration among distributed workers; and from there to community support, etc.

The whole Augmented Knowledge Workshop concept emerged from the Bootstrap strategy -- working toward coherent, integrated access to an open-ended, evolving collection of resources and people. And doing this in a way that best enables evolutionary freedom of parts of the whole system without being unnecessarily anchored by the needs of other parts. From this arose such things as the Procedure Call Protocol, the application-indpendent User Interface System, the Network Virtual Terminal, etc.