Desktop Data Capture

desktop_data.gif

The computers of the 1960s were mainly batch processors. Most organizations were still running paper systems. Computerisation had scarcely touched the organizations. There were no terminals. Telecoms were rudimentary. Software was either limited or non-existent. In the 1970s, mainframe operating software dramatically improved, telecoms became more accessible and somewhat easier to use. Visual display units based on cathode ray technology with reasonable costs and reliability became available.

Real-time computing became a realistic, cost-effective proposition for larger organizations. The appearance of the minicomputer from the late 1960s provided a fast, relatively inexpensive system designed for telecom work that opened the door to widespread terminal usage.

One aspect of the increased terminal usage was the connection of terminals directly to the mainframe. For a variety of reasons, for many mainframe users that was not a cost-effective solution because direct connection needed new ‘state- of-the-art’ mainframes for the most efficient operation. A more attractive proposition was to connect a minicomputer to the existing mainframe and use the
minicomputer to drive the terminals. The minicomputers were much better telecom systems than their mainframe predecessors partly because they usually had a monolithic operating software environment rather than separate operating system, data management and telecoms software.[ This is the approach later copied in second generation PCs]. When the minicomputer was connected by cable to the mainframe in the computer room it was called a Front-End Processor. When it was linked via telecoms it was called a Distributed Processor.

The key-to-disk system used in the punch room was a minicomputer. One of the main milestones in the development of desktop computing was the movement of data capture out of the punch room to the source of creation of the input form. This was done by moving terminals to remote locations and/or moving complete systems to remote locations. These movements represented the beginning of the end for the punch room as historically conceived.

 

Later in the 1970s these minicomputer systems were upgraded with file processing, data management, high level programming languages and later interactive telecoms with mainframes. This was the tipping point. Desktop computing was born. Developments continued and in the 1990s organizations re-engineered their work processes. The desktop terminal became a multi-function workstation either as a PC or as a terminal with full access to a complete range of facilities including internet and intranet connectivity.

The Case Studies in this section are the studies that have survived. They chart various approaches to distributed processing via data capture. Distributed processing and punch rooms co-existed. They were not mutually exclusive. New applications tended to start with distributed processing data capture and then evolve into desktop computing and later into PCs/workstations which often included scanning.