|INIS Pioneering Information Management: The early days of computerized acquisition, processing and storage of data at the IAEA|
In his third Law, British author, Sir Arthur C. Clarke, states that “any sufficiently advanced technology is indistinguishable from magic”. When a technology becomes estranged due to the inevitable process of obsolescence, it becomes alien to the untrained eye, unfamiliar and obscure – elements that possibly add up on the original mystique of early computers. In this article we will take a look at a number of early information technologies, some gone, some forgotten, some still alive-and-well, to unveil the magic that has greatly contributed to INIS, throughout its 45 year journey in the realms of nuclear culture for peaceful uses.
First computer of the IAEA
FIG. 1. An IBM System/360 together with peripherals; the console is visible in the forefront.
In the beginning, there was the computer: heralded as a “Giant Brain” by the press, the first computers of the 1940s, such as the ENIAC, were a jumble of vacuum tubes, crystal diodes, relays, resistors and capacitors, over a million of hand-soldered joints, spreading over an entire building: all with the same computing power as that of a modern pocket calculator. When INIS began its journey in the 1960s, computers had already come a long way. Since 1951, when John Von Neumann’s machine was released by the Institute for Advanced Study (IAS), the complexity of computers’ circuits had doubled steadily every two years, as first noted in 1965 by Gordon Moore, co-founder of Intel, and author of the famous law bearing his name. When Moore made his observation, the world of computers had already transitioned from vacuum tubes to transistors, the public had seen the birth of the PDP-1, the ancestor of the modern personal computer, and the paradigm of general-purpose computers had won its battle against that of one-purpose, highly specialized systems. The appeal of an all-purpose machine convinced the Agency’s management to explore this emerging technology and to purchase a computer with the ability to tackle different problems, such as handling the specialized duties required by INIS and, at the same time, solving financial or statistical problems. This machine was the IBM System/360 Model 30.
More than a computer, the System/360 was Big Blue's biggest gamble since its foundation. With legendary Thomas J. Watson, Jr. at the helm of the company, IBM’s bet was to launch not one, but a full family of six mutually compatible computers, together with more than fifty peripherals, which modularly expanded the features of the computer unit. With an initial investment of $5 billion, the System/360 was part of Tom Jr.’s vision to have IBM move its major source of revenue from punched-card equipment to electronic computer systems and make a final transition from discrete transistors to integrated circuits. Aggressively pursuing success with its new system, IBM made the bold move to discontinue all five of its previous computer product lines, including the successful 1400 series, leaving customers no option but to embrace the new System/360. However, the transition to the new system was made less traumatic by IBM, with an unprecedented pursuit in compatibility. Thanks to emulation features in the System/360, the computer was fully compatible with the applications written for its predecessors, something IBM had never before endorsed. The System/360 Model 30 quickly became an incredibly popular mainframe. Orders climbed to 1000 units per month within two years of its launch, and by the early 1970s, IBM peaked at 70% of the world’s mainframe market, with its customers including NASA, Ford and Volkswagen. In his memoir, Tom Jr. would recall: “The System/360 was the biggest, riskiest decision I ever made, and I agonized about it for weeks, but deep down I believed there was nothing IBM couldn’t do”. But how did this computer work and what did it look like? A Model 30 would normally include a Console, a Card or Punch Reader, a Printer and a number of magnetic tape drives. With its 64k byte of RAM, 8-bit micro architecture and the ability to perform circa 33 000 additions per second, it succeeded in helping NASA land the first man on the moon. There was no doubt that it would be a perfect companion for the birth of INIS, the Agency’s first computerized database on the peaceful uses of nuclear energy.
FIG. 2. A close-up of an IBM System/360 console
One of the goals for the IAEA to invest in a computer system was to address the information management requirements related to the development and implementation of INIS. The deployment of a computer in the Agency had an impact at different levels of the Organization: not only did it transform the way people worked, but it changed the Agency itself by triggering a change in its organizational structure. In fact, along with the deployment of the IBM System/360 Model 30, a whole new Section was born within the Division of Scientific and Technical Information (STI). The Computer Section, whose legacy is still carried out today with the assistance of the Software Development and Support Group (SDSG) in the Nuclear Information Section, would use the IBM System/360 to develop and run applications aimed at checking the inputs submitted by the Member States, in compliance with the United Nations International Scientific Information System (UNISIST) decentralized concept. The computer soon proved to be essential in processing the inputs and producing the outputs on magnetic tapes, together with the files for the printed version of the INIS Atomindex, a monthly bulletin containing bibliographic references to the items of literature in the INIS Collection and distributed to INIS Member States. The system’s printer, an IBM 1403, whose output quality would remain unsurpassed until the advent of laser printing technology in the 1970s, allowed the Agency to print its INIS Atomindex in-house, in complete autonomy and without having to rely on the support of 3rd parties. The experience gained by the Agency in establishing, running and maintaining a bibliographic information system was so vast that, when FAO considered establishing its own information system for the agricultural sciences in 1973, namely AGRIS, INIS was clearly the model to mirror. Thus, many years of cooperation between the IAEA and FAO were established, continuing today on different projects, with the Agency sharing its IBM computer technology to process the records of agricultural literature and producing the FAO Agrindex publication.
To the modern computer user, the IBM System/360 might look cumbersome and primitive in design, its size laughable and its hardware and software specifications ridiculous; however, this computer diligently served the Agency for decades. According to Big Blue, it would take an estimated 33 million hours of operation for one of the solid-logic technology modules to fail on a System/360. The IBM System/360 impacted the world of computing and mainframes with several significant contributions. First and foremost was that of compatibility, through standardization of input and output interfaces and through the first-ever written operating software for a family of compatible processors. The IBM System/360 was also responsible for the introduction of the 8-bit byte and 8-bit architecture, which, thanks to the staggering success of computers like the Commodore 64, the best-selling home computer in history, reached millions of home users worldwide during the 1980s, launching the era of “computers for the masses” in which we live today, a concept envisioned by pioneering entrepreneurs such as Jack Tramiel, Sir Clive Sinclair, Steve Jobs and Bill Gates. On a more colourful note, it was actually possible to order these huge IBM computers and peripherals in different colours, just as it is possible today to order a laptop or a mobile phone in different colours - a testimony that the IBM System/360 was meant to be more than a simple technological step in the history of computers, a “sharp departure from concepts of the past in designing and building computers” - using the words of Thomas Watson Jr. himself.
Having introduced the IBM System/30, the reader might wonder how the bibliographic records were loaded into the computer. At first, this very time-consuming job was done manually by subject specialists and professional typists. Then, in 1973, INIS decided to pioneer another technology: Optical Character Recognition.
The idea of introducing OCR technology to INIS was an effort to optimize the workforce at the INIS Secretariat, whose mandate was to process the contributions from the UNISIST decentralized model. OCR, in essence, converts typewritten text to a digital format understandable by a computer, relieving computer users from the burden of long manual input sessions. Thanks to the rising popularity of inexpensive flatbed scanners, people nowadays are familiar with this technology, and some use it on a daily basis. For example, by feeding scanners with all kind of forms and templates, the OCR software on the computer converts, in a matter of seconds, the written text on paper into an easily-editable electronic format. In the 1970s, however, this technology was still immature and most companies or organizations could only dream of having such options at their fingertips. The first true OCR machine had been installed at Reader’s Digest in 1954 to process typewritten sales reports and transform them into punched cards for the computer. The second generation of OCR reading machines appeared between the mid-1960s and early-1970s, with IBM presenting its first model, the IBM 1287, in 1965 at the World Fair in New York. Investing in OCR technology at this time would have first required the purchase and deployment of a highly expensive OCR machine; then the understanding and adoption of its input formats; and lastly, the purchase of suitable machinery compatible with such formats. In the early 1970s, INIS found its OCR reader of choice in the Compuscan 170, a monster-machine which weighed 548kg, created by the American company Compuscan Inc., based in Teterboro, New Jersey, and whose fame rose greatly thanks to the endorsement of the US Air Force, which successfully deployed a Compuscan setup in its Foreign Technology Division (Machine Translation) facilities. Contrary to modern OCR software, a Compuscan could properly process documents using only specific and standardized character typefaces.
FIG. 3. Picture of a complete Compuscan systemFIG. 4. An IBM Selectric "Golf Ball"
Originally, these typefaces were only of two kinds for Latin characters: OCR-A and OCR-B. Easily readable both by humans and computers, these sets were sans-serif and monospaced fonts invented and released in 1968 respectively by the American Type Founders and by Adrian Frutinger, a designer of Monotype Inc., from Woburn, Massachusetts. At first, the OCR-A and OCR-B typefaces could be only be printed by the IBM Selectric typewriter, another revolutionary technology from IBM, an electric typewriter that implemented rotating typeface spheres, easily interchangeable, in place of the regular typebars found in mechanical typewriters. These rotating ’golf balls‘, with reverse-image letters moulded on the surface, could print the fonts understood by the Compuscan, enabling the use of standardized OCR sheets as an input to the INIS production cycle. The Compuscan could produce an output of only 100 characters per second, circa 1/15th of what current architectures offer. However, such throughput represented a consistent improvement over manual input, considering that a professional typist can type only between 50 and 80 words per minute (wpm). Later on, together with the Compuscan, the IAEA decided to make use of another OCR system, the ECRM 5200 Autoreader, which was provided by ECRM Inc. on a loan, free of charge. The ECRM 5200 could process 500 wpm in English text, and would cost the astronomical amount, by today’s standards, of $37 500. Both the Compuscan and the ECRM 5200, officially as of 1974, offered the ability to process the Cyrillic alphabet, a feature of great interest to INIS, considering the prominent role of the USSR in the submission of INIS related materials.
Before becoming obsolete, the Compuscan and the ECRM machines had significantly improved the efficiency and speed of INIS operations at the Secretariat and, most importantly, had reduced the manual workload of its staff. However these machines were far from automatic. The supervision of human personnel to correct mistakes due to damaged characters was still required, with corrections performed using an uncomfortable mini-viewer and a built-in cathode-ray tube display. These days, OCR at INIS is entirely carried out by software, with the aid of modern digitizing equipment. Highly advanced modern software products, like Abbyy FineReader, InftyReader and Adobe Acrobat, offer the Secretariat the ability to convert scanned images of handwritten, typewritten or printed text of any kind, into machine-encoded text, enabling INIS users and Member States to perform simple searches throughout converted documents, in only a matter of seconds, using the INIS Collection Search (ICS) engine.
Data storage and distribution at INIS
FIG. 5. Microfilm and Microfiche formatsFIG. 6. IMB 2401 magnetic tape disk drives, in blue.
Since the dawn of human history, man has sought the means to distribute and preserve data, information and knowledge. Assuming that knowledge is what we know, and data is the description of certain facts of the world around us; then information is what allows us to expand our knowledge. For example, in the case of a cave painting, the painting is the information, what is depicted is data; and the wall of the cave, the canvas where the information is stored, is what we would call a medium. Paraphrasing Canadian literary critic Herman Northrop Frye, for thousands of years, the most technologically efficient medium to divulge information has been, simply, books. A book needs no machinery to work: it requires only the eyes of the reader to transfer the information stored in it. Why then, would we ever bother giving up books in favour of other mediums? Partly because, as a medium, a book has its limitations; to begin with, a book cannot exceed a certain size, the amount of text written on a page is bound to the resolution of the human eye and storing a large collection of books would require the availability of ample storage space. But what if we could overcome these limitations by tweaking back and forth size and resolution to our liking? This is the concept behind microforms, micro-reproductions of documents to one twenty-fifth of the original size.
A microform is unreadable to the naked eye and it requires magnification to read the information contained within it. The use of this technology goes back to the mid-19th century, with one of its most peculiar uses taking place during the Franco-Prussian War in 1870, to deliver information via pigeon post. Both microfilm and microfiches belong to the family of microforms, with the latter being the main technology of choice at INIS for over thirty years, for the storage, reading, printing and transmission of its non-conventional literature. The difference between a microfilm and a microfiche is that microfiche images are distributed on sheets of film, rather than on a roll. A microfiche would contain several miniature reproductions of pages, in the standard size of 105 mm by 148 mm, commonly on a polyester base, a material whose lifetime is estimated to be as long as 500 years, much longer than any optical media on today’s market, such as CD-ROM or DVD-ROM. Thanks to its size, for many years, microfiche offered a reliable and cheap means for INIS to increase the resources in its full text collection, without the physical requirements of additional storage areas. It was the advent of computers, however, that slowly, but inexorably, made this technology both obsolete and impractical. Microfiches are surely cheap, compact, safer and more portable than paper, but the advantages of today’s digital document storage outweigh all the above. Digital documents, which can be processed by computers, have searchable text; making the content easily accessible and the distribution immediate. The space required for a digitized book is calculated in bytes, with information density on storage media augmenting every year according to Moore’s law. For this reason, a project started in 2003, with the cooperation of INIS Member States, where more than 83% of the entire microfiche collection was digitized over a period of 12 years, with over 288 000 full text documents converted into electronic format, more than 14.2 million pages, amounting to more than 350 gigabytes of data. Microfiche still survives today at INIS, where staff are busy converting the remaining 3 million pages, before finally dismissing this format.
If, on the one hand, paper and polyester were a common source for INIS inputs, magnetic media was, for years, the medium of choice at INIS for the distribution of information, with magnetic tapes being the technology relegated to oblivion. Since the 19th century, when Oberlin Smith succeeded in recording audio on a wire, magnetic storage has always been around us, having long outlasted other popular technologies, such as punch cards. In the form of tapes, floppy disks and hard drives, magnetic storage survives nowadays mostly as hard drives in IAEA data centres, with the preferred medium of distribution being the Internet or the Cloud. The first IAEA computer, the IBM System/360 Model 30, was equipped with two kinds of magnetic drives: the IBM 2314, a primordial hard drive with a stunning capacity of 25.87 MB on removable disk packs, with a maximum of eight drives being supported by the main unit; and the IBM 2401, a magnetic drive that could handle 9-track 800bpi tapes to record information in either EBCDIC (8-bit) or ASCII (7-bit) formats. It was the IBM 2400 series product line that launched the 9-track tape format, a standard that dominated offline storage and data transfer for over 30 years, with the last drive manufacturer abandoning the format only in 2003. This support had a storage capacity of less than 200 megabytes — circa 1/3rd of the capacity of a modern CD‑ROM — and consisted of a 2400 ft. long reel of magnetisable oxide coating on a very thin plastic strip. For many years, it was precisely this format that INIS would use for its outputs: magnetic tapes prepared at the Secretariat in Vienna, with descriptive information and subject analysis of INIS bibliographic products distributed to INIS centres worldwide, until the Internet, more than anything else, killed this type of media. Throughout its years of operation, INIS produced a large number of tapes for distribution to its members on a bi-weekly or semi-monthly basis, without missing a beat, until 1997, when the last tape was finally produced. Tapes nowadays continue to be used worldwide, although in different formats than the 9-track reels that INIS used to produce, and they are currently enjoying a resurgence, with sales in the last quarter of 2014 amounting to $121.50 MM, according to the Santa Clara Consulting Group in CA, USA. This data medium resists obsolescence as it still has several advantages over other solutions, for example speed, reliability, cost-savings and a much longer lifespan. In a recent interview to The Economist, Alberto Pace, head of data and storage at CERN, claimed that, according to their calculations, extracting data from tape is still four times quicker than reading from a hard disk. As for reliability, tapes are more resistant to failure: a tape can snap, but it can be quickly fixed, simply splicing it back together; if a hard drive fails, there is no guarantee that the data can be fully recovered. Finally, with an average lifespan of over 30 years, tape still remains the cheapest physical medium on the market, costing per terabyte only about $17, compared to $35 for a common desktop hard drive. Although INIS has shifted to faster and cheaper methods of distribution to its Member States, tape is still doing quite well, considering that it has been around for 60 years, and it is still actively used at the IAEA for long-term preservation of data.
This article covered a lot of history in the evolution of computers and information management; however, we should never forget that, no matter how advanced or fascinating a technology is, it would be nothing without the people who operate it. As Elbert Hubbard said, “One machine can do the work of fifty ordinary men. No machine can do the work of one extraordinary man”. There is no doubt that, in the past 45 years, INIS and NIS have seen extraordinary staff come and go: visionaries, innovators, pragmatists and technicians, who surely made the machines work, but most importantly magnified, with their own unique human talents, the impact and contribution machines have had in the success of INIS and its operation.