thumbnail of Do Not Fold; 1
Transcript
Hide -
If this transcript has significant errors that should be corrected, let us know, so we can add it to FIX IT+
Do not then staple or mutilate this car. The slogan of the computer a University of Illinois radio service presents a series of programs about you and the computer from banks to hospitals and from airlines to music. It's application in this team and these programs will give you a glimpse of these countless applications and what they mean to you. Do not fold begins the story of the modern computer in early history. For centuries man has added and subtracted. He has counted the days in a year and the sheep in his herd at first man used his fingers the ten fingers of a human created a decimal basis for his counting system. When trade grew and numbers frequently reached higher than ten the hand was slow so all the evidence was developed. The clacking of the beads in each row as 10s were carried in the hundreds signal the
start of a new generation of adding tools used by man and sixteen forty two of the son of a grocer devised a machine that use gears and would add together numbers on a decimal basis Blaise Pascal had invented the forerunner of today's digital computer. The real father of the computer however was Charles Babbage. In 1812. Mr Babbage planned a complex adding machine dependent upon automatic parts despite financial aid from the British government he never finished the project. The technology of the Times was not yet ready for his ideas. Mechanical parts were too slow for his calculating engine Babbage needed electronic components not yet invented. United States Bureau of the census provided the next impetus for the development of today's digital computer.
The law of the land and provided that the federal government must count the people every tenth year as a nation grew so did the problems of conducting the census. More doors had to be opened by more citizens in more communities. Good afternoon ma'am. I'm a representative of the United States Census Bureau and I'd like to ask you a few questions. Yes yes first of all I would like to know how many people reside in this house. Well there's my husband and myself and our six children. And my aunt Stephanie. And my sister Josie and my cousin Joyce. Writes. That the census taker will have no easy task. Dr. Herman Hollerith hope to improve the situation in 1890 he conceived the idea of putting information on punched cards. This would speed tabulation of information on a special machine and it would allow rapid computation of result. In the.
It turns years of World War Two created a special interest in the capabilities of computing machine advanced technology of 940 allowed men much more freedom than Charles Babbage had known in 1812 over 100 years earlier. While Churchill urged courage when Hitler planned invasions minute Harvard University International Business Machines and Bell Laboratories joined forces to construct the mark one small electromechanical relays with the main components of this computer. These cumbersome devices took up to five seconds to do multiplication when electron tubes were added to the hardware the newly developed computers. Results were more efficient. Any EQ or the electronic numerical integrator and calculator was designed to compute trajectory tables for firing shells expres and ballistics and use complex calculations to determine where a shell might land if of a certain size fired at a certain angle from a certain location.
We're not fool proof over their rate of failure was high and yet each of 18000 tubes had to be operative or the system would not work. In 1946 any act blew out several hundreds of its vacuum tubes during a futile attempt to divide by zero because no one had remembered to instruct the computer that the task was impossible. In 1949 the University of Illinois built a special computer labeled the iliac run for the Aberdeen Proving route was when the university built a second just like the first for its own use. It became the first mass producer of computers. The most important advance in technology for the computer was yet to come. In 1948 Jay Bybee and F. Mott NWA Schottky had Bell Laboratories invented the transistor an electronic component that would replace the vacuum tube in a modern computer. New materials at hand after the war made possible this startling development. Now that wires were connected and transistor decided in the computer spewed out
information. Why was this information coming from. How did the computer handle it. What does a computer look like. Today's last computer is cased in steel cabinets which may be as small as a file cabinet. Inside the doors on these cabinets are three main portions of the computer the memory area stores information for later use by the computer and is the repository for all incoming information. The central processor takes this data stored in the memory and operates on it according to the instructions provided the input output area of the computer readies data for the actual calculations involved and then returns this data in the proper form. Let's trace the path of that data through the modern digital computer. When a problem is being prepared for today's computer it must first be put into the proper form. The machine does not understand human language without translation.
The digital computer typically operates on the basis of numbers not only two. For example in the memory a vast system of crossed wires creates thousands of points within the web. At the junction of two wires a ferrite core can be magnetized either in a positive or negative direction. This point could represent either the number one or the number 0. The term bit is a contraction of the words binary digit and implies information stored on one of these cores in the memory by never a reference to a number system and the word digit is derived from the Latin word digitus or finger man's first counting device. The idea of counting with only two fingers or two digits may seem impossibly difficult to a person familiar with 10 numbers but Roger Johnson research assistant at the coordinated Science Laboratory at the University of Illinois explains how it actually works. Let us consider the problem of counting and performing arithmetic in a number system which
consists of only two symbols or numbers. For example let us use the two symbols 0 and 1. Any number in this system therefore must consist of some combination of ones and zeros. The three digit binary number 1 0 0 is an example. In the decimal number system the digit positions of a number from right to left or given the values ones place tens place hundreds place and cetera. The number 5 2 or 52 as representing two ones and five tenants. In the binary number system. The digit positions from right to left are given the values ones place to place. The force place. It's place and set. The value of these digit positions by counting the number of different possible combinations of avenging the symbols 1 and 0. Therefore the binary number 1 0 0 can be thought of as no runs
no 2 and 1 for the binary number 1 0 0 is equivalent to the decimal number four. Then this binary system the number 0 and 1 have the same meaning as in the decimal system. The decimal number two is represented by the two digit binary number 1 0 which can be thought of as no ones and one to. The decimal number three is represented by the two digit binary number one one. That is a 1 in the ones place and one in the twos place. Since we have now used up all of the combinations of two digit binary numbers the decimal number of your must be represented by the three digit binary number 1 0 0. No ones No 2 and 1 for. 1 can easily continue in a similar manner to describe all of the decimal numbers in terms of the equivalent binary numbers. So it's a given combination of these. I never get just can represent letters or numbers. They can
be fed into a computer and more complex forms. The program from a person who sets up a series of instructions for the computer can use this basic computer language often called machine language or rely on other languages more similar to English with such a language just for. Me and I might construct a program to process your gasoline credit card Roger. God just what needed Science Laboratory at the University of Illinois I think explains how this might be set up. In a program of this type cards containing instructions would be read into the machine. Then there's instructions would be carried out by the computer. The first instruction might read as follows. RED CARD comma 100 comma. Id. Comma. Amount. Comma delete comma I do yes. This statement causes the computer to read a punched card according to certain rules and to place the data on this card into specific memory locations in the computer.
For example the purchaser's identification number is placed into the memory location labeled ID. The amount of the purchase is placed into a memory location labeled amount. The date is placed into a memory location labeled date and the distributors identification number is placed into a memory location labeled IBS. The second computer statement might read. Them to me equals balance parentheses I did. This statement causes the computer to take the customer's account balance which is stored in the memory location balance ID. And places it into another location labeled empty. The next statement would read. Them to me because they empty. Plus amount. This statement adds the amount of the previous balance to him out of the purchase and stored in the memory location labeled empty. The next computer statement might mean balance apprentices ID equals
empty. This statement causes a computer to place a new balance stored in an empty into the memory location label balance parentheses ID. That's the customer's account has now been updated. Using similar statements the program would credit the proper distributor over the sale and then read the next card. New languages are constantly being developed with different capabilities and characteristics. We asked Dr. Steven Fenjves professor of civil engineering at the University of Illinois to comment on the need for these new languages. Initially when computers first became available each computer had it own set of instructions and the only thing the programmer could
do was to write down the individual instructions to that particular machine. This was a time consuming activity and also there was no choice as to what kind of language a person could use. Then in the early fifties some people big game to recognize that communication could be raised at a different level that the person could state larger chunks of computation which to him meant the unit of processing and let the computer convert into the individual instructions. This obviously was a great improvement in communication and furthermore it had the advantage that one could write programs for different machines. Essentially machine independent programs as long as the different computers had a. Translating program called the compiler which could interpret the statements and
translate them into machine instructions you really didn't care about machine used. I think this kind of trend is coming to an end. Government power lines users are very concerned about this proliferation and how we have standard Forth and standard COBOL but share government standards like any other standards. And most of the language US being towards greater generality picking up the better parts of everything and incorporating them into one package. Once the data has been couched in a particular language how does it get into the computer. At first programmers used punch cards. These cards which are still in common use contain 80 vertical columns. We have 10 rows each. Each column represents a character. Thus the first row when column 1 punched would mean number 1 and the fifth row when column 2 bunch would mean number 5. The holes in each card allow current to pass through the guard only at that point an
electrically charged rollover is on one side of the punched card and a metallic brushes on the other side. The cards act as insulators between the roller and the brush so that electrical impulse only flows through the hole. Why do these cars that we encounter every day have the constant reminder do not bend staple or mutilate this car. Any alteration of the physical characteristics of this card may affect the reading of the card. An electrical impulse may flow through a torn home or a staple and render the information on the card totally inaccurate. A bank account may be credited or debit by a thousand dollars because of a pin prick in the wrong column a card may fail to run through the reading device and have to be replenished creating further possibilities of error. So the warning do not fold is an important danger sign of the computer age.
This is the sound of punched holes on a paper tape read in a special reading device. However much more effective input mechanism is magnetic tape or magnetic disc. A punch card reader may average 250 cards a minute or 20000 characters a minute magnetic tape can feed in tens of thousands of characters per second. Even the typewriter can be used to import information typewriters are much slower but more familiar to the average person who might want to use a computer. Instructions for the computer can be typed on special typewriters step by step five writers can also be connected through a special terminals with TV2 information typed into the computer may appear on the screen of the cathode ray tube and then data calculated by the computer may be presented on this through even drawings may be
sketched on the tube with a special light pen. A recent development in visual displays may prove even more versatile than the cathode ray tube. Engineers at the University of Illinois have devised a plasma display panel less than a quarter inch thick which has thousands of crossing wires invisible to the naked eye. Each point on this panel may be lighted or unlighted producing an image in black and white for color. Dr. Donald the bits are one of the inventors of the plasma display panel explains why this device may compare favorably with the cathode ray tube for computer work. It has the ability to retain its own image. This is one of its important aspects. Consequently once you turn a little spot on the screen it stays on without the computer telling it to go on each time 30 times every second. Or you can turn the spot off and you do not disturb any of the neighboring spots. Another important aspect of the screen is that it's transparent and we want to be able to superimpose pictures
along with a computer generated graphics. This allows us to do it. We can project on the back of the same screen. And then you can the student cannot view this projection through the screen and the pictures and thought. So the student can see the computer generated graphics Aladdin with the picture graphics now which are can be much higher quality than a television picture. But on the same screen in a very economical and useful way. Some of the other aspects which which we think are important but not have so essential or that we can produce these things in different colors. And so we're able to have a high quality graphic color presentation as well as a monochromatic present. Also very important of course is that our screening response directly to digital computer signals that need not go through Conversion Devices such as cathode ray tube devices have to go through in order to produce the image. And so the computer can can write or turn on a spot on the screen accurately was completely
liability complete accuracy without having to worry about high quality electronic circuits which may get out of adjustment as time goes on. Just making the term the terminal more reliable for computer home use for a computer application. And of course last but certainly not the least important aspect is that for computer applications we can probably produce this device at a fraction of the cost necessary to produce equipment to do the same job by any other method. The next step in communication with the computer may be human speech and generic studying this problem in hopes that a computer can be instructed to understand the sounds of the human voice. Human brain is going to understand these patterns and so why not the computer. You know that many different pronunciation of the same word mean the same thing but the computer cannot be stored with these millions of variables and human speech patterns. As a matter of fact scientists
suggest that a person's voice may be as distinct as his fingerprint unlike any other. Still be automatics corporation has devised a simple system to recognize sound as a man says the words slow into a microphone a typewriter to the computer prints out s l o w its vocabulary is limited to 100 words at the moment. Though computers may have difficulty understanding human speech it is easy for them to speak in a sense. They may activate recorded messages on any topic. The last time you dialed an incorrect number you heard a message activated by a computer. I'm very yet able to pick up that I am ok and there I am. It may take quite some research also shows that a computer may learn to speak itself minute Bell Laboratories in Murray Hill New Jersey have given them computer information about the function of the human speech organs at the
outset the computer speaks slowly as it does hundreds of calculations to determine how particular sounds are formed after it is decided the basic elements of the sound pattern. It speaks quickly that without the phrasing of human speech the sentence sounds flat with timing. The computer sounds better. But still pitch is missing. Bad. Scientists have even programmed computers to sing popular melodies and provide musical accompaniment.
The primary vehicle for all information put into the computer is the program within the confines of the specialized instructions the computer can manipulate figures and data and provide answers. Actual Problems put into the form of computer programs take hours of preparation. One reason for using a computer is to avoid excessively complex calculations by hand. Still the logic of these calculations must be conveyed to the computer so that it will know how to handle incoming data. The many revisions of a computer program are labeled in debugging. Some suggest that this title may have originated with Dr. Grace Hopper a pioneer programmer who told of a lengthy testing session at the Harvard computation lab in which an apparently erroneous program was checked and rechecked. Finally someone located a fly stuck in one of the relays preventing electrical contact so insects may cause just as much trouble to computers as they do to human beings.
Now the computer has been put together and information has been fed into the memory be a punch cards or magnetic tapes. The computer goes to work. Thousands of points within the memory of the computer that are either 1 or 0 0 0 compared with added to or subtracted from each other. The end result of such electrical mathematics are printed on a high speed printer that can put out up to a thousand lines of type a minute. Calculations it might have taken days may be available in a few minutes because of today's computer. The language of the computer contains many unfamiliar terms. Some We've considered already a bit is a binary digit or part of a two
number system a memory is a vast system of crossed wires that are electrically magnetized a program as a set of instructions for the computer. But what about a bash program in real time and on line systems. Let's take it one at a time. Problems can be gathered and solved all at the same time using the same program. Companies may do all their calculations of the monthly payroll on one day instead of updating information in the memory as an employee works another hour each day. Batch programming is done with all the necessary information is at hand perhaps several hours days or even years after the data was originally collected. Other computers however are on line and work in real time. Information is not gathered but fed directly into the computer as it is accumulated. The input device is hooked directly to the computer or on line. Some banks have installed terminals at every tellers window in his deposits are made the customers account is immediately credited. New data constantly updates files in the memory of the
computer. Well that seems clear about what about the term multiprogramming. Sometimes a computer has more capabilities and those in use at a particular time. It is possible for others to use those free capabilities and solve their own problems at the same time. I don't want to peers that the computer is serving everyone at the same time. Actually it is sharing its time and milli second doses with everyone. Each user is delayed a small fraction of a second while the computer handles input and calculations in turn. If you notice this delay because it is measured in such small portions of a second. We've been describing the modern digital computer which handles the largest part of the load of electronic data processing today. Another kind of computer the analog computer is also used to solve special problems primarily in engineering. A digital computer operates on the basis of two numbers which are indicated by the positive or negative direction of magnetization an analog computer deals in
voltages voltages may range across a wide scale and indicate changes in measurement. However voltage readings may tend to be slightly inaccurate and therefore digital computers are preferred for most modern applications. And later programs in the series will be able learn where analog computers play an important role in electronic data processing. Well also examine the applications of digital computers in industry finance government medicine and other areas. Major companies across the United States are continuing to develop computers with new capabilities for the demands of today. Competition among such firms as International Business Machines Control Data Corporation Scientific Data Systems Sperry Rand and Burroughs has produced ever larger and more complex installations of computers public and private agencies have cooperated in the development of computers over the last three decades from the earliest any EQ to the latest third generation equipment.
Today we follow the development of the computer from Babbage's my Ghost Machine to the present day digital computer in upcoming programs and this series will see how computers can be used to tabulate sorority rush. Arrange dates fly airplanes bake cakes and write music. On our next program we'll discover how the future may bring a war without man. Each week the University of Illinois radio service brings you the meaning behind the slogan of the computer age. DO NOT band stateful are new to light this.
Series
Do Not Fold
Episode Number
1
Producing Organization
University of Illinois
Contributing Organization
University of Maryland (College Park, Maryland)
AAPB ID
cpb-aacip/500-b56d6468
If you have more information about this item than what is given here, or if you have concerns about this record, we want to know! Contact us, indicating the AAPB ID (cpb-aacip/500-b56d6468).
Description
Other Description
"Do Not Fold" is a program about the growing applications of computer technology. Each episode focuses on how different professions and sectors are using computers to explore new possibilities in their line of work. Interviewees discuss how they are incorporating new technology into their work, what these innovations mean for the future of their field, and how they may affect the general public.
Date
1969-02-18
Genres
Documentary
Topics
Education
Technology
Media type
Sound
Duration
00:29:15
Embed Code
Copy and paste this HTML to include AAPB content on your blog or webpage.
Credits
Producer: Johnson, Jiffy
Producing Organization: University of Illinois
Production Designer: Haney, Edna
AAPB Contributor Holdings
University of Maryland
Identifier: 69-19-1 (National Association of Educational Broadcasters)
Format: 1/4 inch audio tape
Duration: 00:29:00
If you have a copy of this asset and would like us to add it to our catalog, please contact us.
Citations
Chicago: “Do Not Fold; 1,” 1969-02-18, University of Maryland, American Archive of Public Broadcasting (GBH and the Library of Congress), Boston, MA and Washington, DC, accessed October 24, 2021, http://americanarchive.org/catalog/cpb-aacip-500-b56d6468.
MLA: “Do Not Fold; 1.” 1969-02-18. University of Maryland, American Archive of Public Broadcasting (GBH and the Library of Congress), Boston, MA and Washington, DC. Web. October 24, 2021. <http://americanarchive.org/catalog/cpb-aacip-500-b56d6468>.
APA: Do Not Fold; 1. Boston, MA: University of Maryland, American Archive of Public Broadcasting (GBH and the Library of Congress), Boston, MA and Washington, DC. Retrieved from http://americanarchive.org/catalog/cpb-aacip-500-b56d6468