thumbnail of About science; About computer languages
Transcript
Hide -
This transcript was received from a third party and/or generated by a computer. Its accuracy has not been verified. If this transcript has significant errors that should be corrected, let us know, so we can add it to FIX IT+.
This is about science produced by the California Institute of Technology and originally broadcast by station KPCC in Pasadena California. The programs are made available to this station by national educational radio. This program is about computer languages with host Dr. Robert regrettably in and his guest Mr. Stephen Kane. Here now is Dr. McGrath. I think Stephen before we begin to talk about computer languages persay might be helpful to look back a ways years as to how computers started and the kind of problems and approaches we used with computers in the past. Yes I think that would be a good idea. Computers as we know them today or think of them today really began to develop just prior to the second world war primarily work at Harvard and at Bell Laboratories produced several prototype computers which were basically electro mechanical devices using relays and motor counters
which were really stepping relays. And these were the kind of devices computer systems that were used through the period of the war and that's why and then shortly afterwards. That's right a lot of use was made in ballistic table computation for the Ordnance Department awareness department and perhaps even for missiles. And I remember that vaguely myself. What was the first major advance beyond this electro mechanical stage of computer concepts. I think the really the first point we can look at was the development of a stored program computer using vacuum tubes. This would this allowed of course much faster switching times. And while the memory elements were still slow and primitive it were able to produce for example what the ENIAC computer which was developed soon after the war. Reasonably powerful device at least at those times looking back on
today it's extremely primitive and one would really be horrified if you were faced with a computer that limited power today but it was a landmark at the time. You said that the memory storage techniques were not much changed in that first period. What was the original approach to retaining information in the computer. There were really early approach using the electro mechanical devices was usually to write shit relays or there all was an electro mechanical storage element so of course these were extremely slow. And the numbers might be entered constants might be entered into the machine simply by throwing toggle switches on a biassed or right of that what do they have switches so that one position was zero another position was one thing. Yes which is really what we do in modern systems. Correct but with much smaller devices. OK then moving on from the vacuum tube technology. The next step was what.
The next real step. Well after that. Development of the mi ag a number of other machines were developed by various manufacturers. All of course at that time using vacuum tubes starting to use drums or disks for the memory of memory units beginning to produce the idea of backing store or backup storage where you would have maybe a very fast draw for the memory main memory fast but small and then slower drums or tapes for large massive storage and then the idea of the magnetic core. I think really have played a great deal in producing computers of the speed we have now because they are capable of storing large amounts of data with very rapid access times. When was that about how these began to come in. I guess around 55 maybe a little before there'd been some partial attempts to use actually cathode ray tubes for
storage and these had problems such as trucks driving by and shaking bits off your memory right. All right then moving on from there. The really important step or the one that brings us to the second generation of computers was the introduction of solid state circuitry. And by that you mean lot of transistors. And the machines the computers that we really think of today the large ones are all transistorized machines. The next step up and what brings us and brings us into the present day as the third generation of computers using integrated circuitry. Again solid state micro miniature ization. So this supported a great reduction in volume heating cooling problems and so on. Yes. Along with the
advances in the technology of the circuitry there are marked advances in the way in which one approach to handle a problem on a machine has very markedly over the several decades you mentioned actually is the way the machine was organized. Actual concepts of organization these change very little and still based upon the general stored program concept of von Neumann's. Which is to say then that you prepared a program which was a set of instructions and instructions then consist of a series of mathematical steps and simple arithmetic steps. Yes and this would be then fed into the machine and the machine would respond and carry out these instructions. Great you say these did not change very much really didn't change the basic idea that the machine was simple mind that it could only do basic arithmetic very simple mind reading Sheen right. Very simple.
It got faster as time went on and it had was able to store or remember more and more instructions at one time but still all it could do is really add or subtract. Were there any other advances in terms of its functional capability. We speak here of simple arithmetic processes addition and subtraction and the like. Are there any logical functions that the machine could perform. Yes it was almost immediately necessary to provide some way to make logical decisions by the machine. Now of course when we speak of logical decisions in this context are very simple. Again simple minded is a number negative or is greater than some other number. These of course allowed allow a computer to perform loops or branches that allow it to process different take different processing paths depending upon the results of previous computation complete computation and the end of that.
It would come to a position where it had to make a decision and depending upon the outcome of the previous result then it could act in one way or another. That's right. Well developing these techniques then when one is led to the notion of a computer language I gather. What was what is when we spoke earlier about the numerical coming and kind of language. That's right. This is commonly called the machine language. And in the final analysis it is the only language the machine can interpret. For example a numeric code of 50 and a perpendicular position might mean I had a particular number to some other particular number somewhere in the memory somewhere in the memory. Now the machine language of course is the machine's became more complex the number of possible instructions to them going up to the order of one of a large machine today maybe there may be several hundred different kinds of requests you can make. It became
more and more difficult for. My programmer to. Be efficient at all in producing a program. So if you have a long complicated series of computations you want to make the sit down and in detail lay that out step by step arithmetic process by arithmetic process becomes a tremendous burden then it does indeed many large programs today are on the order of hundreds of thousands of distinct instructions and for a programmer to produce this and having to remember the exact numeric configuration of each kind of instruction would just be too much of an imposition so that the problem of the volume and the number of distinct individual steps then becomes a most essential element and developing languages or languages which will abbreviate. Yes some of these a first step above the machine language and one which is
extensively today is known as the assembler language and this is a. A language very close to the individual machine language. But for example instead of the number 54 I add a programmer might write down the letters A D D and this would then be translated by another computer program called assembler. This would be translated into machine language. So you put in a simple abbreviation and you need an add and another routine that's already in the machine as part of its standard capability. Yes. Then translate that into a set of arithmetic elements. That's correct. Are there other examples that come to mind here in regard to assembly language use the old there are. You might say TMI for test minus. This is a plus minus determination. Yes whether the number of the number usually when there's a
register in the computer call the accumulator. Similar to the opera dials on a desk calculator and the results of arithmetic operations usually remain in that register until you place them in the memory. This is normally where the tests and decisions are made. Do you think of any other examples of assembly language other than the simple revisions. Yes the next larger than the I think really the largest advantage of the assembler is that it allows the programmer instead of remembering that all this number size is in cell 47 51. He simply can give it the name size and let the assembler allocate memory locations for his variables self addressing self addressing so that as far as he's concerned whenever he wants to. Multiply by size he just writes MPI MP Y size.
This would then become translated into the multiply command and the particular memory address that the assembler got assigned. So a machine that has a high order of internal freedom. Right exactly. Going on from this assembly language approach to a more abbreviated language. What might be what was the next step and the next major step was the introduction of the problem oriented language this is a kind of a language which is oriented toward a particular problem so that they programmer. For example if he's an engineer or a mathematician he doesn't want to sit down and learn the machine or the assembly language. He can instead put right down to break statement. Or for example to add three numbers you might write down x equals A plus B plus. And I actually write these down write down an algebraic equation. Yes and this would then be translated by a compiler again
into the basic machine language. But here the programmer is thinking and communicating in terms of the language is natural to him namely mathematics. Where does the notion of sub routines come in. Yes such a simple mathematical functions like cosigning and saw an image as a function that certain very simple functions. Of course that is not part of this so it's not just preceding step this or the concept of the suburb. I actually came out quite early. What you do along with the. Concept of the assembler that it was not really very efficient for each programmer to have to write his own sequence of instructions to take the square root of a number. It would be better to have someone sit down and spend some time and produce an efficient set of code to do this and then the standard square root program. Yes and you could always have that in a machine and when you said Give me a square root it just automatically used that program that's correct and these are these programs are called sub routines because there are there are
sub sections of a larger program. You know problem oriented languages one is also able to use them again in a more convenient manner because this is sort of the out and I attribute of the problem oriented languages if you want to take the square root of x and assign that to some variable y. You could say y equals 0 usually ask u r t for an x. Yes you are implying by your remarks that. In contrast to what one does at the desk if you need to sign. An angle to high precision you look up on a table. Yes but in a computer you don't really have a memory what you do is you compute it up by a series you can figure it out by exactly by the series expansion so that you literally compute the value of that simple function using some series of processes. And in almost all cases almost all types of problems this is actually faster and takes less core space than having to hold a table and doing interpellation on to many many digits. Yes OK you were talking really
previously and before I interrupt interrupted you about problem oriented languages. These are specifically tailored to the for the ease of the operator himself that his problem area. That's correct. Is this a generalized to particular classes of computers or types of computers. Well it again depends. Certainly when we speak of the machine language and the assembler language these are tied intimately to it often to a specific kind of computer or to a lion within a certain manufacturers. Available machines and these we tend to call a machine dependent languages but one of the most pressing problems in the computer industry today is the problem of conversion. You write a program a large series of programs for a computer. They run it successfully for a period of time and then you out go the machine and you need another one. Often times you have to go to a machine which has a
completely different machine language. And this often requires I'm talking about hundreds of man hours of reprogramming. Yes. So a lot of work has been done to produce languages which are to some to many respects machine independent and I were translators. Yes since being a translator is common a very common one is the Fortran translator standing for Fortran excuse me formula translation or Algol for algorithmic oriented language so you can take a program from one machine design from one machine and with this program this Fortran rather than algal you can convert automatically to a format which is acceptable to another machine that's correct and both of these are problem oriented languages which And these are the general class arithmetic languages because they are primarily oriented toward mathematics or engineering type of work we are interested in manipulating numbers. What kind of problems do these have
also for example. Oh there are vast numbers of these one might. For example we can handle matrix large matrix operations matrix inversion matrix inversions eigenvalue determinations. And usually these are then built into larger sets of programs for doing such things as solving a complicated differential equation and yes indeed and of course recent work and films that came back from the last probe the Mariner probe or of course all analyzed with a battery of machines much of the programs actually written in Fortran and to reduce the remove the noise and try and resolve the pictures into more detail form. With these and I presume this is fairly up to date state of
where the profession works with the isotope that it has. And looking ahead and attempting to develop better languages more efficient or easier to use languages. What are some of the needs the deficiencies of the existing system. Well there tend to be a number of them. One of course is that. With languages such as for trainer Al gall it takes quite a while for a person to learn the language and become able to program it. This how long for example a typical case with for Tranda become reasonably proficient in it may take on the order of a month or two. At least it's going to take two or three weeks to be able to write working programs so that a researcher working in a particular field when he's to use a computer to solve a complicated problem would have to invest a month several months. Yes certainly to become acquainted with the language and get accustomed to using it and working
with the computer. That's a large investment. It's a very large investment and of course most of the researchers are. Properly not interested in the computer as a means unto itself they want it to solve a problem too. That's right and they want to make as little investment in learning how to use the two Exactly. So we are highly motivated to find a language which is easy to learn. That's very crude. What else. Another problem with the classical methods of presenting language programs to the machine is that of turnaround time. This means to turn around. This is sort of a piece of jargon that's used in the industry. The turnaround time is measured from the time you take your program to a computing center for example handed to a machine operator to be loaded into the machine and the time that your printed results are available to you. Now while the in almost all cases the time to translate from your Fortran program into machine language and even to execute this and produce your results is quite
quite rapid often on the order of a minute or two for even very complex problems with today's machine speeds. There is such a demand for almost every existing computer that you have to wait in line to gain access to it day to day. You may often have to wait a day or two. So you want a system which is easily adapted easily to diminish this turnaround time. Exactly right. Well you mentioned that there was a great demand for computers is X X excess to machine then becomes a problem too. Yes it does. People just waiting in line to use it. Indeed the most common approach now is what's called batch processing and this is one job at a time is submitted to the program to the computing system where the assembler is available of the compiler. The library of sub routines and
these are the problems are run through one of the time under control of a piece of program or software. No one has the monitor or executive system that whose purpose is to try and speed up the jobs one at a time so there's little bed time between jobs so going to handle several jobs at a time while at the most of the time in batch processing it's one job at a time actually in the computer being processed. Now the next approach is often. Now now we're beginning to talk in sort of the experimental areas in the field. One approach is that of multiprogramming where we try to fit several programs into the memory of the machine at one time and with current memory sizes. This is often quite possible to do. And then while for example one program is waiting for some data to be read in from a magnetic tape and is therefore sort of stalled another program may be computing. Right. This is an attempt so different part of computer is being used That's next right.
Again an attempt to try to maximize the amount of productive time that the machine is using so that all parts of the computer are being used. That's correct all the time. That would be a maximum that would be the maximum. So there is then the question of access to the machine and arranging the internal operation of the machine so that it's totally utilized all the time you want to maximize that. Yes utilization. How about the fact that we have a number of these assembly language and languages and problem oriented languages. I should think that the variety and multiplicity of special rules that apply. This is in all of them. This is indeed a large problem with the techniques we know about or even really thinking about seriously today. We just aren't capable of taking even if we wanted to a natural language such as English opera. We just can't take a statement in that form and translate it into machine language. So what we have
done in producing languages such as assemblers and compilers the problem oriented languages has been to develop what are commonly called formal languages. These are languages where we can define explicitly and I as few rules as possible a complete grammar or syntax for that language. Now there's a different one and there's a different syntax depending on the language and of course this many programmers sense specific languages may be. More usable for a given project or one programmer may be writing and working in two or three languages at one time and he's always having to remember. Do I need a comma here. Just the language where the comma can't be at that point and so he to save time he puts down the comma because he thinks it ought to be there and he submits his program and when he comes back the comma was the wrong thing to become a
restaurant thing. He can't say error message so as in human relations it would be nice to have a universal language or just a few standard languages. So that's another thing. And we talked a little bit about this time sharing problem so that I guess that doesn't take care of all the present needs as we need I think I think it does. Well what are some of the attempts not to get around this and to resolve some of these problems and to reach a new plateau of capability. What we're trying to do now is developed sort of a twofold effort to produce languages which are simple to learn and to you and also to enhance the access of the machine so that it can handle a number of programmers or programs at one time. So you're taking care of a number of these need subtle tenuously so you determine in front of us sort of a funnel. Salt and hope we hope we win through
the approach we have been using has been to develop what's called a timesharing system which we. Excuse me are capable of running a number of programs through the machine. Of course at any given instant only one program is in control of the machine. But by having a very fast timer running and microsecond intervals we can switch the computer's control from program to program with the programmer himself setting often times blocks or miles or even all the way across the country from the computer. He's sitting in front of a most often just the common type electric typewriter and getting instructions on the type you can use instructions to the machine on the typewriter. Now one such language that we've developed is known as the citron language it's a language specific specifically designed for ease of
use and ease of learning. It's also designed to be used from the remote terminal and. So that the system is always keeping the programmer advised of any errors he makes and sort of helping to steer him on the right track to get back to this is a learning business. You mention that the Fortran for example would take a month or two months to learn how quickly can one learn the trend. We found in actual cases where we have given me requested that all of our freshman at Cal Tech Lawrence at Rand so they'll be able to use it in their freshman physics and chemistry courses. We present them with a manual which is about 30 some odd pages in length and without any formal instruction. Let them sit down and read the manual and then sit them in front of one of these typewriters and turn them loose if you will. And we found that for almost all of the students and I woke to an hour and a half is sufficient for them to have gamed or was writing the 30 page manual.
That's right there are examples in it and if you want to know how something works there's no need to ask somebody just type it in and see whether it works or see what AI results you get with that kind of credential I think most anyone be willing to tackle I think for a month or two is a rather formidable but an hour seems like a trivial of us. Yes we think that this will be a. Of great advantage when you mention also the remote utilization of the computer. That this particular programming technique allows you to use it well of course that since you say it's simple you can sit at a typewriter at some remote installation and type out your instructions right. Work typewriters are connected by wire or telephone lines to the machine as the well so trying of course being a formal language has rules of grammar and syntax we think that these are all defined in such a manner to keep the inconsistency and make it easy to remember where the
comma should go but one is bound to make mistakes. He might type in a statement with too many prophecies. This car is the Strand processor and the computer will detect this and immediately type out a message. You've got too many left parentheses. So it monitors your instruction and tells you whether or not the instructions are meaningful. That's right and if they're not the Potemkin are right it's not going to let you proceed unless it can understand what you're trying to tell us. It's almost like having your teacher looking over your shoulder. It is in many respects. Is this a rapid very fast language uses one it's easy to learn is it a rapid process however. It's easy to learn it's easy to use. However if you decide to. It's just not a good language for a fast computer. A large amount if you have large amounts of computing it's too slow. That's right. That's how many of the techniques that are used particularly those which allow close monitoring of what you're doing so that you don't make mistakes even
logical ones during execution of your program to slow the process down. So we find that while many researchers are capable of solving simple One-Shot problems with said Tran directly. Also many of them well develop or test their algorithms with the trend in simple cases when they're convinced that they have an algorithm they like and then they will move it and program it say in fourth try and move it to the 1794. So this is then a very neat computer to use to become experienced with using a computer and also checking out the particular competition what that is that's correct. Or at least parts of it and elements of it and then you can go to a faster computer if you have a lengthy program. Exactly when run out samples for example. Exactly this was about science with host Dr Robert McGregor and his guest Mr Steven Kane. Join us again for our next program when Dr. Peter Lewis a man will lead a discussion about weather modification
about science is produced by the California Institute of Technology and originally broadcast by station KPCC in Pasadena California. The programs are made available to the station by national educational radio. This is the national educational radio network.
Series
About science
Episode
About computer languages
Producing Organization
California Institute of Technology
KPPC
Contributing Organization
University of Maryland (College Park, Maryland)
AAPB ID
cpb-aacip/500-4b2x7606
If you have more information about this item than what is given here, or if you have concerns about this record, we want to know! Contact us, indicating the AAPB ID (cpb-aacip/500-4b2x7606).
Description
Episode Description
This program focuses on the development of computer languages.
Series Description
Interview series on variety of science-related subjects, produced by the California Institute of Technology. Features three Cal Tech faculty members: Dr. Peter Lissaman, Dr. Albert R. Hibbs, and Dr. Robert Meghreblian.
Broadcast Date
1967-07-25
Topics
Science
Media type
Sound
Duration
00:30:28
Embed Code
Copy and paste this HTML to include AAPB content on your blog or webpage.
Credits
Host: Hibbs, Albert R.
Producing Organization: California Institute of Technology
Producing Organization: KPPC
AAPB Contributor Holdings
University of Maryland
Identifier: 66-40-46 (National Association of Educational Broadcasters)
Format: 1/4 inch audio tape
Duration: 00:30:08
If you have a copy of this asset and would like us to add it to our catalog, please contact us.
Citations
Chicago: “About science; About computer languages,” 1967-07-25, University of Maryland, American Archive of Public Broadcasting (GBH and the Library of Congress), Boston, MA and Washington, DC, accessed December 26, 2024, http://americanarchive.org/catalog/cpb-aacip-500-4b2x7606.
MLA: “About science; About computer languages.” 1967-07-25. University of Maryland, American Archive of Public Broadcasting (GBH and the Library of Congress), Boston, MA and Washington, DC. Web. December 26, 2024. <http://americanarchive.org/catalog/cpb-aacip-500-4b2x7606>.
APA: About science; About computer languages. Boston, MA: University of Maryland, American Archive of Public Broadcasting (GBH and the Library of Congress), Boston, MA and Washington, DC. Retrieved from http://americanarchive.org/catalog/cpb-aacip-500-4b2x7606