thumbnail of About science; About artificial intelligence
Hide -
If this transcript has significant errors that should be corrected, let us know, so we can add it to FIX IT+
This is about science produced by the California Institute of Technology and originally broadcast by station KPCC Pasadena California. The programs are made available to this station by a national educational radio. This program is about artificial intelligence with host Dr. Robert McGregor and his guest Dr. Gilbert McCann professor of applied science and director of Cal Tech's computing center. Here now is Dr. McGrath. We define as intelligent thought. All of man's efforts to derives order to concepts or laws from observations of himself and the world around him by focusing attention successively on small segments of the total body of information and attempting to enlarge the field of his concepts by correlating these studies. In the context of our topic this evening we speak of course in terms of artificial intelligence and that translated into modern language pries means computers. Could you tell us something about computers in terms
of how they developed and the historical background. Yes but the real practical development of computers started during World War Two. One of the first important computers was actually developed at MIT by the end of our bush. This was when was that about. This is about 1944 45. It was finished. This computer was designed for a very specific task to solve the differential equations of describing the trajectories of ballistic missiles and the automatic control problems of gun controls and fire control devices which was a technology for World War 2. The first real digital computer which deals with numbers in formal mathematics mathematics was built by Dr. Howard Aiken at Harvard University and other pioneering university that has pioneered in the development of computers. His mark to computer was a mechanical attempt to mechanize the desk calculator. The first truly
modern concept of a rather high speed large scale computer was developed also at the latter stage of the war by two professors at the University of Pennsylvania. Martin actually and they developed the first vacuum tube a truely electronic computer called the ENIAC. Now out of this technology was fostered by let's say the war effort we had at the end of World War 2 in 1946 an industrial computer. Activity that amounted to little 7 or 8 million dollars a year in the last 20 years. This is burgeoning into about a 20 about a seven billion dollar industry. Some factor of 10 growth plan with tremendous impact on all of our every day life. I noticed and making your remarks that you indicated in the case of the second step in computer development the notion of a digital computer.
This distinguishes that type of computer from what other type that the analog computer is the common name for a class of other computers which sort of simulate by mechanical or electrical principles of electricity. Whereas the digital computer works in terms of mathematical formulation of the problem. You have to put real numbers into it in some form and it manipulates these numbers by the laws of arithmetic. Now these are the computers a preponderance in use today. Yes by far that seven billion dollars worth of industry 99 percent of it is probably in the form of large scale digital computers so that the analog computer still is an element of our need and interest but only a very small drive not grown at anywhere near the rate the other has. How would you describe the functions of a computer. Even these that work in terms of arithmetic concepts. Well we can think of this first in terms of what most of us have from anywhere the desk
calculator or desk calculator is an elementary form of a digital computer. You can put numbers into it. You can multiply or add two numbers and produce a product which is an arithmetic operation but you have to do these Each step by step yourself. Now the modern digital computer first is has different than the desk character in the fact that every one of these arithmetic operations in a modern high speed computer can be performed in a few millions of a second. In addition to this to compare modern high speed digital computer as a large memory for instance it may have a memory made of little magnetic cores that can store tens of millions of numbers or words and it can get at these numbers. Bring them in do its arithmetic processing function in a few millions of a second so that the two prime functions then are the business of the data processing one of the arithmetic steps that the computer takes and secondly its ability to store
information and retrieve that information for further processing or manipulation. In one important part of this memory is that it so we're going to get even the instructions. That it may do in an automatic way many many thousand of these individual steps can be done automatically by what is known as a program which is also information words or instructions which are stored in these high speed memories in talking about their automatic process. Does the computer infact work in terms of the same kind of arithmetic that we know all of the number systems that we know are there other applications over other approaches that I use. Well mathematics has developed a large number of number systems for instance we think most people in terms of the decimal system 0 to 9 the characters of the numbers. Actually most computers are more efficient designed to work in what's called a binary system. Just a 0 and 1 just two numbers. But there are
relationships between the binary system and the decimal system and it's easy for a computer for instance to take information that a human may given decimal form convert it to this binary form produce arithmetic operations in binary form converted back to decimal form and using information in the system that you understand for instance. Is this a convenience because of the electrical mechanization of the system. Yes it's more efficient you can build faster computers smaller and cheaper. If you use the binary system. You indicated this Marge growth and computer utilization. What are in fact some of the applications and the areas of our social interest that utilize computers. You know Bob one of the biggest applications is our federal government and state governments and even our municipal governments. It's amazing. Well it just would stagger the imagination to find out how many millions and millions of words of information vital statistics printed in from all the all the citizens of the
country are stored away in the memories of computers that are possessed and operated by our various governments. So they probably are the largest user. So this is by and large a storage and processing of specific kinds of data. It's not so much a manipulation of the information and ethnically but merely a storage and library function. It's not the kind of well there's quite a bit of processing or planning ahead of various logistic operations. There's a lot of what might be called really sophisticated mathematics it's necessary to plan the modern government operation. One of the second big users it's becoming more and more important to our everyday life are the large automated factories that are being developed. The automobile industry is becoming highly automated. They petroleum large chemical automatic factories which are having an impact on we humans because of the potential effect it will have on our
labor situation replacing let's say human labor by automated factories is a very important development and in the field of automation and computers it's interesting to look at some of the things that are being done in terms of the use of computers for our health. The computerized hospital is here a practical thing. What do you mean by computerized hospital. Well no I know number of hospitals are now very successively developed successfully developed methods by which computers will keep track of the records of patients that are being treated in the hospital. The records of the drugs that have been. Given that he determines when the nurse should give them new drugs so that there will be no let's say mistake made in the treatment of the patient. The laboratory tests that are made for instance urine tests and blood tests and X-ray tests. Nowadays these data are taken and can send immediately to in many hospitals to computers which analyze these data and actually perform very
effective diagnoses themselves and tell a doctor what the disease is so we have a very nice observer here and consultant which manages perfect memory so to speak and has a large memory as a matter of fact to make comparisons and correlations. Is that the kind of. Yes and in this particular case since this process can be well-defined let's say by people who think about what the process should be a very high reliability let's say of these computers makes them very reliable in this sort of a function. It across the country through Western Union or the telephone company to central computers in their decisions to make transactions in the New York Stock Market are controlled by these computers. Even two in terms of making analyses to determine the best time to buy or sell stocks for the most profitable type of operation. This means however that they must have some rules by which to instruct the company that's right in a lot of
their attic over search has been done to apply to develop let us say the proper mathematics for this type of a business and it also implies that there's of understanding what the process is. Well it's true that there are a lot of things one can't predict about the trends of the stock market so that sometimes these computers don't make the right decision. Airlines have done a very good job in automating their operations. Many of us are probably impressed by the very great efficiency of the ticketing situation these days how easy it is to for you to obtain a reservation how reliable your reservation is kept for you and so on. The whole control of ticketing and all of the operations including flight control of the large airlines are highly automated with large scale computers had and I remember many a time standing from a counter and asking if there's a flight available from someone so it's such a time which I wanted to change to and in moments I could get an answer. It wasn't just a casual telephone conversation.
I actually the agent probably picked up a telephone and dialed a number and talked into something that was actually perhaps talking to either a person the other end to push them buttons to get some information out of a computer. In some cases however they have a direct connection to the computer I think can address the question to compute it as fact. How about other areas of our society. Well some of the more glamorous ones that we read about in the newspaper and so on are the military in our space research programs which have had a very profound impact on the fundamental development of new computer technology. The need for automatic control obviously in outer space where situations where human beings can't cannot or not cannot even be affected perhaps if they could be there is a spirit of very profound amount of very fine basic research and automation computers. Perhaps however some of the more interesting applications of computers today are
in education and in basic research in educational institutions such as ours at Cal Tech and that's notable places as I mentioned earlier MIT and Pennsylvania and Harvard. What is the business of research in computers as does the development of computers persay already utilization or is it both. Well actually first there are the present technology of course of computers which we saw breaking ages coming the modern computer that I say has been developed largely through the application of modern electronic technology in the use of formal mathematics. These types of computers just as we know and today and as they work let's say effectively are very useful in basic research for processing data of modern research. It's interesting to note that some of our modern title research searches and such as nuclear nuclear research or research by modern biology and the complex question we wish to ask about.
Living nervous systems by and large the application the definition of a problem in mathematical terms and then asking the computer to perform the mathematics. Well let's say that modern research basic research is so complex and gather so much information that even let's say the keeping track to storage and keeping track of the information so you don't lose it might recall require these large memories and then you have to ask to say examine the data correlate many different bits of data and there are many situations where let's say existing formal mathematics is quite adequate but technology in other words in the mathematics that exists is quite adequate. However there is developing in a number of institutions in one Cal Tech is one of them. A basic research program which is focusing focusing its attention on what we are we call information science. Now this is research which is trying to get at let's say a better
understanding of how human beings or any living nervous system understands processes information does creating thinking rather than does formal things which are defined at say by mathematics in order to let's say determine how we might build a next generation of computers which could be even let's say more fascinating and what they can do and might be what we will call the products of automatic or artificial intelligence of the future. The language of the computer today is mathematics present as We Know It Is it in virtually all most applications. Are you talking here about different kinds of mathematics the richer languages so to speak. Well there are many kinds of mathematics which deal purely with numbers and in terms of let's say a language which describes them they are not to complex each language is nowhere near as rich say as the languages by which we have for instance a conversational language we're using these languages to call algebraic language as
they deal with small units of numbers you make all the numbers by a symbol and the language is typically one of taking a given number multiplied by another number through formal series of these sorts of things out of this kind of a rather simple language. One can take arithmetic expanded arithmetic to algebra geometry. Can one can approximate by finite arithmetic finite difference arithmetic calculus one can deal with stochastic mathematics theory and probability and these are the more common formal mathematics that are used by let's say the present types of computers that are trying to. So you're talking really about going beyond as I would now we're of course in our basic research that seeks a better understanding of artificial intelligence to find let's say a richer kind of a language that can deal with more sophisticated concepts and we feel that the basic concepts by which we can do this are here the biggest problem let's say is not so much one of
one to building a computer that can do something more creative but understanding what creativity is. Defining it in terms of a richer language and one can build a computer that can perform more creative thought. What are the elements of creative thought in this and the context of our present remarks then. What are the things that you would call creative as it affects a computer or what the computer can do. Well one good example Bob of this is the problem of vision. In fact this is one of the areas of research. That we're engaged in to try to get a better understanding of what we call creative thought. I'm kind of curious that you selected the example of vision. Does this have a particular uniqueness or advantage. Well for one purpose one advantage of it is that the visual the sensory system of me and the visual system starts with say that sight sensory system of the eyes the information let's say which is coming into the living nervous system the sight sensory system can be well-defined it's
known visual patterns they process let's say of trying to probe into their visual nervous system is easier than going higher up in the nervous system. I think the biologist has learned let's say how to get inside the visual nervous system without just damaging a living organism to get some to actually see how it functions. Information on how it functions. So there are certain biological point of view an advantage that you can get at it understand it and perhaps profit from that experience. Now it's also interesting that in for instance in humans the sight sensory system is the most sophisticated of all our sensory organs. So you must be talking about lower forms of life that well even in humans that we're interested in vision. But for instance 90 percent of all the information that comes in through the century organs of Man comes in through each eye each eye for instance has over 100 million rods and cones which can and other words there are 100 over 100. Million little points of
detail have a visual pattern it said. That's projected on the cornea of an eye. He's like a dot on a television set. But there are 100 million of us rather than a few thousand in it. Now it's interesting because of this type of us site's entry system the human system is so complex it's also important to do research on simpler organisms in the research program at Cal Tech which is seeking to understand the principles by which a visual nervous system abstract visual patterns is being done not only in humans but on insects and many different successively more complex animals. I believe I've heard of what going on with the frog. He was hot he has a great deal of work at MIT on the frog because it has one has been discovered to have one interesting property which has enhanced understanding of this process of abstracting visual patterns that the eye of the frog. It's been discovered that right behind these rods and cones where
the first light information is is trance formed and from the light energy into electrical signals which are the is messages with messages which are electrical which is the form the physical form in which all information flows in the nervous system right behind these rods and cones there are several million little tiny neurons. And these neurons take through a very fine dendritic connections to the rods and cones. The information from many rods and cones produced new electrical signals which they send on up to another point and or a system which is the first stage of data processing are abstracting a concept so they take signals from a large number of these light sensing elements in the eye. Organizing information in some particular way and then send it on to other processing center they send it up into the brain of that of the frog. But but interestingly enough this first stage of data processing or
computing abstracts certain important properties of the frog needs for survival. For instance some of these are the kind of program already in there already right in the retina a small computer and even think of the retina of a frog as a small initial computer that very rapidly computing certain things necessary let's say for the survival of the frog for instance. Some of these neurons recognize a fly when it's flying across the heel of vision simply by noting the fact that there are small dark objects and they are moving in the direction in which they're moving determinist to the frog. The fact that this may be a fly is flying in a certain direction is within his range and he sticks out his tongue and tries to grab it. That's the final signal that comes back out of the computer with the given the signal that I will have the machine the frog act in a certain way. Is that the kind of the idea. Yes as a matter of fact we think we know enough about this principle of how the recognition of small dark objects might be thought of as insects or a
very large dark dark object at the fly might think of as a as a hawk and he should jump in the water instead of sticking out his tongue. We enough is known for instance from research of this sort on the principles of that type of pattern extraction that people of a number of people who actually build electronic pattern recognition devices which can function as well as the frogs. So the element of this artificial intelligence one element is the business of pattern recognition. Or organising organizing information recognition. Yeah this principle of how you abstract important patterns concepts out of tremendous amounts of information to millions and millions of bits of visual information is probably the same process by which the central cortex of the brain abstracts other kinds of concepts from nodular made non visual information. It's hoped that this is the case and therefore by learning how vision works we will learn how the central cortex does creative thinking. One other important element makes up this
notion of artificial intelligence. Well and to actually understand it in a conceptually conceptual manner requires what we what if we think of as a language an adequate language of description the fact the Science Information Science is based upon the theory of languages. Mathematical linguistics. Now the formal language of mathematics have their algebraic languages. Mathematics However today the formal types of mathematics that have been developed. Are extremely in terms of their capacity to understand something as complex as let's say a human nervous system are extremely simple. We think of mathematics as a very complex mathematics today. Mathematics of algebra and calculus and stochastic mathematics is really not adequate to describe the very complex processes let's say I raise on enough. Or is it not. Rich and I want it just is not rich enough in terms of the amount of information it can handle or the
complexity of the language. Richer language is we think a richer language by saying Let's say that a noun is describing an object that the amount of information that's contained in the real description of the object. Well a single number may be 10 digits but the recognition of a person as Bob might be millions and millions of bits for information is required by the mind to scan through its memory to finally say yes that's Bob. So Bob is a richer much richer word than say a simple number as a dish or a. And this is what we mean by richer language. Well how has this interest in examining and gaining experience with the am I. Lower animal forms as to developments and as computer technology both in terms of pattern recognition and in terms of the language development. Can you talk about the long haul actually. We've only made a beginning and really understanding living nervous system. We've learned a certain number of very small number
relatively thousands of bits of important information but that's only a very small percentage of the hundreds of millions of let's say of things we have to know before we truly understand a living or a system. Nevertheless this says this. The research that has been done up to the present time is extremely important. It has laid the foundations for new strategies of research in new theoretical strategies particularly that must be developed in order to create let's say these richer languages. And there is it for an interestingly enough mathematical linguistics has now been developed to the point where one can indeed approximately model some of the simpler aspects of the mind of man as he even thinks in a conversational language. One of the professors at Cal Tech has actually developed through the extension of the theory of logic in mathematical linguistics a concept of a computer memory and a language structure whereby it really
can think in English it is told simple information slowly in terms of English it builds up on knowledge just like a baby me perhaps learned to recognize there are many words. There's literally a learning process and it indeed is simulating in the first stages at least of a true mathematical concept of a rich conversational language and perhaps a way in which some of the basic features in which one of my not are works is not really essential this business of the learning process is not part of the creative element. Well it certainly had something related to it or a corollary to it. Well it's been well established that the living nervous system of most particular the higher order animal such as man is something which develops continually from birth. The living nervous system is continually changing the way in which it functions as a result of sensory experiences and new information which comes into it and is continually learning how to think say and function in terms a richer and richer languages
which describe its true behavior. I get the impression and at the moment we're very much in the beginning stages of the learning process ourselves and the development of computers as are the sources of artificial intelligence. Would you say that we were at the crawling stage so to speak. We're just beginning to really cross but we're making progress. And perhaps we can look forward someday not to distant future I hope. Where. We can have devices which attack m sounds which are languages in which to work and accommodate and the hand of larger quantities of information and extract other information from them. Well thank you very much you know. This was about science with host Dr. Robert McGregor and his guest Dr. Gilbert McCann professor of applied science at the California Institute of Technology. Join us again for our next program when two more prominent scientists will discuss a subject of interest about science is produced by
the California Institute of Technology and is originally broadcast by station KPCC in Pasadena California. The programs are made available to this station by national educational radio. This is the national educational radio network.
About science
About artificial intelligence
Producing Organization
California Institute of Technology
Contributing Organization
University of Maryland (College Park, Maryland)
If you have more information about this item than what is given here, or if you have concerns about this record, we want to know! Contact us, indicating the AAPB ID (cpb-aacip/500-8s4jr24n).
Episode Description
This program focuses on the science behind artificial intelligence. The guest for this program is Dr. Gilbert D. McCann, California Institute of Technology.
Series Description
Interview series on variety of science-related subjects, produced by the California Institute of Technology. Features three Cal Tech faculty members: Dr. Peter Lissaman, Dr. Albert R. Hibbs, and Dr. Robert Meghreblian.
Broadcast Date
Media type
Embed Code
Copy and paste this HTML to include AAPB content on your blog or webpage.
Guest: McCann, Gilbert D.
Host: Hibbs, Albert R.
Producing Organization: California Institute of Technology
Producing Organization: KPPC
AAPB Contributor Holdings
University of Maryland
Identifier: 66-40-12 (National Association of Educational Broadcasters)
Format: 1/4 inch audio tape
Duration: 00:28:18
If you have a copy of this asset and would like us to add it to our catalog, please contact us.
Chicago: “About science; About artificial intelligence,” 1966-11-21, University of Maryland, American Archive of Public Broadcasting (GBH and the Library of Congress), Boston, MA and Washington, DC, accessed March 2, 2024,
MLA: “About science; About artificial intelligence.” 1966-11-21. University of Maryland, American Archive of Public Broadcasting (GBH and the Library of Congress), Boston, MA and Washington, DC. Web. March 2, 2024. <>.
APA: About science; About artificial intelligence. Boston, MA: University of Maryland, American Archive of Public Broadcasting (GBH and the Library of Congress), Boston, MA and Washington, DC. Retrieved from