The Machine That Changed the World; Interview with Mitch Kapor, 1990

- Transcript
Can you tell me your first recollection of future first experiences. It was a school. You know I was back in high school and in fact this was during the summer at a summer program sponsored by the National Science Foundation for high school juniors and they actually had an even that it was an old computer is in one thousand 50s vintage computer that looked like an oil burner. It was in the cabinet the same size it had a little ammeter on the front. It had some some absolutely tiny amount of memory but we could we could program it. That was what do you think that it was. Well I thought it was a great toy. I mean I was aware that people did scientific calculations on it and one of the other computers that we had access to was a big CDC sixty six hundred one of these machines that filled the whole room and. You could write a program by punching punch cards one of the time and submitting your deck of cards. They took us through a tour of the Computer Center and they gave us a demo of computer music.
But but it was pretty clear that that the real use in those days was to do batch processing for big corporations to run their financial accounts and things. When you were in college. Well I had more opportunity to to play with computers it took a couple of programming courses in college but I don't know there was a skill set that you needed to have to be a really good programmer you had to be very careful and meticulous and disciplined because if you made even the smallest errors in syntax if you misplaced a period or omitted you know parenthesis your entire program wouldn't run. And it would take hours for you to submit the deck of cards and to get it back and so the little mistakes could completely cripple your productivity and I just didn't have that kind of skill or patience and so I found programming on the old big machines to be unbelievably frustrating and I used to get
really angry about it because I couldn't understand why the things weren't easier to use. So that experience changed what had become interactive it changed. Well pretty much everything had to change they had to shrink in size they had to shrink in cost so that an individual could afford one. They had to come with a capability to let people who weren't extraordinarily technically gifted use them which which meant that there had to be all kinds of new software developed from from programming languages to text editors. So and the users had to change to justify a whole industry that made personal computers. So if the late 70s things it's not going to happen. Yeah well sure I mean in the early to mid 70s even before there were actual personal computers to people like Ted Nelson were writing
about them as though they already existed. And then in the mid 70s the first kits came out that you could you could assemble you had to solder things together to make them work and since I flunked soldering in high school I kind of had to. Pass on that period. But by the late 70s machines like the Apple 2 had come out and you know they worked like consumer electronics you took them out of the box and you plug them in and they had great color graphics and sound and games and I became totally totally hooked. So the one thing the first thing with this and that's the change with your experience is what happened was that the microprocessor was invented people had no idea at Intel what it was going to be used for they thought it would make intelligent refrigerators and microwaves. But a bunch of hobbyists really a bunch of kids out on the west coast saw the opportunity to build their own computers using these very inexpensive microprocessors that were
the heart of the system. And. That was the first big enabling technology. Cheap chips. That's where it got started. So this sort of quantitative change made possible a qualitative change in the whole experience for programmers and then later. Yeah well what happened was that these are the hackers on the West Coast of the Homebrew Computer Club not only wanted to build their own computers which they could finally do given. Microprocessors but they were also doing this originally not for any commercial intent but as a hobby for the sheer joy of it because they love to doing it and because it because they had that kind of feeling about what they were doing. They always shared their information very freely. And the idea was not to build a proprietary system but to build. One system after another each of which incorporated
everything. Everybody knew about building these devices so they got to be better and better very quickly and they also came with a lot of information about how you could extend it how you could program it how you could add more hardware to it. So the whole idea of open systems that that really almost didn't come as a finished product that that came out. And then other people would come along and make it work better or add new functions to it is really a product of that original hacker mentality. So you know that. Well sure because the the whole computer industry now believes in open systems meaning ones in which anybody can come along and and write a piece of software for a given piece of hardware. And if people out there in the marketplace like the software then it will succeed. And so it's easy to get into the computer
business or the software business. Not that many people succeed but because it because there is the promise of a great reward if you do succeed it attracts a lot of the best minds and the most creative efforts. And the notion of building a computer industry around open systems. Really does come from that original group of hackers. You were one of those people running the ID business. So because it was so well in the late 70s there was no personal computer software business there was a software business in mainframes but that was Mars or Jupiter or some other some other planet. And in fact the the business community and the software industry thought of personal computers as toys I mean they ran games they were cute but you couldn't do real work with them they didn't have real software and
they were extraordinarily discounted. I saw an opportunity I thought that the the key was that the personal computer changed the relationship between the user and the machine. In the old days of mainframes they were kept in special rooms. You couldn't even get into the room you couldn't touch the machine you had to do all of your work through an intermediary with a PC. There was no intermediary you could sit down yourself. You didn't have to have a degree in electrical engineering and you could begin to do useful work. It really freed people especially people in the in the business world from dependence on data processing staffs. So it was very empowering and I thought that that empowerment of the individual user could give rise to a whole industry of new products that were aimed
specifically at the non-technical professional. That's interesting. This is recently invented arithmetic engine where the first cases you try to reach the right amount that you said these things. So it was OK. OK. Well the original. The original uses of computers going all the way back to World War 2 were to calculate the trajectories of projectiles. But I think it was very quickly recognized in in computer science that that computer was capable not only of manipulating numbers but of symbols in in general and certainly the training that I had reinforced that in fact the first couple of programs I wrote were much more mathematical or statistical in nature
but the idea of using the computer as a medium of information and communication was never very far far from my mind. Now you talk about these programs being run at what what what what he's going to be what presents us the ideas you would write about with it. Well the it's interesting that every once in a while you can point to an example where a single individual invents a concept heretofore unsuspected that has a profound and revolutionary effect on on how people work and Dan Bricklin is invention of the spreadsheet. Count says as one of those he was at the Harvard Business School at the time and in a classroom where the professor was having the students repeatedly run a financial simulation on on a calculator and so he would say Well suppose
the interest rate were 12 percent not 10 percent. Then would you make the investment. And what the student would have to do is they would have to punch in many hundreds of numbers to run the entire calculation each time one of the critical variables changed and. Brooklyn saw a way to take the kinds of calculations that were done on a calculator and put them into a format on a computer screen that resembled the paper spread sheets that accountants used to do financial projections. In a way that radically simplified the whole problem because with a computer you could just change one single number you could change the number that said 10 percent to 12 percent. Press a button and the computer would do all of the recalculation and it was a radical breakthrough. No one had ever thought to set up a program there that worked that way. And it it just it caught on like wildfire as everybody in business
it ever had to do a budget or a financial analysis recognize this is a fundamental improvement. They said this stuff. Well you know in fact because a visit calc which is what they called it which was the first spreadsheet came out on the Apple 2 originally it gave Apple an enormous boost and it made Apple a credible entrant into the business market it was the thing that really made people take personal computers seriously as a business tool. What does he do. Well. One of the one of the funny experiences that happens to me is that every few months and this is been happening for almost 10 years now somebody comes up to me and said Well I was around in the era of many computers before visit calc and I actually invented the spreadsheet and I always have to laugh because what
they generally mean is that they worked on some sort of financial modeling language where you would use very cryptic language of formulas and equations of a very technical nature that would would would create some budget or financial analysis. The essence of what a spreadsheet is is different. It's not in in the formulas but it has to do with the fact that a spreadsheet has first a visual appearance. When you look at the computer screen you see before you a Tablo of rows and columns and each each cell in this Tablo might have a number in it or might have a label in it. In fact it corresponds very closely to a paper spreadsheet except you see it on a computer screen. There's also a cursor a pointer there is a highlight. One of the cells at any given time has a
highlight on it and you can you can move that point or highlight around either using keys on the keyboard or a pointing device like a mouse. You could type things in and what you type in then is reflected up on the screen and when you actually build your financial model the way you do that is you essentially say I want this number here to be the sum of that number and that other number and you literally do that by pointing. So one bit at a time one relationship at a time you build up the meaning of the spreadsheet but all the time you're interacting directly you're pointing to spots on the screen you're typing things in. And what happens is that you endow the spreadsheet not only with its visual appearance but with behavior and behavior is is that thing which models the the financial situation. You're looking at profit and loss or forecast of how a business is going to do.
The point is that the way you build up the spreadsheet and its appearance and its behavior are all unique to the spreadsheet and are very different than the way you would work with an old style financial modeling language. A spreadsheet has much more direct interaction in direct manipulation. You work with it iterative Lee. You don't have to sort of specify 100 formulas in advance. You do something you try it. If it's not right you fix it. And so the term that Ted Nelson coined to refer to the thing this of what makes a spreadsheet a spreadsheet or what makes a program a program is is he called it a virtuality. A little world unto itself that has an appearance that has behavior that has ways that you as the user interact with it and that has meaning.
That's a virtuality is it. Yes. What so what. Well it's a corsets. It's an illusion. Ah yes. Why does it have leverage. The reason the reason that it has leverage is that that particular virtuality is well suited to use by non technical professionals by regular human beings because it manages to eliminate a whole set of very daunting requirements. You don't have to memorize some intricate formulas or particular language syntax. You don't have to be so precise and careful because there isn't a huge penalty for making a mistake. If you leave something out it will immediately tell you and in fact it will essentially show you how to correct it. It produces instant gratification
because it keeps up with you as you change the assumptions in the spreadsheet you can see the results changing as you go along. So all in all behaviorally it's much more suited for human purposes. It's sort of like prior to PCs and spreadsheets. Imagine. Imagine an analogy to buildings. Suppose we didn't have any homes. Suppose we just had big warehouses that weren't divided up into into into separate rooms and there was no carpeting and I mean they provided shelter. But they had a concrete floor and were very uncomfortable. And then somebody came along and all of a sudden said well you know we could divide this up into rooms that we could could put carpeting down and you don't actually have to cook outdoors you could cook indoors and have have a stove and you could have indoor plumbing. I mean what you would say to describe like why is a house so much better. Than a factory for living in. Is that houses are much better suited for human purposes of living
than warehouses and a spread sheet is much more like a comfortable house for doing financial analyses than than a factory and the fact is that that dead people in in computer science and in engineering have been used to building factories because they had an industrial clientele and and that was what the demand was for. And the new enabling technology of PCs made it possible to build comfortable homes. This is a good one. Well. After visit Kalak came out IBM announced its entry into personal computers and of course IBM had enormous credibility with the business market. They came out with a much more powerful machine than the apple to more memory capacity higher speed and I saw an opportunity to create a commercial product that would include a
spreadsheet among some other things. There was really designed to take full advantage of an IBM PC that was five times more powerful than the Apple 2 and that product became Lotus one two three. And this was still it was successful so I think it's still the single most widely used application in the world in the number of total number of users must be approaching 10 million at this point. As a PC that was a burgeoning new software industry. You mentioned Bill Gates. But still it will be true to say wouldn't it. Many people found operating poltroons you know the inside pages is still way down quite difficult to use. So here's how you get outside of business. Yeah I think it's fair to say that even as the PC software industry was expanding enormously. People continue to find the very products that made them more productive.
Also difficult to use and and frustrating at the same time and a variety of interesting adaptations have been worked out that have enabled us to get to the point where there are tens of millions of PCs. Some of those adaptations for instance are in virtually all large corporations and ones that have tens of thousands of computers. They now have staffs of hundreds of specialists who go around and install software and solve problems provide technical support and without those without that core of specialists people wouldn't be able to use spreadsheets. It's like once they get into the spreadsheet they're fine. But these things are still very difficult to use and very frustrating. The original dream of personal empowerment really has taken much much longer to realize and is still not there by any means.
So this is what I want to do. Question of you to facebook that fact places what's cool with you that's made from a software you know what is viewed face. OK well. You can get a room full of experts arguing day and night about why it is an interface or user interface. But in just commonsense terms there will be some set of visual elements on the screen that characterize how you work with the machine. They may be menus dialog boxes buttons and those elements and how you use them constitute the user interface. Functionally They're the equivalent of the knobs and buttons and switches that you would use to operate a piece of physical machinery. Here the machinery is quote virtual And so that the user interface is really the are the virtual knobs and buttons and switches.
Now there were people who well before there were symbols the canaries who dreamed of watching faceless software could provide the witnesses. Oh why do you feel when you look at the history these things so you know to 73 they took so long to build. The people here are very well. It has taken 15 years or more 20 years really for some of the very key ideas about making interfaces easier to filter from the original research labs to commercial practice. Now it's kind of an assumption that that's quote a long period of time. I mean there's one school of thought that says that any time you have an invention which is going to involve changing human behavior it takes 20 to 30
years to move from conception to being very widespread and seems to be almost a law of nature. In the particular case of computers I can point to several reasons why it's taken still. One reason is that the requisite hardware too that you need to run a and easier to use int. you start again. It's funny in a way that you need much more powerful hardware to create a simpler interface. But that's the way it works and the hardware that you need to make a really simple interface simply hasn't been available at a low enough cost. Until fairly recently that's one reason a second reason it's taken so long is that. The the culture of the computer industry is very
engineering oriented. If by tradition has not been extraordinarily sensitive to the needs of users since for a long time the consumers of computing devices were just as technically sophisticated as the people who were making them. Making things easy to use was just not a high priority and it takes a long time to shift a cultural tradition. And that's happening in the computer industry. But it doesn't happen overnight it happens over a generation. And I think that's the real. That's the real reason so many people in this history lesson. Doug Engelbart is the single most important person in the history of computing at least as far as the end user computing is is concerned. And it's shocking how few people have actually
heard of him. It would be as if we all use electricity and light bulbs but nobody had heard of Thomas Edison. He was at Stanford Research Institute For a long time through the 60s and into the early 70s. And he and his group invented and put into working form. Most of the innovations that. People find to be important. For instance he invented the mouse the pointing device that is used in an simpler graphical user interfaces. They invented the word processor they invented and created a working hypertext system they developed the first working collaborative software that is software which was specifically designed to enable a group of people to work together and communicate more effectively.
And what happened was that a number of the key people and ideas moved from Angle bards lab to a lab at Xerox called Xerox PARC. And from there the next major realisation was that Apple in the Lisa and the Macintosh. And now it's in the process of becoming mainstream. But there's I don't know I've just been off topic slightly. Are you going to get to show any of that 68 angle Bart tape. Great. Okay great that great fun. OK so those people have a first day sword that was as you say elaborate 0 0 was right there. Right right. And do you think that historians will look at that as a watershed beauty because it makes it visible what it what is the very good ideas that they see. One can see this coming as well that one of the most important ideas is direct
manipulation. I mean if you think about it we have a variety of ways of indicating what we want. Suppose you're in a foreign country and you don't speak the language and you get a menu. You're going to find it very very difficult to order off a menu and it will be very frustrating. On the other hand if you're in Japan at a restaurant where they actually have all of the dishes made up in these wonderfully clever. Plastic or however they do that forms. You can point to it you don't have to speak a word of Japanese pointed so that goes to show that sometimes pointing to what you want is a lot simpler and easier than. Forming a linguistic expression. Same thing with computers. In the Macintosh you have direct manipulation you get a pointing device called a mouse. As you move it on the surface of a table a pointer on the screen moves around and you can just point to what you want. So the Macintosh
really represented a fundamental simplification because most of what you do is done by a form of direct manipulation. You point to what you want and then some very simple action like clicking on a mouse is is done to indicate that's it or give me this one or I want that and so that really unburdened the user from having to memorize some kind of arbitrary command syntax. So much more of the the functions many more the functions are brought out onto the surface you can see them on the screen with a Macintosh and that makes a big difference. Another big innovation. Without which direct manipulation really wouldn't be effective at all is the fact that they use a screen which is made up of graphical dots as opposed to characters. And that means that the image the icon the picture of the figure the illustration can
be constituents as basic as as the word which is just not the case. In commercial computers before the Macintosh it means you have a visually richer environment. You can divide the screen up into separate regions or areas you can use the skills of a visual designer. Much better to create a structured environment or a structured virtuality on the screen simply because you've got a much richer visual vocabulary out of which to compose the elements so that that's made a difference that the third big innovation. I would say is the notion that Steve Jobs himself brought to it which was in the context of the Macintosh a ruthless demand for simplicity. There were many machines of a research variety that had a mouse and pointing devices graphical user interfaces and so on.
What the Macintosh did was it simplified it. Other devices had a mouse with three buttons on it and you go and see to I press the left button to get the menu in the right button to extend the solution. Steve decreed we will have a one button mouse and it's tough to go wrong when there's a single button. So they systematically went through and simplified the environment and. At a bit of a cost in terms of flexibility but it was so important at that point in time to swing the pendulum toward simplicity that it was clearly the right design choice because it made it possible for my mother and everybody's mother to use a computer for the first time to get this simplicity paired with with people's because. Very very right. The hardware certainly had to be powerful enough to support this if is to k graphics
and the software you need a lot of complexity in the software that shields the user from the details of the bits and the bytes of the information. People tend to think still that actually it's the machine name when they think a Mac is different from a car. Yeah but actually as we've seen recently with Windows 3 it's really assault would get increasingly more and more of the differential differentiating factors of a computer come from the software not the hardware. And we're used to thinking about it as as the machine and how is this machine different than that one but in fact you're you're quite right to observe that it's software that makes makes you do huge difference that's going to talk a bit about what we think software and computers actually like here is as we said could you just started as calculating instruments and
then people started to talk about it as mental tools but really that's something else as well that is what it is that they may behave like media. Can you tell me. You know I tend to think of computers as digital media digital because they involve digital microprocessors and other kinds of electronics. But there's a very interesting comparison to be made between computers on the wall on the one hand as digital media and print media like books and and magazines and we're just on the threshold of the era of digital media so it's a bit hard to see what it's really going to be like. But there's some interesting contrasts that can be made when you read a book for instance the information is all fixed in its form. You know that whatever is on page 230 is what is on page 230 and that's the way it was
printed. And every single copy of that book has the same thing on that page. In a computer the information is somehow much more mutable much less fixed in its form. It doesn't have to have a single sequential order. Books have pages that are numbered one two three four five six seven. Front to back but on a computer in principle you could take the same overall collection or body of information and organize it differently for different purposes. You could even create a a program that lets the user choose their own path through a forest of information. Computers media are much more interactive. A book. It doesn't do anything different based on what you do because the information is fixed but you can program a computer to respond differently based on choices or selections that you make or
behavior that that you put forward. So compared to print media digital media are our interactive their non-sequential digital media are also connected to the world much more directly than books and magazines. I mean for instance let's say you get your your checking account statement in the mail. It's published it's a form of print media and you can see your balance their checking account statement in a computerized form of your checking account. If you have that you could not only see a balance but there could be a button on the screen and you could click the button and that might be transferring more money into my account and in some way that you wouldn't have to be concerned with as a user you would know that that would initiate a transaction that would result in more money being put into your account where you could imagine somebody drawing a button on your paper checking account statement but you could tap it and click it as much as you want. And you know that nothing would happen it's just
not the paper isn't connected to the world. Computers can be connected to the world because of networks and connections to other computers. In a way that traditional media aren't and this this connectedness can also make an enormous difference. So it feels good that it can emulate All right. Well yes there is a universal ality of representation with digital media because you can you can store and present taxed and images and sound and video so digital media in some sense our digital media is universal in a way that other more specialized media aren't and that's also extremely important. There were as you say a world threshold this week but we haven't got very good ways of thinking about it. We still tend to use it to imitate existing media maybe print
so far. Well when a new medium is developed it typically emulates an old one in its early stages so. The earliest movies were just like filmed stage plays. And it it took a while before it cinema developed its own vocabulary. We don't have a really good conceptual framework for digital media that would let productions and digital media stand on their own. We're still groping around we're still imitating old forms we're trying to invent some new forms and because advancing in this area really requires a lot of feedback from users and is fundamentally a social process it's not a purely technological process. It takes time and I expect the next couple of generations really to be ones of
experimentation. I mean human generations are there. There's a further complicating factor which is that the technology itself is a moving target. It keeps getting better and better and cheaper and cheaper. And so it's always tempting to put off making a product to wait for the next latest and greatest generation of. Full motion video digital desktop teleconferencing hardware to come. You thought of it. An industry based on me that differ from an industry based physically right. Well. This goes back to this goes back to the hackers again the great thing about open systems is that there is a very low barrier to entry for a new firm to come in into the business.
So let me give you a contrast in the old days and in traditional with traditional industrial products. If you want to make something that goes into an automobile with a few exceptions like tires and radios. If you want to make a component you have to deal directly with Ford or GM and they have tens of thousands of vendors that they deal with but there's no real opportunity to just come into the marketplace come out with a product and see how it does because you kind of can't hawk into the car. I mean the engine is the engine that you know the manufacturer puts into it. Whereas what happened in personal computers is that Apple or IBM puts out its basic machine and they publish all of the specifications that tell you how how to make software for it. And they publish all the specifications that tell you
how to make hardware that works with peripherals printers disk drives monitors and so on. What that means is that to enter into the business you don't have to have a contract with Apple or IBM or the manufacturer. You don't have to have a relationship you don't have to be approved by them. You just go out there and do it. Low barrier to entry and that means you get thousands tens of thousands of entrants all competing with each other to make software and and peripherals. And what happens is the marketplace decides which of the good ones and which are not the good ones by voting with the pocketbook and so some products succeed. Those are widely imitated in the next generation and very quickly the overall quality level of products becomes very responsive to the needs of the people. It's a form of innovation that is
mediated not by the manufacturer but it's mediated by the marketplace. Now I've I've thought about this a bit and I think the punch line is that. It's possible in digital media to publish specifications that let you go and write software if you want to. Whereas I don't think it would be possible to write specifications for an automobile for instance. They would make it possible for third parties on the outside to do engines or you know carburetors. And it has to do with the fact that there's a certain persuasion to digital media. I mean it works the way it works and can be completely described symbolically whereas physical objects cars and houses have this inherent sloppiness of a sixteenth of an inch here a millimeter there this is out of alignment with that and they just don't compose themselves together out of pieces the way you can with digital
media so that the bottom line of this I think is that in digital media it's possible to have a whole. Ed in digital media it's possible to have a whole economy that's based on a very different type of relationship between the parties than it is in an industrial arrangement so I think there's a new ecology of innovation in digital media that if it's properly used can be very creative and productive and beneficial to people doing so yeah. What's it like organizing a big song Party. Does it resemble engineering. It's like going to hell and back. I'm a big software project is it's like building a building because it requires a concerted
effort by many different people with different specialties. You not only have the equivalent of of carpenters and electricians but doing software well requires. Understanding and collaboration between designers and engineers in the same way that building a building requires collaboration between architects and builders. The the architect or designer is really concerned with well how is this thing going to work for people. How are they going to be able to use it. Does it meet their needs. What do the users really care about. So the the architect or designer looks at the project sort of from the outside in and makes sure that the suitability of the final product is going to be there.
The programmer or engineer takes a position looking from the inside out and they have to build it they have to make sure that the code works that the building stays up that it's sturdy that it performs the basic functions that it needs to perform and the best efforts are ones that are genuinely collaborative where the outside in approach and the inside out approach kind of meet in the middle. Typically it's not smooth sailing because there are different values different working styles and a whole bunch of incredibly messy real world factors like that. For instance the fact that big projects require not one architect and one builder but they require large teams of people and they all have to coordinate in the exchange of information and let each other know what the other is doing and it can be a very taxing process.
You think it is real. It was so well that the peculiar thing is that the original version of Lotus 2:59 the one that was developed and programmed by Jonathan Sacks and the one in which I acted as designer. Well the programming was done by one person. Overall it was a two person effort. One designer one programmer. It had approximately twenty thousand lines of code. That would mean if you actually listed all the instructions it might be a book of oh about 100 100 pages. And John wrote that himself over the period of a year. Fast forward six or seven years later and look at a project like the third major release of Lotus 1 2 3
at that point. The product already had several million users and what that meant is that the new version had to be letter for letter compatible with all of the previous versions. Or you'd have a hostile lynch mob on the banks of the Charles River. So there is a demand for compatibility. There was a need to be also to be compatible with hundreds of different printers and display devices and we weren't worrying about but a small fraction of that bottom line release three one two three were involved hundreds of people. If you include all the programmers the people wrote documentation the people who tested it and the total amount of code was probably all in all 50 times larger. Then then the original one. So it was a big project I should say not one that I was personally directly involved with at
all. But Lotus was fairly open in discussing the magnitude of the effort. When you get credit because of that it is believed that since it is based on the rules of logic physics if you have a problem it is harder to trace it in Libya so that we will be sure. Flaws in the old NGs are relatively speaking easier to debug because in general the behavior of a physical system. Works on the principle of proximity i.e. the things which are closest to other things have the greatest effect on them. So if you've got a problem in a spot right here you start by looking in the immediate vicinity of that spot. So if you've got something that's not holding its weight well you look to see if the joint is tight if the screws are right of there and you don't have to go and analyze the whole build.
Well software doesn't work like that there's no law of proximity. If you see a problem and it's in one particular. If it manifests itself and in a particular function you could see it on the screen. When you attempt to execute a certain command there is no simple and direct way of knowing which part of the code could have the problem. In some sense it could be almost anywhere and so the detective problem of hunting down the the source of the problem is enormously harder than in physical media because digital media don't obey the same simplifying law of proximity of cause and effect. And you think oh well I think we are going to have to develop our methods of working with digital media. Much further
before we can have the confidence that comes from understanding the materials the physical world provides us with a whole set of natural constraints that we've used to guide the technology of building buildings. We don't have natural constraints when it comes to digital media. Me because code is not physical stuff it's virtual stuff. So we have to in fact invent a set of social constraints. We have to discipline ourselves we have to agree that not everything ought to be possible when it comes to building code in order to save ourselves from being overwhelmed by insurmountable possibilities. In other words we as people have to supply what nature has not and without a certain number of constraints in building a program
that we we live with. It cannot be successful. Let me give you an example. In every software project I've ever been involved with there is an overwhelming temptation in the last days. To go and to change how it works you can't do that with the building. You can't do that with hardware because if you build a five story building and you're a week away from being done you can add another story. Yet in software at least at the abstract level the possibility is there. Programmers say it's just code. I could go on and I could add that routine and we could add that new function. But what they don't take into account is the side effects the consequences the possibility of that introducing other problems or simply that the inability to predict what the real effect will be of changing something. So the only way that I know to prevent that
from happening is to say we will not make changes in the last week. We could but it would be wrong. Nature doesn't stop us from making the change we have to have the strength and the discipline not to make the change ourselves. Yet for me these days is ridiculous. Then you could test the thing you could be deducted and that's good right. Right is you know I think right right. There's enormous complexity in in software and. I think the thing you have to understand is that its programmers work the way medieval craftsmen built cathedrals one stone at a time. There is no equivalent of industrial methods of production we don't have the software equivalent of interchangeable parts. We don't have the equivalent
in building software of the assembly line. It's all hand work and that means you don't have the precision or the reliability to be able to safely say well having built this we think it works according to the specifications Instead you have to exhaustedly test it and there are always particular combinations of circumstances that somehow have eluded the test plan. The way this problem will eventually be overcome is when people build when computer scientists create methods of writing software that's more quote industrial. When they come up with the equivalent of building blocks of interchangeable parts of assembly lines so that you don't have to hand craft every line of code and that is again on the horizon. But it's not here yet. Yes because object oriented. Well technically the object oriented programming is
the approach which is seen as having at least currently the most promise in this area but there are also things like visual programming languages and other techniques. Data flow diagrams that are all part of this broader undertaking of improving the methods by which software is created. Just look at. Yeah and you know what. So let's give people a lot of them because they have to think about it right. And one area we want to talk about the legal issues is you know what I think you know so well what does it resemble. And and she is protected by copyright laws. So things seem to from the cases look Eagles You know well there's a long tradition in this country. There's a long tradition in the U.S. of applying copyright and patent to new media
so that we've seen as progressive extension of of copyright over the years to include motion pictures for instance which were certainly not contemplated by the framers of the Constitution. The current debate about how best to protect intellectual property. It centers around the issue of whether it is possible to successfully apply and extend copyright and patent which are the tools that are that have we've had for a long time to protect software. Keeping in mind of course that the constitutional reason for protecting software is not to enrich the people who make it. That's a means to an end but is to stimulate progress in the arts and sciences and this limited monopoly that people who make software get is in service of that broader goal.
And what I think is happening is that we're facing a fundamental crisis which is that it is no longer clear that we can easily or gracefully extend copyright and patent. To cover software in a way that reflects the original purpose which is to be pro innovation. Digital media may be sufficiently different in their characteristics and behaviors as to stretch the existing framework beyond recognition. And if that's the case then we're going to need to go through a process which I'm sure is going to be very difficult and painful at times in which all of the parties in government and industry and academia sit down and try to go back to first constitutional principles about how do we promote useful progress through digital media by offering the proper protection. So these something that is really here is something wise if you
will. One way of looking at what's unique about digital media is that it somehow combines both the characteristics of a traditional medium like print with the characteristics of a machine. In the following sense that. You think of any machine video cassette recorder telephone answering machine. I mean it has knobs and buttons and switches and you do things with the knobs and buttons and switches and it produces some sort of behavior or effect. Well computer is in some sense the same way. You've got a program that has an interface it has some knobs and some buttons and you do things to it and stuff and things happen. You can create a spreadsheet that way or a document or you could control the lights in your house. But what makes a computer program unlike a traditional machine is that traditional machines don't have content there they're not
information bearing. They don't have meaning and semantics whereas most of what you see on a computer screen is words and images. So Ted Nelson wrote a book with the title literary machines and I think it's it's appropriate to borrow that computer and computer software are this new hybrid that sort of part machine part media. It's got machine like characteristics it has a publication like characteristics. And they're seamlessly interwoven into a single stall and trying to make sense out of that fundamentally new type of of beast is one of the real challenges of digital media. So it isn't so. Well I'm I'm always reluctant to proclaim that anything is the next big thing because we've been repeatedly disappointed by too many next big things.
But the more I look at this the more credibility I think the claim has that digital media. Have at least the potential to be as important as the printing press was in its time. In other words a major turning point for society. I mean when you look back at what's been achieved let's start with what needs to be done as well. Clearly for some tens of millions of people. Computers have proven themselves to be very useful for spreadsheets for word processing and for a variety of other tasks around the office and to a lesser extent at home. That said I think I have to confess my own impatience and frustration. I think
computers are still much too difficult to use. They're extraordinarily frustrating. If you step beyond doing very simple things and I don't think they have to be that way. But but yet they are. And at the same time the potential of the computer and software in the context of digital media to help us be better informed and to communicate with each other. We really haven't begun building those kind of products yet for commercial distribution. So in a sentence a lot of progress to date enormous frustration because they're still so hard to use and most of the great potential of use for these devices still unexplored. Do you think the pros did the effective together with it. I think networks both local
ones and wide area ones that span the whole country or the whole world are the new frontier they are the electronic frontier. And I'm optimistic that there are amazing new services and products and ways of communicating with each other that will come about or can come about as we are increasingly networked together. Great you do that you know which particular command and all that. Yeah mation just scrolls by very rapidly. And if you want to see something that's halfway up the list you have to be very nimble with thinkers and you. Broke it you wrote. OK so tell me about this. Look at this and I was like yeah that's right. This is the basic interface that most of the tens of millions of people who use personal computers see every day. It's Microsoft
DOS and it's character oriented and there are no pictures or images you just get. Just see tax than if for instance I want to find out what files I have here. You have to know the particular command to type in the right magical incantation is d i r and then I see all this information scrolling by me. Actually faster than I can manage to read it and so I have to know the other proper incantation to stop the thing in midstream and start it and then stop it again. And every time I want to do something I have to note a particular expression to use if I want to copy a particular file from one place to another I have to know. How to. Type the particular commands in the order and the syntax and why am I typing a backslash here and it turns out that there's a lot of memorization. And it's also very arbitrary it wasn't designed to be easy to use.
It was designed to run originally in very small machines that weren't capable of acting very smart. So what happened though is that more. Powerful software was developed which runs on exactly the same machine that completely changes both the look and the feel of the machine. So the same exact hardware but a completely different way of interacting with the machine. So if I bring this up and I'll take a minute to come up I'm going to see a graphical screen I'm going to see some color. I have a pointer that I can move around on the screen. I've got a mouse here I can click on that to make things happen. And now I see these objects called Windows each of which have icons in them and each icon stands for a particular program or document. And just by simply clicking I can make things happen and. It's a machine the same exact machine. But given different software it
creates an entirely different style of interacting with it. You just cannot do that again. Well this interface is familiar to tens of millions of people. This is Microsoft DOS and it as you can see is character oriented No pictures no images. If I wanted to tell it to do anything I have to know the particular command to type in its name in a particular syntax to use. So if I want to copy a file for instance I have to know what the meaning of a colon backslash this is that other things and if I miss type a single character then it will give me some sort of obscure error message. Here I got one that says not ready error reading drive a abort retry fail. It's kind of cryptic and it's certainly not very friendly. It's a typical old style interface. The interesting thing is that simply by changing the software not the hardware
exact same hardware and environment can be built. That is much simpler much friendlier much more graphical and I'm going to bring that up right now. Take a look at it. One thing to notice is that I now have a pointing device as I roll this along the desktop you can see a pointer moving along on the screen and I find what I want I can just click on it and make something happen. And in order to get access to a program I don't need to know any special commands I can just find it's icon this little image here of it. I can point to it and simply by clicking or in this case double clicking I can invoke the program and I can start I can start using it exactly the same. Same exact hardware but yeah the experience is totally different ball. The look of what you see is different and the feel of how you interact with it is different.
From how when you click on the mouse most of the time the program is actually just sitting there waiting for something to happen. It's sitting in a loop that takes computing right. That's if this has anything happened. No. OK. Has anything happen. OK. Has anything happened. Somebody clicked on the mouse. Really. Where did he click. Well he clicked it spot 200 comma 75. Is that on top of anything or is it in blank space. When I think it's on top of yes it's on top of the icon for that you know. Yeah I was going to sorta ok but we don't know yet. Now let's look simple but cute and upright what's actually happening. Well once the computer is constructed this display it's basically sitting there waiting for something
to happen. It's kind of like the programmers sitting there asking selves anything happen no to anything happen. No one is doing this. Tens of thousands of times a second and it waits for something to happen. So for instance if I move the mouse. Now when I'm moving the mouse something is happening which is that it's tracking the position this is how you move the mouse over here. Ok under splay it over there and redisplay it over here. Another event that it can detect is when you click on the mouse so if I click what happens is it selects that particular icon. Now Actually it's doing quite a lot of computation to figure that out because what it needs to do. After it receives the event he clicked on the mouse is he needs it needs to find out we're. So there's a coordinate system that describes the screen and there you know is a pair of numbers that describes the location and then it has to see if that location corresponds to any of the objects on the screen so it's got to essentially
have a little way to calculate which object if any it's on topic. In this case it was this one this thing called review notes. And once you figure out which object then it has to figure out which action to take if the event was a click the action is just to highlight or select the object. But suppose I didn't just click the mouse want Suppose I clicked it twice. That's a different event. And when you double click on an object it actually opens it up it. I have to do that again because it's the wrong document I'm going to set somebodies review to keep running. It would be OK with that. Let me go back to this. Yeah yeah OK. But yeah if you want it can I. That's somebody up there you were saying you want it we will just ground you for going to state you know it for a long time we just focused more on you.
Yeah when. Well one of the other things that the computer asked to figure out is in addition to which object you might be operating on is which action you want to perform. If you click once. That means to highlight the object. So there's a graphical routine that understands how to highlight an object. If you click twice on an object it has a very different meaning which means go. Open up the document show it to me so that I can work on it. One order to do that on the Macintosh. It has to go and figure out which application made the document. This is a word processing document or is it a spreadsheet. This happens to be a word processing document here and so it has to maintain a list for every single document. What what type it is which is to say which application made it. So when I double click it gets the document name it looks up it's typed it finds out what application made it. It
opens up that application actually passes control to that application which then goes and opens up the document. So it's actually going through sequences of many hundreds of discernable events that they could be described in this fashion. Every time you do even the simplest operation so we try to take it down from there. The user clicks twice in software terms how many lines of code for something like that. I would say probably a small number of thousands and we could take a look at what the code actually looks like it's this. Very cryptic. TIME 1 0 this is actually base 16 arithmetic hexadecimal the digits 0 through 9 and the letters A through F and you have many hundreds or thousands of these type of instructions actually being this is executed this
is. Yeah this is machine language. And so each one of those in terms of actions carried out by the computer a busy one. Well if we look at the bottom most line of the screen I'm looking at it says move a dot L and on either side of it's got more stuff. Each one of those things refers to a single instruction executed by the microprocessor. Move L means move one item of information. Typically a character or or cell from some location to some other location inside the computer's memory and it will literally take fan Aus and such instructions to actually perform and an event that has meaning to the user. It's moving things it's adding it's attracting it's shifting it's comparing all at a very atomic level.
So so in other words I mean the simplicity of it you get over that there's tons of it considering there is enormous complexity in the same way that a small number of of atoms can give rise to the small number of different kinds of atoms when he's Norma's number of different kinds of molecules which in turn give rise to all of the different varieties of matter a very small number of building blocks in terms of simple instructions are used to create the programs that do all these different things. Ok we get the desktop and we'll just go time to time. Yes yes yes. All right. Keep running. OK that's that is the way we get in the night. Yes. What's it with the computer. Well at this point it's really just sitting in a loop which is technically called wait next event it is waiting for an event. So it goes. Is anything happening. Is anything happening. No. Is anything happening.
He moved the mouse and it's doing that you know many thousands or tens of thousands of times. Second if I click on the mouse. Now we've seen the effect is that it's highlighted that particular icon. There's actually a lot of computation going on. The first thing that happens after it picks up the click is it has to figure out well where did he click. And that answer comes in first as a as an x y coordinate. You know it's an exposition 200 y position 300. That's the easy part. The harder part is given that position to figure out which of these objects if any correspond to that so there has to be a kind of an internal map that shows the Bandys of each object in a computation that says well for this particular coordinate this is where it clicks. Now note that if I if I move the window it all still works properly. So there's a lot of it's not like hardwired that this coordinate corresponds to that object. The other thing that it has to do
is to not only figure out which object is involved but which action I want to perform a single click just me to highlight. But a DoubleClick has a totally different meaning and that means go and give me the document that that icon represents. And let me work on it. Now in order to do that it has to figure out well which application created that document. This is word processing documents it's just text here. But that means the computer has to know from the has to have some way of getting from a particular document to the creator of it and find where that creator is on the disk and launch that in other words get the application going and then pass control to the application which knows how to open up a document to work on something thousands of lines of code right. Oh yeah absolutely.
You can see this is actually a sample of the instructions that were being most recently executed when we were working on the machine and each one of these is a is a very atomic. Instruction. It will be something like. Take this one particular character and move it from one particular memory location to a different memory location and there will be hundreds or thousands of these that have to be strong together in order to perform even the simplest discernible actions. So for a simple use of DoubleClick it could be thousands of oh great. Oh absolutely thousands or tens of thousands.
- Raw Footage
- Interview with Mitch Kapor, 1990
- Producing Organization
- WGBH Educational Foundation
- AAPB ID
- cpb-aacip-15-6m3319s60d
If you have more information about this item than what is given here, or if you have concerns about this record, we want to know! Contact us, indicating the AAPB ID (cpb-aacip-15-6m3319s60d).
- Description
- Episode Description
- Full-length interview with Mitchell David Kapor. Portions of this interview were featured in episodes from the WGBH/BBC Series, The Machine That Changed The World. The Machine That Changed The World was a five part series chronicling the personalities and events of the computer revolution. The program traced the history of development of computer to the modern personal computer, to future developments on the horizon. There was a focus on history of computers from 19th century to PC, present day applications, and future developments. Mitchell Kapor was a founder of the Lotus Development Corporation in 1982 with partner Jonathan Sachs. Mitchell also designed Lotus 1-2-3, a spreadsheet program that had a large following in the 1980s. Select metadata for this record was submitted by John Campopiano.
- Created Date
- 1990-07-17
- Asset type
- Raw Footage
- Topics
- Technology
- Subjects
- Interactive Computing; information technology; Personal Computers; Computer software developers--United States; Kapor, Mitchell David; Computer software--Development--United States; Computer software--Development--History; Microprocessors--United States--History; Lotus Development Corporation; Lotus 1-2-3
- Rights
- Rights Note:,Rights:,Rights Credit:WGBH Educational Foundation,Rights Type:All,Rights Coverage:,Rights Holder:
- Media type
- Moving Image
- Duration
- 01:15:46
- Credits
-
-
Interviewee2: Kapor, Mitchell David
Producing Organization: WGBH Educational Foundation
Publisher: A WGBH Boston/BBC TV coproduction in association with NDR Hamburg
- AAPB Contributor Holdings
-
Identifier: cpb-aacip-7f20307f79e (unknown)
Format: video/quicktime
Color: Color
Duration: 00:00:00
-
Identifier: cpb-aacip-f4667026cca (unknown)
Format: video/mp4
Generation: Proxy
Duration: 01:15:46
-
Identifier: cpb-aacip-916c9b889ae (unknown)
Format: video/mp4
Duration: 01:15:46
If you have a copy of this asset and would like us to add it to our catalog, please contact us.
- Citations
- Chicago: “The Machine That Changed the World; Interview with Mitch Kapor, 1990,” 1990-07-17, American Archive of Public Broadcasting (GBH and the Library of Congress), Boston, MA and Washington, DC, accessed May 14, 2025, http://americanarchive.org/catalog/cpb-aacip-15-6m3319s60d.
- MLA: “The Machine That Changed the World; Interview with Mitch Kapor, 1990.” 1990-07-17. American Archive of Public Broadcasting (GBH and the Library of Congress), Boston, MA and Washington, DC. Web. May 14, 2025. <http://americanarchive.org/catalog/cpb-aacip-15-6m3319s60d>.
- APA: The Machine That Changed the World; Interview with Mitch Kapor, 1990. Boston, MA: American Archive of Public Broadcasting (GBH and the Library of Congress), Boston, MA and Washington, DC. Retrieved from http://americanarchive.org/catalog/cpb-aacip-15-6m3319s60d