Rich Pasco

From Computing Pioneers
Revision as of 00:31, 6 May 2015 by Computingpioneers (talk | contribs) (Created page with "This is a transcript of an audio interview. This transcript may contain errors - if you're using this material for research, etc. please verify with the original recorded inte...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

This is a transcript of an audio interview. This transcript may contain errors - if you're using this material for research, etc. please verify with the original recorded interview.

Source: ANTIC: The Atari 8-Bit Podcast

Source URL:

Interviewer: Kevin Savetz

This interview is with Rich Pasco who was Atari's manager of VLSI Development, that's a very large scale integration development. Where he worked on the FREDDIE memory management chip which was used in the Atari XL and XE series of computers. He worked at Atari from November 1982 through May 1983. He lobbied management to create some products for the Atari 8-bit line, including a mouse and an 80+ column display system, which were not developed. Prior to his time at Atari, he was a member of the research staff at Xerox Palo Alto Research Center, PARC. This interview took place March 27, 2015.

Kevin Savetz: I'm mostly interested in talking to you about what you did in the LSI development at Atari, but before we get there I notice that you've started things off at Xerox PARC. I was hoping you could tell me a little bit about that.

Rich Pasco: I was at Xerox PARC from 1878 to 1981. I had three years of work there. My research at Stanford had been in the subject of LSI design, Large Scale Integrative Circuit design, and I went to Xerox PARC with the idea, of extending the capability of LSI design to the masses. I worked with Lynn Conway, who in partnership with Carver Mead, had written a book called instruction to the LSI system design. It's interesting, back in those days there was a small scale of integration. Which means there was a chip that had a handful of transistors.

There was medium scale integration which would be a larger number of transistors. You're probably more familiar with TL, transistor logic, they have large scale of integration which had 2,000 transistors on that chip.

And very large scale of integration which had tens of thousands of transistors on that chip. So most of the chips that are used today would be considered to be LSI by those standards.

Carver Mead and Lynn Conway, Carver Mead of Kel Tec and Lynn Conway of Xerox PARC, had a belief that any electrical engineer can design their own integrated circuit chips. It's a matter of writing code and printing it out to a device.

It's not really that simple because it required some understanding of the physics of the device and electrical design rules to make the circuit work, physically.

But they had come up with a simplified set of design rules that could be taught to any engineer fairly quickly and developed a course around that. While I was working for Xerox PARC, I designed several, medium scale integration chips, an Ethernet controller for example.

It was interesting because I came at this not as a seasoned designer, but as an algorithm oriented guy. Yet it was possible to put devices on chips. The other thing I learned by working at Xerox PARC was how wonderful it was to have a mouse in my hand.

I have at my desk an Alto, A-L-T-O, workstation, which was, by today's standards, about a PC. It had a 64K of RAM, it had a 608x808 pixel display, but what made it different than the average home computer of those days was that it had a mouse and a bitmap display.

Every pixel on the display corresponded to a location in memory that could be written directly under software control, which was a break from the traditional last-all type terminal that most computers had. It was at Xerox PARC that the whole concept of mice and menus overlapping windows was developed.

Ironically, one of my many assignments while I was there was to give a demonstration to a bunch of young whippersnappers from a start-up company called Apple, who then later took those ideas and went off and developed something...some Macintosh with it.

So I became not only an LSI designer, but I became enamored by the technology of the whole mice and menus thing. Since Xerox had developed the technology, given it away to the world, to Apple, Xerox had this thing that they said, "We are not a computer company. We've been in that business before."

They had bought a mini-computer company and lost their shirts. "We are an office-automation company." From a corporate point of view, Xerox was single-mindedly focused on making a better copier and, "Maybe we can put this computer technology into a copier."

Then they came into the idea of, "Maybe we'll do desktop document publishing systems," but they never really wanted to get into the business of making computers. It's unfortunate, because Xerox's corporate disinterest in computer technology and yet having a laboratory.

Work at Xerox Park, I often characterized it, "It was the best graduate school I ever attended," and they brought together some of the finest computer scientists from around the world who were doing wonderful computer technology that the corporation didn't want, so it leaked out all over the industry.

With my background in LSI design based on working with Lynn Conway and Carver Mead, and my being enamored by the technology of the mice and menus and all of that.

I went to Atari hoping to move Atari away from the video game business more into the personal computer business to develop something which would've looked a lot like a Macintosh.

Kevin: How did you get hired by Atari? How did that whole thing start? Rich: Alan Kay was a researcher I had known, and I really respected his wisdom. When he went to work at Atari, I wanted to follow him. I really wanted to work with him. I wanted to work in his research lab. At that time, there wasn't any position working directly with Alan, so I took a position in the home computer division development department instead. Alan was in the Atari labs, so it was different on a corporate level but at least I was in the same company, and I got to work with him.

Alan Kay had, while working with Xerox, he was really developed a concept called the "Dynabook." He was a man of great vision who had envisioned that someday everybody would have a home computer that they could use for everyday things like ordering stuff online and paying utility bills.

Stuff that we routinely do online today, and this is in the late 1970s, Alan is envisioning a machine, which we would call today's laptop, and he called it a Dynabook. We didn't have the technology at that time to make a machine that small.

The Dynabook was implemented at Xerox PARK and also a lot of the software was written that we'd find quite familiar today, but we're talking about a desktop machine.

Not even a desktop machine, a desk machine the size of a portable dishwasher that would sit under your desk with a monitor on the desk and a mouse.

The idea was like envision that you're doing this on a laptop, and what can we do in that environment? Alan went to Atari hoping that Atari would be the company which would allow him to develop that, and I agreed with his vision and followed him.

That's why I came to Atari, why I looked at Atari. As I said, he was a man of great vision. He was looking 20 years into the future as to what would be there, and spot-on vision. I was a little more practically oriented.

I wanted to get something done now. "What can I do now?" So I went into the home computer division with the idea of making a better home computer, and frankly the machine I had in mind would've looked a lot like a Macintosh.

I got into the home computer division. "What do we need? I'm here to work. What can we do?" At that point, Atari was looking at the follow-on to the popular 400 and 800 series of home computers.

Kevin: You got there around November, 1982. Is that right? Rich: I got to Atari in 1982, yeah. Kevin: OK. Rich: The machine that they were wanting to do, unfortunately, was not with the Dynabook. As I said, the Atari 400 and 800 had been a very popular line of home computers. One of the things that they did was that they had a few useful applications. But you could also play video games on them. The controller for the video games was a four switch joystick. You could move it up, down, left or right and it was all or nothing. You were either pushing it or you weren't. There was no proportional control. The product under development.

As I said, was the Atari 1200 was supposed to be the next in that 400, 800 series. It was to provide a sleeker desktop machine that was slightly more business oriented, but still would be compatible with all the old existing games and be cheap enough so you could still sell it in Walmart.

At that point, the Atari 800 system, which I had one at my desk, I used it as my workstation, was $400 for the system unit. Another thing, they had an external hard drive, there was no internal, external disk drive.

A floppy drive external and a printer and a this and a that. By the time you put together the whole system it was a couple of thousand dollars.

Kevin: Sure. Rich: Atari had in mind that the way we ca sell more computers is to make this same machine cheaper, which I disagreed with. I thought we need to have something that has nice menus and windows. I don't mean the operating system windows, the overlapping windows on the screen. Nevertheless, my assignment was to help with the development of this cheaper version of the 800. A little more memory, we went from 48K, up to 128K and reduced the price point from a $400 system unit to a $79.95 system unit and sell them in Walmart.

That is the way we are going to sell gazillions of these cheap computers. My assignment, because I had the LSI design, was to take the memory control options, which had been a bunch of separate small scale integration parts.

And come up with a single VLSI chip which would do the entire memory management function. By the way, it was customary in Atari to name chips under development after the developer's girlfriend. Chips had various women's names on them. At the time, I was dating a woman whose name was Gay.

They started calling my chip the Gay chip. But people who were a bit homophobic objected to that name and it was nixed, that it could not go into production under the name the Gay chip.

I was the one person who was not allowed to name the chip after my girlfriend, because her name triggered the ire of some homophobic individuals.

Kevin: What name did that chip ultimately have? Do you remember? Rich: Freddy. Kevin: Freddy. Rich: It was named Freddy. Why? I asked my manager "Well, what can we name it?" He said "Oh, call it Freddy." The name Freddy stuck. There wasn't any individual named Freddy at all. But that was... Kevin: So...I am sorry, go ahead. Rich: You were asking a question. Kevin: How were you involved with developing Freddy? What did you do to it? Rich: I commanded the team. It was the first management job I had. I didn't intend to be a manager. But it was more than one person could do. I commanded a team of about three chip designers. We did the algological architecture for it and the gate, did it down to a gate-by-gate level. Gee, it was so much time I don't remember all the details. But we eventually moved the chip into production. It was developed into the computer.

Kevin: Cool. Rich: From concept to a little bug with legs on it that you could plug into the PC board. Kevin: So it's is a RAM addressed multiplexer. I assume that means that the 1200 was going to address RAM in a different way than the 400 and 800 did. You need to account for that? Rich: Yeah, the problem was, and again, I am relying on technical memories from over 30 years ago, so excuse me if I am a little vague. The problem being solved was it had to be software compatible with the old 400 and 800. But it had to address more than 64K. Which was the upper limit on the memory address capability of the old architecture. The old architecture, of that 64K, we reserved 16K for control functions, memory, math, IO and such things, only left 48K available. 48K was the maximum physical memory you could equip the old design with.

The intention was to address more than that. The machine shipped with 128K. It was a paging type memory system, not too different from the memory system that was used in the PC later, although not software compatible. The address space was divided into page and address within the page.

The page switching was all done transparently in the hardware. I don't know that I can say much more about it than that. Anyway, that was my official job responsibility. But I still never let go of the notion that we should have a machine that is a lot more capable of.

I'll have to use the word, Macintosh-like functions. I still wanted to produce a machine with a mouse and menus and windows things. I should mention that the machine did eventually come out, I believe under the name 1040.But it was long after I left Atari.

Kevin: The FT machines, you are referring to, the 520 and the 1040FT? Rich: Yeah, those. Kevin: The 16 bit machines that were Mac-like in interface. Rich: Those machines finally implemented what I had been arguing for. I don't know how much credit I can take for them because I had left Atari in frustration at my inability to do that some time before it came out. I didn't have any direct responsibility for those, other than perhaps some inspiration that this is what has got to be done.

Kevin: Yeah. Rich: But I would like to share a couple of stories of my experiences along the way. Kevin: Please. Rich: Yeah. One of them was that the frustration I had with Terry was the corporate mindset that we are making low cost machines that plug into TV sets. It was common wisdom, something everybody knows, that you can't put text on a TV screen. That when you're dealing with a TV set and your input to the TV set; and this was long before HD television, this was all (?) definition and DSV standards.

When you are putting information into a TV set by going in through the antenna terminals with an RF signal, that you can't put more than 80 columns of text on a display and have it be legible. That was common wisdom that everybody knew, except that I didn't believe it. I didn't agree with it.

What was wrong with that idea? Where that idea came from is the old west teletype model of rows and columns of text of fixed pitched characters. If you took a standard 40 column display and the IBM PC was coming in about this time, and connected that into a TV that it was perfectly legible.

But if you took an 80 Column display, which for example was seen on the old monochrome screen of the PC. If you put that 80 column type of video in through an RF modulator into the antenna terminals of the TV, it was illegible. The characters were too small or too blurry to read.

So, everybody knew that you can't put more than 40 columns on a screen, PC, on a computer and make it be legible. I didn't believe that. I made a bet with my boss.

I said, "I believe that I can produce a device which will put text on the screen of an unmodified TV connected only to the antenna terminals and I will invite you to read it and you will be able to read it perfectly well and it will have more than 80 characters per line."

My boss said, "how long will that take?" I said, "Well I'm proposing that we develop a custom video controller chip which might take a year or so in development and thousands of dollars in expense.

As a proof of concept, if you allow me to make a mock up, a black box that will do that, I can do it in one day. As long as you don't constrain me on how big the box is or how much it cost, or how much is in it. I can make a mock up in one day.

He said, "OK you're on." I went off to the lab and left him to go back to his administrative things. The next day I said, "OK I want to show you my device, but first I want you to look at this TV."

He looked at the, I led him by the hand into the lab and he looks at the TV and I said, "Now read what's on the screen." And what was on the screen was a page of the manual for the Atari computers.

He read it out loud. Read it fine. Like he was reading it out of a book. I said, "OK, now go count the characters on each line." He's 1, 2, 3, 4 ...., 79, 80, 81, 82, 83. 83 characters on that line. I said, "Great, now count the characters on the next line."

1, 2, 3, 79. OK count the characters on the next line. 1, 2, 3, 86. So you see I'm getting an average of more than 80 characters per line. He said, "OK now show me the device that's producing this." I opened the door of the next room and there was a video camera on a tripod focused on a printed book.

Kevin: [laugh.] OK. Rich: That was my device. It was putting more than 80 characters per line. Why was it working? It was working because of several things. First of all, the text in the book was not in rigid matrix of rows and columns of fixed width characters. Kevin: It was proportionally spaced. Rich: Proportionally spaced. We weren't wasting a lot of space for little skinny letters like 'I' because we wanted to use the same space for big wide letters like 'M', we were proportionally spacing them as necessary. Secondly, and equally importantly, the camera in its optics was performing an automatic anti-aliasing function. That is to say it was blurring the text enough to keep the video signal within the bandwidth of what the video channel allows.

The blurring was not so bad as to interfere with reading, but it was enough to keep the bandwidth of the video signal compatible with what went into the TV. After all that's what the camera is made to do.

So, I said, "All we have to do is develop a chip that does this. Instead of text print page, we will have characters in memory and instead of the video controller we now have we will have a new piece of silicon that turns these characters into video that looks like the camera produces."

Oh yeah, that sounds like a great idea and we should do that. By the way, I'll note for you now that there was a piece of software out there that was emerging on the market which did that on the IBM PC and the name of the software was Adobe Acrobat.

Adobe Acrobat does, its software, exactly what I was saying. That it takes characters proportionally spaced if that's what the PDF document calls for. And uses it's display to produce a nicely looking display on the computer.

Another way I could have done that would have been to take an IBM PC and put an Acrobat on it and text on it. But again, I'm thinking in the mind of we build a chip for a low end massed produced computer.

It doesn't necessarily have a lot of computer CPU horse power, but we can do a display controller that does that stuff in hardware. The idea was a good idea but it never got to development. One little antidote.

Kevin: Why didn't it go to development? Political? It's always political, right? Rich: Yeah political. I would say the main reason was that despite my compelling demo I couldn't really convince anybody in the management chain to put the resources it would take into developing that piece of silicon. It would have taken a staff of two or three engineers about a year to do it. Or maybe four or five engineers six months...

Kevin: Yeah. Rich: The overall response I got whenever I proposed anything like that was, "why would we want to do that we are a video game company. How will that make our games play any better. Very frustrating indeed. Kevin: Right. It's like when you're working for a copy machine company. Rich: When you're working for a copy machine company trying to make computers. Kevin: Right. Rich: OK, so we're a video game company. Let me hit the management where it makes business. Let me try to get them interested in using a mouse instead of this four function game switch, four function joy stick to control the video games. So again, I've had a mouse in my hands for years working at Xerox. I go to my management at the time, and they say, "What's a mouse? I took an IBM 800 home computer and I bought a Holly mouse which is the same mouse that Xerox is buying for use with the Alto from Jack Holly.

Jack Holly was a small home business but he got into the business of manufacturing mice for Xerox. His primary customer was Xerox. The Holly mouse was about the standard size of a mouse with a steal ball bearing on the bottom.

Internally two shaft decoders. One for vertical and one for horizontal that can convert the motions of the ball into x and y coordinates through the computer. It also had three micro switches on the top for the three buttons on the mouse.

Atari had a game product called Missile Command which was available for the 800 home computer. The game of Missile Command, you're in control of a world where there are aircraft dropping bombs on your city.

Kevin: Right. Could you let the listeners of this Podcast know what Missile Command is? Rich: We know what missile command is. I found that playing missile command with a four-function joystick to be a most frustrating experience because the obstacle in playing the game was not the game itself but the difficulty communicating my intention by a switch that had only five positions. Neutral, up, down, left and right. How can I make this more user friendly? It turned out that the software for Missile Command already had it built in it. The controller for a trackball, which was available as an optional product for Atari home computers.

I thought, If it can do trackball it can also do mouse, because after all trackball is a mouse turned upside down or a mouse with a trackball turned upside down.

So I built a small hardware interface to allow one to plug a Holly mouse into an 800 computer and put the 800 computer into its trackball mode and low and behold I can sit down and play Missile Command with a mouse.

I immediately found within seconds after plugging in my mouse that my score in Missile Command doubled or tripled. Finally now it was a playable game. So excited about my new discovery, I ran back into the office 8of the Vice President who had said, "What's a mouse?"

I said, "Here try Missile Command with your joystick." And he did and he got an incredibly good score. And, I said, "Now try it with this mouse." It was the first time he ever had a mouse in his hand. He was very awkward at it.

Didn't know how to use it and ended up doing much worse with the mouse than he had done with the four-function joystick He said, "Well, it's a nice toy, but I don't' see why we should sell that. End of class. So Atari was not going to release the mouse.

The Atari 800 home computer, or its successor the 12 hundred which is what I was going with. And based on that one experience where the VP had a strange and foreign experience with the mouse, after letting go of his comfortable four switch joystick, the company decided not to do mice.

Kevin: I feel frustration for you 30 years later, I'm sorry. Rich: So around this time the market was leaving Atari. One of the reasons the market was leaving Atari...there was a great book called "De Re Atari." Kevin: Sure Chris Crawford. Rich: That book was published for a while and then withdrawn because someone in cooperate manage got the idea that; "You know we really want to have a monopoly on selling software for our computers. We don't want the whole world to sell software for our computers. Because we want to have a monopoly on that business." So they withdrew "De Re Atari" and stopped supporting external developers in developing software because "Why should we have a small percentage of the software for computers? We'll have it all."

The problem was while this was going on Apple came out with a book, called "Inside Macintosh", and they encouraged people to develop software for their computer. And I put "Inside Macintosh" right opposite "De Re Atari."

Caroline Rose who is the editor and chief technical writer for "Anti-Macintosh" and I are good personal friends so I have watched the whole business s go on as Apple opened their Mac to outside software and Atari said "Nope, we're gonna have a monopoly on it.

Nobody else writes software four our Mac but us..for our 12 hundred Computer, sorry." And the software that Atari was offering was frankly inferior to that which could be obtained on the open market for the Macintosh, and that contributed to the death of Atari.

So the company which had been spending millions of dollars on great bashes a year earlier had to start tightening its belt. And they had to start making some layoffs and people were let go in waves. And I watched a phenomenon I have seen so many other times in Silicon Valley is that.

When you say lay-offs, when you start reducing your staff by having waves of lay-offs the people who are really desperate for a job cling very tightly. The very best people, the key people, the people that you want most on your team; they're the ones that start circulating their resume.

Because they don't want to wait for the layoff, they want to move now when they can. And they do. So the threatening waves of layoffs meant we lost a lot of key players. And the team was dwindling, not so much by the layoffs which were happening.

But by the people who were leaving in fear of being layed-off. This got to be worse as time went on, and as key people left and the company lost more money they had more lay-offs and it became a sad situation.

In the spirit of dark humor I suppose you would call it, I made up a phony press release and stuck it to the door in my office. It said "Sunnyvale Atari Incorporated announced today...

Sorry, former video game giant Atari Incorporated announced today from its Sunnyvale headquarters the layoff of approximately one third of its [inaudible 0:36:13] based workforce. Neither of the remaining employees were available for comment."


Rich: That didn't get me in very good stead with the management but then again it was all falling apart. So it wasn't long to long after that that I ended up turning in my resume and getting a better job. Turned in my resignation and I went to, again, Hamilton Research Center in southern [inaudible 0:36:43] where I continued my work in per se design. So that's the next stage of my resume. So I'm concluded with the things I wanted to share with you but I'm sure you had some questions so please ask.

Kevin: No, I came in here with very few questions 'cause all I knew was that you were doing verify Atari, and you've answered a lot of my questions. So is there anything else that you can talk about the development of the Freddie Chip? Maybe features you wanted to add that didn't get added? Rich: I don't remember to many of the technical details of the chip. It was fun. We did complete the development in a timely way and got it into the products. But the product itself was unfortunately a lame duck. It was too little too late when it landed in the marketplace. And most people saw it for exactly what it was: a cost reduced slightly feature enhanced version of an obsolete computer, and why would anyone want to buy that when they could buy a Macintosh?

Kevin: Right. Rich: So that realization came too late, from a corporate point of view. I had already left and Atari went ahead and turned out the 1040 and 520 SG. I remember the 1040 because it was the number of a tax return and that was a deliberate choice of number. Kevin: Huh, interesting. Rich: I'm afraid if you want to have this conversation about the technical aspects of the Freddie Chip, I'm not really in a good position to do that because I don't remember too much about the details. Kevin: No, that's fine. I getting to ask. If you could send a message to the Atari computer users who still exist, and you can right now, what would you tell them? Rich: Oh, good question. Let's see. Atari was an interesting chapter in the development of Silicon Valley and the whole concept of home computers in general. If you look carefully at the Atari story there's a lot of lessons on how to run a computer company and how not to run a computer company. What both Xerox PARK and Atari have in common is they were both computer companies that had the world in their hands and let it escape because of some poor decisions. So if you have an Atari computer, even though it's hopelessly obsolete by today's standards, it's a bit of history.

It's a little bit like owning any other historical artifact, hold on to it and love it and treasure it because of what it tells about the history of the world.

Kevin: Good answer, thank you. I've heard so many stories, and different interviews, of Atari management just not getting it and doing the wrong thing. Every company messes up now and again but it seems like Atari management constantly would make the wrong choice, over and over again. Rich: Yep, that is what they did. I wish I could have combined the best features of the Xerox Park environment and the Atari environment and maybe the result would have been a company that would have had a little better result than either one did. Kevin: So what were the best features of both? Rich: The best feature of Xerox Park was that the company poured money, big money, into a research center that they recruited world class computer scientists from around the world, gave them a budget and a lab, and left them alone. And as a result some of the best technology came out of that lab. The downside, which is the one that more people talk about, is that they gave away the technology and never developed it into a product.

But as far as the energy involved in so many highly confident computer researchers collaborating in an environment where they were pretty much left alone the results in the technology was awesome.

It made an incredible advantage if not for the company Xerox than certainly for the world of personal computing. That was the best thing about Xerox Park; is that it was a great grad school.

Rich: The best thing about Atari was that they had begun by capturing the attention and the mind of the general public for hundreds of thousands of American families, the original Atari 26 Hundred Game Machine was the original computer they had mapped. They didn't think of it as a computer, they thought of it as a game machine, but it was in fact a computer. And because it was a computer people got to know the notion of you can have this thing, and you can put programs into it, and depending on the program you put into it you can do different things with it.

They had the public's attention with that and they came out with the home computer. Again, what they did with the home computer after that was pretty bad, but Atari had the potential of being where Apple is today and again they blew it.

But there was that potential and they positioned themselves very well in that market. They didn't have the product to follow through that the Apple Macintosh was.

If Atari had only done the 1040 machine a couple of years earlier, like during the time I was there, they might have taken a good a chunk of the market share that Apple took. But again, it was too little too late.

Rich: So I have another story which is not really about my work, it's about the video game machine, the 26 hundred. Kevin: Great. Rich: While all of this was going on I had a 26 hundred machine at home and my son, who was only five years old, loved to play it. He got to be a master of the four switch joystick. Again Atari had the same philosophy with it as they did later with the 800. "We will have a monopoly on software for our machine." Their insistence on that led to them not being very kind to outside developers who wanted to develop software for it. Atari also had this image of; "We are good, clean, wholesome, family fun."

So given these two cooperate stances, I'm sure you can imagine how negative the reaction was when the company came out with an adult video game for the 26 hundred.

When I say an adult video game, the nature of the game was that you had a naked man and a naked woman and with the four stick joystick you moved the naked man around and made him jump on the naked woman.

Kevin: So maybe we are talking about "Custer's Revenge". Rich: Yeah, that's the one. And Atari's response was to sue them and say; "You can't make software for our machine without stealing our intellectual property. Besides that you are tarnishing our good, clean, wholesome, family fun image." The lawsuit brought a lot of attention to Atari and a lot of attention to "Custer's Revenge"...and thank you for reminding me of the name. Because of all that attention sales of the Atari Game Machine skyrocketed, and so did sales of the "Custer's Revenge" software.

And everybody was buying it, mostly to see what the hullabaloo was all about so Atari ended up making a bunch of money. The lawsuit sailed and Atari lost so they couldn't stop the company from making it but they achieved what they wanted to which they.

One, preserved their "We stand for good, clean, wholesome, family fun" and two, they boosted the sales of their machine to a market that they never intentionally got into anyway. So the whole thing was a win win for everybody.

The outside publisher got a lot of publicity for their product, Atari got opened a whole new market they never thought they to get into. Everyone was making a lots of money on the sales, it really didn't matter who won the lawsuit 'cause the outcome was best for everybody.


Kevin: I've heard bits of it, sure, but never from that perspective of everyone wins. It's ironic that on one level Atari was right that part of the things that ended up hurting...when the video game world crashed was that there was so much crappy crappy software out there. Terrible games from companies that would pump out as many games as possible, no regard to quality. So Atari was right, to some level.

Rich: I disagree with that. Yes, there is crap out there and [inaudible 0:47:28] but what's the most popular operating system today? It's Windows. And why is it so popular? It's certainly not that Windows is technically good, but that everyone publishes stuff for it. If Microsoft had held on to the ability to program the Windows like Atari held onto to their ability, Windows would have died a long time ago. Success comes from getting everybody else to play in your ball park.

Kevin: OK. Rich: We may agree to disagree on that but that's my opinion. Kevin: No, I see your point there. You're right. It's a common belief...knowledge in the retro game world is that part of the reason for the crash was that there was so much bad software. People would spend 40 bucks for a thing and would end up with a terrible game for their 26 hundred and got frustrated.

Rich: I don't believe that...I don't believe that for a minute. Kevin: OK. Rich: Yeah there was a lot of crap out there for the 26 hundred, there's a lot of crap out there for the 400 and 800. But that could have been managed better by Atari. Atari could put stuff they publish out there with a name like, Genuine Atari or whatever, somewhere on it. There is crappy software for every platform. There's crappy software for windows, there's crappy software for Android. I have three machines in my everyday life, two run Windows and one runs Android. There's crap for both platforms and I'm sure there's crap for Macintosh as well.

What happens in today's world is that the...there are lots of reviews, columnists review things and users reviews things, on pure review sites and the stuff that's good gets a good reputation and the stuff that's crappy gets a bad reputation and the market floats with their dollars and fate.

That's far better than having any company claim to be the monopoly of software for their own product. I don't do a moral judgment, if any company wants to be the sole source for software and they want to keep their architecture private so nobody can design...sure let them.

But I certainly wouldn't want to have stock in that company, I don't think that's a viable strategy today. What's the most popular mobile platform? It's android. And why is Android so popular? It's because Google's encouraging the world to write for it. And they are doing the right thing there.

The most popular desktop platform is still Windows, and why? It's because Microsoft encourages the world to write for it. And if I had any a computer product of my own I would encourage the world to write for it, that's the way you get it...what do I know I'm an engineer.


Kevin: That was great, that's all I need. Thank you very much. Rich: You are very welcome I appreciate the opportunity to talk with you today. Bye Bye.