The History of Interactive Computer Graphics - Part 4

Motion caption

The History of Interactive Computer Graphics

Computer graphics, animations and interactions with digital equipment are now self-evident. You just have to pick up your smartphone, tablet, desktop computer or what else and you feel intuitively when you have to swipe, click, drag or pinch zoom. You also expect nothing less than nice interfaces with smooth animations.

In this blog series, of which this is the fourth part of six, I like to take you on a journey through time with our focus on the development before and during the creation of computers, digital graphics, animations, graphical interfaces, graphics software, interactivity, 3D, a pinch of the first games, the creation of the internet and a touch of virtual reality. If I could even mention the largest part of influential events, that would be a world achievement. So that's just impossible. Instead, I like to point out a number of events that I think have made an important contribution to getting where we are now. Sometimes with a slight side-road to a development that indirectly made an important contribution or reflects the spirit of the times and relations between events. Although I personally find audio and music are very important and interesting and I have always been involved in producing music, I have made the decision to omit audio developments in this series to keep the series somewhat concise and focused.

I have made more than 110 illustrations for this series and also provide each part with at least one interactive to bring the events alive as good as possible for you.

If you haven't read part one, part two and part three of this series yet, it's worth reading those first.

Part 4

In this fourth part we start at the beginning of the 80s. We will see some important relationships between companies, individuals and events and discover that many things are even directly related to each other. More often than you might think.

Well-known star players in the field of technical innovation, graphics and animations are increasingly seen in the timeline, because they keep coming up with new techniques. During this period, a stable technical foundation has been laid for everything that is related to graphics, animations, interactives and computers today. Because we are increasingly moving towards the present tense, this period will evoke nostalgia for many people. That was at least certainly the case with me while writing this part!

There are some interesting situations ahead this time, so let's get started quickly. And this time we start in Canada, at Simon Fraser University.

Motion caption

1980-83: Motion caption for animations

More studies came to study people's movements. And these laboratories were also increasingly using computers to analyze human movements.

Between 1980 and 1983, Dr. Tom Calvert, professor of kinesiology and computer science at Simon Fraser University (SFU), who was also interested in computer animation, was working on the first motion caption (mocap). Motion capture in itself was not new and was done in the late 1970s, but it was first done for computer animations in the early 1980s.

He did this by using exoskeletons and mounting potentiometers on the body via the exoskeleton. This affected an electric current and so human movements via this analog electric signal, which was then digitized, were used to control computer characters. This was done for choreography studies and studying movement abnormalities, but the techniques and equipment used for this soon began to be used in the growing computer graphics industry.

Principles of Animation

1981: 'The Twelve Principles of Animation'

The animation industry became increasingly professional and in 1981 two major animators of Disney, who were also lifelong friends, released the book 'The Illusion of Life'. These animators were Ollie Johnston and Frank Thomas.

The book was a clear explanation of the animation processes at Disney and the book wrote about what Disney saw as the twelve principles of animation. In fact, now in 2020, the book is still the bible for animators worldwide, and the principles are still widely used and increasingly applied in animated computer interfaces and websites.

It is beyond the scope of this blog series to go deeper into the principles, but there is a lot to be found on the internet about these principles. With Wigglepixel I always take these principles into account while at the same time taking into account the performance of the interactive animations in browsers.

You're not supposed to animate drawings. You're supposed to animate feelings.

Ollie Johnston
Image morphing

1982: First digital photo morphing

In 1982, a short film was shown for the first time at the SIGGRAPH conference in which two digital raster images, i.e. images made up of pixels, were morphed to each other. That had never been seen before.

The demo showed a woman transforming into a Lynx. The technique used was conceived and executed by animator Tom Brigham and his colleague Douglas Smythe of NYIT Computer Graphics Lab (CGL), whom we already saw in the third part and who played an important role in the development of many important computer graphics techniques.

In his further career, Tom Brigham has also developed many other visual effects for cinemas, television and experimental theater performances.

Video game crash

1982-83: E.T. and the Video Game Crash of '83

Due to the worldwide success of the movie E.T. the Extra-Terrestrial, in 1983, Atari, in collaboration with Universal Pictures, released the game E.T. on the market for the Atari 2600. Because Atari wanted to release this game before Christmas and therefore the time pressure was high, the game wasn't tested for errors and playing fun and was therefore launched too quickly.

Atari had expected skyrocketing sales and had therefore overproduced game cartridges. But even though the sales were initially a hit, the game didn't reach the high sales figures Atari expected. As a result, many game cartridges could be thrown in the trash and many of the games sold also returned from dissatisfied buyers who had complaints about the game itself and serious errors. Atari reportedly lost $100 million.

The E.T. game is still considered one of the worst computer games ever released. But with that, this game also became an important and valuable lesson for the games industry. Atari literally buried this bit of history when they buried ten to twenty truckloads of remaining games in a deep pit in Texas in 1983.

Among other things, due to a saturated market of first-generation game consoles with games that were very similar, but also because the demand for computer games decreased due to the rise of personal computers, among others, there was a recession in the games industry in the United States between 1983 and 1985. Many people think that the dramatic E.T. game was an important impetus causing this Video Game Crash of '83.

Colecovision

1982: ColecoVision and Donkey Kong

Just before that crash, in 1982 the American company Coleco was the first company to use a digital computer processor in a home game console, because chips became cheaper due to the success of the Commodore 64. The ColecoVision was also the first device to offer truly innovative competition to the Atari 2600, which had been successful since 1977.

In the years leading up to 1982 there was a Japanese company that had been making playing cards since 1889 and moved to developing electronic toys now had switched to an emerging market because of the 1973 oil crisis; computer games. This company, Nintendo, in collaboration with Mitsubishi, who supplied the hardware, had, among other things, marketed a clone of Atari's Pong like many other companies. In part three of this blog series, we already read how many different companies tried to capitalize on Pong's success.

In 1981, Nintendo's Shigeru Miyamoto designed the arcade game Donkey Kong. It was the first time that it wasn't a programmer who wrote a computer game, but an industrial designer. That also affected the quality and success of the game. With Donkey Kong, Nintendo had even started a whole new games category: platform games.

In 1982 Nintendo had tried for some time to gain a foothold in the United States, but that didn't work out very well. Despite the fact that a bidding war had erupted in the United States over the US distribution rights for Nintendo's Donkey Kong game. This bid mainly went between Atari and Coleco and was eventually won by Coleco.

Among others because of Nintendo's Donkey Kong game, which now, with the acquisition of the rights, came with the ColecoVision game console, but also because a hardware plug-in for the console made it possible to play Atari 2600 games on the console, the ColecoVision became a great success. Atari sued Coleco, but that wasn't successful due to the lack of legislation in the new computer games market.

The ColecoVision was part of the second generation of game consoles and eventually, together with other new brands, models and techniques, it brought the video games market back to life after the Video Game Crash of '83. But it would still take some years after 1983 to end the crash.

Tron

1982: First solid 3D CGI on film

In parts two and three of this blog series, we saw that 3D computer graphics began to emerge in short films. But those were usually just short films to demonstrate something scientific or a new technical computer animation technique. Also we mainly saw wireframes and certainly no long cinema films with 3D graphics.

This changed in 1982. Walt Disney's science fiction feature film Tron was the first feature film to feature solid 3D computer graphics. The film is directed by Steven Lisberger and is based on a story by himself and Bonnie MacBird. The story is about a computer programmer who has been transported into the software world inside a computer. The programmer then tries to communicate with the software programs he is in to escape.

The making of the film had already started six years earlier, in 1976, when Lisberger was intrigued by Atari's game Pong, about which we already read in parts two and three.

Sgi and IrisGL

1982: Silicon Graphics and IrisGL

In 1982 associate professor of electrical engineering James H. Clark left Stanford University to start a computer business with a group of graduate students. He gave his company the name Silicon Graphics, which would later be renamed SGI.

Silicon Graphics started building and selling high standard computers with a major focus on graphics. They used their own graphics API IrisGL, which stood for Integrated Raster Imaging System Graphics Library. Both 2D and 3D graphics could be produced with this library. IrisGL would eventually play a very important role in graphical APIs we still use today. More about that later in the next part.

In 1983, Silicon Graphics launched its first hardware product, the IRIS 1000. A graphics terminal that needed to be connected via a network to a computer, such as a Digital Equipment Corporation VAX, and did the calculations for raster graphics. The power this graphics terminal and its successors to Silicon Graphics brought to the creation of 2D and 3D graphics was unprecedented and caused a significant boost for the graphics industry.

Particles

1983: Development of particle systems

The possibilities for 3D graphics started to become considerable in 1983. But creating complex shapes and animations, such as fire and explosions, was not yet possible. William 'Bill' Reeves, PhD from the University of Toronto and also hired as a member of the Lucasfilm Computer Division graphics stars club, whom we have seen before and will see more often later, had in his spare time therefore been working on a revolutionary new rendering technique: 'particles'.

He first used the term 'particle system' when using his particles method when he made the fictional 'Genesis effect' for the film 'Star Trek II, The Wrath of Khan'. With this revolutionary system, he exploded a virtual bomb on the surface of a planet that caused a conflagration in the film. That was groundbreaking and a very important development for graphic arts as a whole. In 2020, particle systems are still widely used and indispensable in making lifelike simulations.

Motion blur

1983: Development of Motion Blur

In the same year, Reeves also invented the Motion Blur algorithm in addition to particle systems. The Computer Graphics Lab group that later switched to Lucasfilm as Computer Division, of which he was a member, had also realized that computer graphics could never be used in a real movie if they didn't at least look like the image of a camera. Especially by simulating a slow shutter speed.

If something comes along at full speed in nature and is filmed with a camera, you will also see a blur and you will no longer perceive a cyclist or car sharply. So Reeves also wanted to simulate this motion blur in computer graphics, because this made computer graphics much more realistic and less artificial.

Nintendo Famicom

1983: Nintendo Famicom

On July 15, 1983, Nintendo, which already had some success in the arcade games market, released an 8-bit home console on the Japanese market. This was given the name Family Computer (Famicom).

This game console was special because of the high resolution sprites, larger color palettes than used by the competition and tiled backgrounds. In short, the graphics were more beautiful and more detailed than what had been on the market until then.

The Famicom had two controllers that already looked like the typical Nintendo controllers of the better-known Nintendo Entertainment System that only came on the market later. And it came with their own game Donkey Kong, among others.

The console later became, after some teething problems, the best-selling game console in Japan in late 1984.

ATARI-Nintendo join

1983: Cooperation Atari and Nintendo

Because some of their Arcade games being successful in Japan weren't successful in the United States, Nintendo decided in April 1983 to contact American Atari, which was already a well-known name worldwide.

For Atari, due to the Video Game Crash, Nintendo was interesting and many conversations between Atari and Nintendo later Atari would distribute the 8-bit Nintendo Famicom to all countries outside Japan. All details were worked out and all that was left was to sign a four-year contract that Nintendo had negotiated hard about. But that never happened.

At the Consumer Electronics Show (CES) in June that year, Atari had seen that the American company Coleco had the Nintendo game Donkey Kong, for which Coleco did have the license rights for game consoles, but not for personal computers, because Atari had this, also released their game on their new Adam computer. In addition to a game console, the Adam was also a full-fledged personal computer. So Atari was quite angry and felt stabbed in the back by their negotiating partner Nintendo.

Also Nintendo's boss was angry with Coleco, had an emergency meeting with Coleco the same day and demanded that the game would not be released on the Adam. Coleco said that he wasn't aware of problems because of a deal between Nintendo and Atari and in September that year there was a consultation between Nintendo, Atari and Coleco in which the conflict was finally resolved.

But because Atari had already lost more than $500 million due to the Video Crash, Atari fired many employees, their parent company closed many Atari offices and the deal with Nintendo was canceled. Nintendo therefore decided to do it themselves, but it would have been close to writing in these blogs about the Atari Entertainment System (AES) instead of the NES.

Although there are people who are convinced that Atari had never intended to actually sign the contract with Nintendo, because they think Atari found their own successor, the Atari 7800, superior to the Nintendo Famicom.

Apple Macintosh

1984: Apple Macintosh

As we saw in part three, the Apple Macintosh, the Mac, was certainly not the first computer with a graphical interface and a mouse. Also the Apple Lisa computer, the predecessor of the Mac, already had a GUI and mouse. And the Mac's graphical interface was even adopted from the Xerox Alto.

Nevertheless, the Mac was an important step in history, because it was the first personal computer with a GUI and mouse that also became popular and thus accelerated further technical developments in the graphic field. Besides the Lisa and Lisa 2, it was the first popular personal computer with a Motorola 68000 chip, the instruction set which was much better equipped for graphic work than, for example, IBM PCs.

Development of the Mac started five years earlier, in 1979, but it didn't have a graphical interface back then. It was only after Steve Jobs' visit to Xerox PARC, where Xerox showed their GUI, that Apple took these ideas into their computers and really started creating the Mac and Mac OS. Ultimately, this computer has proved to be very important for the development of the digital graphics industry. Actually, with the Mac, a graphical operating system was widely marketed for the first time.

The people who are crazy enough to think they can change the world are the ones who do.

Steve Jobs
Postscript

1984: PostScript

When we talk about graphic software nowadays, many people immediately think of Adobe Photoshop, Flash and After Effects. Unfortunately, not everybody knows that Adobe is not the creator of these programs, but that in fact Adobe has often acquired these and other products through corporate acquisitions. What Adobe did develop itself and put the company on the map was PostScript, the first version of which appeared in 1984.

Talking about PostScript, many designers think of PostScript fonts. And PostScript Type 1 is indeed a font file format introduced by Adobe in 1984 as part of PostScript. Still, PostScript is much more than that. But what exactly is PostScript?

PostScript is a page layout programming language. PostScript became important because several printer types had come on the market that operated differently but were required to produce the same prints. PostScript made it possible to save graphical formatting of pages using mathematical formulas and logic. This allowed the page to be read on different printers and interpreted their own way, and the graphic content also became scalable. Each printer was able to read these PostScript scripts and, after execution and processing could print them in a way that suits the technique of the printer. Adobe co-founder John Warnock thus found a good solution for a growing problem due to an increasing number of different printers.

Where we still see a lot of PostScript today is in simplified form within the PDF file format, which is also developed by Adobe. Part of PostScript is also still used within EPS files. EPS is an abbreviation of Encapsulated PostScript.

The example below is taken from the Wikipedia page on PostScript to give you an impression of a PostScript script:

%!
/Courier findfont
20 scalefont
setfont
50 50 moveto
(Hello World) show
showpage
Alpha channel

1984: Alpha Channel and Compositing Algebra

Developments in computer graphics went really fast in the 80's and graphic design with computers finally became easier, better and more accessible and affordable for many more people. In the same year that Postscript was published, Thomas K. Porter and Tom Duff released their paper 'Compositing Digital Images' in July 1984.

Porter and Duff had already worked with Lucasfilm's Computer Division, the club led by Ed Catmul that we see over and over again, as they have brought so many important developments. So they certainly were with the core club of people who helped advance the CGI world.

In their paper, important graphic ideas were introduced, such as an Alpha Channel for storing transparency information per pixel in a graphic file. They also came up with a whole set of image combining operations they called Compositing Algebra. You might recognize this as the blend modes in many graphic editors, such as Adobe Photoshop and Affinity Photo. But you will also find the techniques if you work with 3D shaders or html <canvas> objects.

How cool is it that you can still find this article on the Pixar website! It is here.

Adventures of Andre and Wally

1984: First fully CGI-generated film

In 1984, the Lucasfilm Computer Division was supplemented by an ex-animator from Disney who quickly became very important to the company: John Lasseter.

He had put forward the idea at Disney to combine camera images with computer graphics, but Disney was not impressed by this and saw no future in it. At the Computer Division, an animator like him was more than welcome, because the club consisted mainly of technicians and they noticed that they could use an experienced professional in the field of animation.

When the Computer Division brought in John Lasseter they made the very first fully CGI-generated animated film with 3D graphics that same year: The Adventures of André and Wally B.. The short film was a first for the outside world for many new graphic techniques. Such as the Motion Blur technique developed by the group itself. Due to the animation knowledge of John Lasseter and his Disney background, he was responsible for the animations of the film. This was the first time that the well-known Disney animation principles, which we have just discussed, were applied in computer animations. This greatly benefited the animations and generated many positive reactions.

The art challenges the technology and the technology inspires the art.

John Lasseter
Data glove

1984: Data Glove

In 1984, developments in the field of virtual reality were not exactly stopped either. Jaron Zepel Lanier, an American scientist who was also a computer philosopher, visual artist and classical composer (such a great combination!) Is now considered one of the foremost developers of virtual reality. Together with Thomas G. Zimmermann, who had worked at Atari like him, he founded VPL Research Inc. in 1984.

This was the very first company to sell VR glasses and VR gloves. That same year they released the Data Glove. This glove sensed human movements and orientation, using sensors and fiber optic techniques. The Data Glove was initially developed to control computers. But it was eventually also used for virtual reality by the United States military.

In addition to the versatility of Jaron Zepel Lanier, he later also made significant contributions to many leading companies. Besides that he worked at Atari and with VPL Research Inc. had worked on the path of virtual reality, he also worked on applications for the Internet2, and as a scientist he contributed to Silicon Graphics and Microsoft Research.

NASA Virtual Env Workstation

1985: The Virtual Environment Workstation

NASA was also interested in Virtual Reality and developed in 1985 in collaboration with VPL Research Inc. the 'Virtual Environment Workstation' (VIEW). This was a set with VR glasses, with which you could view stereoscopic images provided by computers or remote video cameras. The Data Glove was also included in the set. When the user moved his fingers, a hand animated in the virtual environment with which the user could grab a virtual object.

The Data Suit included in the set, also developed by VPL Research, also used sensors to transmit movements from the rest of the body to the computer for an even more natural experience of movements in the virtual world. VPL also wrote the software. A funny detail I think is that VPL Research gave their helmet system with viewer the name EyePhone. What does that reminds us of nowadays?!

ATARI vs Amiga

1984-85: Atari vs Amiga

Atari's parent company Warner Communications thought Atari's 8-bit game consoles were enough to get profit from for a while and therefore didn't want to innovate. When employees indicated that they wanted to grow to 16-bit computers and to work with the new Motorola 68000 chip that was also used in Apple's Macintosh, the company wasn't open to that and the designers didn't feel heard. Therefore computer chip designer Jay Miner and game designer and programmer Larry Kaplan, among others, left Atari.

A while later, Larry contacted Jay, and after finding investors, in 1982 they started a company later called Amiga Corporation. At that time under the name Hi-Toro. The intention was to make games and accessories.

Larry went back to his old employer a year later after he got an offer from Atari and Jay turned out to be his direct competitor. Because Jay Miner could then fulfill his dream and design a chipset and computer, he named Lorraine, which was based on the Motorola 68000 chip. Initially as an extensible game console, but later as a full personal computer, because that was more likely to succeed given the Video Game Crash of '83. Meanwhile, the money ran out and they were looking for new investors. Steve Jobs of Apple has also come by a few times.

In January 1984 Amiga showed an improvised demo of a bouncing ball at the CES fair. There was no OS yet. This attracted investors, including Tramel Technology, but also Atari and Commodore. Amiga was pretty desperate and went for a deal with Atari. Atari gave Amiga a $500,000 loan. In return, Atari would then receive exclusive design rights for one year to use the chips in their own video game consoles. After that year Atari should also be able to add a keyboard and mouse.

At the same time, Warner was negotiating the sale of Atari, because Atari suffered a serious loss of millions. Not even the CEO of Atari knew about the takeover plans. In July 1984, Jack Tramiel and his company Tramel Technology acquired Atari Inc.'s Consumer Division and it became the new company Atari Corporation. Furthermore, I just call this Atari again, otherwise it gets a little complicated.

When the Amiga Corporation heard of these negotiations with Jack Tramiel they started negotiating with Commodore. They also knew Jack Tramiel because he previously had also shown interest in Amiga, but had wanted to change the company too drastically. Many Commodore employees had recently left and gone to Atari, so Commodore was so interested in the Amiga Corporation that they bought off the $500,000 loan for Amiga and didn't go for a licensing deal on techniques and products, but took over the Amiga Corporation as a whole. The Lorraine chipset has therefore never been delivered to Atari.

On August 13, 1984, Jack Tramiel, on behalf of Atari, sued Amiga for breach of contract. At the time, Atari was unaware that Amiga wasn't only working on a chipset, but secretly even on a whole computer around it. Meanwhile, Atari and their ex-Commodore employees started developing their own personal computer...

ATARI 520-ST

1985: Atari 520-ST

So this computer was created by the rivalry between Atari and the Amiga Corporation. Due to the delay at Amiga and because Atari employed former Commodore employees, they could come with their computer earlier than Amiga. In the year that the IBM PC compatible's only released Windows 1.0 in November, which tended somewhat towards a graphical interface, in January of the same year Atari announced their much more powerful and graphic computer: the Atari 520-ST.

The ST was set up much more graphically, because unlike the IBM PC, but like the Apple Mac and thus the Amiga boys' upcoming computer, it used a Motorola 68000 chip with a 16 bits external bus and 32 bits internal bus . The ST in the name therefore stands for Sixteen / Thirty-Two, because of these bus sizes.

The Atari ran an operating system from Gary Kildall's Digital Research: GEM (Graphics Environment Manager). As we read in the previous part, Gary was also the original developer of other major operating systems, including CP/M and DOS. With the GEM operating system, the Atari was the very first personal computer on the market to work with a GUI built with bitmap graphics. As a result, the interface looked much more graphical than the interfaces of the competition.

In addition to graphic applications, the Atari 520-ST was extremely popular with music producers for many years after production. This was because Atari had made an important decision; they had put MIDI ports on the motherboard that allowed users to control musical instruments with the computer and thus record music via MIDI data. The MIDI ports only cost a few cents extra in the production process, but they made this computer very popular among music producers.

Sales of the 520-ST started in July 1985 and saved Atari from destruction that seemed inevitable because of the '83 Video Game Crash.

Because of the included MIDI ports also music programs were released for the Atari ST. Like Steinberg's well-known MIDI sequencer Cubase (back then under the name Pro 24). Steinberg had previously written a simple MIDI sequencer for the Commodore 64, but when the Atari ST came on the market writing a new version of the program for the Atari was obvious. Today, Cubase is considered to be the mother of all software MIDI sequencers and DAWs (Digital Audio Workstation).

The Atari ST has made a name for itself as the ultimate music computer. At a time when making music with the computer was certainly not self-evident. Atari STs have been used for many years, even long after Atari itself no longer existed, for a lot of money because of the stable MIDI timing.

BTW as a side note, Jack Tramiel, the boss of the new Atari Corporation, didn't have a good reputation in the industry. Many companies had bad experiences with him. At the beginning after taking over Atari, this was perhaps the reason why the Atari ST quickly got the nickname 'Jackintosh'. Of course because of Jack's name and considering the Atari was reminiscent of Apple's Macintosh.

Amiga 500

1985: Commodore Amiga

But the expected and formidable competition also came that year. As on July 23, 1985, Commodore unveiled 'The Amiga from Commodore', which was later renamed the Amiga 1000. This was quite a show, complete with orchestra, singer Blondie and a creative demo by Andy Warhol.

Presales started in August and the first copies were delivered in November. Unfortunately they were later than Atari and too late for the Christmas sales. The Amiga 1000 was way ahead of its time and more powerful than the Atari ST, but it offered significant new features so new that many people didn't even understand what they could do with them, as there was no software available yet demonstrating these features. Commodore wasn't happy with the results and hired a new CEO. The plan than became to come up with a hi-end and a low-end model: the Amiga 2000 and the Amiga 500, both of which came on the market in March 1987.

For the sake of overview, I take the story a little wider here and include the Amiga 500, even though it only came on the market in 1987. Because that model showed very well the possibilities that the Amiga 1000 already had, but was much more popular and affordable, better marketed and also had much more impact.

The Amiga 500 was a direct competitor to Atari ST and also ran on the Motorola 68000 chip, but with the Lorraine chipset that Amiga had developed. The Amiga 1000 was a more powerful and high-quality computer than the Atari 520-ST, but also more expensive. Therefore, the 500 had become the entry-level model that could compete directly with the Atari ST.

In contrast to the IBM PC and also the Mac, the Amiga screen could display no less than 4096 colors. In comparison: the PC could at that time display 16 colors and the Mac even less: only two!

Although audio falls outside the scope of this blog series, I think it is important to mention here that the earlier Commodore 64 was the first personal computer to have a sound chip on board; the SID chip (Sound Interface Device). The Amiga went a step further and could even play standard 8-bit audio samples on four separate mono channels without additional hardware. Mod trackers (music trackers), multi-track software for making music on four mono channels with samples, became very popular. Many house tracks were made with module trackers and partly because of this, partly because of the Atari ST having MIDI ports, a new type of music producer was created: the home producer.

The Commodore Amiga was therefore very suitable for playing games, making music and graphic design. With the Lightwave 3D program it was also possible to create 3D video clips. The computer became especially popular among gamers and the 3d demo scene. The Amiga also got an important role in video productions and show control.

But the original employees of the Amiga Corporation, including Jay Miner, weren't satisfied with the device after the company was acquired by Commodore, because the computer was less than they intended. So they left the company.

Years later, Atari was unable to keep up with the market. The Commodore Amiga went well for a while and had released some new models after the 500, but wasn't able to keep up with IBM PCs and their competitive Windows 3.1, VGA graphics and Sound Blasters in the end.

The discontent of the original Amiga employees was so great that it could even be found in a series of sold computers. In the very first batch of Amiga computers, all staff signatures were proudly printed in the bottom plate, but now the following action was less positive.

The employees expressed their dissatisfaction by literally hiding it in the computer as an Easter egg. When hitting a certain combination of eight keys, while simultaneously inserting a floppy disc into the drive, a large message flashed on and off the screen that clearly stated how the employees thought about Commodore:

We Made Amiga... THEY F*CKED IT UP.

When Commodore found out about this, they were of course not happy and many computers were retrieved. This caused a delay of several months in the production process.

I thought it was important to write about this period in the time of Amiga and Atari until after 1987 now to get a better picture of the whole story. And also because it is such an interesting part of history. But now I take a step back again to where we left off: so back to 1985!

Nintendo NES

1985: Nintendo Entertainment System (NES)

We had just met the new player in the game console market, Nintendo. And we already read about their Famicom game console which was a big step forward and that they wanted Atari to distribute their product outside of Japan. However, the following development should not be missing in this article about the history of interactives and computer graphics.

July 1985 was the month on which the well-known Nintendo Entertainment System (NES) was launched. The NES, which was an improvement of the Famicom in a new package and belongs to the third generation of game consoles, quickly became very popular and one of the best-selling game consoles worldwide. This also ended the Video Game Crash that had lasted since 1983.

The NES was also special in that it not only put Nintendo on the map worldwide, but Nintendo also introduced with the NES an important new business model for game consoles that changed the industry. They didn't make all the games for their system themselves, but gave licenses to other game producers to be able to make games for the NES. So they actually took an important step in the direction of a more varied and wider range of games and perhaps even avoided unnecessary competition.

Actually, with this they solved important problems that had previously led to the Video Game Crash of '83 and Nintendo had learned from the mistakes Atari had previously made. They also beat the Atari 7800 in terms of sales. Remember? That was the game console that was rumored to be superior by Atari and the future compared to the Nintendo Famicom... With today's knowledge we know that it turned out slightly differently.

With the NES was also perhaps the most famous platform game ever launched: Super Mario Bros.

Inverse kinematics

1985: Inverse Kinematics in Animation

To put, for example, a skeleton in an animation in a certain position, simple Forward Kinematics was used at the time. If you wanted to place an arm in a certain position, you had to rotate all parts in a certain position from the upper arm all the way to the hand. This was time consuming and not a pleasant and intuitive way of animating.

What animators preferred was to directly move of a hand or a foot, whereby the rest of the arm or leg also moves naturally with it. The same challenge also existed in robotics, where robots should be able to grab something with their arms.

A solution to this challenge came with Inverse Kinematics. Although much work had already been done in 1985, Anthony A. Maciejewski and Michael Girard of Ohio State University (OSU) published their paper on Inverse Kinematics in animation. Which made the animation of computer characters considerably simpler and more intuitive and ultimately could lead to more natural postures. So this was a very important development for both robotics and computer animations.

To illustrate the difference between Forward Kinematics and Inverse Kinematics I made an interactive for you. Below in this interactive you can clearly experience the difference between both methods. Follow the instructions in the caption and you will see that Inverse Kinematics is a very convenient way to use in animations.

Select a kinematics method with the switch. With Forward Kinematics (FW) you have to rotate all arm parts up from the base. Inverse Kinematics (INV) works the other way around. In that mode you can click and drag anywhere on the field and the arm will shape itself to reach the mouse cursor or your finger on a touch screen.

Conclusion

And with that we have come to the end of this part with quite a bit of information. Some of the situations in this period are so interesting and full of related events that this time I chose to go deeper into some of them. Especially, because they have also proved to be very decisive for further technical developments.

What particularly struck me when writing this part was the prominent role Atari played during this period. Whether it was in negotiations with Nintendo and the Amiga Corporation, a series of hindsight, rather wrong decisions, made by parent company Warner, the loss of key employees due to lack of innovation, and at the same time welcoming Commodore employees. And competition and lawsuits. I don't think anyone can deny that Atari and also Commodore, especially with the exchange of their employees, played important roles in the development of graphically oriented computers. It is also clear that the role played by the Motorola 68000 chip is significant.

We are now getting closer to home and to the present tense. In the next part we will enter the internet age. That was also a time when many interesting new things happened.

Did you find this interesting or do you want to ask something or comment? Let me hear from you below and share the blog on social media. It motivates me to keep writing quality blogs like this one. Also after clicking on like or dislike you have the option to add comments if you'd like to comment on something (optional). Thanks and 'till next time!

Continue reading part 5 >

About me

Maarten de Haas

Sr. Front End Engineer / (Interactive) Motion Designer Wigglepixel
Maarten de Haas

With over 21 years of experience as a professional I am fulltime passionately committed to designing and developing Interactive Maps and Christmas Cards for Websites and Touch Screens for companies.

Previously I've been graphically, creatively and technically responsible for graphics and sport- and game software for many famous television productions in The Netherlands and abroad for more than 13 years.

I know the graphical and technical capabilities in 2D and 3D, as well as the hiccups of different browsers and optimizations in order to get the best possible online performance.

I make sure your project both graphically and technically produces the best results, so you don't need to worry about that.

Worked for and collaborated with