Subscribe Sign Up

We use cookies to improve the services we offer you. By continuing to browse this site, you consent to keep them in accordance with our Privacy Policy.

×

16 movies of the XXth century that defined the history of CGI: Part 1

August 31, 2021
  • 7,412
  • 5 min

In just 50 years computer-generated visual effects industry went from a short film with a schematic image of a cat making simple movements to crazy movies, like Blade Runner 2049 and Ready Player One. How did that happen? Let’s see how CGI technologies evolved throughout the years and which movies became the most significant in that matter. 

Kitty, 1968

In the late ‘60s, a group of soviet mathematicians led by Nikolai Konstantinov designed a mathematical model that could digitally create a realistic animation of a walking cat. On a BESM-4 computer, Knostantinov’s team created a mathematically computable model that could replicate the physics of a moving cat. The computer produced thousands of frames and then printed them on paper using alphabet symbols (printing pixels was not possible at the time). 

The individual prints were converted into a cinefilm reel which could be played on a projector. 

Even though there were other computer animations created before Kitty, it’s this short film that many recognize as the first incarnation of the design technology that has eventually evolved into the CGI we know today.

Metadata, 1971

A few years later, artist Peter Foldes created an experimental 2D animated short film Metadata using a digital tablet of the time and the world’s first keyframe animation software (invented by Nestor Burtnyk and Marceli Wein). 

Keyframing is a digital animation process where the animator draws the starting and ending points of the movement and then the computer creates a smooth transition between them.

These two short films presented the two key components of CGI to the world of movie-making: algorithm-driven image creation and computer automated transition-making process. 

Computer animated hand, 1972

One year later, grad students Ed Catmull, the future co-founder of Pixar studio, and Fred Parke introduced the first 3D digital rendering technique. They first created a plaster model of Catmull’s left hand, then divided it into 350 small triangles and polygons and created counterparts of them in the computer. Using a 3D animation software developed by Catmull, they made the hand swirl, point, and do other movements.

At the time, there was no available hardware to render the film at speed. Catmull and Parke had to transfer the clip on film manually, by taking long exposure Polaroid shots of the computer monitor and then joining them together into a physical film.  

WestWorld, 1973 and Futureworld, 1976

The movie Westworld became the first feature film with 2D computer animation used in it. Written and directed by Michael Crichton, the film is about androids that live amongst humans. To show a robot’s point of view, the director used a computer processing technique that pixelates photographs and creates a moving bitmap image put of photographs.

Three years later, the same team released Futureworld featuring 3D computer animation. In one of the scenes, the viewers can see digitalized Cutnell’s animated hand and a moving 3D face on a screen.

Thanks to these two movies, CGI was introduced to the entertainment industry. 

Star Wars: A New Hope, 1977

The success of Westworld and Futureworld inspired George Lucas to use 3D wireframe graphics in his first Star Wars movie. He basically improved Catmull’s 3D creation technique and use it in the trench run briefing scene in Star Wars: A New Hope. Partly because of the innovative effects used in the movie, A New Hope became such a hit. 

Looker, 1981

The next decade became the heyday of CGI in movies. In 1981, the movie Looker introduced the first realistic CGI 3D human. To create the scene, actress Susan Day’s body was manually measured and digitized using wireframe rasters with topographical recognition. This technology could generate shaded imagery.

Star Trek II: The Wrath of Khan, 1982

The shading innovation introduced in Looker paved the way for the effects used in Star Trek II: The Wrath of Khan

In 1982, in George Lucas’s movie, the audience for the first time saw a realistic fractal-generated landscape. The scene became possible thanks to software developer Bill Reeves who created a new graphics technique called “Particle Systems” – an innovative method of modeling fuzzy objects such as fire, clouds, and water. It randomly generated a large number of very small sprites and randomly generated their local oscillation pattern. Without this algorithm, creators would have to individually draw thousands of tiny sprites and individually program their movement.

Tron, 1982

The same year, the film Tron was released. Steven Lisberger presented the first extensive use of 3D CGI: 15 minutes of the movie were fully computer-generated. Tron also included a very early computer-generated facial animation. The program that created these effects was based on the same wireframe rasters used in Looker.

The movie also was the first one to combine CGI and live-action in one sequence.

Tron looked super innovative at the time, and quickly entered the pantheon of timeless classics. The movie was nominated for Oscars, but, surprisingly, not for visual effects. In 1982, CGI was considered to be “too easy” compared to traditional animation.

 

Want to know more about the history of CGI? Stay tuned – the second part is coming next month! 

Join for how-to guides, speсial offers, and app tips!