It’s no coincidence that Michael Crichton’s name is a popular one on the earliest portions of the timeline of CGI in film and television. After his 1973 film Westworld pioneered 2D computer animation in a feature film, the television spinoff Futureworld continued the trend with the first use of 3D computer graphics to animate a hand and a face. Crichton’s 1981 venture Looker — which he wrote and directed — claims a similarly important milestone: the first CGI human character. Her name was Cindy, and she’s kind of the digital australopithecus that ironically enough seems only to have evolved into Andy Serkis playing bigger monkeys.
So why are Tron and Star Trek II: The Wrath of Khan lauded to this day for basically doing what Looker did a year earlier? Simple: because Looker is awful. END OF REVIEW.
Continue reading Looker (1981)
Though known primarily for his novels, Michael Crichton made a name for himself in Hollywood not only through popular adaptations of his novels such as Jurassic Park and The Andromeda Strain but also by directing films himself for more than a decade. Westworld was both Crichton’s feature directorial debut (barring the ABC made-for-TV film Pursuit) and one of his earliest original screenplays. Plagued with production woes from the start, Westworld is largely renowned today as a major landmark in science-fiction cinema and an important advancement in film technology.
As David A. Price writes in this New Yorker piece, computer-generated imagery is commonplace at the movies these days. Star Wars gets a lot of the credit for sparking the technological revolution in Hollywood (although there have been a few technological advances since then), and it’s certainly true that the effects team behind that space saga deserves most of the commendation in which they bask. But if the question is where did all of this start? — Star Wars and Avatar and every other CGI-laden movie of the past thirty years — then the answer is almost certainly Westworld.
Continue reading Westworld (1973)