How 1970s and 1980s Film Innovation is Still Impacting the Industry Today
June 24, 2013 •
American filmmaking is constantly changing. In our decade alone, it has been impacted by HD, Blu-Ray and Ultra-Violet technology. However, earlier innovations are still competing with – and working with – these new technologies.
According to A Brief History of Film, the American film studio was destroyed by the end of the 1960s due to corporate takeovers, changing aesthetic tastes and a decrease in film attendance. With the old ways diminished, up-and-coming directors experienced a new sense of independence. Filmmakers such as Francis Ford Coppola and Martin Scorsese began to experiment.
Independent studios began to grow and original filmmaking gained traction. Furthermore, new technology began to find its way into the hands of these innovative directors.
The technology of the 1970s and 1980s, following the collapse of corporate-dominated studios, enabled directors to achieve something truly new in film. Today, directors are still using these three important film innovations:
The late 1970s saw the adoption of early computer generated imaging (CGI) by movie production companies and their special effects teams.
The technology was originally developed in research labs at various universities, where researchers were attempting to convert their computer data into pictures.
Yul Brenner’s robotic cowboy in Westworld is considered the first instance of 2-D CGI in film. It was followed by films such as Tron and Young Sherlock Holmes.
Today, there are few directors who have not used CGI at least once in their careers. It appears in all genres, from action to science fiction to animation.
Sony introduced digital camera technology in the late 1980s.
However, it took a while for the technology to catch on. Many directors felt that developed film was a better quality than digital – though digital was more convenient and the cameras were easier to handle.
In 2002, George Lucas became the first director to shoot a major motion picture with a digital camera. His Star Wars Episode II: Attack of the Clones is considered the first HD film. Since Lucas’s adoption of the technology, more and more directors have begun to work with and experiment with digital film.
Cameras have gotten smaller and more compact, allowing even more experimentation.
Cameraman Garret Brown solved the problem of shaky shots in 1976 when he invented the Steadicam.
Before his innovation, there were only two ways to get a steady shot: mount the camera on a dolly or hold it on your shoulder. But, Brown thought it would be better to combine these two techniques: he wanted the steadiness of a dolly, but the freedom of movement.
Brown developed a system of counterweights to distribute the weight of the camera across the cameraman. This allows the cameraman to move with the actors.
Directors are still using the Steadicam today. It has been combined with platform cranes, low mode extensions and gyroscopes to achieve various types of shots.
Garret Brown’s work with the Steadicam includes:
These three innovations created a foundation for filmmakers to experiment with technology. The 1970s and 1980s saw a new age of filmmaking. Experimentation with technology continued through the 1990s and into the new millennium.
As a student of film, you should be aware of how these innovations changed the industry. You should also be aware of how today’s directors continue to use these innovations in new and interesting ways. Learn more and experiment with filmmaking at Brooks Institute.