Abstract

Computer-generated imagery or CGI is an area of digital visualization and image manipulation practices that, following its emergence in the late 1960s, quickly came to hold a privileged relationship to film production—affecting in particular visual effects, animation, and the big-budget blockbuster. In these areas, digital imaging is consistently pushed to its limits by an ever-advancing state of the art. In a view promoted by the industry through plentiful “making-of” coverage, CGI is strongly identified in the popular imagination with spectacular visual effects, demonstrating Hollywood’s prowess at realizing fantastic visions. But CGI plays a more significant if quieter role in its so-called invisible effects, which begin with the unnoticeable retouching of filmed “truth” and ripple outward to what some have warned is the destabilization of the cinematic medium itself, replacing the industry at every level—from production to exhibition and distribution—with its digital other. Academic attention to CGI grew slowly alongside its emergence as a powerful if alien force in filmmaking and film culture during the 1980s and 1990s but took off after the crucial year of 1999, when The Matrix heralded the fusion of analogue and digital cinema. CGI has become a key focus of popular attention on “behind the scenes” information, and an industrial entry point for fledgling filmmakers with access to cheap digital production tools. But even as it extends the powers and profits of the film industry, CGI has challenged established practices and definitions, destabilizing film’s ontological base, its indexical relationship to reality, the tenets of classical narrative structure, and even the boundaries separating film from other media such as video games, experimental art, and virtual reality.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call