Showrunner Amazon-backed AI studio has announced plans to reconstruct the missing 43 minutes of Orson Welles’ 1942 classic, The Magnificent Ambersons. For the restoration of what many film historians consider one of cinema’s greatest lost works, the project will utilise newly shot live-action footage, archival materials and a mix of artificial intelligence. However, Warner Bros, Discovery, and Concord have denied the firm the rights to the film. As a result, the reconstructed cut will not be released for commercial purposes. Instead, the effort is being regarded as an academic exercise aimed at answering questions that have lingered for over 80 years.
AI Reconstruction Plans by Showrunners
Showrunner CEO Edward Saatchi has confirmed that the project is a pilot project to explore how AI can transform Hollywood production. He stated that the most recent model of AI used by the company is capable of producing lengthy, intricate stories, with the ultimate goal of developing full-length films. He believes that it cannot currently maintain a story beyond a single episode, yet sees this as an opportunity to explore how the boundaries of AI in storytelling.
The reconstruction of the Welles’ film shall be based on face and pose transfer tools and conventional filmmaking techniques. Scenes will be acted out by live actors, and AI will superimpose the likenesses of the original 1940s cast onto the footage. The studio will also draw heavily on the vast archive of set pictures, using them as a cornerstone for recreating lost scenes. He said that the goal is to have the 43 minutes live on after 80 years of people and their voices wanting to know: could this have been the best film ever made in its original form?
A Tragic History of Lost Footage.
The Magnificent Ambersons was shot and filmed in 1941 in the Gower Street Studios in Los Angeles, which is presently known as Sunset Gower Studios. Welles had originally cut it at 131 minutes. Nevertheless, with Welles giving up his final cut rights, RKO executives made extensive edits to the film. To open up space in the vaults, almost a third of the negatives were burned, and an 87-minute theatrical release was retained, which critics continued to laud as artistically successful.
The Hollywood Reporter review of the time referred to the film as a screen offering of exalted artistic values and recommended that the talent of Welles had made him worth his weight in gold. However, over the decades, historians and movie lovers have theorised about the influence of the excised content, which featured a darker ending and lengthier scenes that Welles did not want to be included in the film. That lost footage has never been seen again, and its destruction is frequently noted as one of the greatest losses to cinema.
Technical Challenge with Film Director Brian Rose.
Film director Brian Rose is leading the reconstruction effort, having worked on Ambersons during the past five years. Rose has used 3D modelling, research in archives and matching scripts to reconstruct 30,000 lost frames of the film. He explained that he had tried to frame and time each of the lost scenes, based on set photos and production records. An example that Rose described is a four-minute-long moving camera shot that had no cuts and whose loss was a tragedy. The scene traversed a ballroom and featured a dozen characters in and out of frame, crossing subplots into a continuous take.
The released version only left the last 50 seconds of the scene. Rose termed it as being ahead of its time, and his reconstruction has now become the blueprint of the AI-based re-creation of Showrunner. Also entering the team is VFX specialist Tom Clive, who has worked on face-swapping and de-ageing. Clive is a former employee of Metaphysic, which was a company specialising in AI-generated digital doubles, and joined Showrunner to work on the Ambersons project. His role will be important in keeping AI overlays faithful to the likenesses of original cast members such as Joseph Cotten, Dolores Costello, and Tim Holt.
AI in Hollywood’s Future
As of 2022, visual effects, dubbing, and storyboarding of films have been done with generative AI. Showrunner initiative extends beyond these supporting roles and seeks to make AI a part of the very fabric of storytelling and production. Saatchi sees Showrunner becoming a kind of Netflix of the AI, in which users could create episodes or even entire movies when prompted with a few words typed in. The company unveiled its new AI model on Friday, saying that it can produce long-form narratives that have been created in a live-action way.
This model will move the Ambersons reconstruction forward in the next two years, as well as undergo testing on original projects. Saatchi admitted that AI use of copyrighted content is controversial, but the Welles reconstruction was more of an experiment than a commercial item. One important impediment is the right issue. Showrunner has not licensed any rights to the film through Warner Bros. Discovery or Concord. Saatchi replied that as long as those companies can see a marketplace to it and a way of doing it other than in an academic situation, then of course they own it. Up until that time, the re-created footage will remain little more than a testament to technology and science.