Jump to content


Photo

How do you place visual effects into a shot when doing a photochemical finish?


  • Please log in to reply
31 replies to this topic

#1 Reuel Gomez

Reuel Gomez
  • Basic Members
  • PipPipPipPip
  • 257 posts
  • Other

Posted 19 July 2013 - 06:53 AM

From what I understand, digital compositing never really became commonplace until the digital intermediate process became popular in the early 2000's. So how were visual effects (not special effects) placed into a shot if you aren't doing a D.I.?
  • 0

#2 Mark Dunn

Mark Dunn
  • Basic Members
  • PipPipPipPip
  • 2427 posts
  • Other
  • London

Posted 19 July 2013 - 07:42 AM

Optical compositing, multiple exposure, motion control, stop-motion, go-motion, travelling matte, foreground miniatures- this is a huge field.  There's no simple answer to your question.


Edited by Mark Dunn, 19 July 2013 - 07:43 AM.

  • 0

#3 Reuel Gomez

Reuel Gomez
  • Basic Members
  • PipPipPipPip
  • 257 posts
  • Other

Posted 19 July 2013 - 08:39 AM

Optical compositing, multiple exposure, motion control, stop-motion, go-motion, travelling matte, foreground miniatures- this is a huge field.  There's no simple answer to your question.

I'm referring to CGI, not anything that can be shot practically. Sorry I didn't specify
  • 0

#4 Phil Rhodes

Phil Rhodes
  • Sustaining Members
  • 11943 posts
  • Other

Posted 19 July 2013 - 09:13 AM

I don't think there's been more than a few movies that rendered CGI then composited it optically. Any that did would have been very early examples. The Kodak Cineon system was invented to get around exactly this sort of situation in the early 90s and was obsolete by 97. So, those sections of the movie which had CG effects in them were scanned, composited and filmed out, although the purpose of doing so was to add effects, not do grading. Something like Jurassic Park would have been done this way.

 

Before that, Tron famously did CGI optical compositing, using a horrendously manual technique involving originating on 65mm then shooting every frame out to large kodalith sheet film and rotoscoping manually to produce mattes with which to drop in the backgrounds. However, most of the stuff that's in Tron is actually compositing of airbrushed backdrops - I'm not sure how much actual live action/CGI integration there is. There was certainly no moving camera integration - it was all locked off. The DVD extras famously include one of the people involved saying "it had never been done that way before and it will never be done that way again", and I would hazard the opinion that he's right. 

 

Some CGI was done for The Last Starfighter. I'm not sure if there was any live action integration.

 

Theoretically you could still render your CG, film it out, and drop it in using an optical printer, and the process wouldn't be any different to what was done for, say, Star Wars, just using CG elements as opposed to models. The practical complexities of making things look properly integrated and part of the same scene in an optical printer are notorious, involving complex exposure, filtration and timing concerns. I'm not sure this was ever done.

 

P


  • 0

#5 David Mullen ASC

David Mullen ASC
  • Sustaining Members
  • 19765 posts
  • Cinematographer
  • Los Angeles

Posted 19 July 2013 - 10:48 AM

We had a similar question back in "2004" and this was my response then:

 

http://www.cinematog...pic=1067&page=4

 

 

 

 

"Terminator 2" was blown-up conventionally using an optical printer. What helps his blow-ups is that Cameron asks his cinematographers to go for a very dense negative. Also, Adam Greenberg did a great job of lighting for contrast, which gave the image a nice snap, which always helps a blow-up.

However, it was one of the first films to have its digital efx shots composited digitally and then laser record the finished shot back to film (Super-35.) ILM previously had recorded out the separate digital efx elements to film and then did the compositing with the background plates in an optical printer (as they did for the water snake in "The Abyss.") At a lecture I attended, Dennis Muren said he pushed ILM to get digital compositing and outputting to film ready in time for "Terminator 2."

 

 

"Terminator 2" came out in 1991.  So the 1990's were the age of digital effects composites recorded out to film (usually an I.N.) and then cut into the film negative, and that was the norm all the way past "Lord of the Rings" (2001) until D.I.'s had become commonplace by the mid to late 2000's.  It's still done today, as in the case of "The Dark Knight" films, which did not go through a D.I.

 

Before that, elements (models, CGI, matte paintings, live-action, etc.) were composited in an optical printer.


  • 0

#6 Phil Rhodes

Phil Rhodes
  • Sustaining Members
  • 11943 posts
  • Other

Posted 19 July 2013 - 11:24 AM

 ILM previously had recorded out the separate digital efx elements to film and then did the compositing with the background plates in an optical printer

 

Yikes, I stand corrected. That sounds like a pain. The period during which that was the standarda approach must have been fairly short, though.


  • 0

#7 David Mullen ASC

David Mullen ASC
  • Sustaining Members
  • 19765 posts
  • Cinematographer
  • Los Angeles

Posted 19 July 2013 - 11:53 AM

CGI elements were rare in the 1980's other than "The Last Starfighter" -- the stained glass knight in "Young Sherlock Holmes" was 1985 and the water snake in "The Abyss" was 1989.

 

There was some attempt at digital compositing of a few shots even back then, even "Flash Gordon" (1980) tried some electronic compositing which had to be recorded back to film. See:

 

http://www.theasc.co...rum2/page1.html

 

 

The DI process was born out of a rather recent marriage between visual effects and motion-picture film scanner- and telecine-based color grading. Of course, digital imaging began impacting the motion-picture industry a long time ago. While at Information International, Inc. (Triple-I), John Whitney Jr. and Gary Demos (who now chairs the ASC Technology Committee’s Advanced Imaging subcommittee) created special computer imaging effects for the science-fiction thriller Westworld (1973) and its sequel, Futureworld (1976). The duo subsequently left to form Digital Productions, the backbone of which was a couch-sized Cray XM-P supercomputer that cost $6.5 million. With that enormous hunk of electronics (and an additional, newer supercomputer that the company later acquired), Whitney and Demos also produced high-resolution, computer-generated outerspace sequences for the 1984 feature The Last Starfighter. The substantial computer-generated imagery (CGI) in that film was impressive and owed a debt of gratitude to a groundbreaking predecessor: Tron. That 1982 film, to which Triple-I contributed the solar sailer and the villainous Sark’s ship, featured the first significant CGI in a motion picture — 15 minutes worth — and showed studios that digitally created images were a viable option for motion pictures.
 
During that era, computers and their encompassing “digital” aspects became the basis of experiments within the usually time-consuming realm of optical printing. Over 17 years, Barry Nolan and Frank Van Der Veer (of Van Der Veer Photo) built a hybrid electronic printer that, in 1979, composited six two-element scenes in the campy sci-fi classic Flash Gordon. Using both analog video and digital signals, the printer output a color frame in 9 seconds at 3,300 lines of resolution. If optical printing seemed time-consuming, the new methods weren’t exactly lightning-fast, either, and the look couldn’t yet compete with the traditional methods.
 
In 1989, Eastman Kodak began research and development on the Electronic Intermediate System. The project involved several stages: assessing the closed-loop film chain; developing CCD-based scanning technology with Industrial Light & Magic; and, finally, constructing a laser-based recording technology and investigating the software file formats that were available at the time. The following year, Kodak focused on the color space into which film would be scanned. The project’s leaders determined that if footage were encoded in linear bits, upwards of 12-16 bits would be necessary to cover film’s dynamic range. Few file formats of the day could function at that high a bit rate. Logarithmic bit encoding was a better match for film’s print density, and Kodak found that 10-bit log could do a decent job (more on this later). The TIFF file format could handle 10-bit log, but was a bit too “flexible” for imaging purposes (meaning there was more room for confusion and error).
 
Taking all of this into account, Kodak proposed a new format: the 10-bit log Cineon file format. The resulting Cineon system — comprising a fast 2K scanner capable of 4K (which was too slow and expensive to work in at the time), the Kodak Lightning laser recorder and the manageable Cineon file format — caused a radical shift in the visual-effects industry. In just a few short years, the traditional, labor-intensive optical died out. Though Kodak exited the scanner/recorder market in 1997, the Cineon file format is still used today. 
 
---
 
I seem to recall reading in Cinefex that the early CGI shots like the Genesis Planet demo in "Star Trek 2" was basically photographed frame by frame off of a hi-res b&w CRT monitor using color filters to build up RGB, which was how a lot of the early video-to-film transfer devices worked.

  • 0

#8 Phil Rhodes

Phil Rhodes
  • Sustaining Members
  • 11943 posts
  • Other

Posted 19 July 2013 - 12:08 PM

There's an interesting coda to this, about that filtered-CRT trick.

 

From what I read, one of the issues they had on Tron was that one of the companies involved (which included Information International, coincidentally) was doing vector graphics, and filled-in areas with a repeated raster of passes. The vectors are visible in the "journey to the computer world" sequence where objects are clearly hashed-in with lines as the camera flies close.

 

The other guys were doing filled polygons. The vector stuff included Sark's carrier, but the carrier was required to appear in other scenes which were being polygon-rendered. The polygon people were therefore required to emulate the look of the vector graphics using very thin poly objects...

 

All of it was, as far as I know, shot off mono CRTs with RGB filters. Being as there was no such thing as any form of preview, or any way to store the images, this was the only way to do it - render it to a framebuffer and shoot it out to film. I don't think the vector stuff was even simultaneously rendered, just drawn onto a CRT and integrated as a photographic exposure.

 

And I thought Imagine 3.0 on the Amiga circa 1994 was painful. That solar sailer was beautiful.

 

P


  • 0

#9 Robert Houllahan

Robert Houllahan
  • Sustaining Members
  • 1584 posts
  • Industry Rep
  • Providence R.I.

Posted 19 July 2013 - 03:20 PM

Modern CRT film recorders from Celco and Lasergraphics work the same way with a monochrome CRT and color filters, they just work allot faster. I think the elements for Dark Knight, etc. were recorded on a Celco at 4-5sec/frame.

 

-Rob-


  • 0

#10 Reuel Gomez

Reuel Gomez
  • Basic Members
  • PipPipPipPip
  • 257 posts
  • Other

Posted 19 July 2013 - 09:55 PM

So they used film scanners/recorders on the TDK films?
  • 0

#11 David Mullen ASC

David Mullen ASC
  • Sustaining Members
  • 19765 posts
  • Cinematographer
  • Los Angeles

Posted 19 July 2013 - 10:58 PM

What did you think they used?
  • 0

#12 David Mullen ASC

David Mullen ASC
  • Sustaining Members
  • 19765 posts
  • Cinematographer
  • Los Angeles

Posted 19 July 2013 - 11:20 PM

The Dark Knight trilogy did not go through a D.I. but not only did any visual effect have to go through the process of being scanned, composited with efx, and recorded back to a 35mm anamorphic I.N. to be intercut with the non-efx live-action footage shot in 35mm anamorphic... but it got even more complicated with the second two because of the IMAX footage.

 

The first one, "Batman Begins", had a photochemical contact-printed post (other than the efx shots) but was then also blown-up to IMAX by taking a color-timed 35mm I.P. and scanning that, going through the IMAX DMR process (basically a secret sauce of digital grain reduction and sharpening so that it looks decent on a large IMAX screen, and then recording the results to 15-perf 65mm negative.)

 

But for the other two, the filmmakers did not want to do a D.I. and then just record-out 35mm, DCP, and IMAX versions from one digital master, they wanted a film master using original negative for each release format... (similar to what had to be done with "Little Buddha", which mixed 35mm anamorphic and 5-perf 65mm) so essentially it meant that the IMAX footage had to be reduced to 35mm anamorphic and cut into the original negative, and the 35mm anamorphic footage had to be blown-up to IMAX and cut into that negative, so two original negative masters exist for contact-printing off of, one in 35mm anamorphic and one in IMAX.  So essentially there are chunks of the movie that went through a D.I. type process for both the 35mm and IMAX versions, but overall, whenever possible, if original negative could be contact-printed from, that's what they did.

 

It's one reason the IMAX sections stand out, as compared to the IMAX sections in second "Transformers" movie or the last "Star Trek" movie -- in those movies, the IMAX footage went through a D.I. and had a lot of efx added to them (and in the case of "Star Trek: Into Darkness" also got converted into 3D) so you don't get to see contact-printed IMAX negative for any non-efx moment as you can in the second two "Dark Knight" movies.


  • 0

#13 David Mullen ASC

David Mullen ASC
  • Sustaining Members
  • 19765 posts
  • Cinematographer
  • Los Angeles

Posted 19 July 2013 - 11:31 PM

I'm referring to CGI, not anything that can be shot practically. Sorry I didn't specify

 

A CGI element like a monster could be recorded out to a piece of film, probably against a black background, along with the two hold-out mattes needed to be able to composite the element into a background element in an optical printer.

 

Basically positive versions of the elements are loaded into the projector side of the optical printer, along with any hold-out mattes, and are exposed onto a dupe negative in the camera side of the optical printer.

 

See:

http://en.wikipedia....iki/Compositing


  • 0

#14 dan kessler

dan kessler
  • Basic Members
  • PipPipPip
  • 172 posts
  • Other

Posted 20 July 2013 - 03:55 PM

A lot of interesting background, but I think the
simple response is that the OP's original understanding
was incorrect.  Digital compositing was in wide use
well before D.I.'s.  It was standard for CGI elements
through most if not all the '90's.


  • 0

#15 Reuel Gomez

Reuel Gomez
  • Basic Members
  • PipPipPipPip
  • 257 posts
  • Other

Posted 20 July 2013 - 05:23 PM

A lot of interesting background, but I think the
simple response is that the OP's original understanding
was incorrect.  Digital compositing was in wide use
well before D.I.'s.  It was standard for CGI elements
through most if not all the '90's.

What happened was I was watching a featurette on the T2 DVD and I was confused when I heard them talking about the VFX shots and optical compositing and took it as if Jim Cameron were saying that they were optically compositing.
  • 0

#16 Reuel Gomez

Reuel Gomez
  • Basic Members
  • PipPipPipPip
  • 257 posts
  • Other

Posted 20 July 2013 - 05:26 PM

The Dark Knight trilogy did not go through a D.I. but not only did any visual effect have to go through the process of being scanned, composited with efx, and recorded back to a 35mm anamorphic I.N. to be intercut with the non-efx live-action footage shot in 35mm anamorphic... but it got even more complicated with the second two because of the IMAX footage.
 
The first one, "Batman Begins", had a photochemical contact-printed post (other than the efx shots) but was then also blown-up to IMAX by taking a color-timed 35mm I.P. and scanning that, going through the IMAX DMR process (basically a secret sauce of digital grain reduction and sharpening so that it looks decent on a large IMAX screen, and then recording the results to 15-perf 65mm negative.)
 
But for the other two, the filmmakers did not want to do a D.I. and then just record-out 35mm, DCP, and IMAX versions from one digital master, they wanted a film master using original negative for each release format... (similar to what had to be done with "Little Buddha", which mixed 35mm anamorphic and 5-perf 65mm) so essentially it meant that the IMAX footage had to be reduced to 35mm anamorphic and cut into the original negative, and the 35mm anamorphic footage had to be blown-up to IMAX and cut into that negative, so two original negative masters exist for contact-printing off of, one in 35mm anamorphic and one in IMAX.  So essentially there are chunks of the movie that went through a D.I. type process for both the 35mm and IMAX versions, but overall, whenever possible, if original negative could be contact-printed from, that's what they did.
 
It's one reason the IMAX sections stand out, as compared to the IMAX sections in second "Transformers" movie or the last "Star Trek" movie -- in those movies, the IMAX footage went through a D.I. and had a lot of efx added to them (and in the case of "Star Trek: Into Darkness" also got converted into 3D) so you don't get to see contact-printed IMAX negative for any non-efx moment as you can in the second two "Dark Knight" movies.

So anything that contained CGI was scanned/recorded back onto 35/65mm but everything else was contact printed, did I get that right?
  • 0

#17 Reuel Gomez

Reuel Gomez
  • Basic Members
  • PipPipPipPip
  • 257 posts
  • Other

Posted 20 July 2013 - 05:28 PM

 
A CGI element like a monster could be recorded out to a piece of film, probably against a black background, along with the two hold-out mattes needed to be able to composite the element into a background element in an optical printer.
 
Basically positive versions of the elements are loaded into the projector side of the optical printer, along with any hold-out mattes, and are exposed onto a dupe negative in the camera side of the optical printer.
 
See:
http://en.wikipedia....iki/Compositing

What's a dupe negative?
  • 0

#18 David Mullen ASC

David Mullen ASC
  • Sustaining Members
  • 19765 posts
  • Cinematographer
  • Los Angeles

Posted 20 July 2013 - 08:13 PM

What happened was I was watching a featurette on the T2 DVD and I was confused when I heard them talking about the VFX shots and optical compositing and took it as if Jim Cameron were saying that they were optically compositing.

 

For one thing, even today people loosely use the word "opticals" to refer to certain transitional effects such as dissolves, even tough they are all done digitally today.  Second, even though ILM tried to do all the efx compositing digitally for T2, the film itself was shot in Super-35 and optically blown-up to anamorphic, plus there were probably some optical effects added to the negative -- occasional shots get farmed out to other efx companies, little stuff, but it may involve optical printing.  Plus perhaps ILM's decision to do all the compositing digitally hadn't been made while they were shooting, so some comment made on the set would say "optical compositing".  And perhaps at crunch time, maybe ILM had to push some stuff to their optical printer department to finish, they just couldn't get it all done in their digital system. 

 

"Dupe negative" ("dupe" short for duplicate) is used interchangeably with "internegative" (though technically I think "internegative" refers to a negative made from a reversal positive original) -- either way, what I really mean is the use of intermediate duplication stock, a low-contrast film stock with a color mask (that brick orange color that negatives have) used for duplicating a piece of film with minimal increase in contrast and grain.  If you copy a camera negative onto the stock, you end up with a positive image and thus is called an interpositive (I.P.) but if you copy this interpositive onto this same dupe stock, you end up with a negative image and this is called a "dupe negative" or "internegative" (I.N.).


  • 0

#19 David Mullen ASC

David Mullen ASC
  • Sustaining Members
  • 19765 posts
  • Cinematographer
  • Los Angeles

Posted 20 July 2013 - 08:16 PM

So anything that contained CGI was scanned/recorded back onto 35/65mm but everything else was contact printed, did I get that right?

 The effects shots needed to be scanned and recorded back, but also they had to digitally reduce the IMAX footage down for the 35mm negative cut, and they had to digitally blow-up the 35mm footage to IMAX for the 15-pert 65mm negative cut.


  • 0

#20 Reuel Gomez

Reuel Gomez
  • Basic Members
  • PipPipPipPip
  • 257 posts
  • Other

Posted 20 July 2013 - 10:23 PM

 The effects shots needed to be scanned and recorded back, but also they had to digitally reduce the IMAX footage down for the 35mm negative cut, and they had to digitally blow-up the 35mm footage to IMAX for the 15-pert 65mm negative cut.

When the 35mm footage is blown up, isn't there a loss in quality?
  • 0


Technodolly

Ritter Battery

Opal

Gamma Ray Digital Inc

rebotnix Technologies

Aerial Filmworks

The Slider

Tai Audio

Willys Widgets

Visual Products

Rig Wheels Passport

Abel Cine

Wooden Camera

Metropolis Post

FJS International, LLC

CineLab

Broadcast Solutions Inc

Paralinx LLC

CineTape

Glidecam

Media Blackout - Custom Cables and AKS

Abel Cine

FJS International, LLC

Wooden Camera

Media Blackout - Custom Cables and AKS

Metropolis Post

Tai Audio

Ritter Battery

Broadcast Solutions Inc

Visual Products

Gamma Ray Digital Inc

CineLab

rebotnix Technologies

CineTape

Willys Widgets

Opal

Rig Wheels Passport

The Slider

Glidecam

Technodolly

Aerial Filmworks

Paralinx LLC