I watched the test in the best quality possible and yeah, what does it say really? I'd say film and Alexa in this, carefully planned, test are almost virtually indistinguishable but Steve knows what he's doing, I've seen the LUT he's designed on Danny Collins and it still looks like digital. I just don't buy it, just those films this year shot on film like Bridge Of Spies, Steve Jobs, Carol (even though super 16 is an even more different thing), Love & Mercy, The Big Short, Joy, The Hateful Eight, The End Of The Tour, Star Wars The Force Awakens, could they be replicated on digital?! No, I don't care how much work goes into an emulation of film, film cannot be emulated as it is a chemical reaction, it's unpredictable, it's alive, how can you account for that?
When I watch something like Joy or The Hateful Eight, or Interstellar, or The Dark Knight, or Lord Of The Rings, and then I watch something like The Social Network, or Sicario, or Her, or Nightcrawler or whatever, does it feel the same?! No, and those DPs don't shoot digital on those movies trying to make it look like film, they shoot the format for what it is. Anyway, I'm already starting another film vs digital debate which this thread is bound to spark anyway, but Steve Yedlin's theory is just flawed, a couple of carefully tuned tests don't prove anything (not that he's trying to prove anything, so he says). Yes, so what?
Edited by Manu Delpech, 09 January 2016 - 06:31 PM.
All Yelden is doing is showing how great the finishing tools are today. What he captured with the Alexa didn't look like that in camera, it looked like that after careful manipulation. Plus... it's in a digital world! Once you take film and scan it to digital, it's no longer film, it's no different then what's shot with the Alexa in my book. The real test can't be done with a MPEG file on your home computer or with a DCP at a theater. It has to be done with a film projector and photochemical process vs a digital projector and an all-digital process. It's easy to do split screens this way, simply install a masking on the projectors. I guarantee you, the difference between the actual photochemical film and DCP is night and day. Not only will the film have far better blacks, but a far better contrast ratio. Shots that had blown-out skies on the digital, would be probably perfect on the film print (if done right).
In Yelden's discussion, http://www.yedlin.net/160105_edit.htmlhe's asked if one (film) is real and digital is "simulated". He responds; "This is quite a value judgment. Are you saying that film is “real” and digital isn’t? I don’t even quite know what it means to relegate digital to a lesser status of existence than film. If you take the terms literally, then of course digital is real and the distinction between “real” and “simulated” doesn’t exist. If you take the terms more figuratively, then I’m not sure what you’re getting at other than expressing an a priori belief that is at odds with the empirical evidence rather than making any kind of statement about the empirical evidence itself."
My answer to the question is of course digital is simulated. It doesn't exist in a form our senses are capable of viewing it. We can't touch it, smell it, see it or hear it. If you gave an solid state drive to a monkey, he'd probably smash it and try to eat it. Digital requires translation from reality into simulated non-reality and then back again.
Film or any physical analog asset, exist in a form our "analog" body can work with. We can touch a paint brush, feel the layers of dried paint and smell the oils as well. We can hear the paint brush hitting the canvas and of course we can see it as we do it. Same with anything that exists in our 3 dimensional world. Film is only slightly different, as we can't really hear film but we can sure as hell touch it, we can absolutely smell it and of course see the image, without any translation of any kind. It's a "real" physical item that exists right in front of your face.
Here is another great comment... "What I myself am propounding is that we perceive it as magic precisely because of a prior held belief that it’s magic and not because of any actual (as opposed to imagined) attribute. It’s just like the computer-composed music study you quoted — it’s a placebo effect. People believe they can “see” and “feel” film because it has some kind of “soul” that digital doesn’t, but the fact that they believe it does not prove that it’s a physical property of the objects rather than a psychological projection of an imagined truth. "
Umm, I'll gladly hold a picture of me touching an actual frame of film. It's a physical thing!
I think this statement is also funny; "Also, why the mention of “24fps,” that doesn’t distinguish digital from film acquisition since they’re both quite literally 24fps."
But see they aren't! Film projectors have black gaps between each frame, for the next frame to load. Digital projectors play the frames back seamlessly, there is no gap between each frame. So the frames stay longer on screen per second, then film.
I love this as well: "“Superstition" seems more precise to me, though, for describing the (outspoken) belief that the perceptual attributes traditionally associated with film are always and only seen in photochemically acquired images and the perceptual attributes traditionally associated with “video” are always and only seen in digitally acquired images. That’s because “superstition” very pointedly connotes an unfounded belief, especially one that ascribes a causal role to a non-essential object or phenomenon. The word also implies a preference for cherry-picked belief-confriming anecdotal evidence over rigorous unbiased empirical evidence."
No... it's because digital has far less bit depth then film does, PERIOD. Technically, we physically don't have the technology today to reproduce film identically from acquisition through projection. We could easily produce a 24bit 8K image from film (theoretical max bit depth of film is 32bit). However, computers and drives aren't fast enough to work with that media in real time. So we compress the color space down to 12 - 16 bit and even worse, we shrink the resolution down to 4k or even 2k. Pixar doesn't even have acquisition and they finish everything in 12bit 4k. They COULD finish in 32 bit like film, but they don't because it's unnecessary and overly time consuming. Why? Simple! The projection technology can only accept 12 bit, the image reproduction systems (DLP chips) don't have enough steps in their movement to differentiate. They only move a few degree's from full black to full white. It's wonderful technology, but it's nowhere near the quality of our capture devices today. Digital still cameras and some film scanners can easily capture 24bit color, which is still less then film.
Yeldin sounds smart, but I just don't think he understands the technology behind what he talks about. There is more to it then "look" because anyone can mimic something in computer environment, that's easy. The hard part is getting the digital process to work like the photochemical one. I could care less about what digital scanned film looks like, it still looks nothing like an actual complete photochemical workflow.
A proper test is to project film in a film projector, next to a digitally faked film in a digital projector. But it's difficult to discuss in words because our entire language around imaging has become digital. For example, we speak of film resolution today as if film were composed of pixels. We might say, for example, that the resolution of 35mm is 4K. It wasn't that long ago we spoke of 35mm as 2K. But film is simply not expressible in terms of kilo-pixels in the first place. The only reason we do so is to give a rough idea of film in terms that are familiar to your everyday homo-digitalis.
The digital illuminati, so enamoured of digital metrics, have no idea at all about how analog images work, let alone how one might theorise such - ie. what words, terms, or concepts to use. And then they try to fake it instead. It's completely stupid.
I work as a digital image technician. That's my day job. But as an artist it is the analog domain that intrigues me far more. I have no interest in faking a film look. To what end is such craziness? Who is one trying to fool? And why fool anyone anyway? If anyone were to fool me with a digitally synthesized image I took for film the only thing that would prove is that the person who fooled me gets off on doing so - on fooling people. It doesn't say anything about me (other than I've been fooled - woopee - big effen deal) or about the nature of analog images. All it does is express something about the nature of digital images. Something that might very well be interesting in it's own right - but nothing whatsoever about the nature of analog images. Because basically it's not an analog image, it's a digital one.
The error here is the idea that images are illusions and that the way you create an illusion doesn't matter. But images are not illusions. It's an Ancient Greek misconception (or indeed an ancient superstition), rebirthed during the Rennassiance, and perpetuated to this very day.
Edited by Carl Looper, 09 January 2016 - 10:08 PM.
"I’m trying to keep the focus of the demo on the overall “look,” and not on the historical details of the making of the demo. " - Yedlin
This is because the author believes the historical details are irrelevant - that the look (the image) is all that matters. And in many ways that is also very true. But the historical details are neither irrelevant nor a bias. They will provide details with which we can further analyse the image - and understand the nature of images in general. We would be better informed. The author is not interested in that. Not interested in informing us.
What is of interest (at least to me), is not what we can't see, but what what we can see. But if what we can see (or could have seen) has been erased or repressed, such as repressing historical details, and making two shots indistinguishable from each other - then what are we looking at?
And that's the problem right there in a nutshell. In this evacuated imploded nihilsitic space where nothing is being made visible, and not even nothing is being demonstrated, not even nothing is being said.
Well not entirely.
That guy walking to the window is a bit too self conscious. One can see it in his wobbly knees. He needs to get over that.
Edited by Carl Looper, 09 January 2016 - 10:43 PM.
Yedlin's argument reflects a famous argument between the Stoics and the Platonists in Ancient Greece.
The Stoics argued that what we see is an extension of what is actually there. Our brain is like a blank sheet of paper onto which nature writes. If we see a horse, for example, it is because nature (in the form of a horse) has impressed that in our mind. We see what is there. There is no distinction to be made between a horse and an image of a horse. The real horse is an image. And the image is a real horse. As strange as this sounds to the modern mind.
The Platonists, on the other hand, treated the image, not as an extension of what is there but at best as a representation of what is there. At worst, an illusion. Or a deception. In Platonic philosophy the world is divided into reality on the one hand and images on the other. Images could inform us or equally deceive us. behind such was reality which would be some sort of mind.
A typical example would be a mirage in the desert. If we mistake such a mirage for water this proves (or so it will be argued) that the image (the mirage) has deceived us.
The Stoics counter-argued that it is not the image which deceives us. It is our interpretation of the image (our mind) which is at fault. The Stoics advice was to study the image (rather than our mind). For example, one could walk to where one thought there was water and see that there is no water there. The image we originally had is not in any way altered by this investigation. Rather, what is altered is how we interpret the image. Instead of describing it as water (or representation of water) we can describe it in other ways, eg. as the refraction of light in hot air.
But neither way of describing a mirage alters the underlying image. The image (a shimmer on the horizon) remains the same image.
A lot of this debate disappeared during the Middle Ages (or Medieval times).
And during the Renaissance, it would be the Platonic side of the argument that would be rediscovered and adopted, and determine the shape of the modernism to come. So much so that we barely know how to understand the Stoic position.
If we regard images as illusions, the images we create in that context will tend to confirm our bias in that regard. We will be drawn into the creation of illusions - such as the creation of a digital image pretending to be an analog one. Or an analog one pretending to be a digital one (the human figures in Tron 1). We have this compulsion to reinforce the idea that images are illusions. And we will tend to edit out anything that might interrupt this idea of an illusion.
The Stoic position is an alternative one. Importantly it is not just the flip side of the Platonic schema - as if images were to be understood as the reality behind images. In Stoicism there is no reality deeper than the image. The image itself is the primary reality. What is less real (so to speak) is how we might understand the image. In what way might we theorise it.
One way to theorise an image is to create an image rather than just talk about images.
And how we create such an image then expresses the theory we're otherwise proposing, but in terms of how we created it rather than how someone otherwise reads it. So if we create an image in such a way that it fools someone, then the theory we're elaborating is that images fool people - whether it actually does so or not (fool people) becomes irrelevant. The very intention gives away the thesis. Now one might very well hide one's intentions, in order to keep the spotlight on one's guinea pigs, but there is one guinea pig you can't hide information from and that is yourself.
Interesting discussion, thanks for posting this. For the most part the images are very close. I'm curious about the parts of the comparison test where there are larger differences in color and highlight rendering, specifically the flashlight shot and the night exterior. Seems to me that there is also some suppression of celluloid characteristics in the 35mm footage going on in the highlight halation of the red sensitive layer, though Yedlin mentions that he modeled for that in the Alexa footage.
Regardless, I like his focus on objective analysis through the scientific method. In light of our recent discussion here on lens fetishism in the 'favorite lens package' thread, I probably need to re-examine some of my emotional attachments to certain tools and dig a little deeper on why exactly I like them so much.
Apples and Oranges. One cannot compare 1 micron three dimensional randomly dispersed film particles to a flat matrix of rigid 7 micron 2 dimensional pixels. There over a trillion film particles per frame. Sometimes people confuse film grain with particles. Grain is actually clusters of fundamental particles. Digital sensors are at best a binary approximation of analog phenomenon. Check out Nyqvist Theorem and MFT.
Edited by Nicholas Kovats, 10 January 2016 - 09:55 PM.
We're given two images, in the same domain (the digital) where all phenomenological differences have been intentionally erased, and asked to understand this as saying something about the difference (or lack thereof) between analog and digital images.
An artist can presents us with an orange, and an apple painted orange, and suggest to us there is no difference. And we can certainly agree with that. We can say. "Yes you are right - there is no difference"
But another artist could take the same apple, painted orange, and wash off the paint, to reveal the apple. Suggesting to us the opposite thesis. And funnily enough we can also agree with that.
Resolving the difference between these two works shouldn't be that hard. But it often is. The answer is always to consider the bigger picture in which any picture is otherwise couched. Even if it requires imagining such. For it's easy to edit this big picture out. And to suggest there is no bigger picture. But if we want to be even a little bit scientific about this we have to edit some of that bigger picture back in. Even if it's just an imagined one.
But it's up to the artist, be they into elaborating difference, or rinsing it out.
Edited by Carl Looper, 11 January 2016 - 06:11 AM.
"I’m pretty confident on the concept that film is not made out of some sort of magic that can never be studied or understood but must forever be described in vague religious terms." - Yedlin
The term "magic" has a number of meanings. Today we have "magic shows", and "black magic" cameras, and Harry Potter, in relation to special effects, saying "I love magic". But far more interesting than contemporary use of the term, is it's historical usage. There is to be found, if anyone cares to look, a dense history of magic. Science finds it's origin in magic. Science also finds it's opponents in those opposed to magic. Magicians (scientists) will suffer at the hands of those who will be the arbiters of magic. And the arbiters will be the Church (the religious). The Church will decide between magic with which it agrees (such as angels) and magic it doesn't (women flying at night).
Magic is not superstition. It is not the superstitious who practice magic. Magic is a word used to describe something for which there is not yet a better description. In the case of the religious there is not any better way of describing magic: it is either the work of God or the work of Satan. For the religious the only question is which. For the magician this is not the case. There will be additional work to be done, which, today we call science.
Now there are at least two types of magic (two types of science). One is towards a description of magic in terms of language (such as mathematics). And the other is in terms of how one might do magic, which we can otherwise call technology. There is also a third type of magic (if not more) in which magic is not investigated in any way but used (or abused) in the description of something other than magic. For example, it might be used to describe an angel. Or an atom. It is based on the idea that magic represents something (such as God or Satan).
It is not incorrect to describe the analog image as magic. Not is it incorrect to describe the digital image as magic. Digital images have only the virtue of originating within the very descriptions we might otherwise develop in relation to such, ie. making it very easy to re-describe such magic. And very easy to make the mistake of describing the analog image using the same terms.
The question is how to describe the analog image in terms that are appropriate to it.
The main argument in favour of digital is ease of use. I use digital all the time for that very reason. And I'm sure almost everyone else does as well. Why shoot film?
From experience, although it requires a lot more work to obtain, the resulting "data" from film exposure (for want of a better expression) works so much better in post than that which is digitally sourced. The appropriate scientific disciplines to employ in this are both mathematics and statistics, where the latter is the significant one.
In mathematics there is no such thing as noise. In statistics, noise is that which it intimately takes for granted.
Now noise is not something you would normally treat as a virtue. One would like to begin without any noise. But there is no such thing as a noiseless image. Further more, noise is not something added to an otherwise noiseless image. A signal on the one hand, and noise on the other, are only separable in concept - not in reality - and it's based on the idea that there must be some mathematical explanation for the 'noise' - ie. that if we knew what the explanation was, there would be no true noise - there would be just those for whom the nature of the signal was unknown, only appearing as noise. They'd be calling it "noise" only because they didn't know to what signal it corresponds.
God does not play with dice, as one notable physicist said.
But there is a fundamental noise for which there is no mathematical explanation (no God), simply because mathematics is the incorrect language in which to describe noise. Mathematics is completely blind in this respect. If God is a mathematician he is a completely blind God. To understand fundamental noise we have to turn to statistics.
And in the context of such one will find why an image created by means of film exposure serves as a far better master than one created digitally.
But it all depends. If one is not at all interested in exploiting that of which film is capable of encoding (and subsequently decoding), then there's no need to shoot film. If one is not going to exploit film then one need not shoot film. It's as simple as that.
And demonstrating one's indifference in the way Yedlin does simply demonstrates how one might express indifference. It doesn't in any way prove any lack of indifference elsewhere must be therefore biased, misguided, superstitious, etc.
Edited by Carl Looper, 15 January 2016 - 07:32 PM.
When scanning noise/grain there is a phenomena (visible effect) called "grain aliasing" where grain appears larger than it otherwise would be.
As you increase the definition of a scan the apparent size of the grain shrinks! This is because what we're seeing in a scan is not the native grain but an "alias" of such. There isn't any known way of alleviating this grain aliasing other than by increasing the definition of the scanner. This has nothing to do with the definition of the signal component. One could start with a completely out-of-focus signal and the situation would be the same - one would still need to increase the definition of the scan if one were to decrease the size of the grain aliasing.
In large gauge film the aliasing shrinks to a size at the boundaries of conscious perception or below it (where it would otherwise subconsciously do it's thing).
Now this grain (be it aliased grain or something closer to the native grain) is not just some sort of effect added to a signal (as if the signal existed in some universe without this effect). It is a function of the way a signal is created in the first place. We otherwise speak of sampling a signal ie. as if the signal actually were noise free in the first place. But this is just a way of speaking. And in this way of speaking the less samples we have the more noisy is the result. As we increase the number of samples, the less noise there is in the result. In the limit we will have created a pure signal. But where we begin is at the other end: in pure noise.
Edited by Carl Looper, 15 January 2016 - 08:27 PM.
Prior to film exposure we can postulate a continuous signal, where the signal is not in any way quantised in terms of depth. By depth I just mean the analog "equivalent" of bits per pixel. Or the "colour" if you like. So where we might otherwise talk in the digital domain about 1 bit images, or 8 bit images, or 24 bit images (etc), the analog signal would have a quasi-infinite number of bits per pixel (so to speak).
But as previously mentioned this continuous signal is just a way of speaking. In practice (observation) we have instead photon detections. These detections are nevertheless statistically correlated with our otherwise postulated continuous signal. We are able to see the continuous signal despite the fact that such a signal is seemingly composed of particle detections. The pattern of particle detections induces an apparition if you like - and that apparition is of a continuous signal. Or a "ghost wave" as one notable physicist called it. And so clear are studies of this ghost wave, in controlled experiments, we are able to formulate a precise mathematical function (a continuous function) to describe it, and to which the particle detections insist on precisely conforming! This mathematical function has infinite depth. Since the observable pattern conforms precisely to the mathematical pattern, the observable pattern also has infinite depth. But the observable pattern is not mathematically related to the mathematical function. It is statistically related. Or more simply, it is noisy. But as you obtain more and more particle detections the less noisy it becomes! In the limit (which you can never reach) you would make visible the mathematical function, ie. as if the mathematical function were the reality and the image a description of such.
But the mathematical function is the description. It describes what we see. It is not what we see that describes the mathematical function. We see a continuous signal - even in incredibly noisy images.
Now given an area of film in which the image signal is out of focus, we will nevertheless have a signal (noisy or otherwise) which has infinite depth. So the more bits we allocate to that area the more that such will describe the signal we can otherwise see by means of looking at the film itself. One way to allocate more bits to a given out-of-focus area of film is to assign more pixels to such. The sharpness of the signal is not in any way increased (since the signal is out-of-focus) but the depth is increased. It's colour. And the noisyness of the signal actually helps to induce a more finely graded sense of the depth. Where we might have difficulty determining the colour at any single point (when masking out all other points), the moment we enlarge our mask to include the distribution of detections over an area, the more we able to see (or "hallucinate") the colour at any single point. For example we can see grey tones in an otherwise 1 bit image where each pixel is only black or white. We see the signal. We don't see the individual pixels except through an effort to do so (by blocking out all the other pixels in such an image). Film is quite nice in controlling this distribution of values, and the resulting hallucination, because it doesn't interfere with it. It has a statically neutral appreciation of the signal, which it is able to transfer to digital, ie. where an otherwise digital-only version of the same signal would produce the jaggies. In a sense film can be regarded as a very good "pre-filter" stage for a digital signal. It is able to "modulate" the original signal in such a way that helps the digital signal exhibit greater depth than it might otherwise do (had one not pre-filtered it with film).
I otherwise call all of this "magic" but this is not to suggest any supernatural forces at work. On the contrary it is completely natural. If we use the term "magic" in realtion to this it's to express in more simple terms what is otherwise visible to the eye. The technical elaboration of this magic is only required when one wants to exploit it further. If one is otherwise happy with just the straightforward appreciation of this magic the term "magic" need not be clarified any further.
In each of the following images the number of bits per pixel is exactly one. One bit per pixel. Here we've rendered such a pixel (0 or 1) as black or white (although the reproduction here on the page may not be so clear)
The first image is close to how a one bit digital camera would reproduce the original image.
The others introduce some noise prior to what the one bit camera would see: 5%, 10% and 15%.
What we see is that as the noise is increased, the more shades of grey we can see !
This is how originating on film, and scanning from such would differ from just scanning an image directly. One can get more shades of grey in the result - despite the fact that in all cases the final digital result (the deliverable) has exactly the same number of bits per pixel.
It's important to note that one can't introduce the noise after the fact. The effect doesn't work. You can't get what film gets starting from a digital image. You have to intervene prior to such. While demonstrated here using one bit per pixel the principle holds for any number of bits per pixel.
Edited by Carl Looper, 16 January 2016 - 04:20 AM.
Yedlin's expression of indifference requires that all of this be repressed - that one treat such insights as unimportant, (by making a demo in which such has been repressed), and suggesting there is some sort of unscientific bias, superstition, or prejudicial forensics going on in any alternative.
"If the attributes exist, they can be studied and described precisely without ambiguous, mystical, quasi-religious language. " - Yedlin
Yes, they can. But if you repress any such attributes in a demo, the demo must surely be compromising the task of describing such attributes.
"To insist that film is pure magic and to deny the possibility of usefully modeling its properties would be like saying to Kepler in 1595 as he tried to study the motion of the planets:“Don’t waste your time, no one can ever understand the ways of God so don’t bother. You’ll never be able to make an accurately predictive mathematical model of the crazy motions of the planets — they just do whatever they do.” - Yedlin
Yet Yedlin insists on repressing the very thing that Kepler himself would have otherwise been using in a study of the planets: a forensic approach to such. If we are to repress forensics we would have to deny Kepler the use of optics in his work. By repressing forensics we might get away with suggesting there's no emperical difference between planets and gods:
"On the one hand, I believe that there are forensic clues in the demo that can be used to identify which is which to a trained eye, but on the other hand my whole point is that that doesn’t matter in the slightest — it’s the overall look and not the inconsequential trace forensic evidence that matters. " - Yedlin
Part of the problem is putting the onus on the last observer, in a complex chain of processes, as the correct arbiter. But it's not in any one demo in front of any one group of guinea pigs that give us any insight into how we might otherwise mediate an image. It is by work which includes forensics, thought, models and experiments aimed at teasing out information (rather than repressing such) which allows us to develop more interesting ways of mediating or otherwise creating an image.
"A priori belief has a long history of leading people far afield of the empirical truth. People's absolute certainty in a truth that they believe but refuse to put to the empirical test has yielded things like The Inquisition, but hasn’t yielded too many advances in human knowledge of the physical world." - Yedlin
Yes indeed, but on which side of the fence is Yedlin sittin'? That of the scientist/magician, or that of the Inquisition.
Edited by Carl Looper, 16 January 2016 - 04:28 PM.
There is actually a big issue with the whole film and video thing which is not talked about much.
Film and video are very different processes but these days they are both usually shown digitally whether that be as a DCP file or perhaps on a DVD.
In theory the digital formats can only show data in a limited way. It's what Tyler is talking about earlier in this thread.
DCP and DVD represent the film as a digital file that can only do a certain combination of things. The fact that you can store a film on a DVD means that you should be able to create the same data that looks the same way as it is just 0's and 1's. If a video camera could create those 0's and 1's in the same way it would look just like a film scanned to DVD.
There is a big problem however, that maybe Steve even hints at slightly. It's still really easy to tell the difference between something shot digitally and something shot on film, even when reproduced on a DVD or a youtube video. I would argue that the film looks way better although I know this is my judgement on this. What is more I don't think we are getting closer to the way film looks, I actually think we are getting further and further away.
The original Varicam had colour and a look much closer to film than the Alexa is now able to do. The Sony F35 was closer that the F65. So if we are trying to make things look like film then we are actually going backwards. The trouble is that I don't think the goal for video cameras is to make them look more like film anyway. They are generally trying to improve the specifications of the video and not to try and make it look like this older format.
There is another issue too. It's was long a common thing for owners of red cameras to say that raw meant they could make the image look like anything including film and yet the output of movies shot on the red cameras never remotely looked like film. In addition the red cameras and the alexa cameras both impart a specific look to their images. You might be able to move it about a lot but it still certainly has a tendency in a certain direction.
There is an Alexa look and there is a Red look and they are somewhat different and both of them are a gulf away from the way film looks.
It may be possible to make video cameras that have a look a lot closer to film in theory but in practice is this actually going to happen?
What Steve talks about is mostly what is theoretically possible.
I havn't been able to see the demo, and the fact that he has been working hard on a process to try and make video look more like film kind of underlines everything I'm saying. I'm impressed that he has decided to use a camera with such a well known look as the Alexa so it will be interesting to see what he has come up with.