Jump to content



Phil Rhodes

Member Since 26 Dec 2003
Offline Last Active Today, 04:47 AM
***--

Posts I've Made

In Topic: 4k Ultra HD rant

Yesterday, 12:47 PM

There are standards to meet, but in terms of creative intent, including critical colour, there isn't really much to say. Getting things past a distributor's quality control process basically means getting them past someone the distributor trusts who sits and watches it intently, and yes, that can absolutely be done on a desktop computer monitor (I've done it and could refer you to other people who have done it.)

 

You have to keep some understanding in mind of what they're looking for, which is obvious technical problems like dead pixels on cameras, compression arefacts, focus or obvious soft shots due to GoPro or whatever, skies being green, or excessive noise, clipping or crushing. Otherwise, the reality is that as long as it's in the ballpark, the microscopic shades of a-few-delta-E that everyone obsesses over simply will not be noticed and will not cause anything to fail any sane QC, because it's not wrong, it's just perhaps not precisely what you intended. That's not great, but it doesn't make your project unsaleable.

 

These days, more or less anything from the midrange of desktop monitors upward, plus a basic calibration with an affordable probe, will get you more than close enough.

 

P


In Topic: 4k Ultra HD rant

Yesterday, 03:57 AM

Which, to me at least, says that even though those films weren't shot with HDR in mind, you could, in theory I suppose, create an HDR of nearly every commercial film.

 

You could. I'm sure they will, if it gets any traction, and they'll want you to buy it again. The usefulness of this is currently dubious in a situation where even TVs advertised as "HDR ready" might struggle to exceed about 600-700 nits, which is practically as bright as many computer monitors anyway.

 

Which leads one to ask, why not do that in the first place?

 

If you want the technical answer, because the displays wouldn't have been able to deal with it; it would have been like watching a log picture on a normal TV. You could say, why don't they put some simple hardware in the player so as to crunch the highlights down with something like a 709 curve so that the HDR signal looks reasonable based on knowledge of the level of capability your TV has, and that's basically what Dolby Vision does.

 

HDR is nice but current consumer TV attempts at it, in my view, are not good enough to make it worthwhile.


In Topic: 4k Ultra HD rant

15 November 2018 - 06:06 PM

The thing is, a lot of what George is saying is completely correct.

 

For instance:

 

I'll even venture to say that it probably looks better now than the raw footage straight from the negative (if that's possible).

 

It is quite possible. Assuming the 4K was scanned from the original camera negative, it'll be sharper than the print-from-interpos-from-interneg-from-negative you saw on the 35mm prints at the time of release. Conventionally made release prints like that tend to have a resolution well under 2K. The 4K probably won't have the same wide colour gamut as film, but that's not hugely significant on most material.

 

Of course, lots of OCNs probably don't resolve 4K either, given the lenses of the time and so on, but that's another issue.

 

But the general thrust of this discussion seems to be that improving technical quality beyond a certain point isn't that helpful, which in my view is absolutely true.

 

I'm not sure about the technological progress of the 90s being uninteresting, though. I think tech progress, particularly in computing, has become less interesting right now, specifically because computers are now more than good enough to do most of the jobs they've become established in doing. In the 90s, big performance gains were common, and we watched things like nonlinear editing become possible, then affordable, then affordable and slick, over a matter of a few years. When HD came along, one of the downsides was that it made our workstations feel slow again. Same with 4K, and that's happened at a time when the tech progress is much slower.

 

But in general, I'm starting to feel the same way. I turned 40 a few days ago, so I'm not quite ready to describe myself as older, but I'm starting to tire of pixel peeping and tech porn in much the same way. Maybe it's a concomitant of middle age.


In Topic: Voltage & Wattage Question

11 November 2018 - 03:25 AM

You can even do it with MR-16s, which are most often 12V each.

 

Horrifically inefficient, by modern standards, but it works.


In Topic: Voltage & Wattage Question

06 November 2018 - 06:51 PM

Yes.

 

My thoughts: I wish that Iwasaki still made their 150W par36 metal halide stuff, it does this sort of thing so well and for about 25% the power.

 

If you wanted to homebrew it, it is increasingly plausible to homebrew LEDs at this sort of power level, depending what sort of radiation pattern you wanted.

 

P


FJS International, LLC

Ritter Battery

Aerial Filmworks

Wooden Camera

rebotnix Technologies

The Slider

Metropolis Post

Visual Products

Technodolly

Gamma Ray Digital Inc

Abel Cine

CineLab

Glidecam

CineTape

Tai Audio

Broadcast Solutions Inc

Media Blackout - Custom Cables and AKS

Willys Widgets

Rig Wheels Passport

Paralinx LLC

New Pro Video - New and Used Equipment

Rig Wheels Passport

Visual Products

Metropolis Post

New Pro Video - New and Used Equipment

CineLab

Technodolly

Gamma Ray Digital Inc

rebotnix Technologies

Tai Audio

Wooden Camera

Glidecam

Paralinx LLC

Broadcast Solutions Inc

Abel Cine

FJS International, LLC

Media Blackout - Custom Cables and AKS

The Slider

Willys Widgets

Ritter Battery

Aerial Filmworks

CineTape