I'm officially employed again Friday. It's basically a puny underfunded company that was started by picking up the remains of the last company in court at the Chapter 11. This will be interesting.
I gave them a list of my requirements (computers and four weeks vacation) and they accepted it. So at least I won't be using an antique computer like last time. Not that there's anything wrong with an antique. I'm one myself.
The new TV standard uses MPEG-2 to compress video. I think this is going to have serious negative effects upon cinematography.
MPEG as most of you know does inter-frame compression and works best when two adjacent pictures in the video are similar. On a static image with lots of detail, the detail becomes crystal-clear immediately. By contrast, if there's lots of stuff happening dynamically in the video, the compression loses so badly that the blocks in the video become visually apparent - MPEG subdivides the picture into individual coding blocks, and when there's a lot going on, about all the encoder can do is to tell you what each block's color and brightness should be.
So what does this do to cinematography?
Well, I've seen horrific shows like Boohbah (a kids show which is roughly Tellytubbies on a bad acid trip, which is really saying something when you consider that Tellytubbies is pretty weird to start with). Being a kids show, there are segments where an actor and a few props are messing about in the frame. And to calm things down for preschoolers, the frame is outdoors with scenery, with a fixed camera position and zoom, and few scene cuts.
This works fabulously in HDTV. The scenery, which was pretty to start with, is just amazing. You see incredible details in the trees, beach, plants, desert, or whatever in the background. And with just the actor and props moving about there's plenty of bits available for coding all that nicely enough.
I've also seen stuff like a darkened room in a lightning storm kinds of shots. Since MPEG doesn't have a way of coding "change the DC brightness by X", when the lightning flashes you are treated to a room full of blocks. Oh yeah, that looks sooo good. Same thing happens for explosions where the brightness changes dramatically -- it just looks blocky.
The principle here is a bit more broad than just simple brightness explosions and flashy lights. The Smallville title sequence (ya know, the part with the theme song and all that) now has meteors flying about, fast cuts, multiple video sequences being layered into different locations on screen simultaneously, and all that. It looks bad. Really bad. The first meteor comes in, and as it gets close it becomes too large and dynamic, and turns to blocks.
DVDs don't have this problem so bad because their bandwidth is more dynamic. If they need more bits for a lightning storm sequence, they can steal those bits from some softer part of the movie. Because HDTV is broadcast over a bitrate-limited medium, they can only steal bits from the few seconds surrounding the bit-intensive scene.
So the future? If HDTV really becomes predominant as I expect, I'd bet cinematography and editing will change. Fewer scene changes. More static shots so you can see the background. Say goodbye to the use of darkened rooms with flashy lights in Halloween screamfests.
It looks to me like pretty much every proposition is going to be defeated. That at least is a bit of a victory — there has been far too much interference-by-proposition in the state government. We typically have a dozen propositions in each General Election. Maybe this will discourage every Tom, Dick, and Enron from financing an initiative every time the whim strikes.
So I did my civic duty today and voted. But looking over my votes, I'm puzzled.
When, exactly, did I become a Republican?
Are there any solvents that can wash that off?
The complete list of hurricane names for the Atlantic season has been used up. The next storm is forecast to form tomorrow will be Tropical Storm Alpha, as they now go through the Greek Alphabet for names. This amuses me immensely.
The current list only had enough names to match the most storms ever seen (which happened during the 1930s). You'd think prudence would dictate having a margin of error -- allocating a few extra names more than the worst season known.
In programming, we normally try to avoid fixed-size allocations that could be theoretically exceeded. I think this software principle should be hereinafter known as the "Hurricane Alpha Principle".
The overnight weather forecast amuses me enormously. I told the littleuns about it, and now the four year old is afraid there will be thunder which she's apparently heard is supposed to scare her.