I noticed while I was doing post on this that I should’ve recolored the shuttlebay’s lights red, to match the movie. And I just realized that the other shuttle was in the bay during this scene. Oh, well. Next time.
Hooray! This is my first image made since I’ve returned to the world of the gainfully employed, and I think you can see that I was a bit tired and desperate when I sat down to lay it out (the decision to plop in an HDRI photo background rather than lighting the scene at all was particularly lazy). I will defend the composition with the fact that my square images are meant to be used as phone and tablet wallpapers, so I give a little more “safe area” on the edges since they’ll be cropped down in those uses.
I’m trying to figure out how I’m going to make it through the second half of this project now that my days are no longer my own. The first idea I’ve thought of is to try and lay out several scenes in advance during the weekend, so I only have to worry about rendering and post-work during the week. We’ll see how that works over the next couple days.
This image is based on a painting by Andrew Probert, depicting the launch of the Enterprise-D.
An effect I wanted to try out is something called a “split diopter,” which is basically a sort of camera bifocal, for when you want two (or more) subjects at different distances to be in focus.
I revisited one of the passes I did for my 50th Anniversary picture to better get to know linear lighting. It mixes and builds differently, so I readjusted all of the intensities of the scene lights (and then further played with the contrast in Photoshop to get some of the drama back). I’m going to have to practice a lot more to get things down. Surface settings also react differently. That’s most apparent on the inboard grill of the warp engine, where you can see the blue glow reflecting from its twin on the other side of the ship.
At the urging of Rhys over at Foundation3D, yesterday I finally figured out how to adapt my workflow to use linear color.
There are many explanations on-line for what a linear workflow entails. It’s fairly technical, but the upshot is that computers have to mess with images to get them to look right. That’s great, except when you process those images, say by using them as texture maps in a 3D rendering calculation. All of the renderer’s math is thrown off, because it’s using source data that was filtered for display, and things come out looking wrong. Everything is wrong!
A panicky, last minute render. I ended up recalling a shot from the Star Trek: The Next Generation episode “The Best of Both Worlds.”
I intended to try doing my post work in After Effects rather than Photoshop, but that didn’t work out (apparently, After Effects, 5K frames, and eight-year-old laptops don’t mix).