You Don’t Know How Lucky You Are!

Well, okay, most of you probably really do. Anyway, I’m talking about just one little corner.

Anybody in love with film, all excited to be discovering this fascinating retro medium, or still working happily with your old favorites, might not want to read further. However, I do want to say explicitly that I wish you well, and hope you achieve what you want with your chosen medium.

So, now, here it is:

Film
Digital

Any questions?

The film image is from March of 1994, and was shot on Kodak “Gold 200-2 5096” according to the edge marks. Looks like they were processed at Proex.  Given the date, I probably shot them with an Olympus OM-4T (I decided to switch to AF and got the Nikon N90 that fall). They were scanned on a Nikon Coolscan 5000 ED at full resolution, with no grain reduction, and then the curves were adjusted to make the picture look decent.

So, here’s the full film image (click through for somewhat bigger version):

The above crops are both “100%”, i.e. 1:1 pixels from my digital file (the full-size versions you get when you click through above are).

Not horrible, certainly; zooming in to “100%” is what gives pixel-peeping a bad name, and is rarely good for anything except making relative comparisons.

Here’s digital example, taken earlier this month with my D700.  It was taken at ISO 200, just like the film.  I made minor curves adjustments to make it look a bit better.  The full-size version, sized down (click through for 900-pixel version):

 

And that is why I personally am not a fan of film, at least in comparison to digital.

You really don’t want to see the comparison at ISO 1600.  I scanned some Ektapress 1600 Professional (PPC) last night.  Scary boulder grain! Amazing electric blue sparklies through all the shadows!

A Major Transition

Sometime in the last decade or two we crossed over something that would have been viewed, in 1970 say, as a huge transition.

In fact, we hardly noticed as we crossed over it.

When I started developing software professionally, I wrote assembly language—actual machine instructions, with syntactic sugar to make them easier to read and remember, like mnemonics for op-codes, and the ability to assign names to memory locations.

From there, things progressed to compiled languages, which turned something like Fortran or Cobol or C commands into actual machine instructions.

There were also interpreted languages around—BASIC being the most famous one (well, or LISP, I guess).  They didn’t make your program into machine instructions at all; instead they just parsed and executed it directly.  This was slower, but easier to implement, and much easier to port to another environment.

Interpreted languages were widely viewed as toys (largely because most people had no idea how big some of the systems built in LISP were).

Well, sometime over the last couple of decades, I’m pretty sure we crossed the boundary, and most new lines of code written today are in interpreted languages—mostly Java and C#, though there’s plenty of Perl and Python and Ruby being written.

And it’s passed almost completely without remark.

(Closely based on a comment I made in rec.arts.sf.written)