"Momma, don't take my Kodachrome away!" -- PaulSimon?
As of 30 December 2010, the last place on earth that processed Kodachrome completed its last roll and shut down the line. This after Kodak ended production of the film 22 June 2009. If you have any exposed bu unprocessed Kodachrome on hand, you officially have zombie film.
While claiming that *all* chemical photography is obsolete is silly, the long term (thing 10 years) prospects of 35mm are not good.
[moved here from ZombieTechnologies
Cameras - "Chemical" VS Digital [Although the actual term should be "film" camera
Really? Where can I buy a $100 (US) digital camera with the resolution of 35mm film?
- Chemical analog cameras
- obsoleted by: Digital Cameras
- Counter query - How can I take 100,000 film pictures for less than $2000? -- MikeWarot
- but looking closer: Analog allows control and resolution.
- but looking again: Film has already been obsoleted by CCDs in astronomy (for which CCDs were invented in fact). WithinTwentyYears, the mass market will catch up to astronomy.
Where can I buy a $100 (US) digital camera with the resolution of 35mm film? Vinyl is obsoleted by CDs because CDs are universally available for essentially the same price. If it costs me $2000 for a hi-res digital camera, but $100 for the same res chemical camera, the chemical one is not obsolete.
There's another issue that comes up. The digital camera from 3 years ago is inferior to the digital camera of today available at the same price. Too bad that means you have to replace the entire camera. Back in the 80s, Kodak came up with a new emulsion technology they branded T-MAX. Instantly, any camera ever made that used 35mm, 4x5, or 120/220 film got an upgrade. One digital camera where the back with CCD is a separate piece that can be changed out is the Hasselblad. Try pricing that one.
Also, what's available in digital cameras with interchangeable lenses? Hundreds of new and used film cameras available at all price ranges but the very cheapest have lenses you can change. Lens technology improvements + film improvements mean that a camera made in the 50s -- or even earlier -- can perform as well as a newly-minted model fresh off the assembly line.
More disposable technology. --StevenNewton
Total Cost Of Ownership
! Digital cameras are not
generally more expensive than chemical cameras. Their purpose is to take pictures with them, and producing a picture the chemical way can be (and generally is) much more expensive. You have to consider these costs, and then the cost ratio is a function of how many pictures you'll take over the life-span of the camera.
This has already been beaten to death on rec.photo.digital. I have a 2.1 megapixel digital, and several reasonable 35mm cameras. The digital cost $450 a year ago, plus probably the same amount again for more memory, accessories, CD-burner, blank CDs, high quality printer paper, colour printer etc. However, I have taken over 1000 digital pictures since then, and maybe shot one roll of 35mm. For me, 35mm is obsolete. Your mileage may vary on this one of course, but digital camera resolution is getting better, and prices are going down, and neither of these are happening for 35mm. In the sciences, digital imaging, especially due to the linear response to signal, made chemical film a niche use item several years ago. -- AndyPierce
Update on digital camera technology (Dec 2002): Foveon's X3 digital sensor provides resolution equal to or surpassing 35mm film. Initial consumer camera will be the Sigma D-SLR for around $3000. See http://www.dpreview.com/news/0202/02021103foveonx3preview.asp
[Update on digital SLR prices: Canon has begun a national TV ad campaign in the USA to sell the EOS Digital Rebel SLR (6.3 megapixel) with a medium zoom for just under $1000. Yowza! My next camera.]
With my chemical reflex cam, I use up two 36 shot rolls for one 'good' picture I'd like to keep, on average. I have to have them all developed so I can find a good one. Often others are also usable, but need digital correction. I have to scan them, manipulate them, and print them out again. With my digital cam, I take many more shots, because shooting costs essentially nothing, and I also keep more in my archive, because archiving and retrieving is much more convenient and also costs essentially nothing. As a result, I end up with more 'good' pictures. Most of them stay in the digital domain anyway, but when I do need a hardcopy, that's no problem either. True, inkjetting is expensive, but shops who develop on real photo paper from digital media are becoming ubiquitous and prices are getting ever lower. I still sometimes use different chemical cams, because the results look different. -- NeKs
One individual, or one industry, migrating from one technology to another does not make for obsolescence. As a chemical analog photographer and a programmer, let me give a counter example. I installed Linux on my home machine in 1998. The last time I used a version of Microsoft Windows at home was to run a US income tax preparation program in early 2000, but otherwise I have done all my work in 3 years in Linux. In February of 2001, I put together a new top-end computer from components and installed Linux - the machine never had a Microsoft Operating system. In Open Source development, Linux is the primary development platform. Yet whatever your personal biases may be, it would be absurd for anyone to assert that this makes Microsoft operating systems obsolete zombie technology. -- StevenNewton
Heh. The comments above tickle me. I'm a chemical photographer, I work mostly with medium and large format. How soon until I can buy a digital camera that can capture the amount of image information that my 4X5 loaded with ASA50 can for the $400 that it cost me? By the way, silver-halide negatives and prints (the kind I mostly do) aren't analogue, really. Dye based media kind of are, but a silver grain is black or it isn't. On the other hand, all the image processing is done with analogue computers.
That argument is invalid. The world is made out of atoms, each of which can be in only a discrete, not continuous, quantum state (skipping issues of quantum superposition, or if you prefer, after measurement collapses the wavefunction). Your argument would equally well lead to the conclusion that digital computers are actually analog computers.
No, the correct conclusion is that macroscopic analog and digital phenomenon are a matter of the macroscopic states exhibited, not of the systems they are reducible to.
In other words, silver-halide film is indeed analog, not digital; it captures what appears to be a continuous scale of grey tones. Furthermore there are no digital artifacts such as Mach banding. So long as you're using it in the linear portion of its S-curve response, of course. -- DougMerritt
What about film grain? There are limits to the resolution one can achieve with any medium, and film's limits are hit much sooner than solid state sensors as soon as you get out of the nominal light exposure curve.
The other thing to consider is camera quality. Sure, film has higher resolution than digital, but the optics of most consumer cameras aren't good enough to take advantage of that. I find that my 5-megapixel Nikon digital takes better pictures than my 35mm Yashica or my 24mm (APS) Canon Elph, both of which are quality cameras. -- StefanVorkoetter
blurring is bad you're wrong no, you're wrong
Any photography is a balance of the science and art that IS photography, both in nature, physical limitations and expression. While some photographers choose to use their cameras for artistic purposes(for its emotive and expressive capabilities), others still use photography as a recording or timeline tool. Which ever way you view this, be aware that others may not use photography in the same way as you do. Below is an example of two people who have different expectations on what photography should
be used for and the nature in which they should
One significant difference is that depth of field is very different between digital and chemical cameras. (At least, for most "affordable" ones.
) Basically, for a given aperture setting, things will be in focus over a _much_ wider range of distances with a digital camera. In some respects this is good - it's easier to get your whole picture in focus, and you can work with wider apertures (i.e. let light into the camera "faster") with a digital camera. But in some respects it's bad - with a digital camera it is very hard to deliberately get some of the image, e.g. the background, out of focus. Of course, the way a digital photographer would blur part of an image (or do numerous other special effects) is to take a clean picture, and perform the special effect in Photoshop.
Not really. Blurring done in PS doesn't look the same as the kind of blurring you get from a good lens. Things farther away are blurrier than close objects when done in-camera. You might be able to do this in PS, but why bother when it's easy to take the picture in it desired form? Conversely, if you want it all sharp, use a smaller aperture. The point is, a good camera gives you control over the picture, instead of making everything sharp whether that's what you want or not.
From here it is hard to adjudicate a meaningful conversation that is relevant to the topic. I thing making another (OffTopic) page and transferring it there would help... or just delete the whole argument. Its Semi Flameful anyway.
- This raises the question of why anyone would ever want to create blur in a digital picture. It's not like visual blur represents anything in reality or human perception. It merely represents an artifact of clearly inferior technology. And why in hell would anyone want to hold on to inferior and outdated technology? HorselessCarriageThinking no doubt.
- Photography-as-art does lots of things that aren't "realistic". Why do photographers and (especially) cinematographers always use blue filters, for instance, when shooting scenes in New York City (or some other location which is intended to be NYC)? Blur in photography is an established artistic convention; trying to justify artistic conventions with scientific arguments/explanations is often pointless.
- Bad example. They use blue filters to shoot NYC to make it more cold and depressing. They use red filters in California to make it hotter and cheery. Nothing could be simpler.
- I know why they do it; that's not the point. The point is that arguing about artistic decisions and conventions on a purely scientific basis is silly. If the artist wants to defocus the background, it's up to him and him alone; others shouldn't berate him for it on the grounds its unscientific. (One may, of course, evaluate the quality of his art, but arguments based on scientific correctness probably don't come into play there).
- I wonder who uses defocusing for valid aesthetic reasons? Or whether they use it for invalid reasons like making explicit reference to known conventions? Most uses of monochrome are to draw attention to the conventions of cinema; "this is an olde-style moving picture". Heavy-handed references to other movies aren't indicators of a good movie. And it doesn't get any more heavy-handed than making your movie monochrome. Except of course for blurring, which is even more intrusive than monochrome.
- Blurring is often used by amateur photographers who don't know enough about composition. Blurring is like putting a big flashing neon sign around the object you want to draw attention to, or painting that object using dayglo neon colours, It's artificial. The old technology made it inevitable, which made it seem natural to generations of people habituated to the older technology. Now that we're no longer limited by that technology, we are still seeing people employing these technics saying its aesthetic. It is hard to change peoples minds when they are so used to the old technology.
- The descision on aesthetic reasons should be an issue for photographers to resolve. And like many things in art - there may be more than one acceptable answer. Why should photographers all adapt the same technique? Should technique - as opposed to the end result - even matter?
- Blurring may be easy to do so any photography student can do it - but so what? Just because something is taught in photography 101 doesn't make it off-limits or inappropriate for serious photographers. The most important elements of a photograph - subject/scenery, lighting, etc. - have nothing to do with the camera anyway. I'm not necessarily disagreeing with you - I'm asking "who cares"?
- Of course technique matters! It affects the end result. The distinction between means and ends is specious.
- Everything it seeks to achieve, drawing attention to an object, is better achieved using other means. Blurring is used merely because the other means are much more difficult to master.
- Depth of field is a valid compositional tool. It can be abused, but it can also be used well. You raised the question of "why" and it was answered. The motivation is irrelevant.
- Using a shorter depth of field is a standard way to draw the eye to a subject. Visual blur simulates the effect of the macular in human vision. We can only see fine detail in a small part of our field of vision.
- Which effect is never seen in practice because the human eye constantly zips across the field of vision, dozens of times a second. The only way to see the effect is to stare fixedly at something and let your attention wander to your peripheral vision. Not exactly common practice.
- Let me put it another way. When humans are highly stressed and the fight or flight response kicks in, one thing that can go is colour vision. So you end up seeing in black and white and in slo mo. Out of all of the black and white films produced nowadays, what proportion of them use the black and white to refer to this artifact of human perception (a good reason) as opposed to obsolete film technology (a bad reason)?
- That's not what I meant. If you look at a woman standing in front of a crowd at a bus stop, you see her in sharp focus with your macular. If you want to convey that kind of experience through a photograph a common technique is to use a short depth of field so that the crowd is out of focus. The viewer's eye will be drawn to the woman, just as yours was.
- There are no bad reasons for using black and white.
- One reason to create blur in a digital picture is to draw the viewers attention to a region inside the frame. When you responded that the effect was never seen in practice it was apparent to me that you didn't understand what I meant. The effect is always seen in practice. We can't see the detail of an entire crowd at once like a camera can. Our attention focuses on a small area defined by the macular. I don't mean that people are aware of this, or that their eyes can't move, but that one of the differences between vision and photography is that a photograph can show the same level of detail over an entire image while vision can't.
- People are almost never aware of it. And why? Because the human eye zips across the field of view dozens of times a second. Exactly like I explained above.
- Actual tracking of human eye movements indicates that a person observing a "scene" has their eye resting upon and moving between various focal interest points. Put another way, the field of view is sampled by eye very unevenly. Effectively the "interesting" parts are in focus, and the "uninteresting" parts are out of focus. There is no raster-scan type movement of the eye, which would be the only way for the scene to have evenly focused material throughout.
- Long before an image hits consciousness, it's a unified, continuous and complete visualization. The only times a person is aware of uneven details is when they're staring fixedly at an object.
- This misses the point. People focus on particular areas in the field of view and the rest is largely ignored. You can't do this in a photo because it's too small. Using depth of field is one way to have this come out in your pictures. If you don't like it, don't use it. If other people like it, why give them a hard time? It is silly to assert that people get a continuous complete visualization. A person observing a scene cannot later recollect all aspects of it with equal efficiency. Whether parts are "blurred" or "ignored" or "forgotten" doesn't matter - the effect is the same.
- Of course people get a continuous and complete visualization. Or at least, they think they do, which is what really matters. Lack of detail is only obvious some of the time, and when it is it sticks out like a sore thumb. The problem with blurred photos is precisely that; they stick out like a sore thumb.
- The point is that justifying recreating in reality something that's an artifact of the lower visual system is blatantly stupid. It's as blatantly stupid as creating a high-level goto concept and then justifying it by appealing to its existence at the machine level. You don't recreate something that you never notice in places where it sticks out like a sore thumb.
- It doesn't matter if people are aware of it. Are you arguing that photographers don't use depth of field to draw the viewers eye to the subject? Are you arguing that they do but they shouldn't? What are you arguing?
- I'm arguing that they shouldn't. At least, not if technology makes it avoidable, which was what started off this argument. Because once it's avoidable, failure to avoid gross visual imperfections like blurring is a sign of sloppiness and laziness. Technology is progressing, raising expectations. People who fail to meet those higher expectations are sloppy and lazy.
- Technology made it avoidable long ago. Deciding on depth of field is neither sloppy nor lazy. Own your value judgements.
- Or correspondingly, to de-emphasize the blurred parts of the picture. In other words, for aesthetic reasons. Non-aesthetic photography, such as crime scene documentary photos, probably never have any blurred portions in the ideal case.
- [The original paragraph above also is highly misleading. The underlying issue is primarily the physics of lenses (including aperture), not of the imaging element. Wider aperture creates a smaller depth region in focus, but captures more light (and higher order Fourier terms, potentially giving more detail). Only a zero-width aperture can have infinite depth of field, but it captures zero light and zero image detail (in practice the limit is the wavelength of the light being captured). Only an infinitely wide aperture captures all possible image light and detail, but in turn has no in-focus region.
- The question of the speed and resolution of film versus digital imaging elements is very secondary after that (which is why the original poster said "for a given aperture setting"), and the two technologies have more overlap than not.]
This is a symptom of the size of the image sensors that are currently used in most digital cameras. See http://www.wrotniak.net/photo/dof/
Other differences between film and digital cameras
Another difference applies when you have a wide range of brighnesses in your image. From what I've read (I don't have much film camera experience) it is best to expose analog photos "correctly" for the darker parts of the image, and then tone down the brighter sections in development if necessary. With digital, you must do the opposite, because otherwise the bright bits will end up 100% white and there'll be nothing you can do to salvage them in Photoshop.
what about motion picture cameras?
On a related note, how long before film-based motion picture cameras are replaced with digital camcorders? In the world of television - the conversion is happening already as more and more high-value content is shot with digital TV cameras (HDTV in particular) and less with film. (Many programs intended only for television broadcast, and not cinema distribution, are nonetheless shot in film - or at least have been until recently). AttackOfTheClones
was shot digitally - though many are of the opinion that it still
was a waste of celluloid. :) [I thought very little television was done with film.. isn't most television done with videotape today? ] For quite a while, many prime-time shows have been shot in film and then transferred to tape for air; film cameras have long been of higher quality than video cameras (digital or analog). For direct-to-air stuff like news or sports, video cameras are of course required; but for stuff which is produced well ahead of being broadcast film is still commonly used.
Motion-picture film is expensive
, and so are prints of film-based motion pictures. Digital storage is cheap - the only thing expensive nowadays are HD cameras, and those will surely come down in price. The remaining issue is the fidelity of digital imaging technology vs film - film still has many advantages, both in resolution and color depth. But as with still cameras, this gap is narrowing every year.
In a sure SignOfTheApocolypse?
, Kodak recently announced that they would no longer manufacture 35mm non-disposable film cameras for the UnitedStates
Yes, but, Kodak's stock has been in the toilet for a long time now and their management's track record of bad business decisions does not inspire much confidence that this new decision is correct either. :/
- Something about the water in Rochester, NY that seems to cause bad management decisions (see XeroxCorporation).
I'm a big fan of DigitalCamera
s because I'm lazy and cheap.
However, I recently took a photo (analog) from an old album and scanned it, blew it up about 8x, and extracted details that I'd never been able to see before.
When I tried that same stunt with a digital photo, I had significant data loss at the same mag.
How was the digital photo produced - i.e. what did you print it on? (And what camera was it shot on, and what was done to it in the digital domain?) A professionally developed photograph (digital or analog) will be of quite a bit higher quality than something printed off on a $200 Epson, after all...
I have a better DigitalCamera
- the shutter speed is still not good enough to shoot planes landing and taking off
- the expandable detail still does not match what I can scan from an analog photo
It concerns me a bit that Kodak (and presumably others) are looking to lose this (film) technology. When detail and speed counts, the same performance in digital costs a whale of a lot more.
I have found a digital camera that's now the object of my TechnoLust
. Canon makes a 6-megapixel unit which sells for around $1,000. One of the guys with whom I work bought one, and took up-close-and-personal shots of the recent forest fire here in Northern (Western) Nevada (Carson City) and the antics of planes and helicopters fighting it.
The shutter speed is good enough that it froze 'copter rotors in rotation and isolated the rotating blades on a small firefighting planes. Awesome camera. Hell, by next year or the year after, I might even be able to afford one!
Oh, and the Canon 35mm that my daughter wants - the $500 unit - has been reduced to $200. There may be a pattern here.
- Yes I think that anyone who is recommending analog over digital for any reason other than artistic should easily be convinced that digital is better if they take a look at Canon's Digital Rebel 6 Megapixel SLR. Awesome. -- PeterLynch
This page is a heavy RefactoringCandidate
Update Feb 2006:
From the NewYorkTimes
- ... there's the astonishing collapse of the film camera market. By some tallies, 92 percent of all cameras sold are now digital. Big-name camera companies are either exiting the film business (Kodak, Nikon) or exiting the camera business altogether (Konica Minolta). Film photography is rapidly becoming a special-interest niche.
As of April 14, 2006, my strong objections to chemical photography as a zombie technology have evaporated. While there's still room for artistic and special interest uses of film and paper, it's clear that the industry is seeing a dramatic change. I now feel like the cabinetmaker who practices the craft with hand tools - admirably skillful and fully capable of producing something of use and value, but genuinely practicing an archaic form. -- StevenNewton