Focus-less camera

I know a fair bit about photography but still: what’s the difference between this and focussing to infinity?

If you focus to infinity, there’s no way to make the foreground sharp.

“When the camera reaches infinite improbability, it simultaneously takes an image of every point in the universe.”

I just clicked around on some of the photos really quick and to me it just seemed like 3 different focuses, near, medium and far. Why not allow a sliding scale instead of the weird click here to focus?

How close the plane of focus begins will depend on the lense, but yeah, I guess that’s true.

I wonder if there is anyway to use this tech for Astronomy? Back when I used to be into astrophotography doing 3 hour long exposure only to have it come out slightly out of focus was maddening.

Not sure if the optics could work with this technique though.

I’m curious what additional optics do to images from this device also, albeit from the other end of the size scale. If you mounted a camera like this to the trinocular of a dissection microscope, would you be able to focus in any plane using the post-processing, or just the plane that was in focus from the scope’s optics?

Looks like the technical reviews are coming through.

Here are the bits I find interesting

This isn’t a conventional camera that somehow lets you set the focus after shooting. That can be done but to do so leaves you with a 1.2MP point-and-shoot camera.

Creative mode is a recent addition to the camera that allows a little more control over the process. In this mode you can force the camera to refocus its lens closer than in Everyday mode, such that the zone over which the image can later be refocused is concentrated around your chosen point. This allows you to gain greater quality at the point you’ve selected but means you restrict the range over which the output can be refocused.

As a result of someone hacking the version of the ‘light field’ that is sent up to the Lytro site, it has become known that the camera analyses the depth information in each image and the desktop software renders a series of JPEGs representing the key depths in the image. This shouldn’t be taken as evidence that the LFC is conducting some kind of Photoshop-esque sleight-of-hand to achieve its effects (demos of parallax-shifting processing disprove that), but does give some idea of the capabilities and limitations of the files that get uploaded.

Sadly, though, we found that getting the best results out of the Lytro often required rather contrived compositions.

The final word

The Lytro LFC is so unlike any conventional camera that it doesn’t make sense to score it in comparison to them. Ultimately, though, we’re not convinced that the Lytro either solves any existing problem or presents any compelling raison d’etre of its own. If it were higher resolution or allowed greater separation or could produce single lens 3D video it might generate a lot more excitement. As it is, it feels like a product arriving before the underlying technology is really ready.

All of which is a great shame, because Lytro has done a great job of making a credible consumer product out of a piece of fairly abstract scientific research. It’s quite possible that in the hands of the right people it will result in some interesting creations but we just don’t yet see it as a mass-market device.

Sounds like they found it really interesting, but could not find a compelling reason to use it.

A co-worker got one today and I fooled around with it, but the dpreview impression is exactly right: it feels like a product arriving before the underlying technology is really ready. I can see how this would be great in a standard camera body and decent resolution, especially for close or mid-distance shots or stuff with lots of fore- and background action, but that’s quite a bit in the future since even a high-res shot would be ungodly huge in file size as a raw. The camera itself is overly simple and kind of funky to hold and manipulate. It’s more like… a concept than a practical device. Oh, and you can’t even hook it up to a PC yet.

However, it certainly works. I can imagine nearly every camera using this technology or something like it in 10-15 years.

— Alan

Well, the new more professional looking version is out now


http://www.dpreview.com/news/2014/04/22/lytro-announces-illum-light-field-camera

I’m still a bit skeptical about this, but if it can deliver it looks to be pretty cool.

I think even with the (presumed) vast improvements it’s still early days on this technology. Output is only 5 megapixels. I also wonder if they are cheaping out on the lens since they are doing so much work in software.

I think the lens is pretty good, since it is capturing way more than 5 MP of data (it’s only 5MP when you flatten the images for print). They are claiming 40 <fingerquotes>Megarays</fingerquotes> of data being captured, and I would think you need a pretty good lens for that.

I can’t get excited about Lytro. You basically need huge-ass sensors to get crappy image quality. Even worse is that you can only share them via Lytro viewers. The beauty of something like JPG is that almost everything can read them, making them easy to share.

This is a perfectly sensible post, in light of the fact that technology never gets better, and it’s surely impossible for the post-processing software to implement some sort of crazy “Save as JPEG” function.

Mmm, that’s good sarcasm.

More generally, I wonder what it will mean for photography. Photography as it exists now is about a person capturing an image as he or she sees it (or wants it to be seen) and then transmitting that image to others. This technology lets users change that vision as they see fit. Of course, an artist can restrict that vision to a certain extent, but can’t dictate it the way he or she can now with a single image.

Computational photography is a rapidly evolving field. There’s nothing inherent in Lytro that can’t be achieved in software using other sensors. In a sort of reverse effect, Google just released a new version of its camera app that adds blur to a perfectly focused image that emulates depth-of-field effects of large sensors. I think we’ll see apps like Photoshop implement the ability to do this sort of thing from multiple images, as they do today with HDR.

Focus stacking software for macro photography does this sort of thing on a much smaller scale, but does require a relatively large number of images.

The thing is, I either shoot with bokehr not… if I want part of the picture to be blurred, I make that decision when I take the shot. i just don’t see the utility in adjusting the depth of field/focus point after the fact. Maybe if I screwed up the focus, but I’d rather just shoot multiple photos than buy a specialized camera that lets me fix an occasional mistake.

I’m pretty sure this is wrong. The whole point of light-field photography is that you have angular measurement on incoming light, meaning you retain much more information on the scene than standard sensors which only measure local intensity. You could replicate light-field photography with other sensors only if you combined a ton of those sensors, much like the early experimental setups that used 100 DSLRs for one picture.

For one thing, I’d imagine a lot of people are not as skilled at achieving perfect DoF and focus at first try… and having to shoot dozens of photos just to make sure one comes out right is a nuisance (and sometimes impossible, if the scene changes).

Moreover, the Illum only has a big traditional lens because it’s early technology, and also designed to appeal to traditional DSLR users. The promise of light-field photography is that you get rid of the big frontal lens entirely, along with looking through a viewfinder and adjusting knobs to make sure you take the right picture. You’d just have a smartphone-sized lens array that you point in the general direction of the scene, and literally everything there was to see is stored instantly and can be extracted into a 2D image at your discretion.

That said, the Illum itself doesn’t much appeal to me, either. Light-field cameras will get more exciting when they’re closer to the goal outlined above.