Cnet News

Most folks think of a photo as a two-dimensional representation of a scene. Stanford University researchers, however, have created an image sensor that also can judge the distance of subjects within a snapshot.

To accomplish the feat, Keith Fife and his colleagues have developed technology called a multi-aperture image sensor that sees things differently than
the light detectors used in ordinary digital cameras.
[…]
“In addition to the two-dimensional image, we can simultaneously capture depth info from the scene,” Fife said when describing the technology in a talk at the International Solid State Circuits Conference earlier this month in San Francisco.

The result is a photo accompanied by a “depth map” that not only describes each pixel’s red, blue, and green light components but also how far away the pixel is. Right now, the Stanford researchers have no specific file format for the data, but the depth information can be attached to a JPEG as accompanying metadata, Fife said.

It appears that this new technology will be used to create higher quality images rather than 3D prints.




  1. amodedoma says:

    Sooner or later they were going to have to do something besides add more megapixels. Even if they don’t use the info for 3D it certainly could be used for stereographic 3D.

  2. grog says:

    wait till the pr0n industry gets a hold of that tech

  3. edwinrogers says:

    Similar technique was used (multi-element optics) for automatic range finding. This new cleverness is in using spacial optics, the same way.

  4. badcowboy says:

    Now we can fix depth of field and focusing errors in photoshop?

  5. GregA says:

    Hmmmm, and here i was being humble by this development. All I want is a snapshot camera that performs like a SLR but still fits in my pocket.

  6. Ron Larson says:

    Could be great for face recognition apps.


0

Bad Behavior has blocked 11161 access attempts in the last 7 days.