PROJECTS

Light field imaging is an emerging research field due to the new capabilities it brings, including post-capture refocusing, aperture control, and 3D modeling. Single-shot single sensor light field cameras try to balance the fundamental trade-off between spatial and angular resolution. The spatial resolution achieved with such cameras is typically far from being satisfactory, limiting the extensive adoption of light field cameras. In this project, we present a hybrid-sensor light field camera that uses minimal optical components, a regular sensor and a micro-lens array based light field sensor to produce high-spatial resolution light field. The use of a single lens and matching image planes prevent complexities, such as occlusions, that multi-lens systems suffer from. In our experiments, we demonstrate that the proposed hybrid-sensor camera leads to improved depth estimation, in addition to increase in spatial resolution.

Through capturing spatial and angular radiance distribution, light field cameras introduce new capabilities that are not possible with conventional cameras. So far in the light field imaging literature, the focus has been on the theory and applications of single light field capture. By combining multiple light fields, it is possible to obtain new capabilities and enhancements and even exceed physical limitations, such as spatial resolution and aperture size of the imaging device. In this project, we present an algorithm to register and stitch multiple light fields. We utilize the regularity of the spatial and angular sampling in light field data and extend some techniques developed for stereo vision systems to light field data. Such an extension is not straightforward for a micro-lens array (MLA) based light field camera due to extremely small baseline and low spatial resolution. By merging multiple light fields captured by an MLA based camera, we obtain larger synthetic aperture, which results in improvements in light field capabilities, such as increased depth estimation range/accuracy and wider perspective shift range.