Space-from-Time Imaging: Acquiring Reflectance and Structure Without Lenses Traditional cameras use lenses to form an optical image of the scene and thus obtain spatial correspondences between the scene and the film or sensor array. These cameras do not sample the incident light fast enough to record any transient variations in the light field. This talk introduces space-from-time imaging -- a signal processing framework for imaging using only omnidirectional illumination and sensing. We show that it is possible to construct images by computationally processing samples of the response to a time-varying illumination. We also show a range sensing system that uses neither scene scanning by laser (as in LIDAR) nor multiple sensors (as in a time-of-flight camera). These technologies depend on novel parametric signal modeling and sampling theory. Short biography: Vivek Goyal completed his Ph.D. degree at the University of California, Berkeley. After working at Bell Laboratories and Digital Fountain, he joined the Massachusetts Institute of Technology where he is currently Associate Professor of Electrical Engineering. Prof. Goyal was awarded UC-Berkeley's Eliahu Jury Award, the IEEE Signal Processing Society Magazine Award, and an NSF CAREER Award. He is a faculty co-author on student best paper award-winning papers and co-author of the forthcoming textbook Fourier and Wavelet Signal Processing (available at FourierAndWavelets.org). Along with serving on several TPCs, he is a TPC Co-Chair of IEEE ICIP 2016 and a Conference Co-Chair of the SPIE Wavelets and Sparsity conference series. He will present a tutorial on teaching signal processing at IEEE ICASSP 2012.