Technology always surprises us by making the impossible possible. Did you ever imagine a lensless camera? Did you ever wish for a ‘real-virtual’ world like the Matrix?
A camera without a lens!
Imagine a highly efficient camera, without lenses, which can see through materials like clothes, wood, rain and dust; and provide alternative of expensive LiDAR system in mapping. It is soon going to be a reality. Scientists at Duke University, North Carolina, US have devised a metamaterial that uses microwaves to image objects or scenes in real time.
Conventional imaging systems acquire information as pixels or vectors and perform software compression. But, metamaterials perform hardware compression during image acquisition. By leveraging metamaterials and compressive imaging, the researchers developed a low-profile aperture capable of microwave imaging without lenses, moving parts, or phase shifters. The innovative aperture allows image compression to be performed on the physical hardware layer rather than in the postprocessing stage.
The researchers subsequently developed a device, using thousands of tiny apertures arranged in a strip 40 cm in length, which records images in 2D — one dimension across the strip and the other for depth. This device illuminates objects with K-band microwave radiation (18.5 to 25 GHz). Image acquisition is accomplished with a 40:1 compression ratio.
Future imager Vs cameras
Conventional cameras contain chips that carry millions of silicon-based detectors. Each detector records the intensity of light hitting it, producing information corresponding to one pixel of the image. On the other hand, the future imager, built of metamaterial, is a strip of metal patterned with elements. It resonates at a specific frequency to steer radiation. The engineers placed the strip on top of a separate, plastic-covered metal sheet. The small metal strip replaces lenses, multi-pixel detectors and moving parts in a conventional millimetre- or microwave-imaging system.
The technology is being tested in various applications, eg: a smaller microwave aperture is being used in self-driving cars to see through fog and dust and sense obstacles in front of the vehicle. The findings of the research, supported by the Air Force Office of Scientific Research, were reported Science journal.
Comparison of different kinds of imaging
Blurring the line between virtual & real world
Evolution of technology has changed the way we lead our lives, by bringing abundance of readily available information at our fingertips. Whether we like or not, our computers, apps and the cyber world know us better than we would like to believe. Amid this evolution, technologies such as augmented reality are blurring the line between virtual and real world and Microsoft’s IllumiRoom is a great example of this.
IllumiRoom is an augmented reality, peripheral projection technology that could be one of the key features of the Xbox 720 or Kinect 2.0. By combining a Kinect camera and a projector, IllumiRoom augments the area around the television to increase immersion in the game or the movie. Kinect captures the appearance and geometry of the room and then this data is used to adapt the extra visuals that are projected against the wall and furniture around the TV.
IllumiRoom is a prototype system, which is based on Microsoft’s patent ‘immersive display experience’. The US Patent and Trademark Office granted this patent application in September 2012. The patent refers to ‘an immersive display experience within a display environment’ that includes a ‘primary display’ (referring to television) and a ‘peripheral image’ that would seemingly be projected onto the environment around users. It also includes ‘a peripheral input configured to receive depth input from a depth camera’, referring to Kinect.
The company claims that user enjoyment of video games and related media experiences can be increased by making the gaming experience more realistic. Previous attempts to make the experience more realistic have included switching from two-dimensional to three-dimensional animation techniques, increasing the resolution of game graphics, producing improved sound effects, and creating more natural game controllers.
In addition, the next Kinect sensor will be able to detect the dimensions of the room a player is in, including its depth. A new ‘camera component’ could ‘include a depth camera that may capture a depth image of a scene’ and use infrared light ‘to determine a physical distance from the capture device to a particular location on the targets or objects in the scene.’ In other words, Kinect will be able to identify the size of the room around users.
Microsoft isn’t the only company trying to stake out territory in the new world of augment reality gaming technology. Oculus’s Rift, a virtual reality headset is gearing up to offer gamers a chance to immerse themselves in artificial worlds.
Though IllumiRoom’s design calls for external projection of computer-simulated images as opposed to a headset, the concept of inserting the gamer into the world as an active participant rather than an observer remains powerful. Enterprising visual technology like the Rift and IllumiRoom provide ways of mentally incorporating a more full-bodied interactive experience that crashes through the boundaries of boxed-in images.
The figure shows how the device could “project a peripheral image in a 360-degree field around [the] environmental display.