System Bits: March 18

A silicon chip developed by Caltech researchers acts as a lens-free projector—and could one day end up in your cell phone; Stanford researchers turn a smart phone into a diagnostic tool.

popularity

Bending light with a tiny chip
Imagine your smartphone being able to project a bright, clear image from a presentation or a video onto a wall or a big screen – all made possible with a light-bending silicon chip developed by Caltech researchers.

Traditional projectors pass a beam of light through a tiny image, using lenses to map each point of the small picture to corresponding, yet expanded, points on a large screen. The Caltech chip uses an integrated optical phased array (OPA) to project the image electronically with only a single laser diode as light source and no mechanically moving parts.

The researchers were able to bypass traditional optics by manipulating the coherence of light—a property that allows the researchers to “bend” the light waves on the surface of the chip without lenses or the use of any mechanical movement. If two waves are coherent in the direction of propagation—meaning that the peaks and troughs of one wave are exactly aligned with those of the second wave—the waves combine, resulting in one wave, a beam with twice the amplitude and four times the energy as the initial wave, moving in the direction of the coherent waves, they explained.

By changing the relative timing of the waves, the direction of the light beam can be changed.

Using a series of pipes for the light—called phase shifters—the OPA chip similarly slows down or speeds up the timing of the waves, thus controlling the direction of the light beam. To form an image, electronic data from a computer are converted into multiple electrical currents; by applying stronger or weaker currents to the light within the phase shifter, the number of electrons within each light path changes—which, in turn, changes the timing of the light wave in that path. The timed light waves are then delivered to tiny array elements within a grid on the chip. The light is then projected from each array in the grid, the individual array beams combining coherently in the air to form a single light beam and a spot on the screen.

As the electronic signal rapidly steers the beam left, right, up, and down, the light acts as a very fast pen, drawing an image made of light on the projection surface. Because the direction of the light beam is controlled electronically—not mechanically—it can create a sort of line very quickly. Since the light draws many times per second, the eye sees the process as a single image instead of a moving light beam.

Instagram for the eye
In another interesting example of smartphone-related research developments, researchers at Stanford turned a smartphone into an inexpensive tool for doing eye examinations in the field.

The idea is to use the smartphone’s built-in camera to take diagnostic images of the retina, optic nerve and other eye tissues. To accomplish this, the researchers developed an adaptor that holds the smartphone and a magnification lens. The lens peers into the eye. The adaptor holds the smartphone camera at just the right distance from the lens to take a sharp picture of the magnified image of the inner eye.

The first prototype was built on one of the researcher’s personal 3D printers who worked with the medical team to refine the adapter using equipment at the Stanford Product Realization Laboratory. This system allows ordinary medical practitioners to take eye scans and transmit them online to specialists for diagnosis.

The researchers asserted that the device has obvious benefits in the developing world, where it puts an inexpensive tool in the hands of ordinary medical practitioners in the remote locations but it could be just as relevant here in settings where time is of the essence.