New Single-Pixel Imaging System Inspired by Animal Vision

In a paper published recently in Science Advances, University of Glasgow (Glasgow, United Kingdom) researchers describe their method for creating video using single-pixel cameras. Taking inspiration from how animals’ eyes work, the researchers say they found a way to instruct cameras to prioritize objects in images using a method similar to how animal brains make the same decisions.

The school’s sensor is the latest in a string of single-pixel breakthroughs from its optics group. In this innovation, the line of computer-controlled cameras uses just one light-sensitive pixel to build up moving images of objects placed in front of it.

Single-pixel sensors are much cheaper than dedicated megapixel sensors found in digital cameras, the researchers say, adding they are capable of building images at wavelengths—such as at infrared or terahertz frequencies—that are expensive or impossible to create with conventional cameras.

The resulting images are square, with an overall resolution of 1,000 pixels.

In conventional cameras, those pixels would be evenly spread in a grid. The team’s new system, however, can allocate its pixels to prioritize the most important areas within the frame, placing higher resolution pixels in these locations. In addition, it can sharpen the detail of some sections while sacrificing detail in others. This distribution can be changed from one frame to the next.

The research was led by David Phillips, an engineering research fellow at the school.

“Initially, the problem I was trying to solve was how to maximize the frame rate of the single-pixel system to make the video output as smooth as possible,” Phillips explains. “However, I started to think a bit about how vision works in living things, and I realized that building a program which could interpret the data from our single-pixel sensor along similar lines could solve the problem. By channeling our pixel budget into areas where high resolutions were beneficial, such as where an object is moving, we could instruct the system to pay less attention to the other areas of the frame.

“By prioritizing the information from the sensor in this way, we’ve managed to produce images at an improved frame rate, but we’ve also taught the system a valuable new skill,” Phillips adds. “We’re keen to continue improving the system and explore the opportunities for industrial and commercial use.”

For more information, visit gla.ac.uk.