We human beings tend to take eyesight for granted. Most of us do not care much about the mechanics of how vision works. We only care that it does. But when you get right down to the science behind human vision, it is actually quite fascinating. That science is now being applied to give robots electronic eyes that can ‘see’ even better than human eyes. Much of what these electronic eyes can now do are the direct result of passive sensors.
A passive sensor for electronic vision relies on an external light source hitting it, explains California-based Rock West Solutions. As a company that specializes in data analysis and sensor development, Rock West has designed and built sensors for a variety of commercial, military, law enforcement, and healthcare applications. They say passive sensors are all around us, even though we may not recognize them.
Tracking Movement Around a Space
Some passive sensors rely on light as their main source of data. Others rely on radio frequencies. Either way, the sensors are very useful for a variety of applications. For example, Rock West has developed passive RFID sensors capable of tracking movement around a given space. Imagine the applications of this sort of thing.
Visit the headquarters of any large corporation and you are likely to be given an ID badge to clip to your shirt. That badge gives you access only to those areas of the building you need access to in order to complete your business. Now, imagine your badge being equipped with an embedded RFID chip.
As you move throughout the building, passive RF sensors are communicating with the RFID chip in your badge and sending data back to a computer system. This allows security to know where you are in the building at any given moment. If you access areas you are not supposed to be in, someone will know. This is but one example of how the technology can be used.
Helping Robots See
Getting back to electronic vision for robots, passive sensors are helping them see things in their environment. Their electronic eyes work very similar to human eyes in the sense that they receive information via light waves. Those light waves are converted to electrical signals that are sent to a computer microprocessor for analysis.
Human vision works much the same way. Our eyes receive light waves that are then converted to electrical signals sent to the brain. It is the brain that makes sense of those signals, telling us what we see in front of us.
Obviously, robots do not see exactly the same way as human beings do. Furthermore, they cannot discern for themselves what it is their electronic eyes are picking up. But they are capable of registering wavelengths of light that the human eye cannot pick up on. Thus, they are able to process digital visual information that is otherwise invisible to the human eye.
For example, a properly programmed robot would not be inhibited by darkness. Where the human eye has trouble seeing in low light conditions, robotic eyes can make use of other wavelengths of the light spectrum. They can still see with infrared night vision, for example.
The long and short of it is that passive sensors are doing for robots what eyes do for human beings. Though they don’t see exactly the same way, they are able to respond to information obtained from light waves and make sense of that information in useful ways. It’s all made possible by small sensors that are gradually changing everything – including the way humans interact with machines.