Shield AI horizontal logo black

Shield AI Fundamentals: On Perception

What does perception refer to in robotic systems?

We’re referring to a robot’s ability to sense the environment around it, process the information that’s coming in from its sensors, and use it in order to perceive the environment, figure out where it is and how it’s operating relative to the environment, and leverage that information in order to make decisions or take actions. 

Can you elaborate on what concepts fall within the overarching concept of perception?

We generally consider perception as including sensing, state estimation, mapping and localization. 

Sensing refers to the robot’s ability to observe the environment around it, as well as its state within that environment, through the use of its onboard sensors. 

State estimation is the process through which a robot determines its state based upon direct information from its sensors or inferences made given information from a combination of different sensors. A robot’s state is used to describe the status of the robot at any particular instance in time. It defines both where the robot is in the world — its position, orientation, linear velocities, angular accelerations, and so forth — and its internal characteristics, such as its battery levels and system wear-and-tear. State estimation is sometimes referred to as sensor fusion since it requires the fusing together of sensors in a consistent manner. State estimation allows a robot to understand where it is relative to the world around it. 

Mapping refers to the robot’s ability to create a model of the world around it. There are many different ways of representing the environment, and mapping is the challenge of taking sensor observations and transforming them into a consistent environment representation. From an algorithmic perspective, it allows the robot to build an environment state model. The problem of mapping can be approached in a variety of ways. One is to use a grid-based approach which discretely cuts the world up into small blocks and tries to model whether or not some kind of matter occupies a particular block’s location. Another, more sophisticated representation, may try to look at the world in more of a continuous manner and build a consistent point cloud of the different observation point samples that the system may take. 

Localization refers to the robot’s ability to determine where it is relative to that environment model. Often you’ll hear people talk about localization and mapping as one because they are done at the same time. The robot will build its map as it moves through the environment, while figuring out where it is relative to that map that it’s building.

How do a robot’s sensors impact its perception?

A robot’s sensors provide it with information that allows the robot to model the world and its position within it. The fidelity of that model is directly contingent on the fidelity of the information. 

Think about how we, as people, navigate through environments. We use our sensors — eyes, ears, the sense of touch — to move through that world. And if a particular sense is performing poorly, our ability to navigate might be diminished. For instance, you might stub your toe at night because it is difficult to see. Because of this we rely upon all of our senses to provide different sources of information in order to be able to build up an environmental model. This allows us to de-prioritize low-fidelity sensor observations in favor of other sources of information that prove higher fidelity.

In robotics, we borrow from the field of information theory to reason about these concepts. The quality of a robot’s environment model and state estimate are a function of the type and quality of sensors that the robot is equipped with and its processing capabilities. The more resources the system has available to it, the better its ability to represent its state. For instance, with higher quality sensors, the robot will gather more accurate sensor data, enabling it to build increasingly accurate, high-fidelity models of the world. 

What impact do environmental characteristics have on perception?

Similar to sensor quality and fidelity, environmental characteristics influence the information that a robot can perceive.  It’s not hard to imagine that a robot that depends heavily on visual perception will have a tough time in dark or smoke or fog.  

Another example is clutter.  It is easier to find a pen on a table that has just a few things on it as compared to a table with large piles of many different items. It’s easier to detect an arctic fox in green grass than in a snowy landscape.  The things that affect our ability to disambiguate and discriminate given different levels of information have a similar effect on a robot’s perceptual acuity.

Want To Learn More?

Get in touch with the Shield AI team today.