Sensing and perception
This covers often used sensors and touch upon perception in the form of computer vision and deep learning.
Objectives:
- Account for what is measured by the sensors listed below
- Explain the basic measurement principles for the sensors
- Discuss the pros and cons with these sensors and what they can be used for
- Use a pinhole camera model to map a point in the world to a point in the image
At lecture material
Data from the sensors in a mobile phone moving on a table. The first one when the phone was moving back and forth and the second when the phone was moving in a circle.
- sensorLog_20230920_bf.txt Download sensorLog_20230920_bf.txt
- sensorLog_20230920_circ.txt Download sensorLog_20230920_circ.txt
Pre-lecture material
- https://en.wikipedia.org/wiki/Accelerometer Links to an external site. (first section)
- Gyroscopes are used in robotics to estimate rotation. These days MEMS versions are used. Read the section on applications here Links to an external site.
- How MEMS Accelerometer, Gyroscope and Magnetometer Work (First 3 minutes)
- Read this about LiDAR Links to an external site. (The GPS and IMU that they say are part of a LiDAR are there to handle motion).
- RGB-D using structured light explained (2min)
- Time of flight camera Download Time of flight camera
In the following SHoR refers to the Springer Handbook of Robotics 2nd edition (and 1st edition in parenthesis).
Sensing
The main learning outcome in this part of the module is to get a basic overview of the types of sensors that are available, what they measure, how, their typical use and pros and cons.
There are many ways to cluster different types of sensors. Three examples are
- Proprioceptive vs Exteroceptive, i.e. measuring internal or external states
- Ex: wheel encoder and camera.
- Active and Passive
- Ex: Camera with a flash vs camera without a flash
- Electromagnetic radiation sensors, Inertial sensors and Environmental & contact sensors
Study Table 4.1 in SHoR to get an overview (5.1 in 1st edition)
Proprioceptive sensors
- IMU (SHoR 29.2-4 (20.2-4))
- Gyro (SHoR 29.2 (20.2))
- What does it measure?
- Limitations and challenges?
- Accelerometer (SHoR 29.3 (20.3))
- What does it measure?
- Limitations and challenges?
- Compass/Magnetometer (SHoR 52.5.1 (43.2.2))
- What does it measure?
- Limitations and challenges?
- IMU (SHoR 29.4 (20.4))
- Main components
- Rough idea of how it works
- Importance of subtracting gravity accurately (what happens to the estimated position and velocity in Fig 29.7 (20.7) if the orientation is not correct?)
- OPTIONAL MATERIAL
- Gyro (SHoR 29.2 (20.2))
- Encoders
- Used for?
- Different principles: Optical, Magnetic, potentiometers (resistance)
- Incremental vs absolute
- https://en.wikipedia.org/wiki/Rotary_encoder.
Exteroceptive sensor
Exteroceptive sensors tell us about the surrounding world. Often we use them to measure some quantity z to be able to estimate some other quantity x. For example we measure an z=image and want to estimate x=position.
- LIDAR (SHoR 32.2.2 (22.1.3))
- The two basic principle of operation?
- Single beam versus multi beam
- What an autonomous car sees with its LIDAR (1min)
- OPTIONAL
- Using LIDAR to oversee machines
- Using LIDAR for safe motion in human environments
- The TED talk by Chris Urmson from Google about how a self-driving car sees the world is interesting. Don't miss the duck at 11min10sec!
- Sonar (SHoR 30.1 (21.1))
- Basic principle?
- Basic principle (4min)
- Limitations?
- Use cases?
- Examples: Side view assist (2min) and Parking assistant (2min)
- Multi-beam sonar simulation (1min)
- Examples: Medical imaging using sonar (2min) and Some more sonar basics´ (2min) and a longer medical sonar video (7min)
- Basic principle?
- Camera
- Pin-hole camera model (Fig 5.9 (4.9) note that Fig 4.10 (5.10) is from Teknikringen 14, KTH)
- What are intrinsic and extrinsic camera parameters Links to an external site.
- Stereo-vision (SHoR 31.2.4 (22.1.2))
- https://andor.oxinst.com/learning/view/article/rolling-and-global-shutter Links to an external site.
- Video showing the distortion from a rolling shutter
- Extra:
- Example: Multi-camera car system (5min)
- RGB-D camera (SHoR 65.2, paragraph "Structured-Light Distance Sensors")
- Radar
- Basic principle by James May (5min)
- Example:
- Radar looking at a street (2min)
- Extra
- Stealth technology (4min)
- Air traffic control (4min)
- Examples for adaptive cruise control Honda(3min) and Bosch (4min)
- Barometer / Pressure altimeter
- Measures pressure
- Can be used to estimate altitude (height), often used in drones
- Note that the pressure can change because of many other things than a change in altitude, opening a door or a window for example
- Typical accuracy of consumer grade barometric sensors(1min)
Perception
Perception is about making sense of the sensor signals. This is a huge area of research. Speech recognition and computer vision are examples of subdomains of perception.
Computer vision
Computer vision deal with the problem of making computer "see". Common tasks include recognising objects and people. This is a vast area and we will not have time in this course to dig deeper into it.
Deep learning for perception
Deep learning is quickly becoming a hammer that should be in every engineers toolkit. You could argue that deep learning is not well understood and that it is a black box. However, perception has taken huge leaps in the past 5-6 years because of it. If you have time there are a massive amount of tutorials out there.