Stanford EE259 I Ultrasonic sensor (Sonar) principle of operation & architecture I 2023 I Lecture 8

23 Jan 2024 (11 months ago)
Stanford EE259 I Ultrasonic sensor (Sonar) principle of operation & architecture I 2023 I Lecture 8

The video starts with a review of the previous lecture on gyroscopes and their construction. It mentions that gyroscopes measure Coriolis acceleration and explains how the internal components of a gyroscope are constructed. It discusses the primary axis and the sense axis of a gyroscope. The video then goes on to explain sensitivity and the factors that affect it. It introduces the concept of inertial navigation and the global reference frame. It explains the problem setup and the need to find the orientation of the robot relative to the global reference frame. It introduces the concept of orientation tracking and the use of direction cosines. The video explains the formula for tracking orientation and discusses the issue of time-varying frames and how to handle it. It introduces the concept of angular velocity and derivation. It explains the process of tracking orientation using gyroscope measurements and the matrix exponential formula. It discusses the limitations and errors that can occur in orientation tracking due to biases and noise. The video then moves on to position tracking and explains the process of transforming acceleration from body to global reference frame, subtracting the gravity component, and integrating to find velocity and position. It discusses the errors and limitations of position tracking, including bias and noise effects. It also mentions the use of sensor fusion and the integration of other sensors such as GPS and visual odometry. The video concludes by discussing the importance of mapping and different types of maps including offline and online mapping. It explains the concept of high definition maps and standard definition maps. It introduces the concept of live mapping and different types of live maps such as point clouds and 3D maps. It discusses the use of simultaneous localization and mapping (SLAM) and the accumulation of frames to generate dense live maps. The video briefly mentions other sensors used for mapping such as sonar, radar, and cameras. It ends with a preview of the next topic, sonar.

Overwhelmed by Endless Content?