What is Obstacle Avoidance?

Obstacle avoidance is a safety mechanism that allows UAV to detect obstacles along its path and avoid it accordingly. The term ‘Obstacle Avoidance’ does not signify one technology. Instead, it is a broad term used to denote multiple complex technologies employed in unison to avoid the collisions of drones.

Objective of Obstacle Avoidance

Drones are generally fragile and susceptible to damage caused by collisions, either through pilot errors or autonomous flight. This often results in costly repairs which stay effective only until the next time the drone crashes. Obstacle avoidance is mainly implemented in order to cut these unnecessary costs incurred and not restrict flight control to manual surveillance.

There are different methods by which drones avoid obstacles: Generally, they notify the pilot who steers the drone away. Certain drones hover near the obstacle until further instructions are given by the pilot, while others identify the obstacle and move sideways or upwards until the path forward is clear. Advanced drones compute the best routes to avoid obstacles.

Mechanism of Obstacle Avoidance Drones

Obstacle Avoidance drones have an embedded technology that ensures the flight renders no damage to it.

Obstacle Avoidance Algorithm is employed, which are thoroughly written set of rules that the operating sensors must follow in order to render effectiveness in detecting and avoiding obstacles.

Simultaneous Localization and Mapping (also abbreviated to ‘SLAM’) feeds the drone a pre-existing map which helps it orient itself in the surroundings accordingly.

Sensor Fusion combines multiple physical sensors into a more effective and high-precision one.

After receiving information from sensors, the Flight Controller instructs the drone to hover, fly sideways or upwards to avoid the obstacle.

Types of Obstacle Avoidance Sensors

Most drones use Ultrasonic Sensors at the bottom for proper detection. Just like Bats and Dolphins use ultrasonic waves to navigate, drones equipped with ultrasonic sensors function in the same manner. Each sensor has two openings: one to send out high frequency sounds and the other to receive the returning waves.

LiDAR, standing for Light Detection and Ranging, LiDAR has a simple working mechanism. It throws a laser at the obstacle and calculates the time it takes to return to the source. LiDAR works incredibly fast and computes the distance with high precision. Also, it is used extensively in Oceanography.

For a better understanding of this technology, click here.

Infrared is a rather inexpensive yet effective method of obstacle determination. The working principle is based on infrared wave dissipation into the distance. If the source emits infrared waves and there is no obstacle ahead, the waves will weaken. However, if there is an obstacle, the ray will return back to the receiver. This type of avoidance system consists of an Infrared transmitter, Infrared receiver and potentiometer.

Drones using Monocular Vision technology are often the most budget-friendly. This technology converts pictures into 3-D reconstructs of surroundings.

‌                                              

stereoscop,distance measurement,self-localization

Image depicting Monocular Vision based Obstacle Avoidance System

Recent times have encountered flash floods and devastation caused by natural catastrophes. It becomes difficult for the rescue parties to thoroughly search the inflicted areas. This is where intelligent aerial vehicles come into action. Inaccessible areas also have unstable GPS, which renders piloting and manual surveillance ineffective. In these situations, the drones will have to autonomously identify their positions, detect and avoid obstacles.

Some of the best Obstacle Avoidance Drones are Skydio 2, Kespry 2, DJI Maciv Air, Walkera Vitus, DJI Matrice 200

You can find the comprehensive list of their features here.

Conclusion

Vision Based Obstacle Avoidance drones might be costly but they are just a one-time investment. These drones can be used extensively in disaster-stricken areas due to the intelligent technology and Obstacle Avoidance algorithms programmed into them. These drones also give a more reliable autonomous flight and cut down expenses incurred due to impacts.

The recent Inter-IIT tech meeting witnessed exceptional performance from IIT Kharagpur. One of the encompassing problem statement was to design a Vision Based Obstacle Avoidance drone to navigate through disaster-stricken areas and extensively help the rescue forces to reach the masses. There were three gazebo worlds with increasing difficulty as one progressed. You can skim through the statement here.

IIT-Tech Ambit had the opportunity to interview Yash Soni, a 3rd year undergraduate at IIT Kharagpur and an essential contributor to the winning team. The team comprised of 9 members including Yash: Team Captain Praneet Jain (5th year), Mohit Singh (4th year), Manthan Patel (4th year), Archit Rungta (3rd year), Shreyansh Darshan (3rd year), Satwik Chappidi (2nd year), Jaskaran Singh Sodhi (2nd year) and Sambhaw Kumar (2nd year).

Despite the COVID-19 infliction, the team managed to collaborate effectively. ‘We used to have sessions on Google Meet to share the progress and discuss our ideas’, Yash said.

On being questioned about various obstacle avoidance technologies currently very commonly used and what did they employ in their drone, Yash told us that the PS required them to use a depth camera along with a downward facing RGB camera (a camera through which colored images of persons and objects are acquired). ‘We incorporated a vision-based planner which used the data from the depth camera to create point cloud of its surrounding and search for a navigable region’, Yash told Tech Ambit. The team experimented with various planning techniques and some in-home algorithms; they selected the most effective one on the basis of the accuracy rendered as well as the time taken to complete the task.

‘We used the simulated drone on the gazebo which is operated using Ardupilot SITL’, Yash said on being asked about the sensor(s) employed for the drone. There was a constraint in the sensors to be utilized in the problem statement. Ardupilot SITL (Software in the Loop) helped them figure out information regarding the drone’s coordinates using numerous sensors like GPS and Accelerometers just to state a few; augmented by a front facing depth camera and a bottom facing RGB camera.

The three worlds mentioned in the problem statement were distinctively different on account of complexity and obstacles, which made it difficult to design a generalized algorithm which would avail in all the three worlds. However, they were built using similar mesh material.

We asked regarding any technical difficulty encountered during the algorithm designing, Yash told us the technical issues pertaining to the provided environments were resolved through the Slack Channel. Apart from this, the challenge difficult to administer was the setup of the environment (for consistency with other systems) for a smooth collaboration. ‘This kept bugging us for some time as we occasionally faced issues regarding differing performance on different systems as we could not work on a centralized system’, Yash said.

With this the interview was concluded, during which Yash thanked efficient management ensured by the Contingent team which included Harsh Maheshwari, Krishnam Kapoor and Mayank Priyadarshi. Problems were quickly addressed and updates were briskly given. Although this was the first time Inter-IIT was conducted online, it was a major success and the aforementioned team had a major contribution in it.

IIT Tech Ambit earnestly congratulates the winning team and wishes them all the best for their future endeavors!