Meet Inspiring Speakers and Experts at our 3000+ Global Conference Series Events with over 1000+ Conferences, 1000+ Symposiums
and 1000+ Workshops on Medical, Pharma, Engineering, Science, Technology and Business.

Explore and learn more about Conference Series : World's leading Event Organizer

Back

KS Nagla

KS Nagla

National Institute of Technology Jalandhar
India

Title: Multi sensor data fusion in mobile robots

Biography

Biography: KS Nagla

Abstract

In the last two decade along with industrial robots, a considerable research has been appeared on service mobile robots such as, robots: servicing humans, servicing equipments and other service robots. Modern mobile robots are faster, lighter, more responsive, and equipped with multiple sensors. Suitable example is recently tested driverless car/robot. Such autonomous mobile robots are able to explore the unknown and partially known environment and can work in a complex environment. Subsequently, the addition of smart sensors and fast computing technology powered these robots with considerable intelligence and speed. As a result; such technologies have enabled autonomous mobile robots to navigate in static or dynamic environments. To complete the complex tasks the mobile robots have to integrate different elements of knowledge such as mechanical and electronic design, control algorithms, sensor based perception, mapping of the environment, artificial intelligence and path planning, etc. Sensor based perception of the environment mapping is an emerging area of research where sensors play a pivotal role. For mobile robot's mapping, the fundamental requirement is the conversion of the range information into high level internal representation. There are several sensor modalities commonly brought to bear these tasks such as vision sensor, laser range finder, ultrasonic and infrared sensors, etc. However the sensory information failure, sensor inaccuracy, noise and slow response are the major causes of errors in the mobile robot mapping. For achieving improved accuracy and reliability in mobile robot mapping, multisensor data fusing has been found as an optimal solution. Further, sensor data fusion offers robustness, economical perception due to a dedicated processor, improved adoptability in worst case scenario (drift, sensor failure etc.), and reduced environmental influences. It also facilitates real-time data analysis by adding N-independent observations. In few cases, the parallel processing of data fusion also provides fast navigational decisions. It has become an obligatory process of intelligent mobile robot systems to enhance its capabilities. In 2014 we have developed a new architecture of sensor fusion framework that makes the map more robust and reliable. The architecture consists of the three main segments: a) Pre-processing of sensory information, b) Fusion of information from heterogeneous sensors, and c) Post-processing of the map. As per past experience, specular reflection of sonar sensor is considered as the fundamental source of error in mapping. To overcome this problem, pre-processing of information for sonar sensor is developed, where fuzzy logic algorithm is used to discard the specular information. The implementation of that fuzzy technique for sonar mapping shows that the average performance of the resultant grid mapping is increased by 6.6%. The specular reflection removal also offers reduced computational time. The qualitative comparisons show the improvement in the results where the overall occupied and empty area of the resultant map is extremely near to the reference map. Such technique is required for future autonomous mobile robots.