Research

UAV-base Multispectral Sensing Solution (UMS)

Measuring soil health indicators (SHIs), particularly soil total nitrogen (TN), is an important and challenging task that affects farmers' decisions on timing, placement, and quantity of fertilizers applied in the farms. Most existing methods to measure SHIs are in-lab wet chemistry or spectroscopy-based methods, which require significant human input and effort, time-consuming, costly, and are low-throughput in nature. To address this challenge, we develop an artificial intelligence (AI)-driven near real-time unmanned aerial vehicle (UAV)-based multispectral sensing solution (UMS) to estimate soil TN in an agricultural farm. TN is an important macro-nutrient or SHI that directly affects the crop health. Accurate prediction of soil TN can significantly increase crop yield through informed decision making on the timing of seed planting, and fertilizer quantity and timing. The ground-truth data required to train the AI approaches is generated via laser-induced breakdown spectroscopy (LIBS), which can be readily used to characterize soil samples, providing rapid chemical analysis of the samples and their constituents (e.g., nitrogen, potassium, phosphorus, calcium). Although LIBS was previously applied for soil nutrient detection, there is no existing study on the integration of LIBS with UAV multispectral imaging and AI. We train two machine learning (ML) models including multi-layer perceptron regression and support vector regression to predict the soil nitrogen using a suite of data classes including multispectral characteristics of the soil and crops in red (R), near-infrared (NIR), and green (G) spectral bands, computed vegetation indices (NDVI), and environmental variables including air temperature and relative humidity (RH). To generate the ground-truth data or the training data for the machine learning models, we determine the N spectrum of the soil samples (collected from a farm) using LIBS and develop a calibration model using the correlation between actual TN of the soil samples and the maximum intensity of N spectrum. In addition, we extract the features from the multispectral images captured while the UAV follows an autonomous flight plan, at different growth stages of the crops. The ML model's performance is tested on a fixed configuration space for the hyper-parameters using various hyper-parameter optimization (HPO) techniques at three different wavelengths of the N spectrum.

[GitHub]

Automated Medical Assistant (AMA)

asdasd

ASA piloting test at BSMMU

This paper introduces a mobile robot that is designed to assist medical professionals in hospitals with daily simple tasks which allow the staff in turn to optimize their resources. A prototype of this robot was deployed at the Bangabandhu Sheikh Mujib Medical University in Bangladesh with funding from the Government of Bangladesh, USAID and UNDP. This robot performed its functions as an Automated Medical Assistant (AMA) by delivering to and interacting with patients and medical staff during its trial phase in 2018. This paper presents the robot to have an intuitive user interface, a user-friendly mode of communication, smart features in its command and control system, security, and energy optimization. The smart robot is able to navigate its way to specific patients as well as avoid obstacles and finally navigate back to its charging station once its energy is depleted.

[GitHub]

Eye-Arm Interaction

This project using an emerging technology of eye tracking to help handicapped people with mobility options. The designers’ intention was to implement a prototype and test its feasibility for helping the poor and physically challenged people in Bangladesh. The working prototype has been designed with two versions. The first one is able to interface the video based eye tracking technology with a laptop so that the bedridden patient is able to control electrical appliances such as a robotic arm. The second version is tested for controlling the same robotic arm but with a standalone head unit with the intention to give the user a greater degree of freedom in mobility. Both versions are developed with the intention to help people with disabilities and improve their quality of life.

[PDF]

ARC-71

ARC-71 rover at Mars research desert station, Hanksville, Utah.

ARC-71 is the rover designed by 17 undergraduate students of American International University-Bangladesh. The team is called AIUB Robotic Crew (ARC). The 17 minds under the supervision of Asst. Prof. Ebad Zahir, achieved on 21st March 2016, exactly what they had dreamed of for over a year - the chance to compete with the top-30 university teams from all over the world. The Hardware and Software Interface team was lead by me.

ARC is the first team from Bangladesh that has qualified to the semifinal stage at their very first attempt. For ARC, the idea was taken from Rocker Bogie’s V-suspension and combined it with the double wishbone suspension system for getting maximum stability from the rover while traversing terrains. The performance of the rover’s arm as well as communication between the rover and the base station are both complex when it comes to performing different tasks remotely. All the components are designed and systems are developed according to the URC- 2016 guidelines. The rover arm is designed in a way so that it can lift up to 7 kg weight and 5 cm thick objects easily. Along with the gripper, an excavator bucket was built so that it can collect samples more than 5 cm deep and provide it to “Microlab” box where sensors and devices are placed. For communication system, 2.4 GHz frequency band is used to maintain communication between the rover and the base station within 1 km distance.

[PDF]