• Tidak ada hasil yang ditemukan

Review on the Real-time Implementation of IoT-enabled UAV in Precision Agriculture and the Overview of Collision Avoidance Strategies

N/A
N/A
Protected

Academic year: 2023

Membagikan "Review on the Real-time Implementation of IoT-enabled UAV in Precision Agriculture and the Overview of Collision Avoidance Strategies"

Copied!
20
0
0

Teks penuh

(1)

*Corresponding author: [email protected]

Review on the Real-time Implementation of IoT-enabled UAV in Precision Agriculture and the Overview of Collision Avoidance Strategies

Tamilselvan Ganesan1*, Niresh Jayarajan2, and P. Sureshkumar3

1Research Scholar, Department of Automobile Engineering, PSG College of Technology, Coimbatore 641004, India

2Assistant Professor, Department of Automobile Engineering, PSG College of Technology, Coimbatore 641004, India

3Assistant Professor, Department of Mechanical Engineering, JCT College of Engineering and Technology, Pichanur 641105, India

Drone-based monitoring is very convenient and effective in the modern era for monitoring large and dense areas where humans cannot monitor efficiently. Precision agriculture (PA) unmanned aerial vehicle (UAV) monitoring techniques make farmers more protective and knowledgeable about their fields. Modern PA drone employs a 5G-enabled Internet of Things (IoT) that captures field data and transmits it to the cloud with extremely low latency to make quick decisions. From anywhere, farmers can keep an eye on their farms. Also, they have a choice of manual or automated methods for executing the proper data-driven actions. Smart farming is substantially more efficient than traditional farming. This article gives an in-depth analysis of UAVs’ real-time deployment of hardware, software, sensors, and IoT in agriculture for crop monitoring, weed identification, and collision avoidance. This research also covers the wide variety of collision avoidance algorithms utilized in both outdoor and interior conditions.

Keywords: collision avoidance, IoT, hardware, precision agriculture, real-time implementation, UAV

INTRODUCTION

Agriculture plays a vital role in the economy of India.

Over 70% rural population depends on agriculture for its households. And around 3.5% of agricultural GDP growth increases in the year 2022–2023 reported the Indian Ministry of Finance (Ministry of Finance n/d). Indian agriculture is the world’s second-largest producer of rice, wheat, pulses, milk, and other commodities. Agriculture development is highly significant for social and economic advantage since it leads to a rise in production and the economy of the country, and it also benefits family growth. Unmanned aerial vehicles (UAVs) are the main

technology that is increasing agricultural output. The use of these UAVs allows for more effective field monitoring and data collection at a lower cost. UAVs are not a new technology; they were originally employed for military monitoring purposes in mountains and denser regions (Tsouros et al. 2019). As a result of advancements in hardware technology and image processing techniques, it has applications in agriculture as well. Smart farming started in the year 1980 called British Agricultural Revolution, which leads to major yield production with more efficient methods. A four-field approach and selective cross-breeding were used to increase crop size and production rate by upgrading the three-field method. As a result, the yield rate improves. The second ISSN 0031 - 7683

Date Received: 10 Nov 2022

(2)

agricultural revolution occurred in the year 1930 when mechanized agriculture was permitted, and each farmer produced enough crops to feed around 25 people. Soil management and new agricultural technologies are part of this year’s revolutionary agriculture. After a long wait, the Green Revolution occurred in the 1990s. Genetically modified crops were developed during this time period as a result of agricultural scientific advances. These crops have insect resistance and use less water. It raised yield rates, and this revolution enabled farmers to increase agricultural productivity for 150 people. The third great revolution, sometimes known as the Green Revolution, is the adoption of genetically engineered crops by every farmer to improve productivity and fulfill demand.

The food supply must be expanded to satisfy global demand as the world’s population grows. Farmers require increasingly sophisticated technology with limited manpower and automation to get the most out of their land to satisfy that productivity need (Dora et al. 2020;

Lipinski et al. 2016). It increases the quantity, as well as the quality, of the crops. The development of UAV-based remote sensing techniques has taken agriculture one step further in terms of production. It leads to a PA (Grogan 2012; Mylonas et al. 2019), which could give the field data in a much fast, more precise, and most effective way.

Stems-based UAVs are the future technologies considered for remote sensing in PA (Mulla 2013). This type of UAV can fly at low altitudes, so it gives ultra-high-resolution images of crops. This type of approach enhances image processing capabilities. In addition to that, these UAVs are capable of monitoring different types of crops in a short time and also in a non-destructive way, which is very important (T. Zhao et al. 2018). Weeds are unwanted that grow alongside the. Since the time people originally endeavored the development of plants, they needed to battle the attack by weeds into regions picked for crops.

Some plants were found to have temperance not initially suspected as being eliminated from the classification of weeds and taken under development. Total annual losses in agriculture are from weeds (45%), insects (30%), diseases (20%), and others (5%), as stated in Figure 1, which is taken from the Tamil Nadu Agricultural University Agritech Portal (TNAU Agritech Portal n/d). This will lead to the integration of PA and communication technologies together paving the way for the fourth revolution in agriculture.

The Internet of Things (IoT) includes sensors and drones that are linked over the internet and communicate data in order to boost efficiency and predictability. To identify agricultural difficulties, the data collected from the different sensors is analyzed by machine learning utilizing computers. The incorporation of this technology simplifies farming. This study discusses the deployment of IoT in

agriculture (Atzori et al. 2010; Lee et al. 2013; Ojha et al.

2015; Sadowski and Spachos 2020). Collision avoidance tactics are required for an autonomous UAV to make the UAV and its surroundings safer. This study also gives a full assessment of the various types of collision avoidance systems that are accessible for both outdoor and indoor environments

LITERATURE REVIEW

In this section, we have highlighted related work in the field of IoT-based UAVs for PA. Initiated by Gaffey and Bhardwaj (2020), a study of most modern UAV applications in snow climate conditions was conducted.

UAVs offer additional advantages in terms of data acquisition, data processing, attachment of various sensors, viewing angles, and UAV flight altitude (Bhardwaj et al. 2016; Gaffey and Bhardwaj 2020; Näsi et al. 2018; Yang et al. 2017) as compared to conventional acquisition systems (Bartesaghi Koc et al. 2018; Garaba and Dierssen 2018; Zhang et al. 2016). According to the survey, various multirotor UAVs and fixed-wing UAV platforms were used in applications worldwide. The most common sensors were red, green, and blue (RGB), and the applications relied on high-quality video transmitted to the GCS. The study by Mohammed Abdulrazaq et al. (2020) shows how diverse and fast the use of UAV applications is evolving in daily life. They also propose to use IoT- based drone technology to develop a system capable of automatically and rapidly identifying coronaviruses from thermal images without human contact. Drones are less costly and have proven useful in many areas of agriculture, such as spraying for pests, identifying weeds, and monitoring crops (Mogili and Deepak 2018). The timely and reliable information provided by UAVs on production, yield, and crop management is beneficial to stakeholders such as farmers and distribution units to ensure food safety (Martos et al. 2021). UAV technology in agriculture could

Figure 1. Annual agricultural loss in India (TNAU Agritech Portal n/d).

(3)

potentially provide comprehensive crop monitoring from the beginning of the growing season to the completion of harvest (Silver et al. 2017). In their review study, Rani et al. (2019) suggest that UAVs are becoming more popular as an important aspect of PA to maintain the sustainability of agriculture. The topic of UAV communication over 5G/6G has been published in recent years. The author (Li et al. 2019) conducted a thorough investigation of UAV communication over 5G/B5G cellular networks.

The authors provided an overview of the recent research efforts in UAV communications involving 5G/B5G techniques at the different protocol layers such as the physical layer, the network layer, and the communication layer. To provide a solid foundation for 5G/6G wireless networks, the authors also studied specific urban areas.

A review article by Zeng et al. (2019) examined various issues in implementing UAV communications beyond 5G wireless networks. In the study by Fotouhi et al. (2019), they presented an overview of most of the variables that

enable the smooth integration of UAVs into cellular networks. Future networks such as 5G are expected to be better prepared to deal with UAV-related difficulties. The introduction of 5G technology was expected to deliver more robust and reliable networks. The contributions of the various literature works are listed in Table 1.

UAV HARDWARE

Drones are controlled by the flight control system (FCS) or flight controller by using the embedded computer.

The microcontroller in the FCS is loaded with the control software for various drone movements. These sensors are mounted on the drone frames (Spoorthi et al. 2017). The frames are made up of some lightweight/

composite materials that provide a limited amount of space for the sensors to mount. For PA, the drone must

Table 1. Literature contribution to topics.

Year Reference

Precision agriculture IoT

5G/6G in agricultur

e

Collision avoidance Challenges

Summary 2015 Simelli and Tsagaris

(2015) * This study survey explains the overview of the

NDVI technique used in agriculture to monitor crops

2017 Goudos et al. (2017) * * * This survey explains the IoT combined with 5G for

future technology applications

2018 Singhal et al. (2018) * This study presents a comprehensive survey of

various UAV classifications, applications, and challenges

2019 Tsouros et al. (2019) * * This article presents an overview of UAV-based

precision agriculture and the challenges in its imple- mentation

Huang et al. (2019);

Shakhatreh et al. (2019) * * This review article briefly explains the monitoring activity for civil applications and the challenges of operating in a static environment

2020 Yasin et al. (2020) * * This study is dedicated to the types, approaches,

applications, and challenges of collision avoidance

2021 Alsamhi et al. (2021) * * This survey provides an overview of recent

techniques and ideas for achieving green IoT for a sustainable future

2022

Boursianis et al. (2022) * * * * This review describes the importance of UAV technology in smart agriculture by examining UAV applications in various scenarios such as irrigation, fertilization, pesticide usage, weed management, and plant development

Velusamy et al. (2022) * * * The studies on the classification of UAVs and

their applications in crop disease detection are systematically evaluated and compared

Our survey * * * * * This article gives an in-depth analysis of IoT-en-

abled UAVs for precision agriculture, 5G technology in smart farming, and collision avoidance strategies;

finally, the challenges in implementing the UAV are also discussed

(4)

be having sensors such as a multi-spectral camera, RGB camera, biological sensors, meteorological sensors, etc., for application purposes to gather information during aviation. At the same time, the weight of the sensors needs to be lightweight. The information gathered from the sensors is partially processed and transferred to the ground station for further processing. The open-source hardware project like the Paparazzi flight controller by ENAC is the oldest. Then ENA released the Paparazzi Chimera autopilot. Then came the Pixhawk flight control, which is integrated with sensors like the accelerometer, barometer, etc. (Kale et al. 2015). Open-source software like Cc3d is used for flight control.

The UAV is expected to collect and record high-frequency signals for monitoring and surveillance. Using an autonomous drone equipped with a combination of the ground control system (GCS) and software, users can perform path planning, collect radio data, and calculate location. The system combines an avionics payload for science and flight that can create flight plans and store data on board. The synchronization of the vehicle data with the incoming radio signal is done by hardware in the UAV.

In addition, this system has a local Wi-Fi network that allows easy remote connections in the field. It is important to distinguish between the two subsystems of the UAV – flight and science avionics – as shown in Figure 2.

Figure 2. UAV hardware components. Picture reprinted from https://

uavrt.nau.edu/index.php/system_overview/

State Monitoring

The FCS requires information about attitude, velocity, and position for implementing FCS. For system state monitoring the inertial guidance system (IGS) is used. For attitude monitoring, systems such as inertial measurement units (IMU) and inertial navigation systems are used (Rawashdeh et al. 2017; Tang et al. 2017). The IMU consists of three accelerometers, three magnetometers, and three orthogonal gyroscopes, which are used for monitoring the acceleration, the orientation of the system,

and angular velocity respectively. The IGS has its own decision-making technique known as dead reckoning;

this method calculates the location of the UAV relative to its original position. The location estimation is provided by the four satellites called the global navigation satellite system (GNSS).

State Estimation

State estimation is important for estimating the velocity, position, and altitude to control a UAV. The onboard values are fed into the autopilot software to estimate the state of the UAV. The estimation is necessary because the systems are prone to unnecessary noise and vibrations.

This will lead to the inaccuracy of the measurement for coordinate transformation (Noor-A-Rahim et al. 2019).

This noise is produced due to the atmospheric condition, GPS buffer, and reflection caused by the nearby systems.

To reduce the noise of the system multiple sensors are used in the UAV for precise estimation by implementing the multiple sensor data technique. Information like attitude, velocity, and GPS information is transferred at a relatively high frequency than the normal value.

Generally, the transferred information is above 20HZ for small drones (Al-Mashhadani 2019). For lower frequency noise reduction the techniques like the Kalman filter are implemented to calculate the state estimate, as shown in Figure 3. For better accurate results, the unscented Kalman filter or extended Kalman filters are used. Other than this the model compensation method is used to improve the gyroscopic readings by modeling the random noise of the gyroscope and then offsetting it according to the system. The UAV state estimation is done by using the Kalman filter technique first as the model is predicted by using Equation 2; then, the Kalman gain (Equation 5) is calculated by integrating the process noise; and, finally, the corrected state is calculated by calculating the noise and the prediction values (Equation 6).

Figure 3. Kalman filter recursive algorithm.

Linearized Motion Model

(1)

(5)

Prediction

(2) (3) Linearized Measurement Model

(4) Optimal Gain

(5) Corrections

(7) (6)

𝑥� represents the current state of the UAV, 𝑥�₋₁ represents the previous state of the UAV, 𝑦� represents the measurement,

𝑊�₋₁ represents the process noise, 𝑣� represents the measurement noise, 𝑢�₋₁ represents the input,

𝑥ˇ� represents the prediction at time k, and 𝑥ˆ� represents the corrected prediction at time k.

Autonomous Control

Most recently the drones used for agricultural and research purposes are GPS-based autonomous drone that desires a specific path on their own. It uses GPS waypoint navigation to control the UAV beyond the pilot’s sight. The UAV could be controlled by the GCS by using the graphical user interface. Autonomous control involves height, attitude, roll, pitch, and other activities like spraying if it is an agricultural drone (Capello et al.

2017; Kwak and Sung 2018). The GPS waypoint provides sequential coordinates, which contain the location and the height of the UAV from the ground level. It has some pre-planned UAV paths to follow through.

Microcontrollers

The FCS has a set of onboard sensors for state estimation and uses a peripheral for data transfer and communication (Chao et al. 2010). STMicroelectronics produces a 32- bit STM32 microcontroller that runs on cleanflight and betaflight software. Some manufacture 8-bit processors called ATMEL ATmega 1284 processors. Qualcomm recently announced their chipset specifically for drones called snapdragon flight. It has quad-core ARM Cortex architecture and an NVIDIA Kepler-based GPU. It comes along with ports for communication like USB, HDMI, and Ethernet ports (Patel et al. 2013). But for small and

lightweight drones, Arduino boards are used. It is an open-source project and is capable of reading various sensors input (Emran and Najjaran 2018). The Arduino board comes in a variety of sizes with various processors.

Arduino nano is the one that is capable of collecting information like flight orientation data and transmitting it via a transmitter to the GCS.

Sensor Components

In agricultural UAVs, there are various components incorporated into the UAVs to enable motion control according to environmental variations. The different parts and their purposes are listed in Table 2.

REAL-TIME IMPLEMENTATION

The specialized built drone software runs on a computer at the GCS. It provides a user full control over a drone for state estimation and takes actions remotely. By using the user input and sensor data combines to operate a UAV actuator. The onboard sensor data determines the drone’s position and attitude. A stereo camera was implemented to estimate the velocity and for obstacle avoidance. Some advanced UAV vision and IMU sensors are used for navigation and automatic landing of the drone.

Differential Global Positioning System (DGPS) The accurate localization of the drone is estimated by using this DGPS. Localization and mapping is the method to locate, estimate, and build a 3D model of the surrounding by using the sensors. A 3D model estimation is done by multiple cameras onboard in the UAV called multi-camera parallel tracking and mapping (PTAM) (Nissimagoudar et al. 2016; Turci 2017; Wang et al. 2019).

The estimation integrated concept of PTAM is from the ego-motion camera. This article proposed a novel extrinsic parameter calculation method to eliminate the overlapping of the camera view.

Mobile Phone as a Hardware

Currently, the most used localization sensors are vision sensors, laser sensors, and IMU. Most modern-day UAVs are equipped with these sensors. But these sensors weigh too much that the application in the small drone is very difficult. Small UAVs are preferred for their cost and their maneuverability (Kang et al. 2017). Considering the small drone and its cost, the sensor equipment makes no sense. So for that process, micro-electro-mechanical systems (MEMS) are used. The MEMS is an alternative to the sensor used because it is cheap and has less weight (Molaei et al. 2020). Mobile phones have the advantage of having various sensors like gyroscopes, magnetometers,

(6)

Table 2. Sensors and their functions.

References Components Purposes

Costa et al. (2012); Marinello et al. (2016) Accelerometer To measure the acceleration of the system

Kale et al. (2015); Yallappa et al. (2017) IMU To measure the angular rates and forces

acting

van Blyenburgh (1999); Vanitha et al. (2016) Hyper-spectral camera It provides an image of narrow bands Vardhan et al. (2014); Vanitha et al. (2016) Telemetry Used to get UAV live data

Kabra et al. (2017); Spoorthi et al. (2017) BLDC Motor for motion

Berner and Chojnacki (2017); Huang et al. (2014) Barometer To measure the atmospheric pressure Herwitz et al. (2002); Tang et al. (2017) Humidity indicator To measure a moisture content

Qin et al. (2018) Anemometer To measure wind speed

Anthony et al. (2014); Marinello et al. (2016) WSN To sense the environmental condition Herwitz et al. (2004); Tang et al. (2018) Digital temperature Temperature monitoring

Giles and Billing (2015); Tang et al. (2017) ESC For BLDC speed regulation

Vardhan et al. (2014) Altimeter For measuring the altitude of a UAV

Reinecke and Prinsloo (2017); Huang et al. (2019) Laser scanner 2D Capture 2D image

Kale et al. (2015); Natu (2016) Video camera To record a video for monitoring of

crop

Achtelik et al. (2011); Berner and Chojnacki (2017) GPS Geo-location

Achtelik et al. (2011); Reinecke and Prinsloo (2017) Air pressure sensor Measure the air pressure around the UAV

Achtelik et al. (2011) Thermal camera Used for low light photography

Kale et al. (2015); Natu (2016) PWM controller Used to give a pulsating signal to the BLDC

Reinecke and Prinsloo (2017); Shilin et al. (2017) Camera Capturing a high-resolution image

Qin et al. (2018) Nozzle For spraying pests and herbicides

Hassanein et al. (2018) Flight controller It is the ECU of the UAV

barometers, and GNSS. And also it has multicore processors for faster processing speed. Its compatibility made this more suitable for small drones. Further, in the study by B. Zhao et al. (2018), the smartphone is proposed as a drone control algorithm. This smartphone technology opens up the convenience of various sensors in a single unit for an organized system to reduce the weight of the sensor components. The smartphone implementation for small UAVs is shown in Figure 4.

Figure 4. Smartphone as an onboard controller.

Scheduling Algorithm

The scheduling of the process is important to avoid the simultaneous access of resources among the task. To meet the real-time performance timing, the efficient use of resources is made requires a specific scheduling algorithm.

The scheduling algorithm can be broadly divided into two types called offline scheduling and online scheduling. In offline scheduling, the scheduling is carried out before the execution. This algorithm is also known as pre-run time scheduling. Then, this algorithm is run during the UAV run time. For the offline scheduling algorithm, it uses the earliest deadline first to schedule the task (Singh et al.

2019; Tobita and Kasahara 2002). In the online scheduling algorithm, it schedules the task during run-time. Online scheduling incorporates an event-driven and periodic rolling strategy is used discussed in (Panda and Jana 2015).

Real-time Operating System (RTOS)

The RTOS literature for the UAV control system is very limited and the scheduling studies on UAVs are also

(7)

minimal. It provides a micro-controller-based program by the real-time kernel. The real-time kernel promises the algorithm has scheduled in time to meet the UAV need (Zheng and Xiao 2019). The real-time operating system provides a crucial operating environment for various missions. FreeRTOS is the most commonly used one for RTOS. The study was conducted by Alvear et al. (2017) and Khosiawan et al. (2019). This study compares the functional changes in the software for various versions ranging from v2.4.2 to v10.0.0.

IOT IN AGRICULTURE

Agricultural sectors are shifting toward IoT technologies to boost crop production capacities. In terms of production and monitoring tactics, this technology takes farming to the next level. IoT includes a variety of sensors that are integrated to execute a variety of activities. Farmers are given varied information regarding monitoring, soil detection, weed detection, yield prediction, disease detection, and pest spraying technologies to boost agriculture using IoT technology. IoT sensors are installed on agricultural fields. IoT-connected devices detect nutrient and water deficiencies. In addition, unmanned drones and satellites capture real-time images of individual crops and store this data in the cloud. Predictive analytics software for PA with machine learning combines sensor data and image data to help farmers select which fields

Figure 5. Basic overview of IoT in smart farming. Picture reprinted from https://netstratum.com/blog/how-iot-smart- farming-improves-revenue-for-farmers/

to irrigate, ideal planting and harvesting times, crop rotation, and soil management. All of this information is very useful as each crop is carefully tended, which is only possible with the help of this IoT implementation.

The data collected and stored in the cloud is called Big Data. It can also tell you where to plant a particular crop, which fields to treat, and how much water, herbicides, and fertilizer to use for each plant. The basic overview of IoT implementation in smart agriculture is shown in Figure 5.

Land maps are developed digitally and utilized for seeding various crops and crop-related characteristics such as crop depth. There are also several sensors added to monitor soil humidity and water content in order to minimize soil erosion, acidity, and pollution (Adamchuk et al. 2004; Placidi et al. 2020). Water irrigation based on IoT technology is chosen over traditional methods.

Because the exact flow of water across all sections of the land lowers water waste. The data such as nutrient level determine the precise amount of fertilizers required as excess levels affect the land. Despite the fact that the initial cost of this technology is relatively costly, in the long term it boosts output, eliminates soil contamination, and lowers fertilizer costs (Harun et al. 2015; Wang et al. 2017). The data from IoT devices is uploaded to a cloud-based server for further processing. To communicate between the UAV and the ground-based system, this data transmission employs technologies such as 3G/4G/5G, Bluetooth, SigFox, Narrow Band IoT (NB-IoT), or Wi-Fi Direct.

UAVs can be utilized mobile as base stations in smart

(8)

environments to offer services to monitoring regions.

Smart gadgets, on the other hand, are positioned outside the UAV coverage area. In order to extend coverage to devices not covered by UAVs, relay hops and UAVs can be used together. The intensity of the signal is critical in determining the area coverage.

3G/4G/NB-IoT Technology

Wireless technologies like 3G/4G/NB-IoT enable the data sharing capability between IoT-based devices for data transfer, estimation, etc., in agriculture, as shown in Figure 8. The existing 4G network facilitates the quality of the network and poses a heterogeneous network due to the channel condition, data rate changes, handoff issues, etc.

(Akyildiz et al. 2010; Huang et al. 2012). Mobile phones are equipped with multiple antennas and transmitters because of the fluctuation in the network signals. This translates into high power usage, but coming to the IoT- based drone poses a considerably smaller battery to power the vehicle. Since all the smart agricultural technologies are battery power they cannot withstand more amount of time in the field. If the battery size increases the payload on the system increases, which will affect the performance of the system (Hassebo et al. 2018; Tamilselvan and Niresh 2022). To achieve a massive amount of research work is going on to improve communication, speed, processing capability, and high specific energy capability, which will lead to improvements in precise agriculture and monitoring time. Achieving a fast data transfer rate the available 4G band is not fast enough because of the

Figure 6. Uses of the cellular network in the agricultural field.

bandwidth and latency issues (Martin et al. 2011). IoT devices required fast performance with ultra-low latency and this can be achieved by high performance with a low- cost network. The current 4G network cannot support such features. These let down in the current cellular network make a way for the next-generation 5G. The uses of the cellular network in the agricultural field are shown in Figure 6.

5G in Agriculture

For agricultural applications, a both reliable and fast connection is required to function the IoT. The current generation of the rural network has poor connectivity and the urban connectivity is also poor due to the high demand in the network among users (Ivancic et al. 2019).

In most countries, the network connection in rural areas is insufficient, and also in some developed countries, the multi-farm with a lot of IoT requires a large amount of data to be stored in the cloud. The current network cannot cope with the demand. To increase the feasibility of smart agriculture, researchers working on network slicing and massive multiple-input multiple-output for providing better connectivity. So 5G connectivity comes into play to fulfill this goal for smart farming by enabling low cost, low energy consumption, high speed, and high spectrum efficiency. The first autonomous crop seeding, crop monitoring, and applying water, pests are introduced in 2017 (Faraci et al. 2018). The whole process is fully autonomous with the inclusion of any labor in the field.

In 2018, there was a project called Hands-Free Hectare

(9)

reported a good cultivation rate. With the inclusion of the 5G technology, PA is possible. By using the 5G technology one could monitor the farms, live stocks, etc. from their home. 5G technology also expands its advantages to enable the IoT sensor to provide various technologies in smart farming (Razaak et al. 2019). The application of 5G expands to the UAV, predictive maintenance, artificial intelligence (AI) robots, and cloud repositories.

This technology easily manages IoT devices and farms.

Table 2 shows the comparison of various wireless network protocols used for IoT devices.

Data Analytics and Cloud Repository

Data are quite possibly the main highlights of the advancement of technology driving the smart agriculture industry. The data – which is gathered from the various sensors, drones, and various farms – are stored in a cloud-based repository. Edge computing and 5G enable high-speed data transfer so that real-time controlling and machine-to-machine interface is possible. The data such as crop monitoring, weed detection, early disease detection, weather prediction, yield prediction, pest recommendation, fertilizer recommendation, and soil and crop mapping.

These many data (sizes vary from megabytes to terabytes) must be moved to the cloud repository and back to the GCS. For instance, the UAV detects the pest in the crop, sends that data to the cloud, and then returns to the farmer

Table 2. Different types of wireless protocols used for IoT applications.

Technology Frequency band Maximum range Data rate Channel bandwidth Reliability Low cost Low latency Scalability SLA sup- port Mobility support Low power

Wi-Fi 2.4-60 GHz 100 m 10 Mbps 20 or 40

MHz Yes Yes Yes Yes No No Yes

NB-IoT 700,800,900

MHz 15 km 200 kbps 200 kHz Yes Yes Yes Yes Yes Yes Yes

ZigBee 902-928 MHz < 1 km 250 kbps 2 MHz No Yes No No No No Yes

Bluetooth 2.402-2.480

GHz Classic:

100m BLE:

240m

Classic:

3Mbps BLE:

2Mbps

15 MHz Yes Yes Yes No No No Yes

LTE-M

(Rell 13) 1.7-2.1 GHz, 1.9GHz, 2.5-2.7 GHz

12 km 1 Mbps 1.4 MHz Yes Yes Yes Yes Yes Yes Yes

LTE-M

(Rell 13) 1.7-2.1 GHz, 1.9GHz, 2.5-2.7 GHz

12 km 4 Mbps 5 MHz Yes Yes Yes Yes Yes Yes Yes

5G <6 GHz 200 m 1 Gbps 400 MHz Yes Yes Yes Yes Yes Yes Yes

B5G/6G 0.1~0.3

THz 320 m 10 Gbps 160 MHZ Yes Yes Yes Yes Yes Yes Yes

to alert him or her to take pest spraying recommendations.

Cloud-based computing is overwhelmingly utilized in smart agriculture. The cloud can be utilized as a server farm or host to store the control administrations of UAVs for route and information handling, as shown in Figure 7.

This information is examined immediately by agricultural intelligence to create simulated intelligence solution guides to satisfy the variable-rate application of plant protection. Edge computing technology utilizes GPU in the cloud edge server (Hassan et al. 2019). Due to this, the size, weight, power utilization, and cost of the UAVs are fundamentally diminished. The bandwidth required to transfer and process these many data are exceptionally high at 120 Mbps; only 5G networks can do this with minimal to no latency. All control administrations for the route and information handling are situated in the cloud, running in server farms or on dedicated hosts.

5G will be considerably faster and will increase the user experience over the existing network (Song et al. 2019).

With a speed of 10–30 Gbps, a lot of information can be transferred across various gadgets with minimum data loss, i.e. the data transfer time will be reduced and, thus, retransfer of data can be avoided. This enables the maximum crop productivity of smart agriculture. Cloud computing utilizes this 5G technology fully, providing high-speed data transfer with minimal latency and faster data processing.

(10)

Figure 7. Cloud computing mechanism. Picture reprinted from (Tang et al. 2021).

AUTONOMOUS COLLISION AVOIDANCE

In recent years, several solutions for collision avoidance have been presented. The main idea underlying collision avoidance algorithms are to provide a control signal that can lead to a conflict-free trajectory of the vehicle. This is because the global trajectory planning technique can also be used for collision avoidance. Since the use of geometric concepts to create collision avoidance algorithms is different from other groups, collision cone approaches and vision-based approaches are classified as geometric guidance techniques. Several techniques emphasize motion planning in a multi-UAV team.

Path planning involves finding the shortest path in a field or database where the edges of obstacles are already known. Conflict resolution systems use obstacle vehicles with known trajectories and determine a trajectory for a controlled vehicle within a given time period, i.e. they design a trajectory. Model predictive control approaches

use a vehicle model to predict future state behavior to maximize control while constraining inputs and states to acceptable quantities and bounds. Geometric steering approaches attempt to provide reactive avoidance control based on conflict geometry.

An autonomous UAV can sense the obstacle and perceive them, as well as the ability to analyze, plan, communicate, and make decisions using the specially made algorithm for the vehicle. Due to no human being involved in the control of a UAV, it is subjected to collide with either a static or moving object that causes serious damage to the UAV. This is the main concern when it comes to autonomous UAVs.

Albaker and Rahim (2009) proposed many collision avoidance algorithms to solve the autonomous collision problem. There are more sophisticated algorithms are available, the major approaches are geometric, potential field, path planning, and vision-based.

(11)

Geometric Approach

The geometric approach is one of the collision avoidance techniques used to avoid a collision, as well as to operate in a safer region. This is the preplanning approach to avoid obstacles. This approach is mainly used for path planning (Chakravarthy and Ghose 1998; Mujumdar and Padhi 2010; Park et al. 2008). There are three types of approaches using this technique: point of closest (PCA), Dubins path, and Collision cone.

Point of Closest Approach (PCA)

PCA is a method based on a geometric approach. This method defines the UAV as a point mass that operates at a constant speed, and all the UAVs in that space are linked with the database – namely, automatic dependent surveillance broadcast (ADSB). The ADSB enables the exchange of information about the speed, altitude, and position of the UAV among each other. In the PCA approach by using the ASDB data, the distance between the UAVs is calculated. If this value is less than the minimum allowed value, then the UAV is in a conflict region, as shown in Figure 8.

Figure 8. Point of closest approach.

The relative distance of the UAV can be calculated using Equation 13 to know the UAV conflict condition of the UAV using PCA.

(13)

Dubins Path Approach

The Dubins path (Dubins 1957) is an obstacle avoidance approach that is based on the geometric approach. It is the shortest path connecting two given points. The major drawback of this approach is the discontinuity in the switching point (Shanmugavel et al. 2010). A Dubins path is formed by two circular arcs with a tangential line in between are with the three circular arcs. The UAV using the Dubins approach has three controls: “turn left at maximum” named L, “turn right at maximum” named R, and “go straight” named S. So by using these three commands, the six combinations of these commands are formatted such as LSL, LSR, LRL, RSR, RSL, and RLR.

In recent research, this terminology is further reduced into mainly two types – CSC and CCC – as shown in Figure 9. C means “curve.” These two approaches are mainly used in stationary obstacle avoidance, so there is a need for improvement in the dynamics of obstacle avoidance.

(8)

(9) (10) where � is the pitch angle, 𝛾 is the aileron, and ∅ is the yaw angle.

(11) (12)

Figure 9. The Dubins path approach. Picture reprinted from https://

gieseanw.wordpress.com/2012/10/21/a-comprehensive- step-by-step-tutorial-to-computing-dubins-paths/

Collision Cone Approach

A collision cone approach is an approach used to predict collision possibilities among UAVs. It can also be used to design collision avoidance to avoid a collision. In this approach, the UAV circle is created around the obstacles that are to be avoided. The tangential line is drawn from the UAV to that obstacle circle, as shown in Figure 10. From that the distance between the circle and the tangent line is calculated to predict the distance of the obstacle from the UAV. If there is an obstacle between

(12)

Figure 10. Collision cone approach.

the two tangent lines then the UAV is in a critical zone (Chakravarthy and Ghose 1998).

Potential Field Approach

A potential Field Approach was proposed by the authors of (Khatib 1985) to avoid collisions for ground robots. This method can be used for collision avoidance between UAVs and static obstacles also. In this method, the repulsive force fields are used, which causes UAV to be repelled by the obstacles. The potential field approach uses two forces one is the attractive force that pulls the UAV to the desired path and another one is the repulsive field that helps to save the UAV from the obstacles. This approach needs a large calculation time and power. This significance of the potential field approach makes this approach not suitable for small-size UAVs in real-life applications (Mujumdar and Padhi 2010).

Path Planning Approach

The path planning approach in UAVs is the act of finding the way from the initial position to the goal location. It provides a continuous collision-free path that links an initial position to the targeted point. The workspace and the obstacle geometry of this approach are represented in either 2D or 3D. The path planning approach uses a grid-based approach.

From this, the obstacle can be read out and the path planning has been done. It geometric methods to calculate the desired path for collision avoidance. The grid was generated from which the collision-free path was found using the search algorithm, as shown in Figure 11. On the downside, path planning needs to address the drawbacks like computational time and path length. Because of a dynamic environment, the UAV needs to have the flexibility to make a decision quickly to avoid such obstacles (Alexopoulos et al. 2013).

Vision-based Approach

The vision-based obstacle avoidance technique uses a small camera for detecting the obstacles. In recent times, this topic attracts much of the research toward a vision-based

Figure 11. Path planning approach.

approach. The limitation arises from the obstacles in a larger area. For this type of place, it needs to accommodate a larger sensor to detect the obstacles in a better way, and this will cause a problem in UAV cost and the increase in weight of the UAV. This method of approach uses a machine learning technique discussed earlier like ANN and CNN techniques to process this data and make a wise decision. Moreover, this camera can give information about obstacle identification and segmentation also. The major disadvantage of using this approach is the complexity of the data processing. Because the small UAVs have very tiny processors, they are rendered incapable of doing data processing. Due to that, the GCS is required to control the small UAV very precisely to avoid the obstacles in its path or the UAV-integrated hardware is capable of doing that processing. In the study by Sedaghat-Pisheh et al. (2017), the author proposed a computer vision-based approach to avoid obstacles in the path of the UAV. They used a cascade classifier technique based on the machine learning approach to detect the UAV and the camshaft algorithm to track it.

This algorithm determines the coordinates of the center of an object, which moves toward the UAV. If it detects the obstacle 5 ft far from the UAV, then the collision avoidance algorithm is executed. Gageik et al. (2015) proposed a simple solution employing infrared and ultrasonic range finders for obstacle avoidance and path planning. This work proposed a solution for low computational work that leads to low time consumption and also consumes low memory space. In the study by Lyu et al. (2016), a vision- based approach is proposed by the researcher. It consists of image detection by using the sensors like RGB camera and hyper-spectral cameras by using the onboard processor processes that image and send it to the GCS. The GCS has a set of algorithms pre-programmed into the system that will be executed, and the commands are transmitted to the UAV autopilot for obstacle maneuvering.

(13)

Figure 12. UAV collision zones. Picture reprinted from Sawalmeh and Othman (2018).

Indoor Collision Avoidance

Indoor collision avoidance algorithms are complex as compared to the outdoor environment. This type of collision avoidance technique is used in indoor agriculture/

indoor farming. The use of GPS is limited to the outdoor environment only; generally, the indoor environment is called a GPS-denied environment. Thus, indoor collision avoidance is a challenging task that needs higher requirements. In indoor avoidance, the RF signals cannot be used because it reflected and then degraded by the wall and the obstacles.

Due to that, vision-based techniques are used that incorporate some sensors such as IR cameras, optical sensors, ultrasonic sensors, and laser scanners for obstacle detection. These type of smaller sensors provides exact information about the obstacles in the UAV path, leading to accurate collision avoidance in the indoor environment. In many of the articles (Alvarez et al. 2016; Luo et al. 2013; Schmid et al. 2013), researchers use this vision-based technique to propose fully autonomous UAV control in an indoor environment. They use a forward-facing monocular camera that detects the obstacles and produces a dense depth map of that obstacle to provide a collision-free trajectory for UAVs. The image capture at a frame rate of 30 frames/s helps in better visibility of the environment. This method uses off-board control to make the decision. The data were processed and sent to UAV in real- time using a Wi-Fi network or other low latency networks which were discussed above. The off-board image processing technique is outdated because of its latency. To solve the latency problem and make a quick decision, Mustafah et al.

(2012) proposed a UAV with two cameras and an onboard computational unit. It uses Gaussian mixture probability hypothesis density for color segmentation of the image, which is captured by these cameras. But the problem arises in smoky and dark environments because these cameras are light-sensitive. In such cases, these collision avoidance strategies have failed. To tackle this, the sensor-based obstacle avoidance system was proposed by Chee and Zhong (2013) and Gageik et al. (2012). This algorithm consists of two modules: obstacle detection and collision avoidance. It uses 12 ultrasonic sensors that are positioned in a circular array to detect the obstacle in the UAV. These 12 sensors give 360-degree detection, thus improving the accuracy of the obstacle near the UAV. The area around the UAV is divided into three main zones, as shown in Figure 12.

[i] Green zone indicates a safe zone. UAV is far from the obstacle (obstacle distance > X + Y).

[ii] Yellow zone indicates a close zone (X < obstacle distance < X + Y). The speed of the UAV is reduced and the close area state was activated.

[iii] Red zone indicates a danger zone (X > obstacle distance). The danger zone state is activated to prevent the UAV from the obstacle.

Deep Reinforced Learning

In this part, we provide a deep reinforced learning approach for collision avoidance. Unlike the model-based technique, the RL-based deep learning approach does not need any prior knowledge. This makes RL a more appropriate framework for dealing with the unexpected and uncorrelated motion of objects near UAVs. UAVs can predict the upcoming movement of any movable object and avoid collisions thanks to bootstrapping and sampling in deep reinforced learning. The following parts must be properly described in order to simulate collision avoidance using the RL framework.

[a] Agent. The deep reinforced learning agent is created and operated within its control trajectory of UAV to prevent collisions while consuming the least amount of energy by taking the shortest path to its destination.

[b] Environment. The environment includes a collection of UAVs and stable items (e.g. walls). Within this 2-D limited space, the agent moves.

[c] Actions. The action of the control UAV is of four movements: A = {Aileron, Elevation, Throttle, Redder}

[d] Reward. The agent is favorably rewarded with 100 if it succeeds in reaching its desired location. If the UAV collides on its way to the destination, the agent receives a negative reward of 100. Finally, to encourage the agent to choose the shortest path, there is a 0.1 penalty for each step the agent takes until it reaches its target.

[e] State. The state consists of two parts:

[i] to find the distance vector 𝑃���:

(14) where

(15) (16)

(14)

𝑥�, 𝑦� represent the destination location, and 𝑥�, 𝑦� represent the present location.

Note that if (𝑥���, 𝑦���) = (0,0), UAV is at its destina- tion. Because the agent is independent of the scenario size in this way, the solution is more universal so that the same training dataset may be used in a variety of settings.

[ii] The mobile distribution and immobile items throughout a grid square centered on the UAV. The grid is considered and described as a binary matrix.

Each member of this matrix shows whether a static or movable item (intruder) is present (= 1) or not (=

0) within the relevant cell. This method allows us to examine the UAV proximity, which is the most important for avoiding collisions and reducing the state space by aggregating multiple observations into a single state. As a result, quicker learning and convergence will be noticed.

CHALLENGES

Although the usage of UAVs for PA is growing, there are some restrictions that keep them from being widely used. In the absence of established workflows, ad hoc techniques are used to set up PA applications, which does not satisfy key stakeholders. In addition, since PA requires data-intensive techniques for image analysis, competent and expert persons familiar with this technology are required. This implies that a typical farmer may require training or may be compelled to hire professionals to assist with image processing, which might be very costly. This problem may prevent independent farmers with few and small agricultural lands from adopting UAV technologies.

Another barrier is the high cost of purchasing UAVs.

Besides that, most commercially available UAVs for PA have a short flight time, ranging from 20 min to 1 h, and cover only a relatively limited area. UAVs with longer flight times are somewhat more expensive.

Big data processing and storage have proven to be a major challenge in the IoT, especially when using AI approaches to extract meaningful information from massive amounts of data for various activities. As a result, mobile edge computing has emerged as an efficient technology delivered by edge devices that provides network performance superior to cloud computing networks to solve difficult problems such as latency and compute- intensive work in IoT-powered drones. Drone edge intelligence has issues with security and decentralized management, which limits its usefulness in supporting green smart environments. Blockchain is a reliable technology that enables decentralized data exchange

while maintaining anonymity. Speed, energy efficiency, and processing frequency are just some of the issues that need to be addressed in blockchain-based systems.

The 5G-enabled drone has shown that it is possible to offload flight control at the edge of the network while enabling low-latency control over 5G. Such an approach can greatly offload the UAV from resource-intensive tasks, ensuring energy savings on the one hand while allowing computationally intensive tasks to be performed at the edge of the network, thus enabling autonomous UAV flights based on data analysis from sensors mounted on the UAV. This would help UAVs autonomously collect data and transmit it to the base station within a fraction of the time. Such advances toward next-generation B5G/6G- capable UAVs with high computational power, as well as optimized energy consumption and flight decisions, can help realize a variety of use cases, including creating ad hoc networks in urban areas where no network is available.

Collision avoidance must be created to handle and complete difficult tasks. Despite the fact that various systems have been developed and evaluated, there are still certain difficulties in collision avoidance systems.

In most collision avoidance algorithms, the minimum safe distance is required as a design parameter. This requires understanding the worst-case safety condition, i.e. the drone’s ability to stop completely before hitting an obstacle. The final limitation is that there are no uniform regulations for UAV systems and operations. Uniform laws for UAV systems and operations are critical to facilitate the globalization of UAVs for various applications.

CONCLUSION

UAV technology has developed in PA in recent years, increasing crop yield in the agricultural area. Chemical toxicity and labor constraints drive the development of a UAV-based sprinkling system. It employs processed images and GPS coordinates to more precisely apply fertilizer/herbicides to crops/weeds, respectively. This has the potential to decrease chemical waste. At the same time, UAVs for PA are still in their early stages.

By implementing the IoT-enabled UAV for monitoring in the field, farmers can improve production with ease.

Smart farming can let farmers spend less time in the field while still improving crop yields. IoT-based 5G technology applications enable farmers to collect useful data with almost no latency that can be utilized to increase agricultural efficiency and yield, which helps in the future.

Furthermore, the various collision avoidance techniques are presented in detail in order to make both the UAV and the surroundings safer. This will pave the way for future opportunities in UAV development for smart farming.

(15)

The progress of UAV technology, battery management techniques, image processing techniques, flying time, cost-effectiveness, and enhanced spraying system is expected to make it one of the major technologies for the future. Research is now being conducted to increase UAV efficiency and monitoring technologies. It provides the possibility of UAV-based smart farming and PA.

AUTHORS’ CONTRIBUTION

The authors confirm their contribution to the paper as follows:

• study conception and design: Tamilselvan Ganesan, Niresh Jayarajan and Sureshkumar P.;

• data collection: Sureshkumar P.; and

• draft manuscript preparation: Tamilselvan Ganesan and Niresh Jayarajan.

All authors reviewed the results and approved the final version of the manuscript.

CONFLICT OF INTEREST

All authors declare that they have no conflicts of interest.

REFERENCES

ACHTELIK MC, STUMPF J, GURDAN D, DOTH KM.

2011. Design of a flexible high performance quadcopter platform breaking the MAV endurance record with laser power beaming. In: IEEE International Conference on Intelligent Robots and Systems 2011: 5166–5172.

ADAMCHUK VI, HUMMEL JW, MORGAN MT, UPADHYAYA SK. 2004. On-the-go soil sensors for precision agriculture Comput Electron Agric. 44(1):

71–91.

AKYILDIZ IF, GUTIERREZ-ESTEVEZ DM, REYES EC. 2010. The evolution to 4G cellular systems: LTE- Advanced. Physical Communication 3(4): 217–244.

ALBAKER BM, RAHIM NA. 2009. A survey of collision avoidance approaches for unmanned aerial vehicles. In:

International Conference for Technical Postgraduates 2009, TECHPOS. p. 1–7.

ALEXOPOULOS A, KANDIL A, ORZECHOWSKI P, BADREDDIN E. 2013. A comparative study of collision avoidance techniques for unmanned aerial vehicles. In: Proceedings – 2013 IEEE International

Conference on Systems, Man, and Cybernetics, SMC.

p. 1969–1974.

AL-MASHHADANI MA. 2019. Optimal control and state estimation for unmanned aerial vehicle under random vibration and uncertainty. Meas Control (United Kingdom) 52(9–10): 1264–1271.

ALSAMHI SH, AFGHAH F, SAHAL R, HAWBANI A, MOHAMMED AL-QANESS, LEE B, GUIZANI M.

2021. Green internet of things using UAVs in B5G networks: a review of applications and strategies. Ad Hoc Networks 117: 102505.

ALVAREZ H, PAZ LM, STURM J, CREMERS D. 2016.

Collision avoidance for quadrotors with a monocular camera. In: Springer Tracts in Advanced Robotics. p.

195–209.

ALVEAR O, ZEMA NR, NATALIZIO E, CALAFATE CT. 2017. Using UAV-based systems to monitor air pollution in areas with poor accessibility. J Adv Transp 2017: 8204353.

ANTHONY D, ELBAUM S, LORENZ A, DETWEILER C. 2014. On crop height estimation with UAVs. In:

IEEE International Conference on Intelligent Robots and Systems 2014: 4805–4812.

ATZORI L, IERA A, MORABITO G. 2010. The Internet of Things: a survey. Comput Networks 54(15):

2787–2805.

BARTESAGHI KOC C, OSMOND P, PETERS A, IRGER M. 2018. Understanding Land Surface Temperature Differences of Local Climate Zones Based on Airborne Remote Sensing Data. IEEE J Sel Top Appl Earth Obs Remote Sens 11(8): 2724–2730.

BERNER B, CHOJNACKI J. 2017. Use of Drones in Crop Protection. Farm Machinery and Processes Management in Sustainable Agriculture, Lublin, Poland. p. 46–51.

BHARDWAJ A, SAM L, AKANKSHA, MARTÍN- TORRES FJ, KUMAR R. 2016. UAVs as remote sensing platform in glaciology: present applications and future prospects. Remote Sens Environ 175:

196–204.

BOURSIANIS AD, PAPADOPOULOU MS, DIAMANTOULAKIS P, TSAKALIDI AL, BAROUCHAS P, SALAHAS G, KARAGIANNIDIS G, WAN S, GOUDOS SK. 2022. Internet of Things (IoT) and Agricultural Unmanned Aerial Vehicles (UAVs) in smart farming: A comprehensive review.

Internet of Things (Netherlands) 18: 100187.

CAPELLO E, GUGLIERI G, RISTORTO G. 2017.

Guidance and control algorithms for mini UAV

(16)

autopilots. Aircraft Engineering and Aerospace Technology 89(1): 133–144.

CHAKRAVARTHY A, GHOSE D. 1998. Obstacle avoidance in a dynamic environment: A collision cone approach. IEEE Trans Syst Man, Cybern Part A Systems Humans 28(5): 562–574.

CHAO H, CAO Y, CHEN Y. 2010. Autopilots for small unmanned aerial vehicles: A survey. Int J Control Autom Syst 8(1): 36–44.

CHEE KY, ZHONG ZW. 2013. Control, navigation, and collision avoidance for an unmanned aerial vehicle.

Sensors Actuators, A Phys 190: 66–76.

COSTA FG, UEYAMA J, BRAUN T, PESSIN G, OSORIO FS, VARGAS PA. 2012. The use of unmanned aerial vehicles and wireless sensor network in agricultural applications. In: International Geoscience and Remote Sensing Symposium (IGARSS) 2012: 5045–5048 DORA M, WESANA J, GELLYNCK X, SETH N, DEY

B, DE STEUR H. 2020. Importance of sustainable operations in food loss: evidence from the Belgian food processing industry. Ann Oper Res 290(1–2): 47–72.

DUBINS LE. 1957. On Curves of Minimal Length with a Constraint on Average Curvature, and with Prescribed Initial and Terminal Positions and Tangents. Am J Math 79(3): 497–516.

EMRAN BJ, NAJJARAN H. 2018. A review of quadrotor:

an underactuated mechanical system. Annual Reviews in Control 46: 165–180.

FARACI G, RACITI A, RIZZO S, SCHEMBRA G. 2018.

A 5G platform for Unmanned Aerial Monitoring in Rural Areas: Design and Performance Issues. In: 2018 4th IEEE Conference on Network Softwarization and Workshops 2018: 237–241.

FOTOUHI A, QIANG H, DING M, HASSAN M, GIORDANO LG, GARCIA-RODRIGUEZ A, YUAN J. 2019. Survey on UAV Cellular Communications:

Practical Aspects, Standardization Advancements, Regulation, and Security Challenges. IEEE Commun Surv Tutorials 21(4): 3417–3442.

GAFFEY C, BHARDWAJ A. 2020. Applications of unmanned aerial vehicles in cryosphere: latest advances and prospects. Remote Sens 12(6): 948.

GAGEIK N, BENZ P, MONTENEGRO S. 2015. Obstacle detection and collision avoidance for a UAV with complementary low-cost sensors. IEEE Access 3:

599–609.

GAGEIK N, MÜLLER T, MONTENEGRO S. 2012.

Obstacle Detection and Collision Avoidance Using Ultrasonic Distance Sensors for an Autonomous Quadrocopter. Proc UAVveek Work Contrib.

GARABA SP, DIERSSEN HM. 2018. An airborne remote sensing case study of synthetic hydrocarbon detection using short wave infrared absorption features identified from marine-harvested macro- and microplastics.

Remote Sens Environ 205: 224–235.

GILES DK, BILLING RC. 2015. Deployment and performance of a UAV for crop spraying. Chem Eng Trans 44: 307–312.

GOUDOS SK, DALLAS PI, CHATZIEFTHYMIOU S, KYRIAZAKOS S. 2017. A Survey of IoT Key Enabling and Future Technologies: 5G, Mobile IoT, Sematic Web, and Applications. Wirel Pers Commun 97(2): 1645–1675.

GROGAN A. 2012. Smart farming. Eng Technol 7(6):

38–40.

HARUN AN, KASSIM MRM, MAT I, RAMLI SS. 2015.

Precision irrigation using Wireless Sensor Network. In:

2015 International Conference on Smart Sensors and Application 2015: 71–75.

HASSAN N, YAU KLA, WU C. 2019. Edge computing in 5G: a review. IEEE Access 7: 127276–127289 HASSANEIN M, LARI Z, EL-SHEIMY N. 2018. A new

vegetation segmentation approach for cropped fields based on threshold detection from hue histograms.

Sensors (Switzerland) 18(4): 1253.

HASSEBO A, OBAIDAT M, ALI MA. 2018. Commercial 4G LTE cellular networks for supporting emerging IoT applications. In: 2018 Advances in Science and Engineering Technology International Conferences 2018: 1–6.

HERWITZ SR, JOHNSON LF, ARVESEN JC, HIGGINS RG, LEUNG JG, DUNAGAN SE. 2002. Precision agriculture as a commercial application for solar- powered unmanned aerial vehicles. In: 1st Technical conference and workshop on UAV, Portsmouth, Virginia 2002: 1–7.

HERWITZ SR, JOHNSON LF, DUNAGAN SE, HIGGINS RG, SULLIVAN DV, ZHENG J, LOBITZ BM, LEUNG JG, GALLMEYER BA, AOYAGI M, SLYE RE, BRASS JA. 2004. Imaging from an unmanned aerial vehicle: Agricultural surveillance and decision support. Comput Electron Agric 44(1): 49–61.

HUANG J, QIAN F, GERBER A, MAO ZM, SEN S, SPATSCHECK O. 2012. A close examination of performance and power characteristics of 4G LTE

(17)

networks. In: MobiSys’12 - Proceedings of the 10th International Conference on Mobile Systems, Applications, and Services 2012: 225–238.

HUANG S, TEO RSH, TAN KK. 2019. Collision avoidance of multi unmanned aerial vehicles: A review.

Annu Rev Control 48: 147–164.

HUANG Y, HOFFMAN WC, LAN Y, FRITZ BK, THOMSON SJ. 2014. Development of a Low-Volume Sprayer for an Unmanned Helicopter. J Agric Sci 7(1):

148–153.

IVANCIC WD, MURAWSKI RW, MATHEOU K, DOWNEY AN. 2019. Flying Drones beyond Visual Line of Sight Using 4G LTE: Issues and Concerns.

In: Integrated Communications, Navigation and Surveillance Conference 2019: 1–13.

KABRA TS, KARDILE A V, DEEKSHA MG, MANE DB, BHOSALE PR, BELEKAR AM. 2017. Design, Development & Optimization of a Quad-Copter for Agricultural Applications. Int Res J Eng Technol 4(7):

1632–1636.

KALE SD, KHANDAGALE S V, GAIKWAD SS, NARVE SS, GANGAL PV. 2015. Agriculture Drone for Spraying Fertilizer and Pesticides. Int J Adv Res in Computer Sci Softw Eng 5(12).

KANG Y, JOO W, LEE S, SHIN D. 2017. Priority-driven spatial resource sharing scheduling for embedded graphics processing units. J Syst Archit 76: 17–27.

KHATIB O. 1985. Real-time obstacle avoidance for manipulators and mobile robots. In: Proceedings – IEEE International Conference on Robotics and Automation 1985: 500–505.

KHOSIAWAN Y, PARK Y, MOON I, NILAKANTAN JM, NIELSEN I. 2019. Task scheduling system for UAV operations in indoor environment. Neural Comput Appl 31(9): 5431–5459.

KWAK J, SUNG Y. 2018. Autonomous UAV Flight Control for GPS-Based Navigation. IEEE Access 6:

37947–37955.

LEE M, HWANG J, YOE H. 2013. Agricultural production system based on IoT. In: Proceedings – 16th IEEE International Conference on Computational Science and Engineering 833–837.

LI B, FEI Z, ZHANG Y. 2019. UAV communications for 5G and beyond: recent advances and future trends.

IEEE Internet Things J 6(2): 2241–2263.

LIPINSKI B, HANSON C, LOMAX J, KITINOJA L, WAITE R, SEARCHINGER T. 2016. Toward a sustainable food system Reducing food loss and waste.

World Resource Institute.

LUO C, MCCLEAN SI, PARR G, TEACY L, DE NARDI R. 2013. UAV position estimation and collision avoidance using the extended Kalman filter. IEEE Trans Veh Technol 62(6): 2749–2762.

LYU Y, PAN Q, ZHAO C, ZHANG Y, HU J. 2016. Feature article: Vision-based UAV collision avoidance with 2D dynamic safety envelope. IEEE Aerosp Electron Syst Mag 31(7): 16–26.

MARINELLO F, PEZZUOLO A, CHIUMENTI A, SARTORI L. 2016. Technical analysis of Unmanned Aerial Vehicles (drones) for agricultural applications.

In: Engineering for Rural Development 15: 870–875.

MARTIN J, AMIN R, ELTAWIL A, HUSSIEN A. 2011.

Limitations of 4G wireless systems. Proc 2011 Virginia Tech Wirel Symp. Blacksburg, VA.

MARTOS V, AHMAD A, CARTUJO P, ORDOÑEZ J.

2021. Ensuring agricultural sustainability through remote sensing in the era of agriculture 5.0. Appl Sci 11(13): 5911.

MINISTRY OF FINANCE. n/d. India Agricultural GDP.

Retrieved on https://pib.gov.in/PressReleasePage.

aspx?PRID=1895288#:~:text=Fiscal

MOGILI UR, DEEPAK BBVL. 2018. Review on Application of Drone Systems in Precision Agriculture.

In: Procedia Computer Science 133: 502–509.

MOHAMMED ABDULRAZAQ AN, ASLAMIAH ISTIQOMAH N, SALMAN AL-ZUBAIDI S, ABDUL KARIM S, MUSTAPHA S, YUSUF E. 2020. Toward a novel design for coronavirus detection and diagnosis system using IoT based drone technology. Artic Int J Psychosoc Rehabil 24(7): 2287–2295.

MOLAEI F, RAHIMI E, SIAVOSHI H, AFROUZ SG, TENORIO V. 2020. A Comprehensive Review on Internet of Things (IoT) and its Implications in the Mining Industry. Am J Eng Appl Sci 13(3): 499–515.

MUJUMDAR A, PADHI R. 2010. Nonlinear geometric guidance and differential geometric guidance of UAVs for reactive collision avoidance. In: AIAA Guidance, Navigation, and Control Conference 34(1): 303–310.

MULLA DJ. 2013. Twenty five years of remote sensing in precision agriculture: Key advances and remaining knowledge gaps. Biosystems Engineering 11(4):

358–371.

MUSTAFAH YM, AZMAN AW, AKBAR F. 2012.

Indoor UAV positioning using stereo vision sensor.

In: Procedia Engineering 41: 575–579.

MYLONAS P, VOUTOS Y, SOFOU A. 2019. A collaborative pilot platform for data annotation and enrichment in viticulture. Inf 10(4): 149.

Referensi

Dokumen terkait

FK Universitas Tarumanegara Jakarta, http://journal.fkm.ui.ac.id/kesmas/article/view/347/346 diakses tanggal 8 desember 2018 Moyad, A.M., 2004, Fad Diets and Obesity - Part I:

Published by Cambridge University Press continuing education needs of Aboriginal and Torres Strait Islander health practitioners in regional Queensland Julie-Anne Martyn1and Ann