• Tidak ada hasil yang ditemukan

Design of Cartesian Type Manipulator for Automatically Capturing Plant Images Inside Greenhouse

N/A
N/A
Protected

Academic year: 2024

Membagikan "Design of Cartesian Type Manipulator for Automatically Capturing Plant Images Inside Greenhouse"

Copied!
14
0
0

Teks penuh

(1)

ABSTRACT Article History :

Keywords :

Scientific activities often require large amounts of digital image data so it is required a device capable of capturing images automatically. This study aims to design a Cartesian-type manipulator with two translational movement for capturing and storing hydroponic plant images automatically and continuously.

The manipulator is programmed to capture and store plant images from 15 different positions for seven consecutive days and two cycles a day, namely at 07.00 AM and 17.00 PM. The 2020 solid work simulation yields a maximum von Mises stress of 13,783 MPa, and a minimum safety factor of 6,869. The manipulator was tested using step period treatments of 0.002, 0.003, 0.004 seconds. The best test results is treatment of 0.002 seconds with an average of x-axis and y-axis positional error was 0.380 cm and 0.076 cm, the average translation speed was 8.96 cm/second. The positioning accuracy on the x-axis and y-axis is 98.9% and 99.8%. The movement stability is quite good around the set point with an error range on the x-axis and y-axis is -0.1 to +0.9 cm and -0.065 to 0.15 cm. System response less than 1 ms and energy consumption of 16,132 watt-hours/cycle. The manipulator is able to work according to the design objectives.

Design of Cartesian Type Manipulator for Automatically Capturing Plant Images Inside Greenhouse

I Dewa Made Subrata1 and Jacklyn Melania2

1Department of Mechanical and Biosystems Engineering, Faculty of Agricultural Technology, IPB University, Bogor.

INDONESIA

2Department of Mechanical and Biosystems Engineering, Faculty of Agricultural Technology, IPB University, Bogor.

INDONESIA

1. INTRODUCTION

Digital image is one of data source that is widely used in various scientific research because digital image contains visual information similar to what is seen directly by humans. In the field of agricultural engineering, digital images are used to determine fruit ripeness in sorting machines (Siskandar et al., 2020) as well as for non-destructive measurements based on image color (Saputra et al., 2022). In the field of biology and plant breeding, digital image data is widely used to study the phenotypes of breeding plants (Bayati & Fotouhi, 2018; Gao et al., 2018; Zhang et al., 2016; Subramanian et al., 2012). However, the application of phenomics in plant breeding requires a large set of datasets that are difficult to obtain in the field in a timely manner if done manually,

Received : 1 February 2023 Received in revised form : 28 April 2023 Accepted : 14 April 2023

Automatic, Continuous,

Cartesian manipulator, Digital image, Hydroponic plants

Corresponding Author:

dmsubrata@apps.ipb.ac.id

Vol. 12, No. 3 (2023) : 545 - 558 DOI : http://dx.doi.org/10.23960/jtep-l.v12i3.545-558

(2)

making it difficult to capture important information that influences plant growth and development. The application of a robotic platform that is capable of collecting data on plant phenotypic traits will help plant breeder scientists to begin to uncover genetic mechanisms so as to increase crop production to meet increasing human needs.

In addition to research in the field of plant phenotypes, large numbers of digital images are also needed in artificial intelligence research which is currently growing rapidly. Many researchers in the field of artificial intelligence use secondary data in their research (Yuliany et al., 2022; Wahid et al., 2021) because the amount of digital image data they have is limited, so it is necessary to carry out an image augmentation process to increase the accuracy of the CNN model they are developing. Augmentation process is done by flipping the image horizontally, rotating and enlarging the image randomly. From the above artificial intelligence research activities, it can be seen that researchers tend to use secondary data because capturing images manually requires quite a long time and is tiring. Therefore it is necessary to develop a device capable of capturing images periodically and automatically to obtain sufficient data so that the accuracy of the artificial intelligence model can be increased. One of the devices capable of recording images regularly and automatically is the Robot platform.

The development of robotics technology in agriculture can be said to be quite advanced although for some types of work such as harvesting there are still many obstacles so that commercially viable fruit harvesting robots are still very limited.

However, robots with simpler functions and task are already grow up, such as (Bhogavalli et al., 2021) developing gantry-type robots for cultivating vegetable plants independently. Researchers reported that the robot they developed was able to grow vegetables independently and provide irrigation water automatically for several weeks of experiments without human involvement. (Anh et al., 2020) designed a gantry-type manipulator equipped with a machine vision system for harvesting pineapples.

Researchers say the success rate of robot manipulators for harvesting pineapples is 95.55% with a harvesting time of 12 seconds per fruit. (Permadi et al., 2021) designed a gantry type robot manipulator to demonstrate the application of robot platform in agriculture. The manipulator design chosen in this study is a cartesian configuration with three linear axes because the gripper control position is more precise and the structure is more robust.

Based on the above descriptions, this research aims to design a Cartesian type robot manipulator with two horizontal translational axes for capturing plant images automatically and continuously for certaint cycles or period of time.

2. MATERIALS AND METHODS

The equipment used in this study included: Xiaomi Xiaovv HD Webcam with resolution of 1920 ´ 1080 pixels and viewing angle of 150 degree, a computer ASUS Tuf Gaming A15, RAM 8GB, NVIDIA GeForce GTX 1650 Ti along with Python software version 3.10.4, OpenCV version 4.6.0, Visual Studio version 1.70.2, Solid work 2019 and Solid Work 2020. The materials used include manipulator construction materials and materials for plant cultivation. The manipulator construction materials include: 20 mm x 40 mm aluminum V-slot, 3 cm diameter galvanized iron, NEMA 17HS4401 stepper motor with a shaft rotation resolution of 1.8 degrees/step, a supply voltage of 12 volts and a current consumption of 1.7 A, A4988 type stepper motor driver with an output drive capacity of up to 35 V and 2 A, Solid V-Wheel Gantry, 6 mm GT2 timing pulley, 6 mm GT2 timing belt , Raspberry pi 3 with processor: 64-bit quad-core ARM Cortex-A53,

(3)

Clock frequency: 1.2 GHz, RAM 1 GB, Adapter 5V 3A model XBS 0530 for raspberry pi, Adapter 12 V 5 A model SK A372573 for stepper motor. Materials for cultivating plants include: water spinach seeds, Rockwool, and AB mix nutrient solution. This research was conducted at the Instrumentation and Control Laboratory, and the Siswadhi Soepardjo Field Laboratory Greenhouse, Department of Mechanical and Biosystems Engineering, IPB university from May 2022 to October 2022.

2.1. Robot Manipulator Structural Design

The robot manipulator made in this study is a Cartesian type manipulator which is also often called the gantry type with two axes of movement, namely the x-axis and the y- axis as shown in Figure 1.

Figure 1. Construction of Cartesian type manipulator

The rail mechanism for the x and y axes traverse is made using a V-slot aluminum profile with dimensions of 20 mm x 40 mm and is supported using Galvani’s iron with a diameter of 3 cm. The dimensions of the manipulator are: length 240 cm, width 130 cm and height 100 cm according to the hydroponic installation that already exists in the greenhouse.

2.2. Design Analysis

The manipulator structure that has been drawn is then analyzed statically using SolidWorks 2020 software to determine displacement, safety factor, and von misses stress. The analysis is carried out on the manipulator frame model by providing a continuous force according to the actual loading (Sungkono et al., 2019) (Figure 2).

Figure 2. Structure analysis of the robot manipulator using Solid work 2020

(4)

2.3. Hardware Control System

The robot manipulator is controlled using a raspberry pi 3 microcontroller to move the camera to the position of capturing plant images, carry out the image capturing process and save the captured plant images. The control system hardware is shown in Figure 3 and the block diagram of the control system is shown in Figure 4.

Figure 3. The hardware of the manipulator control system

Figure 4. Block diagram of the control system

2.4. Control Algorithm

The robot manipulator is designed to capture images of hydroponic plants at 15 different positions in 1 work cycle. The 15 position images were taken to ensure that each image contains more than 6 plant clumps. The image capturing cycle takes place automatically every day at 07.00 AM and 17.00 PM. The applied control algorithm is:

The control system first checks whether it is time to perform an image capturing cycle or not. When the time has come, the camera is moved towards position number 1, which is away from the center of the axis coordinates in the direction of the x-axis by 1250 steps, then moved in the direction of the y-axis by 1460 steps. In this case, the stepper motor has a shaft rotation resolution of 1.8 degrees/step. The stepper motor is connected to a timing pulley with a diameter of 12.5 mm so that each step will move the camera translationally as far as 0.197 mm. Because the stepper motor control is done in an open loop so that the rotation of 1250 step of the stepper motor shaft is equivalent to a translational movement of the camera of 1250 x 0.197 mm = 246 mm

(5)

and a rotation of 1460 steps is equivalent to 288 mm. After the plant image is recorded and stored from that position, the camera is moved to position 2, which is away from the center axis coordinate in the y-axis direction by 2170 steps, followed by capturing and storing the plant image and so on. The complete control algorithm for one image capturing cycle is shown in Figure 5.

Figure 5. Flowchart of the control algorithm for one image capturing cycle

2.5. Functional Test

Functional tests were carried out to find out whether the robotic manipulator system developed was capable of moving the camera towards the plant image capturing position. The functional test was carried out using three treatments of the stepper motor controller step period, namely 0.002 seconds (500 Hz), 0.003 seconds (333 Hz), and 0.004 seconds (250 Hz). The parameters tested including: pulley slip, theoretical and actual movement distance from the camera position, traveling time, positioning accuracy, stability, system response, system energy consumption. Pulley slip is calculated using equations 1 and 2:

(1)

(2)

(6)

where S is pulleys slip (%), Sav is Average pulley slip (%), Xa is actual movement distance (cm), Xt is theoretical movement distance (cm), Spi is pulley slip for the first image capturing position, second, and so on (%), and n is amount of data.

The average movement error for all image capturing positions is calculated using equations 3 and 4.

where K is position errors (cm), Kav is average error (cm), and Kpi is movement distance error for the first position, second, and so on (cm).

The average speed of camera movement is calculated using equations 5 and 6:

where V is speed (cm/s), Vav: Average speed (cm/s), Vpi is movement speed to the first position, second, and so on (cm/s), X is actual movement distance (cm), and t is actual traveling time (s)

2.6. Performance Test

Performance tests were carried out to analyze the performance of the manipulator to capture and store images of hydroponic plants automatically. The robotic manipulator is programmed to capture plant images and is then turned on for 7 consecutive days.

Every day the robotic manipulator executes two image capturing cycles, namely at 7 o’clock in the morning and 17 o’clock in the afternoon. In this study, water spinach seeds were used as experimental material.

3. RESULTS AND DISCUSSION

As stated in the methodology section, before the robot manipulator model is created, an analysis of von Mises stress, displacement and safety factor is carried out using the SolidWorks 2020 application.

Table 1. Properties of aluminum alloys 6063 T5

(3)

(4)

(5) (6)

Property Value Unit

Elastic Modulus 690 GPa

Poisson's Ratio 0.33 N/A

Shear Modulus 258 GPa

Mass Density 2700 Kg/m3

Tensile Strength 185 MPa

Yield Strength 145 MPa

(7)

The material used for the von Mises stress simulation is Aluminum Alloys 6063 T5 with the specifications shown in the table 1. The calculation results of the load acting on the robot manipulator frame are presented in Table 2.

Table 2. Load acting on the robot manipulator frame

As already mentioned before, the effect of external loading applied to the structure on the stress distribution that occurs in the structure is tested using Von mises stress analysis. Before analysis, some of the data that needs to be input in the simulation includes: technical drawings of the tool, the value and location of the loads acting on the structure, and the materials specification data used in the structure design. After the data is entered in the simulation, the stress distribution that occurs in the structure is displayed by solid work as shown in Figure 6. The von Mises stress simulation results (Figure 6) shown the highest value of 13.783 MPa which is in the safe category because it is far below the yield strength of 145 MPa (Pamungkas et al., 2020).

The green object in Figure 6 shows a fixed position or will not move as long as the robot's manipulator operates. The simulation results on the manipulator frame produce the highest displacement value of 1.081 mm located at the camera position as shown in Figure 7.

Figure 6. Results of the Von Mises stress analysis on the robot manipulator frame

Number Force Type Force Value (N) Description

1 Gravity 196.20 Prototype total weight

2 Motorized edge

Weight on X axis 25.84

Stepper motor, 3 limit switch, Y axis v slot, and stepper mo- tor mounting plate

3 Non-motorized edge

Weight on X axis 21.18 Y-axis V slots 4 Weight of gantry and

Camera 5.10 Gantry and webcams

(8)

Figure 7. Results of displacement analysis on the robot manipulator frame

The Tresca Failure criterion uses a material's shear strength value which is half of the tensile strength to calculate the safety factor value. Based on these criteria, the value of shear strength is 185 Mpa/ 2 = 92.5 Mpa. The minimum safety factor is calculated as the ratio between the shear strength and the maximum Von Mises, namely 92.5 MPa/ 13,783 MPa. Therefore, the safety factor value obtained from the analysis using Solid work 2020 is a minimum of 6.869 and a maximum of 10 as shown in Figure 8. The higher the safety factor value, the safer the structure can withstand the load.

Figure 8. The results of the safety factor analysis of the robot manipulator frame The results of the safety factor analysis show that the frame is able to withstand dynamic loads to impact loads because the minimum safety factor value required for a material capable of withstanding impact loads is in the range of 3 to 5 (Imran & Kadir, 2017; Wibawa & Diharjo, 2019). Islami et al. (2022) conducted a structural analysis of the aluminum V slot material for the 3D printer frame and find out a safety factor value of 15 which was better than the Cartesian manipulator in this study. However, the safety factor of 6,869 is still in the safe category

3.1. Robotic Manipulator Prototype

A robotic manipulator for capturing images of plants inside a greenhouse has been successfully made according to the isometric projection size that has been simulated

(9)

using Solid work 2020. The prototype is shown in Figure 9a. The robot manipulator has a length of 240 cm, a width of 130 cm and a height of 100 cm.

Figure 9. (a) Robot manipulator, (b) control circuit, (c) camera installation

3.2. Functional Testing

The pulley slip test results for the three treatments of the stepper motor controller step period are shown in Table 3. From Table 3 it can be seen that the lowest average pulley slip was obtained in the treatment of 0.002 seconds. The slip on the x-axis is greater than the slip on the y-axis due to the loading on the x-axis is greater than that on the y-axis.

Table 3. Pulley slip with webcam loading

3.2.1. Positioning Accuracy

The position error of the camera movement is shown in Table 4. The negative sign in the table indicates that the actual distance is shorter than that of the theoretical once.

From the average error in Table 4, the accuracy of the image capturing position can be calculated with the equation:

An error on the x-axis of 0.38 cm is for the average translation distance of 32 cm so the error is 1.2%. The error on the y-axis of 0.076 cm is for the average translation distance of 43 cm so the error is 0.2%. Thus the positional accuracy for the x-axis becomes 98.9% while the positional accuracy for the y-axis becomes 99.8%.

Table 4. Average webcam stop position errors Treatment of stepper motor step

period (seconds)

Pulley slip (%) Average slip X axis Y axis (%)

0.002 0.982 0.171 0.576

0.003 3.301 0.143 1.722

0.004 5.787 1.032 3.409

(7)

Treatment of stepper motor step period (seconds) Error (cm)

X axis Y axis

0.002 -0.380 -0.076

0.003 -1.200 -0.064

0.004 -2.107 -0.444

(10)

The average capturing position error for each cycle is measured to ensure that errors do not accumulate during manipulator operation. Fluctuations error for the three tested treatments of the stepper motor controller step period are shown in Figures 10 and 11. The slower speeds of the stepper motor rotation, the back EMF in the winding tend to decrease causes the current to increase in the load, resulting in missed steps. Therefore the positioning error become greater at the step period of 0.004 s compared to the step period of 0.002 s.

Figure 10. Stability of image capturing position on the x-axis

Figure 11. Stability of image capturing position on the y-axis

From Figures 10 and 11 it can be seen that the position error for the step period treatment of 0.002 seconds produces the smallest error fluctuations and it seems that there is no accumulation of errors on both the x and y axes for 14 cycles of plant capturing images. Positional accuracy of agricultural robots is generally lower than industrial robots but is still able to provide good performance. Jiang et al. (2020), used an artificial neural network algorithm to increase the positioning accuracy of industrial robots from 0.8497 mm to 0.0490 mm. Yu et al. (2020), conducted a study on a strawberry harvesting robot with an average position error of 2 mm and a maximum of 4 mm, and they said this error is still acceptable according to the positional accuracy of the fruit harvesting end-effector. Gong et al., (2022), conducted research on fruit harvesting robots with a position accuracy of 8.21 mm but still able to harvest with a success of 73.04%.

3.2.2. System Stability

From Figure 10 it can be seen that the translational movement on the x-axis is quite stable around the set point with an error ranging from -0.1 cm to +0.9 cm for the

(11)

stepper motor step period of 0.002 seconds. Meanwhile, the stepper motor step period of 0.004 seconds has stability around the set point with an error ranging from - 0.25 cm to 2.4 cm. This stability is lower than the stepper motor step period of 0.002 seconds. From Figure 11 it can be seen that the translational movement on the y-axis is also quite stable around the set point with an error ranging from -0.065 cm to +0.15 cm for the stepper motor step period of 0.002 seconds. Meanwhile, the stepper motor step period of 0.004 seconds has stability around the set point with an error ranging from 0.015 cm to 0.40 cm. This stability is lower than the stepper motor step period of 0.002 seconds.

3.2.3. System Response

The system response is measured using a linear potentiometer starting when the motor shaft rotation command is executed with the results as shown in Figure 12.

From Figure 12 it can be seen that the stepper motor response to the shaft rotation command is less than 1 ms.

Figure 12. Time response of the stepper motor shaft rotation

3.2.4. Electric Power Consumption

The electric power consumption of the manipulator is calculated using the formula:

The power consumption consists of: Stepper motor adapter: 12 V x 5A = 60 watts.

The A4988 stepper motor driver includes load power = 12 V x 2 A x 2 units = 48 watts and power for the logic system = 5.5 V x 0.00002A x 6 lines = 0.00066 watt. Stepper motor power: 12 V x 1.7 A x 2 phase x 2 units = 81.6 watts. Raspberry pi 3 microcontroller power = 5V x 2.5 A = 12.5 Watts. Raspberry pi 3 power adapter = 5V x 3A = 15 watts. Power Xiaomi Xiaovv Camera = 5V x 0.16 A = 0.8 watts. So that the total consumption of electric power was about 218 watts. One cycle of the plant capturing images takes time: 0.074 hours, so energy consumption = 16.132 watt-hours/ cycle.

The average translational motion speed of the camera is calculated based on the traveling distance and traveling time of the camera. The calculation results show that the highest average speed of about 8.96 cm/second is obtained from the step period of 0.002 seconds as shown in Table 5.

From Table 5 it can be seen that the highest translation speed of 8.96 cm/sec was obtained for the 0.002 second step period treatment. Therefore, the 0.002 second treatment was chosen in this study. (Chancharoen et al., 2019) designed a robotic manipulator with a V-slot with a positioning accuracy of 0.45 mm. the robotic (8)

(12)

manipulator was tested for repeatability at two translational speeds, namely 20 cm/

minute and 240 cm/minute. they concluded that the higher the translation speed, the lower the accuracy of the repeatability.

Table 5. Average of translational speed of camera

3.3. Manipulator Performance Test

Performance tests were carried out inside the greenhouse to capture images of hydroponic plants using the best value of stepper motor step period according to the results of functional testing, namely 0.002 seconds. The manipulator was turned on continuously during 7 days with two image capturing cycles per day, namely at 7 o’clock am and 17 o’clock pm. In every cycle, capturing images are carried out from 15 different positions as described in the methodology. The time required for one cycle capturing images are 4 minutes 24 seconds. The capturing images for one cycles as an example is then arranged based on the capturing position as shown in Figure 13.

Figure 13. The arrangement results of one cycle manipulator capturing plant images From Figure 13 it can be seen that the camera was recorded at least 6 plants for each position, however, because the total number of plants in the hydroponic tub is only 50 clumps, there are overlapping images between adjacent positions, so further processing or rearrangement of the image capturing position is necessary to obtain different plant images for each position. As written in the title of the article, this research focuses on designing a manipulator that is capable of recording images at a determined position and time automatically, without further image processing.

Coincidentally the camera used is a convex camera. Of course, the plant images obtained from this study need to be further processed to be used in the digital image- based analysis.

4. CONCLUSIONS

The design of a Cartesian-type robotic manipulator with two Cartesian axes has been successfully created for automatic capturing and storing of plant images in

Treatment of stepper motor step period

(seconds) Average speed (cm/sec)

0.002 8.96

0.003 6.24

0.004 4.87

(13)

greenhouses. The manipulator best performance for the three tested step period treatments of the stepper motor controller was the treatment with a step period of 0.002 seconds. The average pulley slip is 0.57 %, the average position error is 0.380 cm on the x-axis and 0.076 cm on the y-axis, and the average translation speed is 8.96 cm/

sec. Thus, the Cartesian-type robotic manipulator that has been designed is capable of capturing and storing hydroponic plant images automatically and continuously for 7 days of experiment with two capturing cycles per day. The average time per cycle is 4 minutes 24 seconds or about 0.074 hour.

ACKNOWLEDGEMENT

The author would like to thank the Department of Mechanical and Biosystems Engineering, Faculty of Agricultural Technology, IPB University, which has facilitated my supervised student so that this research can be carried out properly.

REFERENCES

Anh, N.P.T., Hoang, S., Tai, D.V., & Quoc, B.L.C. (2020). Developing robotic system for harvesting pineapples. 2020 International Conference on Advanced Mechatronic Systems (ICAMechS). https://doi.org/10.1109/ICAMechS49982.2020.9310079 Bayati, M., & Fotouhi, R. (2018). A mobile robotic platform for crop monitoring.

Advances in Robotics & Automation, an Open Access Journal, 7(1), 1–7. https://

doi.org/10.4172/2168-9695.1000186

Bhogavalli, R., Tech, B., & Krishnaswamy, S. (2021). Automated farming using gantry robot. International Research Journal of Engineering and Technology (IRJET), 8(7), 3547–3553. https://www.irjet.net/archives/V8/i7/IRJET-V8I7613.pdf

Chancharoen, R., Veerakiatikit, P., Kriathkungwalkai, L., Daraseneeyakul, P., Loetchaipitak, T., & Prayongrat, M. (2019). An accuracy and repeatability of a robot made with V-slot extrusion with built-in linear rails. IOP Conference Series:

Materials Science and Engineering, 635(2019), 1–5. https://doi.org/10.1088/1757 -899X/635/1/012025

Gao, T., Emadi, H., Saha, H., Zhang, J., Lofquist, A., Singh, A., Subramanian, B.G., Sarkar, S., Singh, A.K., & Bhattacharya, S. (2018). A novel multirobot system for plant phenotyping. Robotics, 7(4), 1–15. https://doi.org/10.3390/robotics7040061 Gong, L., Wang, W., Wang, T., & Liu, C. (2022). Robotic harvesting of the occluded

fruits with a precise shape and position reconstruction approach. Journal of Field Robotics, 39(1), 1–84. https://doi.org/10.1002/rob.22041

Imran, A.I., & Kadir. (2017). Simulasi tegangan von mises dan analisa safety factor gantry crane kapasitas 3 ton. DINAMIKA Jurnal Ilmiah Teknik Mesin, 8(2), 1–4.

https://doi.org/10.33772/djitm.v8i2.2378

Islami, L.A., Mardiyana, D., & Ridha, F.F. (2022). Analisis struktur aluminium profile V- slot sebagai desain rangka mesin 3d printer. Jurnal Teknik Mesin, Industri, Elektro dan Informatika (JTMEI), 1(2), 30–44. https://doi.org/10.55606/jtmei.v1i2.505 Jiang, Y., Yu, L., Jia, H., Zhao, H., & Xia, H. (2020). Absolute positioning accuracy

improvement in an industrial robot. Sensors, 20(16), 1–14. https://

doi.org/10.3390/s20164354

Pamungkas, G.A., Priambadi, I.G.N., & Komaladewi, A.A.I.A.S. (2020). Analisis defleksi pada rangka alat pembuat briket sampah organik. Jurnal METTEK, 6(2), 121 – 128.

https://doi.org/10.24843/METTEK.2020.v06.i02.p06

(14)

Permadi, Y., Prayogo, S.S., & Kusuma, T.M. (2021). Robot edukasi pertanian agrobot-i:

rancangan lektronika dan sistem penggerak. Jurnal Ilmiah Informatika Komputer, 26(1), 1–12. https://doi.org/10.35760/ik.2021.v26i1.2696

Saputra, T.W., Wijayanto, Y., Ristiyana, S., Purnamasari, I., & Muhlison, W. (2022). Non- destructive measurement of rice amylose content based on image processing and Artificial Neural Networks (ANN) model. Jurnal Teknik Pertanian Lampung, 11(2), 231–241. https://doi.org/10.23960/jtep-l.v11i2.231-241

Siskandar, R., Indrawan, N.A., Kusumah, B.R., Santosa, S.H., Irmansyah, & Irzaman.

(2020). Penerapan rekayasa mesin sortir sebagai penentu kematangan buah jeruk dan tomat merah berbasis image processing. Jurnal Teknik Pertanian Lampung, 9 (3), 222–236. https://doi.org/10.23960/jtep-l.v9i3.222-236

Subramanian, R., Spalding, E.P., & Ferrier, N.J. (2012). A high throughput robot system for machine vision based plant phenotype studies. Springer, 24, 619–636. https://

doi.org/10.1007/s00138-012-0434-4

Sungkono, I., Irawan, H., & Patriawan, D. A. (2019). Analisis desain rangka dan penggerak alat pembulat adonan kosmetik sistem putaran eksentrik menggunakan solidwork. Prosiding Seminar Nasional Sains Dan Teknologi Terapan, 575–580. http://ejurnal.itats.ac.id/sntekpan/article/view/658

Wahid, M.I., Mustamin, S.A., & Lawi, A. (2021). Identifikasi dan klasifikasi citra penyakit daun tomat menggunakan arsitektur InceptionV4. Konferensi Nasional Ilmu Komputer (KONIK) 2021, 5(1), 257–264. https://prosiding.konik.id/index.php/

konik/article/view/61

Wibawa, L.A.N., & Diharjo, K. (2019). Desain, pemilihan material, dan faktor keamanan stasiun pengisian gawai menggunakan metode elemen hingga. Jurnal Teknologi, 11(2), 97–102. https://doi.org/10.24853/jurtek.11.2.97-102

Yu, Y., Zhang, K., Liu, H., Yang, L., & Zhang, D. (2020). Real-time visual localization of the picking points for a ridge-planting strawberry harvesting robot. IEEE Access, 8 (2020), 116556–116568. https://doi.org/10.1109/ACCESS.2020.3003034

Yuliany, S., Aradea, & Rachman, A.N. (2022). Implementasi deep learning pada sistem klasifikasi hama tanaman padi menggunakan metode Convolutional Neural Network (CNN). Jurnal Buana Informatika, 13(1), 54–65. https://

doi.org/10.24002/jbi.v13i1.5022

Zhang, C., Gao, H., Zhou, J., Cousins, A., Pumphrey, M.O., & Sankaran, S. (2016). 3D robotic system development for high-throughput crop phenotyping. IFAC- PapersOnLine, 49(16), 242–247. https://doi.org/10.1016/j.ifacol.2016.10.045

Referensi

Dokumen terkait