• Tidak ada hasil yang ditemukan

Vision Based Navigation of an Autonomous Robot

N/A
N/A
Protected

Academic year: 2023

Membagikan "Vision Based Navigation of an Autonomous Robot"

Copied!
55
0
0

Teks penuh

MAHBOOB KARIM, Instructor class 'A' and Fahim Hasan Khan, Assistant Professor Military Institute of Science and Technology (MIST), Dhaka, Bangladesh. Khan, Assistant Professor, Military Institute of Science and Technology (MIST), Dhaka, Bangladesh, for their constant supervision, loving guidance and great encouragement and motivation. We are especially grateful to the Department of Computer Science and Engineering (CSE) of the Military Institute of Science and Technology (MIST) for providing their full support during the thesis work.

LIST OF ABBREVIATION

INTRODUCTION

  • Background of Thesis
  • Our Proposed System

Therefore, robotic systems must be controlled by themselves (autonomous) or by the user (remote control), depending on the goal and position of the robot. These input devices capture images of the real scene, and the data is processed by a computer that ultimately performs object detection. In this system, the Aurdino microcontroller will be used as the brain of this robot as it controls the robot's systems.

Figure 1.1: Basic flow of the System
Figure 1.1: Basic flow of the System

LITERATURE REVIEW

Robotics and Robots

In fact, the first use of the word "robot" was in a play about mechanical men built to work on factory assembly lines who rebel against their human masters. 1892 Motorized crane with gripper for removing castings from a furnace. First reference to the word robot in the play 'Rossum's Universal Robots'. It described robots as tools that perform simple, repetitive tasks. 1921 But when the robots in the story were used in battle, the story shows them turning against their human owners and taking over the world.

An intelligent AGV delivers goods without the use of lines or landmarks in the work area. Mobile robots, following markers or wires in the floor, or using vision or lasers, are used to transport goods around large facilities, such as warehouses, container ports, or hospitals. This type of robots includes many quite different devices such as robot vacuum cleaners, robot vacuum cleaners, sweepers, gutter cleaners and other robots that can perform various tasks. Some surveillance and telepresence robots can also be considered household robots if used in that environment.

These can be various robots for collecting data, robots made for demonstrating technologies, robots used for research, etc. Often, robots originally created for military purposes can be used in law enforcement, search and rescue, and other related fields. This type would include robots used on the International Space Station, the Canadarm used in the Shuttles, as well as Mars rovers and other robots used in space.

They aid in exploration and data collection and are often used in space applications, operations and housekeeping tasks.

Figure 2.1: Articulated welding robots used in a factory, a type of industrial robot.
Figure 2.1: Articulated welding robots used in a factory, a type of industrial robot.

Robotic Vision

In addition to geometric features, texture, markings and color also contributed to the computer's representation of the scene, which was then used in a variety of applications such as obstacle detection and avoidance, navigation, planning, etc. Each element of the digitized image (pixel ) has a value that corresponds to the brightness of the point in the captured scene. In the case of the digital images, the acquisition systems require primarily a photosensitive element, which is usually constituted by a photosensitive matrix arrangement obtained by the image sensor (CCD, CMOS, etc.).

The number of elements of the photosensitive matrix system determines the spatial resolution of the captured image. The number of bits used to store image information determines the resolution in image intensity. This function creates a new image that is the inverse of the input image, such as the negative of a photograph, this function is useful for images created by absorption of radiation, i.e. in the case of medical image processing.

In the punctual operations, a pixel q of the output image depends only on a pixel p of the input image, but in grouped operations, the result of each pixel q in the image is I'. So if an 8-neighborhood is considered, a weighted sum must be done with the 8 neighbors of the corresponding pixel p and the result will be assigned to the pixel q. To define the weight values, different masks or kernels with constant values ​​are generally used; the values ​​of these masks determine the final result of the output image.

The algorithm not only searches for the exact appearance of the image, but also finds a certain degree of variation regarding the pattern.

Figure 2.3: Neighborhood of a pixel
Figure 2.3: Neighborhood of a pixel

Some robotics terminology

The robot locates the distance to obstacles based on how long it takes the signal to bounce back. A vision system perceives light from the environment using a two-dimensional field of receptors. Whole situations can thus be recognized and expectations for their further development in the near future can be formed.

In the late 1970s, the idea of ​​motion recognition and vehicle control with computer vision appeared in the context of planetary vehicles [5]. His approach emphasized the importance of high image frequency (greater than 10 Hz) to exploit the temporal and visual continuity of the scene. Modeling the change in camera location would make it possible to predict the positions of elements in images, and the search space would be drastically reduced.

Extending the well-established methods of the 1960s for the recursive estimation of incomplete measurement data, he introduced a 4-D approach, a complete state reconstruction in 3-D space and time, which was finally widely accepted in the autonomous vehicle community [ 6]. The rotation speed of the robot is determined by the difference in the speed of the two motors. If both motors rotate in the same direction at the same speeds, the robot will move forward or backward according to the rotation of the wheel.

If the speed of one motor is higher than the other, the robot will turn in the direction of the slower motor.

Figure 2.4: The autonomous Urbie is designed for various urban operations, including mili- mili-tary reconnaissance and rescue operations.Photo courtesy NASA
Figure 2.4: The autonomous Urbie is designed for various urban operations, including mili- mili-tary reconnaissance and rescue operations.Photo courtesy NASA

TECHNOLOGY USED

Software Technologies

Hardware Technologies

The language is extensible via C++ libraries, and people who want to understand the technical details can make the jump from Arduino to the AVR C programming language on which it is based. The plans for the modules are published under a Creative Commons license, allowing experienced circuit designers to create, extend and improve their own version of the module. Even relatively inexperienced users can build the breadboard version of the module to understand how it works and save money.

The Arduino Ethernet Shield allows the Arduino to send and receive data anywhere in the world with an internet connection. DC Motor The direct current (DC) motor is one of the first machines designed to convert electrical energy into mechanical energy. The stationary electromagnetic field of the motor can also be wire wound like the armature (called a wound field motor) or can consist of permanent magnets (called a permanent magnet motor). The speed of a DC motor can be controlled by changing the field current [15].

All these four pins are connected to the digital pins of an Arduino and the four output pins are connected to the DC motors of the robot. The enable pins are used to enable the input/output pins on both sides of the PIC and Vcc is used to supply external power to the DC motors. Both motors on the same side of the robot move in the same direction at the same time.

Two servo motors were used in the model of the robot; they receive information for movement (forward, backward, left, right) through the control board (Arduino Uno connected to the L293D).

Figure 3.1: Arduino Uno
Figure 3.1: Arduino Uno

DESIGN OF THE SYSTEM

  • System Architecture
  • System Design
  • IMPLEMENTATION
    • LABVIEW Implementation
    • Working Procedure of The Robot
  • CONCLUSION

The command is generated on the PC and sent to the robot through the router within the range of the router. Within this block there are four sections: the first corresponds to the option to select the source of reception that shows all the cameras connected to the computer. It is very simple to change the system image type to Grayscale (U8) and the image is set to imageGray.

An important parameter is to convert the format of the image in the acquired video into intensity values. Four digital pins of the Arduino are set as inputs to the L293D PIC, two pins on either side. This is due to the wiring of the Wireless SD Shield. It uses the Arduino RX/TX ports normally used for native serial connections to instead communicate with the RN-XV (which is configured via commands sent via serial).

From there we can go ahead and add some code to our loop() to make the Arduino turn on its LED when connected to the Node.js server and to accept a TCP response from the server. To upload sketches to the board, we connected it to our computer with a USB cable as usual. Better resolution images can be used to get better estimates, but for resource management we could not use the high definition camera which captures images very perfectly, independent of how far the image is, how angular the image is and whether it is perpendicular to the projection plane or not.

We have two plans in the future: to make the robot fully autonomous with an onboard processing system or n the control station which is currently under processing due to the resource management issue. But yes, there is Ardu-Cam available and IP camera available, we are currently working on the concept of connecting any one of these cameras to the arduino to make the processing system off-board and in the central control station. In a word, we try to get the robot on the Internet control system via the pico-net or without it. Means we try to establish a connection with the robot via a middle access point or to directly establish the connection between the control station and the robots.

Figure 4.2: Video Acquisition using IMAQ Vision Acquisition Express.
Figure 4.2: Video Acquisition using IMAQ Vision Acquisition Express.

Gambar

Figure 1.1: Basic flow of the System
Figure 2.1: Articulated welding robots used in a factory, a type of industrial robot.
Table 2.1: History of Robotics and Robots
Figure 2.2: An intelligent AGV drops-off goods without needing lines or beacons in the workspace
+7

Referensi

Dokumen terkait

[r]

At output pin of this regulator we get a constant 5V DC which is used for different devices in this project Like Microcontroller, LCD Display, IR Sensor etc.. Thermistor type