• Tidak ada hasil yang ditemukan

PDF Development of an Integrated Control System for a Humanoid Robot

N/A
N/A
Protected

Academic year: 2023

Membagikan "PDF Development of an Integrated Control System for a Humanoid Robot"

Copied!
116
0
0

Teks penuh

Intelligent vision algorithm is one of the most reliable and effective ways to develop the control system of an autonomous robot, as it can extract the maximum amount of real-time data from the environment. Such kind of integrated control system has been implemented on MISTBOY, ongoing humanoid robot project. In this paper, the integrated vision-based control system is also presented with analyzed results through mathematical derivation and simulation.

It is hereby declared that the work presented in this thesis is the result of the research and research conducted by the following students under the supervision of Dr.

Introduction ............................................................................ (1-7)

History of robot and robotics

Essential characteristics and components of robot

Applications of Robots

UNIMATE - The first industrialized robot went online at the General Motors car factory in 1961. A programmer is the person who gives the robot "mind". The robot will need to have a way to receive the program so it knows what to do. Basically the robot has five main components such as (a) Manipulator,. b) End effectors (c) Motion device (d) Controller and (e) Sensors.

Space Robots - I would like to separate space employed robots as a split type.

Literature Overview ............................................................. (9-17)

  • Vision system in humanoid robot
  • Control system in humanoid robot
  • Current research Projects
  • Ethical considerations

The optical vision system is designed to be able to detect obstacles in the path of the robot. The robot soccer players rely very intensively on their vision systems when they are in the unpredictable and dynamic environments. The joint controller, motor drive, battery, sensors and main controller (PC) are designed to be installed in the robot itself.

The increased stiffness improves the stability of the robot by minimizing the uncertainty of the joint positions and the vibration control of the links.

Figure 2.1 KHR humanoid  robot
Figure 2.1 KHR humanoid robot

Balancing with Sensors .....................................................(19-26)

  • Accelerometer
  • Gyro
  • Accelerometer and Gyro combination
  • Filter

As the tines rotate, the Coriolis force causes a force perpendicular to the tines of the fork. It is an operation used to calculate the current state of the system from the previous state assuming constant acceleration. Zk is the measured value of the system and refers to the calculated value of xk.

The second part of the right side (0.02*accelerometer) is the low pass filter that acts on the accelerometer.

Figure 3.1 Accelerometer IC  MMA7455L
Figure 3.1 Accelerometer IC MMA7455L

Vision ...................................................................................(28-50)

Automatic camera calibration

Image processing

  • Basic concepts
  • Techniques of Image processing

Enhancement

  • Histogram
  • Making edges more prominent

Methods of matching

Interfacing and mapping with Arduino

So the global location of the point will be (X1, Y1 and Z1) if 2 cameras are used instead of 1 camera. In many cases, the overall performance of a machine vision system is highly dependent on the accuracy of camera calibration. Some points in N4, ND and N8 may fall outside the image when P lies on the edge of the image.

In this case, the distance between two pixels will depend on the values ​​of the pixels along the path, as well as the values ​​of their neighbors. The enhancement does not increase the intrinsic information content of the data, but it increases the dynamic range of the selected features so that they can be easily detected. The shape of an image's histogram gives us useful information about the possibility of contrast enhancement.

Pattern Matching finds template matches regardless of poor lighting, blur, noise, displacement of the template, or rotation of the template. When using color pattern matching to search for a template, the software uses the color information in the template to search for the appearance of the template in the image. The software then applies grayscale pattern matching in a region around each of these events to find the exact position of the template in the image.

The size of the search area depends on the given inputs such as search strategy and color sensitivity. The reaction time of a soccer playing robot must be as fast as possible to match the speed of the ball. However, it is not possible to use an onboard processor due to size and weight balancing issues.

Use the first part of the color location algorithm to find the template instance in the image.

Figure 4.1 Imaging geometry
Figure 4.1 Imaging geometry

Distance Measurement .......................................................(52-56)

Sonar

IR (Infrared Ray)

Some researchers delved deeper into monocular vision to possibly overcome the shortcomings of the stereo vision measurement system. Because the movement of the camera on the robot arm is omnidirectional, finding the corresponding points on images can be a costly computation. However, they have non-linear characteristics and depend on the reflection properties of the object surfaces.

Some IR sensors described in the bibliography are based on phase shift measurement and provide average resolution from 5 cm to 10 m, but these are very expensive. They can be used together when the advantages of one offset the disadvantages of the other. The integration of information provided by multiple US and IR sensors can be a means to cope with the spatial uncertainty of unknown and unstructured environments in some advanced robotics applications, such as flexible industrial automation, service robotics, and autonomous mobility.

When light energy hits a surface, some of the light is scattered or absorbed, and the rest of the energy is reflected. Again, the energy absorbed by the phototransistors is a function of the intensity (I), the distance traveled (2l) and the area (A) of the sensor. To simplify the calculation of the surface properties and distance to the obstacle, the relative angle of the sensor to the surface should be determined.

6, the spike occurs where the direction of the IR signal corresponds to the surface normal (α = 0). After obtaining the properties of a surface and the relative angle of the surface, it becomes easier to calculate the distance.

Figure 5.1 Basic principle of sonar sensor
Figure 5.1 Basic principle of sonar sensor

Experimental Procedure for Methodology Assessment ..(58-62)

Power calculation

The NI Vision Acquisition Wizard is started by placing the Express VI on the block diagram. Select an acquisition source and configure an acquisition using NI-IMAQ, NI-IMAQdx, or simulate an acquisition by reading an AVI or image files from a folder. Once an acquisition is configured, select controls and indicators to be set programmatically in LabVlEW.

Use IMAQ Dispose.vi to clean up the images read by the Express VI when they are no longer needed. When you place this Express VI on the block diagram, the NI Vision Assistant Experimental procedure for. Once you have created an algorithm, you can select the controls and indicators that you want to be able to set programmatically in LabVlEW.

When various functions or Image Processing matching are enabled, the input and output configuration of Vision Assistance increases as Figure 2(a) to 2(b). Right-click the function and select array size from the shortcut menu to set the number of elements in the array. Matched Image To display matched images, first the input images are converted to strings and then the strings are converted to arrays.

Then the image is converted to the appropriate data to apply Overlay and Matching to it. Here too, the program combines all data to present the result as a real-time image.

Experimental Results and Observation .............................(63-76)

  • Color Matching
  • Pattern matching
  • Color pattern matching
  • Histogram
  • Accelerometer and Gyro results
  • Integrated Control system (ICS)
  • Walking mechanism

Because the grayscale image has lost its color properties, pattern matching cannot identify deviations in color. The second is that pattern matching cannot distinguish between objects of different colors with similar patterns, as shown in Figure 7.6 (b). In Figure 7.8, the NI color pattern matching cannot detect the red ball when the minimum matching score is 750 and the weight of the color result is 500. Because the weight of the color matching result is low, the pattern matching cannot give a good result even on a white background.

67 Color weight calibration is very important in color pattern matching because the matching is based on both the color matching score and the pattern matching score. Otherwise, it is not possible to track the ball, because the pattern matching produces a very small score, as in Figure 7.10. 68 below the Minimum Match Score. As a result, NI vision cannot detect the desired ball, as shown in Figure 7.10.

As the weight of the color score decreases, again the vision system begins to track the ball as in Figure 7.11. In Figure 7.12, NI Color Pattern Matching cannot detect the red ball when the minimum match score is 750 and the color score weight is 500. Since the color matching score weight is low and the pattern matching cannot also give a good result on the white background, the total matching score obtained remains below 750. 70 In Figure 7.15 and 7.13 (a), when the red ball appears in front of the camera, the intensity as well as the value of the red pixel increases. On the other hand, in Figure 7.14 and 7.13 (b ), when the white ball or white things appear, the pixel values ​​become very low. By observing the histogram the MISTBOY vision system can easily detect the color of the ball and makes the color pattern matching algorithm more accurate.

From Figure 7.16, it can be seen that this is the characteristic graph of the MPU 6050 when MISTBoy is in a steady state. Although the accelerometer and gyro data differ from the standard condition, the filter displays an almost straight line, as shown in Figure 7.18.

Figure 7.1 Color matching algorithm
Figure 7.1 Color matching algorithm

Conclusion and Recommendations ..................................(78-81)

Recommendations

Future Work

A systematic approach is presented in this book to derive the parameters of the integrated control system of a humanoid robot, best based on its ability to sense and make decisions. The matching techniques and algorithms used in the vision system are discussed and real-time images are analyzed to find out the ball in different critical states to find out the best matching technique for humanoid robot. These discussions are very useful in designing and increasing the accuracy of vision and control system of a humanoid robot.

There are some robot theories like D-H parameters and inverse kinematics to analyze this motion, although it is a hard work because the matrices need to be used in C programming. The best and safe way is to use the Dynamixel engine, because it is accurate and powerful, although it is very expensive. In this project, a free filter is used to remove the noise from the values ​​of the MPU 6050.

It is recommended to try other filters to find out which ones are suitable for balancing a humanoid robot. An open source and popular vision software is OpenCV (Open Source Computational Vision). It is also recommended for those who want to work with this project in the future. The design and function of MISTBoy can be simulated to analyze the dynamic motion of the system and therefore the design can be optimized.

Then the artificial intelligence and vision system can be introduced to give MISTBoy the intelligence of a football player. Kato: “Development of a bipedal walking robot that compensates the three-axis moment by trunk movement”, Proc. Shivdutt Tyagi, “Comparative Study of Image Enhancement and Analysis of Therman Images Using Image Processing and Wavelet Techniques,” International Journal of Computational Engineering Research, Vol.

Gambar

Figure 2.1 KHR humanoid  robot
Figure 4.2 Relation between camera axis and world axis
Figure 4.3 Transformation of camera axis.
Figure 4.6 Neighborhood pixels
+7

Referensi

Dokumen terkait

Table 2 shows the energy and exergy results for a Rankine cycle with a constant heat input 𝑄𝑖𝑛 in the saturated state 300 °C for water, toluene, and n-nonane organic fluids.. The