Body Temperature Measurement Tool for Early Detection of COVID-19 Based on Interactive Augmented Reality Technology and Sensor MLX90614:
Framework and Prototyping
Subandi1,*, Agus Setiyo Budi Nugroho1, Nurkamilia1, Aulia Akhrian Syahidi2
1 Department of Electrical Engineering, Politeknik Negeri Banjarmasin, Banjarmasin, Indonesia
2 Augmented Reality and Virtual Reality Laboratory, Politeknik Negeri Banjarmasin, Banjarmasin, Indonesia Email: 1,* [email protected], 2 [email protected], 3 [email protected], 4 [email protected]
Submitted 14-06-2021; Accepted 20-10-2021; Published 30-10-2021 Abstract
The time of the COVID-19 pandemic is still ongoing and it is uncertain when it will end, so contributions from universities are needed in terms of innovation to anticipate the spread of COVID-19. This paper reports on the development of a temperature measurement application for early detection of COVID-19 based on Interactive Augmented Reality (IAR) technology and the MLX90614 sensor by knowing the framework and application prototyping. The application development method used is Extreme Programming (XP) which is a type of Agile Process and implements testing of application features with black-box and the precision of the application. The result of this research is that an application framework is produced, namely, the user opens the IAR camera application then directs the MLX90614 sensor to the target, the target immediately directs his face to the IAR camera, so that the results of body temperature detection can be tracked and visualized the results on the IAR user interface. The IAR camera function is to generate visualization in the form of detection result information into a user interface modeling. Furthermore, for the black-box test results, it was found that all the features of the application were functioning properly and the precision testing results were 94.44% with a very good category. In the future, the features and performance of the application will be improved as well as the efficiency of the form of the device used so that it can be used by the general public.
Keywords: COVID-19; Face Tracking; Interactive Augmented Reality; Markerless Tracking; MLX90614 Infrared Temperature Sensor;
Thermal Detection
1. INTRODUCTION
Coronavirus infection is called and known as COVID-19 (Coronavirus Disease 2019) and was first discovered in the city of Wuhan, China at the end of December 2019. Until now, COVID-19 has spread very quickly and has spread to almost all countries, including Indonesia and especially the city of Banjarmasin, in just a few months. The efforts of various countries to anticipate the spread of COVID-19 by imposing Large-Scale Social Restrictions or PSBB have also started various kinds of innovations.
Higher education, especially vocational higher education, also has a very important role, in terms of innovation to contribute to anticipating the spread of COVID-19. In the realm of IT in Social and Health, a term like this is very suitable in the context of implementing technology in the social and health fields that collaborate to have benefits.
One of them is by proposing research in developing an application for measuring body temperature for early detection of COVID-19. Thermal scanners or body temperature detectors are often used to find out whether a person has a fever or not. This is because fever is stated as one of the symptoms experienced by people with COVID-19. The technology proposed for the detection process of human body temperature is a combination of the MLX90614 Infrared Temperature Sensor which is programmed through the Arduino device and connected to a smartphone device for Interactive Augmented Reality (IAR) with a markerless tracking method which is a science related to object detection, digital image processing (including pattern recognition and face tracking/recognition), and 2D or 3D modeling that is deliberately brought up to generate interest from an information visualization point of view. Augmented reality technology is currently a trend and has been developed by the industrial world for use in various fields, including in the health sector.
2. RESEARCH METHODOLOGY
2.1 Related Works
After carrying out the search process, there is similar research by applying Augmented Reality (AR) technology. However, this research was conducted before the COVID-19 pandemic outbreak and with each of the research objectives. First, research from [1] which developed an AR-based thermal detection application, where a configured thermal sensor can generate thermal data related to physical objects that have been detected by AR devices.
Second, research from [2] uses AR technology to detect temperature and humidity in plant rooms. The monitored data comes from detection using the HSM20G sensor intermediary. This sensor is connected to a microcontroller which then the data is sent to the computer wirelessly using a Bluetooth module. Use of processing applications to create 3D animation in the form of blocks as an indicator of temperature and humidity values. Marker identification using the camera on the AR system is used to generate 3D animation. The temperature data detection accuracy rate is between 93.4% and 100%.
Furthermore, researchers also get information that there is an industry that will propose the same thing, namely from [3] which states that an AI technology company from China called Rokid will develop body temperature detection glasses
that will be combined with AR technology. Because similar studies tend to be few, this research has big opportunities and gaps to be proposed.
2.2 Basic Theory
2.2.1 Interactive Augmented Reality
According to [4] states that Augmented Reality or often referred to as AR is a combination of digital objects in two- dimensional (2D) and three-dimensional (3D) forms with the real world. AR is a system that has the following characteristics: (a) Combining real and virtual environments; (b) Interactive real-time; and (c) Integration in three dimensions (3D). The goal of the AR system is to improve user perception and interaction with the real world through virtual 3D objects that appear to coexist in the same place as the real world.
Virtual Reality (VR) and Augmented Reality (AR) are technologies that are both used in the real-world modeling of computer systems to assist and support human activities. In the use of AR technology, the user is in a virtual space while still having the feel of real-life [5].
Next, according to [6] and [7] explained that AR is the integration of digital information with the user environment in real-time, where this AR technology uses the real-world environment and then adds new information to it with the help of computers, webcams, smartphones, and special glasses.
Furthermore, we call it Interactive Augmented Reality (IAR) because this technology can be enhanced with various interactions/multi-interactions and is very interactive for its users, including in terms of generating 2D/3D objects or other magical forms of visualization. IAR must provide visualization of information, not just displaying 2D/3D objects, researchers must pay attention to this so that the usefulness of the application is felt.
2.2.2 Detection Method in Augmented Reality
The approaches and detection methods contained in AR are as follows:
a. Marker-Based Tracking
Marker-based tracking is a tracking method using markers. The markers used are special markers such as barcodes or black frames, QR codes, and specially printed markers, where the pattern is processed so that it can be read and recognized by a computer via a smartphone camera or a camera connected to a computer [6].
Furthermore [8] explain that the marker-based tracking method utilizes a marker in the form of a black and white illustration in the form of a square or an image illustration with a certain color and shape. In general, this method requires several things in its processing, such as a computer or mobile device equipped with cameras and sensors that support AR, AR applications, and markers. The flow of the system, namely the AR application will access the device camera, then the system will detect a marker via the camera, then display a virtual object over the marker on the device screen.
b. Markerless Tracking
Markerless tracking is a marker naturally that is directly related to the object [6]. Markerless tracking does not require a marker to display virtual objects. Virtual objects are projected by relying on part of the surrounding environment as the target [8]. This method is usually used for (1) Face tracking, which is tracking by recognizing the position of the human eye, nose, and mouth, such as features in applications that we know Instagram and Snapchat; (2) 3D object tracking, which is tracking by recognizing all existing objects. around such as cars, buildings, tables, etc.; (3) Motion tracking, which is tracking by detecting motion, usually used to produce films that simulate movement; and (4) GPS-Based Tracking, which is a tracking system by accessing GPS (Global Positioning System) and a compass sensor which then displays the object in the direction we want, such as the Pokemon-Go game which is the basis for the development of current AR applications.
2.2.3 MLX90614 Sensor
The MLX90614 sensor is a contactless temperature sensor, in which to measure the temperature of an object, the sensor does not require direct contact with the object. The MLX90614 sensor is simply pointed at the object you want to measure the temperature [9].
The working principle of the MLX90614 sensor is that this sensor works by absorbing infrared light emitted by an object. Since this sensor is not in physical contact with the object being measured, it has a wide measurement range from - 70°C to +380°C. Infrared radiation is a part of the electromagnetic spectrum which has a wavelength from 0.7 to 1000 microns. However, only 0.7 - 14 microns can be used to measure temperature. Because the intensity of infrared energy emitted by an object is directly proportional to its temperature. The MLX90614 sensor is specifically designed to detect infrared radiation energy and has been automatically designed so that it can calibrate infrared radiation energy into a temperature scale.
2.2.4 Extreme Programming
Extreme Programming, often called XP, is a software development methodology of the agile type that has four performance phases, namely planning, design, coding, and testing. XP is the most widely used agile process. Suggests several innovative and robust techniques that enable agile teams to create software releases that frequently provide the features and functionality that stakeholders have described and then prioritized [10].
The XP development model has stages, namely the activity planning stage is to explore and collect the required information. At the design stage, it is recommended to use a simple design because it tends to be preferred. The coding stage
is to create an application architecture through the coding process. After completion, enter the testing stage where the application will be tested so that it can be seen whether the application can run well or not. Furthermore, it enters the realm of software increments where this is an area for the evolution of the software being developed if there are shortcomings and other needs will enter this realm because the nature of XP is an agile process that requires the software to be quickly released by allowing errors to occur or many bugs.
2.2.5 Integrated Embedded Systems with Augmented Reality
The traditional AR system to visualize 3D models virtually from the real world which until now has developed into intelligent and interactive AR related to object contexts for physical objects [11]. AR is used to monitor sensor data supplied through the embedded system [12]. Embedded systems can play an important role in creating sophisticated, intelligent, and affordable scene generators for AR systems.
According to research recently carried out by [13] which states that embedded systems such as the Internet of Things (IoT) can be combined with AR technology where IoT devices as embedded systems can be used as a standard in terms of remote monitoring and control, while AR is applied to user interface interactions that result in visualization of information about what IoT devices are doing, then accessed virtually in real-time to produce a good user experience.
2.3 Research Method
The research methodology for developing applications uses Extreme Programming (XP) which consists of several stages, the XP development model was adapted from [14] which can be seen in Figure 1.
Figure 1. Extreme Programming Development Model
Based on Figure 1, the main stages of the XP model consist of planning, design, coding, test, and further increment software as a complement and characteristic of this model. The detailed stages are further described in the following sub- chapters.
2.3.1 Planning Stages
The planning stage consists of observation, literature study, and requirements analysis. For (1) Observation is an activity of direct observation of the phenomenon that occurs, namely regarding the spread of COVID-19 and ways to overcome/anticipate. The observations that have been made are then documented as consideration for the basis for developing applications; (2) Literature study, namely conducting theoretical literature studies that are relevant to the field of research and any researchers have done the same. Reference sources used are in the form of information from national and international research journals, books, scientific papers or articles, mass media as sources of information both printed and digital; and (3) Requirements analysis, namely conducting a needs analysis, the needs to be analyzed are user requirements, software, and hardware requirements, which in detail will lead to application development and the features provided.
2.3.2 Design Stages
The design stage consists of the form of the application workflow or framework design, how to operate the application, the output using other methods that have been adapted to the needs analysis at an early stage to solve these problems. So that programmers or parties involved in making program code will be facilitated because of what direction this application will run and what kind of flow is in the application and outside the application. Furthermore, the design of the application user interface is also included in this design stage.
2.3.3 Coding Stages
To be understood by a machine, in this case, a computer, the design must be transformed into a form that can be understood by machines, namely into a programming language through the coding process. This stage is the implementation of the design stage which is technically done by the programmer.
2.3.4 Testing Stages
Something that is made must be tested, as well as software/hardware. All software/hardware functions must be tested, to minimize the software/hardware from occurring errors/bugs, and the results must be following predefined needs.
The tests in this study consisted of black-box testing (assessing features whether or not to function) and the level of precision of body temperature detection (assessing between application detection and detection using a non-application thermometer). Black-box testing is carried out in the form of a questionnaire and requires several examiners to be asked for their comments, suggestions, and criticisms. After a questionnaire has been filled in by several testers/users, an evaluation will be carried out to correct deficiencies in the application until it becomes a really good application and by user needs.
2.3.5 Software Increment Stages
After the application is tested internally by an examiner, then a release and software increment is carried out. This process is so that the application can be repaired directly and quickly. Then carried out tests to the relevant environment with several users. Then the software increment is done again to fix the deficiencies and then release, the pattern will be the same continuity to correct software/hardware deficiencies and errors as a means of meeting user needs agile.
3. RESULTS AND DISCUSSION
3.1 User and Software/Hardware Requirements
Included in the user's needs is that users can detect a person's body temperature through the MLX90614 sensor and then produce information in the form of visualization of the detected temperature and status along with the detection time through the IAR application.
As for software needs, namely Corel Draw X8, Blender 3D, Unity3D Pro, Vuforia, Android System Development Kit (SDK), Java Development Kit (JDK), and Arduino Software IDE. For hardware requirements, it consists of a thermal sensor type MLX90614, a complete Arduino Uno device, a smartphone, and an OTG cable (as a link between Arduino and smartphone).
3.2 Design
The elaboration of the application design stage process is as follows:
3.2.1 Application Framework
The framework of application design is described through a flowchart, which can be seen in Figure 2.
Figure 2. Application Framework
Based on Figure 2, the application work process begins by directing the MLX90614 sensor to the face/area of the human body, then to get a visualization of the results face detection is required by pointing it at a smartphone camera that has the IAR application installed. If the face recognition is successful, a 2D object will appear in the form of a visualization of the body temperature detection results right at the eyebrow position and in the middle of the nose line, displaying status, detection time, and an exit button from the IAR camera. If face tracking doesn't work, then no visualization will appear on the smartphone, so you have to repeat the face tracking process.
3.2.2 Use Case Diagram Design
The use case diagram design of this application modeling can be seen in Figure 3.
Figure 3. Use Case Diagram
Based on Figure 3, there is only one actor involved, namely the User, which consists of several main use cases, namely: Open Applications, Main Menu Pages, Access IAR Camera Pages, Instructions Pages, and Close Applications. As for the IAR Camera Page Access use case, it consists of a sub-use case, namely Target Face Detection and Target Body Temperature Detection, which will later bring up the use case Display Detection Information.
3.2.3 Activity Diagram Design
The activity diagram design from modeling this application can be seen in Figure 4.
Figure 4. Activity Diagram
From Figure 4, it is an activity diagram design of the application to be built, consisting of two swim lanes, namely Users and Applications where there are various kinds of activities they do.
3.2.4 Mock-Up Design
The type of mock-up design proposed can be seen in Figure 5 which is tailored to the needs of the user for the application.
Figure 5. Mock-Up
Based on the mock-up design in Figure 5, a user interface design that will be built consists of visualization in the form of object emergence from IAR technology such as information on body temperature detection results, body temperature status, time and date, temperature rate, and an exit button from the IAR camera.
3.3 Coding Process
In the coding process, the programmer uses the Unity application to create a user interface from IAR and enters markerless patterns in the form of a human face position (face circumference, eyebrow position, and nose line) (see Figure 7). Detection of body temperature is obtained from the MLX90614 sensor connected to the Arduino device that has been programmed using the Arduino IDE (see Figure 6).
Figure 6. MLX90614 Sensor and Arduino Kit
Then the Arduino device is connected to a smartphone that has the IAR application installed using an OTG cable. The results of the application for temperature detection based on Interactive Augmented Reality can be seen in Figure 7.
An example of an IAR application user interface that has been built (See Figure 7), where the simulation is that the user directs the MLX90614 sensor (the position of the sensor can be placed on a smartphone camera or the other side) to the body/face of a person who will be checked for body temperature, then the data is recorded (temporarily recorded on the Arduino device). To find out the results of the detection, the user opens the IAR application (goes to the IAR camera page) and the person being detected immediately directs his face to the smartphone that has the IAR application installed. Then facial tracking begins by recognizing the circumference pattern/face shape, if successful, the detection results of the body temperature will appear in the form of information visualization, namely for temperature information that is right above one's eyebrows and the middle position takes from a straight line of the nose.
Figure 7. IAR User Interface
Furthermore, information on body temperature status is intended to interpret temperature values into several categories (See Figure 8). Then it also produces information on the time and date of detection and the exit button that is used to exit the IAR camera to the home page of the application.
The detection results from Figure 7, there are two targets with detection results, namely at a temperature of 36.7°C which is included in the NORMAL body temperature category, and 37.1°C is included in the NORMAL body temperature category as well. The obstacles faced in this study include the device that is not fully integrated (there is a separation between the MLX90614 sensor hardware and the smartphone device for the IAR application) and the limited number of targets for testing temperatures above normal. The following is a reference for categorizing human body temperature adopted from [15], which can be seen in Figure 8.
Figure 8. Categorization of Human Body Temperature
Based on Figure 8, the category of human body temperature consists of Hypothermia = <35°C, Normal = 36.5°C - 37.5°C, Fever/Hyperthermia = >37.5°C - 38.3°C, and Hyperpyrexia = >40°C or 41°C.
3.4 Test Results
Testing is carried out using the black-box testing method to check internally whether all features can function as expected.
A total of 30 respondents (testers and users) conducted the black-box testing process. As for the results of black-box testing
in general, it is found that all test features can function as expected, but it does not rule out the possibility of further enhancing the ten test features available, and even adding other features to increase interaction and visualization.
Furthermore, a precision test is carried out where the precision of a measurement system is also known as reproducibility or repetition/iteration, namely the extent to which repeated measurements under unchanged conditions get the same results. In the case of this study, precision was used by comparing the detection of the application made with the value on a regular thermometer, as many as 30 participants/targets were involved at different times for the three iterations of the test.
The actual test data is 90 iterations performed on 30 targets multiplied by three iterations. The results of the number of detections with the suitable category are 85 iteration data and the number of unsuitable iterations is 5 iteration data. The result of the calculation of the precision is Number of Matches / Number of Fits + Number of Unsuitable, then 85/85 + 5 = 0.94, then multiplied by 100 to make the percent value. Then the precision results obtained are 94.44% in the very good category, meaning that the IAR application has been built and tested with a greater number of match values than the mismatch value detected.
4. CONCLUSION
Based on the results of the research that has been done, it can be concluded that the framework and application prototyping was successfully built. The test results consisted of black-box testing with the results in the form of all the features of the application being able to function as expected and the level of precision of the application with a value of 94.44% with a very good category where it depends on the most suitable value between the IAR application detection results. with an ordinary thermometer. Future work will be to improve application performance and make devices look more efficient (from the physical form) and integrated, and can be implemented to the general public.
ACKNOWLEDGMENT
Our gratitude goes to the Center for Research and Community Service at the Politeknik Negeri Banjarmasin which has passed and provided funding from DIPA Politeknik Negeri Banjarmasin for the 2020 fiscal year, to carry out this research as an outcome of the lecturer development research scheme. Furthermore, we would like to thank the Team of Interactive Media, Game, and Mobile Technologies (IMGM) Research Group and Augmented Reality and Virtual Reality Laboratory (ARVR Lab.) for helping to complete this research.
REFERENCES
[1] R. Kumar and M. A. Sararu, “Thermal Detection in an Augmented Reality System,” United States Patent, US 9,536,355 B1, pp. 1- 23, 2017.
[2] A. Saleh, M. Akbar, and R. S. Widytra, “Pemantau Suhu Dan Kelembanan Ruang TanamaBerbasis Aplikasi Augmented Reality,”
Prosiding Seminar Nasional Teknologi Elektro Terapan, vol. 01, no. 01, pp. 79-84, 2017.
[3] Tekno.Kompas, “Startup China Bikin Kacamata Pendeteksi Gejala Covid-19,” 2020, source:
https://tekno.kompas.com/read/2020/04/28/08500067/startup-china-bikin-kacamata pendeteksi-gejala-covid-19 [accessed on: 20 May 2020].
[4] R. T. Azuma, “A Survey of Augmented Reality,” Hughes Research Laboratories, California: MIT Press, 1997.
[5] H. Tolle, A. Pinandito, E. M. A. Jonemaro, and K. Arai, “Virtual Reality Game Controlled with User’s Head and Body Movement Detection using Smartphone Sensors,” ARPN Journal of Engineering and Applied Sciences, vol. 11, no. 20, pp. 9776-9782, 2015.
[6] A. A. Syahidi, H. Tolle, A. A. Supianto, and K. Arai, “BandoAR: Real-Time Text Based Detection System Using Augmented Reality for Media Translator Banjar Language to Indonesian with Smartphone,” Proceeding of the 5th IEEE International Conference on Engineering Technologies & Applied Science, pp. 1-6, 2018.
[7] A. A. Syahidi, H. Tolle, A. A. Supianto, and K. Arai, “AR-Child: Analysis, Evaluation, and Effect of Using Augmented Reality as a Learning Media for Preschool Children,” Proceeding of the 5th IEEE International Conference on Computing, Engineering, & Design, pp. 1-6, 2020.
[8] U. M. Arief, H. Wibawanto, and A. L. Nastiti, “Membuat Game Augmented Reality (AR) dengan Unity 3D,” Yogyakarta: Andi Publisher, 2019.
[9] M. O. Sibuea, “Pengukuran Suhu dengan Sensor Suhu Inframerah MLX90614 Berbasis Arduino,” Yogyakarta: Teknik Elektro, Universitas Sanata Dharma, 2018.
[10] R. S. Pressman, “Software Engineering A Practioner’s Approach,” New York: Raghothaman S. Publisher, 2010.
[11] D. Jo1 and G. J. Kim, “AR Enabled IoT for a Smart and Interactive Environment: A Survey and Future Directions,” Sensors Journal, vol. 19, no. 19, pp. 1-19, 2019.
[12] A. Croatti and A. Ricci, “Mashing up the physical and augmented reality: The Web of Augmented Things Idea,” Proceeding of the 8th International Workshop on the Web of Things, pp. 4-7, 2017.
[13] A. A. Syahidi, K. Arai, H. Tolle, A. A. Supianto, and K. Kiyokawa, “Augmented Reality in the Internet of Things (AR + IoT): A Review,” International Journal of Informatics and Computer Science (IJICS), vol. 5, no. 3, pp. 258-265, 2021.
[14] A. A. Syahidi, Subandi, and A. Mohamed, “AUTOC-AR: A Car Design and Specification as a Work Safety Guide Based on Augmented Reality Technology,” Journal of Technological and Vocational Education, vol. 26, no. 1, pp. 1-8, 2020.
[15] Wikipedia, “Ketegori Suhu Tubuh Manusia,” Wikipedia Bahasa Indonesia, 2018, source:
https://twitter.com/idwiki/status/948911801042796544 [accessed on: 1 August 2020].