• Tidak ada hasil yang ditemukan

Preparation of Papers in Two-Column Format

N/A
N/A
Protected

Academic year: 2023

Membagikan "Preparation of Papers in Two-Column Format"

Copied!
8
0
0

Teks penuh

(1)

J

JOOUURRNNAALL OOFF TTHHEE IINNTTEERRNNAATTIIOONNAALL SSOOCCIIEETTYY FFOORR TTEELLEEMMEEDDIICCIINNEE AANNDD EEHHEEAALLTTHH

DEVELOPMENT OF A LOW-COST ELECTRONIC DATA COLLECTION TOOL FOR A HEALTH FACILITY SURVEY STUDY: LESSONS LEARNED IN THE FIELD

Patrick Garcia Sylim MD1, Cecilia Cristina Santos-Acuin MD, PhD2

1 National Telehealth Center, University of the Philippines, Manila

2 Food and Nutrition Research Institute, Department of Science and Technology, Philippines

Abstract

The process of selecting and developing a data collection tool for a health facility survey study is described. Several methodologies were considered, and an Android app development platform was chosen to fulfil the requirements of the study. The platform was adopted for its low resource cost, low capacity requirement, efficient and effective community responsiveness and progressive inclusion of functionalities. Data loss was 3.4%, with proposed contributing factors such as network intermittency, malware leading to disuse (necessitating manual encoding), and asynchrony of system interfaces, though the percentage of loss attributable to each factor is indeterminate.

Several considerations need to be taken into account prior to employing ICTs for research, namely, requirements of the study, resources available, and how each option being considered fulfils the requirements and proves sustainable given the resources. Planning, risk assessment, and maintenance are important phases in the development of the data collection tool.

Keywords: data collection tool; development;

information and communication technologies; mobile app; survey tool; Philippines

Introduction

A health facility survey study in the Philippines sought to employ a face-to-face survey methodology to collect data, assisted by an portable electronic data collection tool, as it offered the advantages of real- time monitoring and a responsive feedback mechanism between a data manager and field data collectors both in resource management and implementation. The main study had several requirements for data collection: accuracy of data entry, frequent updates

and visualisation of tabular data, the capacity to operate in conditions with intermittent network connection, portability, and ease of use. On the development side, it needed a cheap, flexible tool to which responsive enhancement and error correction could be implemented. Close emulation of paper forms was desired in terms of appearance, but with additional data validation mechanisms and other backend functions such as conditional branching. Privacy is not a priority since the data collection methodology included participant de-identification prior to encoding, but accuracy was paramount.

Mobile surveys designed to be answered directly by participants have distinct advantages and disadvantages. While this data collection method can provide real-time data and boost response rates, it generally costs more to implement, and its effective use reduces the spectrum of users to a specific age group and capacity.1 Surveyor-conducted collection mitigates the latter disadvantage. Several studies have used mobile technologies such as the Mobile Data Collection (MDC) app as it has the capacity to capture images, geo-locate and is not expensive,2 however, for this particular study, these functions are secondary to form customisability, which needs a certain degree of programming in MDC to achieve and sustain. Other efforts, such as the rabies app deployed at a large scale in Tanzania, were developed from scratch, and while this affords a great degree of functional customizability, with functions such as electronic form entry, SMS reminders, and so on,3 the cost of resources is a significant consideration that deters from this approach.

The project aimed to produce and maintain an electronic data collection tool for the health facility survey study. Specifically, it ; i) reviewed and evaluated various options for technologies to be adopted, ii) designed the software and modelled the data based on the needs of the study, iii) developed the

(2)

JOJOUURRNNAALL OOFF TTHHEE IINNTTEERRNNAATTIIOONNAALL SSOOCCIIEETTYY FFOORR TTEELLEEMMEEDDIICCIINNEE AANNDD EEHHEEAALLTTHH

tool and conducted alpha and beta testing during training, iv) maintained the software throughout the study, and v) now reports on the process and presents recommendations.

Methodology

Design and development considerations

The project started with a preparatory phase wherein information was gathered in order to make the decision on the approach to developing an electronic data collection tool. There were several considerations relevant to design and development of the tool. At the different stages of Pre-Development, Development, and subsequent Technical Support and Maintenance of the tool.

Pre-development requirements

Monitoring requirements. Because the study spanned several regions of the archipelago, real-time monitoring and feedback on data was desired. Because of this, paper and phone transfer routes were given lower priority in the solution choice and non-paper based data acquisition and transmission was preferred.

In addition the data for monitoring needed to pass through as few points as possible to reduce human error.

Development time. One of the more significant factors that affected the decision was that the study was dependent on the introduction of a particular vaccine type to the study sites, which was projected to happen around two months into the study. It was therefore prudent to estimate development time of one to one and a half months, counting the preparatory phase.

Cost. A common consideration when selecting any tool is the cost required to prepare, develop, deploy and maintain the tool, especially for funds from research grants.

Connectivity. The study was designed for field data collection, which implied the risk of connectivity issues for electronic survey tools that need constant network connection to synchronise data.

Privacy and security. Data security was required from the technology but privacy was not paramount as the Standard Operating Procedures of the data collection phase include de-identification of the participants before encoding into the electronic data collection tool.

Tool pre-development

Several development options were considered. The first was whether to hire freelance developers or

establish a development team. A separate development team would provide a degree of freedom over the appearance and behaviour of the survey forms, especially since the study questionnaires employed workflow branching. If the development team was local, the database could be deployed on a local server of choice, with security standards of choice. For cases where the development team had experience with health-related or research-related projects, certain intricacies of design and code conventions could be addressed early on in the development phase.

This choice had considerable risk for development time and cost, especially if the software were to be made from scratch. It also locked the research team into an application that could not be reused for other research efforts. Risk was also considerable regarding the terms of reference and functional design document with a distinct possibility that the software produced at the end of the development phase would not fulfil the needs of the study. For a survey with a non-complex data model, this risk is relatively low. Another risk identified was the degree of support provision during the data collection phase, especially when untoward occurrences are seen with the software or the devices.

The second was whether or not to use a pre-built survey application. This released the burden of developing a survey tool and undergoing multiphasic testing for quality assurance, though it had the distinct risk of having no available pre-built survey forms suitable for the purposes of a specialised study.

Another option was to use a pre-built renderer, with freedom to create forms. The survey application allowed for creation and customisation of forms, overcoming a problem of the first option, and while it partially inherited the advantage of the second option by having a pre-built form renderer, it required training in order to create and customise forms for a fee, which added risk to development time and cost.

An Electronic Medical Record (EMR) like OpenMRS was considered since it can be accessed through a mobile device, however, a continuous Internet connection was required for the planned data submission model, and modules would be too complex to develop unless done through HTML Form Entry, which does not allow for highly-customised form behaviour.

The final option was to find and use a low cost pre- built renderer, with freedom to create forms, but which did not require training. A commercial product, AppSheet, was found. It is an optional plug-in to an

(3)

JOJOUURRNNAALL OOFF TTHHEE IINNTTEERRNNAATTIIOONNAALL SSOOCCIIEETTYY FFOORR TTEELLEEMMEEDDIICCIINNEE AANNDD EEHHEEAALLTTHH

online form cloud service and a corresponding spreadsheet cloud service that allows their conversion into fully working Android applications. The backend renders the form or spreadsheet and synchronises the data into the spreadsheet linked to the form, or the spreadsheet itself. AppSheet itself has an online cloud service where the application metadata is stored. Upon testing, AppSheet was relatively low-cost required a relatively short development time, was low-risk maintenance-wise, possessed an option for an offline mode, and did not require training or extensive coding experience to use.

Tool development

A spreadsheet cloud service was selected for the database for the timeliness of data updates as well as the convenience of being able to view and analyse the data from any device. The questionnaires were then recreated as online forms, with branching patterns included as allowable by the platform at the time. The questionnaires were then converted by the AppSheet plugin into a basic Android app. Several observations were made at this point in the development. The online form was able to handle form branching based on answers to previous questions, but could only handle form branching based on one positive condition at the time of development. The plugin also had the limitation of being able to handle only one condition for each branching, but its advantage was that it could handle conditions that required that an answer was not

chosen (which meant that the expression is true when any of the other answers are chosen) as compared to a previous online form platform, which gave it more versatility.

The questionnaires contained multi-conditional branching, and thus workarounds were needed. Such branches were represented by multiple dichotomous branching and thus the total number of columns that represent the succeeding questions in a branch would have been raised by a power of two for each condition.

Branching patterns showing examples for fields that do not influence form behaviour (Question 1) and those that initiate form branching (Question 2) are shown in Figure 1. Question 1 would require one column to store the value chosen by the user. Each choice in Question 2 leads to a branch where an instance of Question 3 is connected. In the example, Question 3 would require 2 columns to accommodate each branch. If each answer for Question 3 should branch further, the branching would effect a corresponding increase in the number of columns.(Figure 1)

It was therefore imperative to make modifications to the mobile app through the spreadsheet itself, as the advanced editor based its schemas on the spreadsheets assigned to it. In addition, it was observed to read sheet notes attached to the header cells and copy them over as attributes to each data field whenever the schema was regenerated in the editor as an alternative

Figure 1. Example of multi-conditional branching requirements of the survey form.

(4)

JOJOUURRNNAALL OOFF TTHHEE IINNTTEERRNNAATTIIOONNAALL SSOOCCIIEETTYY FFOORR TTEELLEEMMEEDDIICCIINNEE AANNDD EEHHEEAALLTTHH

to direct column editing, as long as the notes contain the proper attributes and code objects. It should be noted that the AppSheet plugin generates these notes automatically when the mobile app is generated from an online form, and these served as the basis for modifying the mobile app from the spreadsheet.

The workaround mentioned above, while resolving the issue of multi-conditional branching, had the side- effect of considerably increasing the size of the database. This also increased the complexity of database cleaning since the process was done on the spreadsheet itself, and had a significant impact on efficiency of database operations, since the spreadsheet saves and refreshes the workbook with every change done. Fortunately, plugins to the online spreadsheet exist to simplify the cleaning process to some degree.

“Summary” columns were placed in the database and given the hidden attribute (and thus, added hidden data fields in the app). “Data Manager View” sheets were then created; they only display the relevant columns, such as the summary columns for branches, through the use of the QUERY functions in the spreadsheet. In subsequent versions of the app other functions will replace the QUERY function for displaying columns that do not require filtering, and QUERY will be retained only for the pre-introduction and post-introduction records.

Aside from the study questionnaires, data query forms were added to the app. These accommodated questions and suggestions from the data manager, and surveyors could address the issues and update the status of the posts. In addition, an attendance sheet was included, wherein the surveyors could take an image of the Certificate of Appearance from the Rural Health Unit. This was later removed to conserve space, which became an important consideration in the implementation.

Data validation mechanisms that were not available in the app were applied to the data manager view sheets through conditional formatting. Cells with anomalous data were automatically highlighted red and these were then posted as queries by the data manager.

Tool technical support and maintenance

Technical support was provided at two levels: at the app creator level, and at the AppSheet backend level handled by the AppSheet development team.

Questions regarding the device, the questionnaire behaviour, and overall app performance were for-

warded to the app creator, who would then troubleshoot the issues; any queries that arose from the back end changes applied to the AppSheet app were elevated to the AppSheet development team.

The app was deployed in a facility study to three sites in the Philippines and used over a period of 9 weeks during which approximately 1200 surveys were completed. The accuracy and completeness of data gathered was analysed during the data cleaning phase.

The study was approved by the University of the Philippines Manila Research Ethics Board.

Results

Using AppSheet as an app builder for the survey tool afforded several advantages to the research team as well as the app creator, based on the following. There was an acceptable degree of control of the app without the need for training or a prior programming background; most of app behaviour is influenced through conventional spreadsheet formulae, so a health-oriented professional could be an app creator and shift focus to the content as needed. In addition, the removal of the need of a developer increased responsiveness to errors in form content and app functionality. The online spreadsheet saved the revision history and could be used for auditing, and changes were relatively easier and faster to deploy since all changes saved in the advanced app editor were pushed to the devices as soon as the next successful synchronisation occurred. The AppSheet development team and community responded quickly to questions and requests, and usually fixed issues within 24 hours of identifying the issue.

However, there were some limitations inherent to AppSheet at the time of development. Multi- conditional branching had to be incorporated through the spreadsheet and this bloated the database and added complexity to data cleaning. Data validation could not be handled at the app level and hence conditional formatting was employed at the database level. The offline version of the app did not work as intended with the devices procured for the study. This was later determined to be a function of the Android OS version which was locked to version 4.2.2. The app had to be accessed through the web browser of the device. A tinyurl address was generated for ease of access. This brought an unexpected advantage of having access to the app without prior installation of

(5)

JOJOUURRNNAALL OOFF TTHHEE IINNTTEERRNNAATTIIOONNAALL SSOOCCIIEETTYY FFOORR TTEELLEEMMEEDDIICCIINNEE AANNDD EEHHEEAALLTTHH

AppSheet on the device, but with decreased functionality.

There were challenges encountered that may or may not be directly related to the survey tool, but directly affected the data collection phase of the study.

These must be considered for research studies that seek to employ ICTs as tools for data collection:

1. The devices were pre-installed and locked to Android OS 4.2.2; the specifications were chosen based on budget and projected data volume, and not the app requirements. The OS version of the device should be considered to avoid untoward app behaviour and should be taken into consideration before procurement and or selection of survey and development tools.

2. The pre-development phase did not include a step for installation of antivirus utilities, which may contributed to the significant issue of malware and adware hampering the performance and usability of the devices themselves. There was also the distinct possibility that a wayward advertisement was tapped, leading to the entry of malicious software, but this could not be verified. Measures must be taken to prevent malicious software from penetrating the devices, from inculcation through training and practice to installation of trustworthy utilities that can provide sufficient protection to the device.

3. Network connectivity played a significant factor for three reasons: i) the survey data was transmitted between the mobile devices and the online database through the synchronisation process, ii) the AppSheet back end and app updates rode along this process, and iii) asynchrony between the two caused a cascade of errors that resulted in unsynchronised data,, though not necessarily data loss. Measures have to be taken to ensure that there are alternative avenues for extracting data directly from the device.

In addition, app updates have to be scheduled in coordination with data collection in the field;

changes to the app should be deployed when all the data collected from the previous app version have been successfully synchronised or otherwise extracted from the devices.

4. The AppSheet development team was constantly deploying updates on the back end code throughout the course of the data collection phase which caused unintended app behaviour; for instance, the back end changes that caused previous workarounds to stop working, or simply an unintentional bug introduced despite Test Driven Development on their end. For a

period, this caused essential parts of the app to not appear and could have been mitigated by timely reporting of the particular bug. While a user-driven versioning system has been suggested to mitigate this phenomenon, the deployment model makes the implementation of such a feature challenging. This can be mitigated by having a vigilant, responsive bug tracking and reporting mechanism across all parties, from the surveyors to the app creator.

Discussion

In addition to the practical findings described above, other issues came to light.

Asynchrony

Asynchrony became a significant challenge during the data collection phase not just because of intermittent network connectivity, but also because of the inherent inter-service interactions that the whole setup employed. The interactions are as follows. AppSheet stores app metadata (codes, settings, UX, and so on) in their cloud server, accessible and editable through an app editor. The app editor extracts table structures and computes schemas from app creator-specified databases, and in the case of the app, an online spreadsheet. This schema is the basis of the synchronisation protocol of AppSheet. When updating, the AppSheet mobile app downloads the AppSheet back end codes and caches them into the offline version in the device. In the case of the app, it was cached by the web browser by which the app was accessed. When synchronising, the app pushes the survey data cached in the device, which is based on the schema also cached in the device. After all queued data have been pushed (with acknowledgement received from the server), the sync function then pulls the latest schema computed by the AppSheet cloud server (and pulls data from the online database.

If acknowledgement from one queued data set (acknowledgement is given per row of data) is not received, the data are re-entered into the queue and synchronisation is attempted again at the end of the push phase. Continual failure of this phase will result in a sync failure message and the process will not move into pulling the latest schema and data, which is the usual scenario in cases where network connectivity is intermittent. This caused two issues: data remained in the devices, unmonitored and unregulated, and the app was not updated with the latest schema and settings. The latter issue became significant when

(6)

JOJOUURRNNAALL OOFF TTHHEE IINNTTEERRNNAATTIIOONNAALL SSOOCCIIEETTYY FFOORR TTEELLEEMMEEDDIICCIINNEE AANNDD EEHHEEAALLTTHH

corrections were made in an effort to be responsive to error and bug reports, and subsequent synchronisation attempts failed outright because of changes in schema, which in turn resulted in column and data type mismatch.

The issue was elevated to the AppSheet development team, who provided two ways of extracting data within a few days of the report. First through the online server synchronisation logs, which log activity regarding synchronisation attempts that trigger server response. Second through a mechanism by which queued data could be sent as-is directly through email to the app creator. The former had the distinct limitation of being able to salvage only data that were lost because of schema differences and server time-outs; it could not salvage data that remained in the devices because of network intermittency and other device-specific limitations.

The latter had the distinct limitation of requiring a network connection as well, but could salvage data that remained in the device for any undocumented reason, including temporary network outage. In total, 6.5% of data was recovered through the two methods.

Malicious software

Malware, adware and bloatware became a significant challenge to surveyors in the field. This affected the usefulness of the app. The presence of such software had a severe impact on the performance of the devices, causing input lag that lasted up to 10 seconds on a single tap, giving the illusion of a crash in certain cases. This may also have contributed to the attribution of slowness to the app itself.

In some devices, it was later observed that the device WiFi connection was activated and deactivated automatically, a phenomenon that may be attributed to infection. Upon searching on the subject, some malware found in the devices have been identified to have infected versions that are characterised by unsolicited connection control. The identified malware had the potential to close the network connection to apps other than themselves, which may have been a direct cause of synchronisation failure.

Adware in some devices caused advertisements to appear during app use, and since the ethics portion of the surveyor training recommended showing the caregivers the app during the interview, caregivers were involuntarily exposed to adult-oriented advertisements that popped up during the interview.

This caused an aversion to the use of the device, and

the surveyor used paper records for data recording thereafter.

Across all the devices used in one of the study sites, around 47 possible pieces of malicious software (malware) were found. Included in the list were applications that were not present after a factory reset, but there was difficulty in discerning between applications that were installed automatically by an infected malware and applications that were possibly downloaded by the surveyors.

Data completion

The percentage of data successfully synchronised against expected data volume as the study progressed (measured in records) is shown in Figure 2.

Figure 2. Data completion rate at two sites, (PDC

= post data cleaning).

In Figure 2 the records seen in the database across the study as a percentage of the records expected or accomplished by surveyors locally. Data beyond 9 weeks was excluded since data cleaning had started by then. (Data obtained from direct record counts recorded in reports accomplished by the data manager of the study, and separate counts conducted by the author). Data from both study sites were combined for data cleaning (extending the line for combined data completion beyond 9 weeks).

By the end of the cleaning phase, 78.7% of the data had been entered as planned through synchronisation, 11.4% were manually encoded by the data manager when surveyors avoided using the tablets and 6.5%

data extracted from devices and online logs. Data loss was 3.4%, with 27% of the loss as missing data due to a branching error not discovered in a timely manner; 73% of the loss was uncategorised

50 60 70 80 90 100

1 2 3 4 5 6 7 8 9 PDC

% Records in Database vs Expected

Weeks

Data Completion Rate

Combined Study Site 1 Study Site 2

(7)

JOJOUURRNNAALL OOFF TTHHEE IINNTTEERRNNAATTIIOONNAALL SSOOCCIIEETTYY FFOORR TTEELLEEMMEEDDIICCIINNEE AANNDD EEHHEEAALLTTHH

Looking forward

The study is set to move on to the second phase, and a lull in field activity opened up opportunities to improve the app. During the course of the first phase, the AppSheet development team added several features that could be of use to the app:

1. The AppSheet back end can now handle IF() expressions, with pre-formulated VALID_IF, SHOW_IF and REQUIRED_IF statements linked to each column, allowing for advanced enabling and disabling mechanisms for data fields. The VALID_IF statement can also be used in tandem with SELECT() and CONTAINS() expressions to allow for advanced lookup functions and app optimisation.

2. The AppSheet back end can now handle AND() and OR() expressions; through these, multi- conditional branching can be accommodated by the engine without the need for the spreadsheet workaround.

3. The AppSheet back end can now handle more functions modelled after spreadsheet functions and perform the operations as the data is being entered at the app level; this allows for more robust data validations, restrictions, and real-time processing.

One significant example of this is the new back end capability of processing dates; the new version of the app can now validate date entries as they are being entered into the forms, instead of after being synchronised centrally.

4. The AppSheet back end can accommodate complex data models and entity relationships through keys.

5. The AppSheet back end can store Virtual Columns, which are functionally equivalent to the summary columns spliced into the database during the first phase to aid in displaying split data resulting from the multi-conditional branching workaround, and can act as hidden, mediating columns that can process parts of complex operations and return a value.

As a result of the upgrade, there are some overarching features of the new App version (2.0).

There are fewer pages due to the more efficient way the back end handles multi-conditional form branching. This reduced the database to 140 columns in one questionnaire and 170 in the other from the previous 209 and 451 (a 33% and 62% decrease in size, respectively); thus there was no further need for separate data manager view sheets. Since phone calls

and text messaging were found to be more efficient and timely for areas with network intermittency, the attendance and queries sheets were removed.

The App can now enforce validation rules and restrictions on various questions and the app was redesigned to employ lookup tables for drop-down menus, instead of storing repeating choices in the app definition. This reduces the size of app metadata synchronised, and allows faster updating of choices should the need arise.

The App now also stores initials in a separate table, used for the drop-down menus to potentially decrease the probability of record duplication due to typographical errors.

Conclusion and Recommendations

Effective use of apps in research entails that considerable effort should be devoted to planning, risk management and maintenance.

When considering app design and development, the following factors should be considered:

1. Resources of the study and the team itself:

a. How soon should the tool be available? Is there time for development? How much time can be allotted?

b. How much of the budget can be spared for development and capacity building, if any would be required?

c. Does the research team have a member that can be assigned to maintain the tool and triage risks and issues?

2. Requirements of the study:

a. What is the study design? How will the data be collected?

b. What data will be collected? Who will collect such data? What are the methods of obtaining the data first-hand? Are there sensitive data being collected?

c. How will the data structures be modelled? How often will data be collected? How will the data points be linked? What analysis is needed in the study? How will the data model allow this analysis?

d. What other functions does the study require of the tool?

3. Options for the tool (each item is considered for each option):

a. Does the resource requirement of the option fall under the capacity that the study and the study

(8)

JOJOUURRNNAALL OOFF TTHHEE IINNTTEERRNNAATTIIOONNAALL SSOOCCIIEETTYY FFOORR TTEELLEEMMEEDDIICCIINNEE AANNDD EEHHEEAALLTTHH

team currently have or can provide (refer to item 1)?

b. Does the option fulfil the requirements of the study (refer to item 2)?

i. If not, can the tool be modified to fulfil the requirements of the study? If it is to be modified, can the study and the study team provide the resources for modification? Does the option come with the resources available, in exchange for other resources that the study and the study team have (e.g. a developer; for a fee)?

ii. Does the option provide maintenance support so it can continuously fulfil the requirements of the study?

4. Perks

a. Does the option have community backing?

b. Does the option offer other support mechanisms?

The points above will be the basis through which the app, or the development platform thereof, can be chosen. In the case of the health facility survey where cost, ease of development (without programming experience), customisability, flexibility, and connectivity were major concerns, the development platform used provided a highly functional data collection app, even with the issues encountered.

...

Corresponding author:

Patrick Garcia Sylim, MD National Telehealth Center

University of the Philippines, Manila Mobile: (+63) 917 505 5923

Email: [email protected]

Conflict of Interest. The authors declare no conflicts of interest.

References

1. Winett, R. Six questions to ask before using mobile devices for surveys. Winett and Associates. (2016). Available at:

http://www.winettassociates.com/using-mobile- devices-for-surveys.html accessed 15 January 2016.

2. Grant AS, Kennedy RD, Spires MH, Cohen JE.

The development and piloting of a mobile data

collection protocol to assess compliance with a national tobacco advertising, promotion, and product display ban at retail venues in the Russian federation. JMIR Res Protocols 2016;5(3):e120.

3. Mtema Z, Changalucha J, Cleaveland S, et al.

Mobile phones as surveillance tools:

implementing and evaluating a large-scale intersectoral surveillance system for rabies in Tanzania. PLoS Med 2016;13(4):e1002002.

Referensi

Dokumen terkait

JOURNAL OF THE INTERNATIONAL SOCIETY FOR TELEMEDICINE AND EHEALTH Donnelly L, J Int Soc Telemed eHealth 2017;5:e6 1 TELEHEALTH IS A VIABLE ALTERNATIVE FOR THE TREATMENT AND

This study focuses on developing a model to conduct sentiment analysis on comments on YouTube videos related to the COVID-19 vaccination program and emergency PPKM and conducting

For knowledge to be effectively translated into practice, individuals should be aware of the competencies that exist within their communities and should learn how to strategise in using

JOJOUURRNNAALL OOFF TTHHEE IINNTTEERRNNAATTIIOONNAALL SSOOCCIIEETTYY FFOORR TTEELLEEMMEEDDIICCIINNEE AANNDD EEHHEEAALLTTHH Training Programmes at CDC * The nursing

The questionnaire consisted of multiple choice questions regarding: age, number of children, profession, time since graduation, current occupation, current number of jobs, number of

Mass Sarana Motorama: 1 There is no application to analyze the information in the data warehouse of so that the top manager have to analyze the data manually 2 There is no application

Property of Soil Value Specific gravity 2.67 Soil Classification CI Liquid Limit % 40.45 Plastic Limit % 22.18 Plasticity Index % 18.3 Maximum Dry Densityg/cc 1.65 Optimum

JOURNAL OF THE INTERNATIONAL SOCIETY FOR TELEMEDICINE AND EHEALTH Tsuji M, J Int Soc Telemed eHealth 2017;5GKR:e57 1 COMPARATIVE ANALYSIS OF REGIONAL MEDICAL INFORMATION SYSTEMS