• Tidak ada hasil yang ditemukan

Chapter 5 Conclusion 65

5.1 Conclusion 65

The data analysis and results of every research instrument applied in this study have been demonstrated in detail in Chapter 4. The data that were obtained from all the instruments were classified into quantitative and qualitative. Both types of data analysis were used to achieve the research objectives as follows.

1. To develop an electronic–based English proficiency test for tertiary-level students.

2. To assess the usability of the electronic-based English proficiency test for tertiary- level students.

As mentioned previously, all the research instruments were designed, validated, and implemented to achieve the valid quantitative data and qualitative data involving students’

responses to the questionnaire, their English proficiency test scores together with their focus group interview responses. The conclusions of the findings derived from both types of the data analysis and results are presented as follows.

5.1.1 The Findings of Questionnaire Data Analysis and Results

Quantitative results and data analysis obtained from the questionnaire were presented in Chapter 4 and these were used to bring off research objective 1 and research objective 2.

This section is to summarize the major findings in response to these objectives of the study respectively.

According to the first objective and the second objective of the study, to develop and assess the usability of the electronic-based English proficiency test for tertiary-level students, 29-item post-survey questionnaire (after the e-testing) was administered to three hundred and fifty-five Freshmen students who had done the electronic-based test. Twenty-three questionnaire items (not including 6 questions on the students’ personal information) concerning the development and the assessing of the usability of the electronic-based English proficiency test were categorized into four categories: System Use, Learning Impact, User's Opinion, and Design of the Test. The responses obtained from the questionnaire were collected and the scores of each questionnaire item rated by the students were computed using a computer program. The findings of the results and analysis are to be concluded in the following part.

After the scores obtained from the questionnaire were computed and analyzed by a computer program using descriptive analysis (mean and standard deviation), the results were interpreted that the students in a sample group generally agreed with all of the items of the questionnaire with the total mean (x̅ ) score of 4.27 and SD of 0.27.

By looking into the mean score of each questionnaire item, it was surprisingly found that all items of the questionnaire concerning the development and the usability of the electronic-based English proficiency test were similarly rated in a high level. In other words, the result of each item obviously revealed the level of students’ agreement with the mean scores ranging from 3.93 to 4.48 respectively. Interestingly, Item 1, “Operation system of the electronic-based English proficiency test is smooth and convenient.” was rated with the highest mean score of 4.48 whereas Item 6, “It is difficult to cheat on the electronic-based English proficiency test.” was rated with the lowest mean score of 3.93 (SD=0.84).

By investigating each of the previously mentioned four categories of the questionnaire items, the results revealed that the total mean (x̅ ) score of the students’ level of agreement on Design of the Test (Items 4, 5, 11, 15, 16, 17, 20, 21, 22, and 23) was considered as of the highest among all four categories with the total mean (x̅ ) score of 4.33 and SD of 0.71. The

total mean (x̅ ) score by the students on System Use (Items 1, 7, 10, 12, 14, and 18) was rated as of the second high with the total mean (x̅ ) score of 4.30 and SD of 0.75. The total mean (x̅) scores of students’ agreement level in other two categories of the questionnaire items,User’s Opinion (Items 3, 6, 8, and 9) andLearning Impact (Items 2, 13, and 19) were also rated as of the high level with the total mean (x̅ ) scores of 4.25 and SD of 0.81; and of 4.18 and SD of 0.67 respectively.

The mean (x̅ ) scores stated above demonstrated that the overall students’ agreement with the questionnaire concerning the development and the usability of the electronic-based English proficiency test was in a high level. The findings that were derived from the interpretation of data analysis and results would claim that the students in a sample group agreed with how this electronic-based English proficiency test was developed as well as its usability specified on the questionnaire items. The details on this agreement regarding the electronic-based English proficiency test are to be discussed in the later section.

5.1.2 The Findings of the Students’ Electronic-Based English Proficiency Test Scores Analysis and Results

In order to obtain the students’ valid and reliable responses for achieving the research objectives, the researchers preliminarily had all students in a sample group do the electronic- based English proficiency test. As previously demonstrated in Chapter 4, the tests items had been developed, validated, tested for the reliability as well as examined before being used as a part of the data collection process.

Sixty-item English proficiency test questions were divided into 30-item Reading part (30 points) and 30-item Listening part (30 points), and all question items were uploaded on MOODLE Platform before being administered. By being given 80 minutes, the students completed the electronic test using the electronic gadgets that they were able to afford.

Afterwards, the students’ total test scores as well as their scores of the reading part and of the listening part were collected via the system and computed using the computer program for the analysis. Referring to Table 4.7, it was shown that out of 60 points, the students’ total mean score of the electronic-based English proficiency test was 42.25 with SD of 9.98. By investigating each part of the test, it was found out that the mean score of the Listening part

(x̅ =22.31, SD=5.18) was slightly higher than the mean score of the Reading part (x̅=19.93, SD=5.63), resulting in the mean difference of 2.38.

As stated in the previous chapter, the analysis and results of the students’ test mean scores and standard deviation were not used as a main data to achieve research objectives in this study; nevertheless, they may be utilized as a part of the data collection process obviously to assure that all students in a sample group had really exposed to and experienced with the electronic-based test. This would help guarantee that both quantitative data and qualitative data collected from them would be sufficiently reliable and viable to bring off the research objectives of the study regarding the development and the usability of the electronic-based English proficiency test.

5.1.3 The Findings of Focus Group Interview Analysis and Results

As mentioned earlier in Chapter 4, the researchers carried out the focus group interview after the students did the electronic-based English Proficiency test and responded to the questionnaire for more solid and substantial findings to achieve the research objectives. After administering the electronic-based test, the researchers requested the students in a sample group to participate as the interviewees of the interview. Eventually 27 students or interviewees voluntarily participated in the focus group interview sessions.

These 27 interviewees were interviewed in a group of 6-7 each; therefore, there were altogether 4 sessions of the interview that were conducted by the researchers throughout the process. Six questions being asked during the focus group interview were relevant to the interviewees’ agreement with the experience, perception, and attitude towards having electronic-based English proficiency test (CEFR), especially in terms of the development and the usability of the test.

To obtain the data analysis and valid findings for the research objectives, the thematic analysis method was applied. In this regard, the acquired responses were analyzed and interpreted into themes which were based on the research objectives and questions of the study mentioned previously. Hence, the gist of the interviewees’ responses concerning 6 questions derived from the focus group interview were summarizedaccording to the thematic content as follows.

▪ System Use

➢ Convenience of electronic-based English proficiency test –

More than half of the interviewees pointed out that this electronic-based test was convenient. Surprisingly, all of them agreed that its convenience was the advantage of this kind of test. To simply put, the electronic-based test appeared to be more convenient than the paper-based one. The examinees were able to do the test anywhere with any kind of gadgets they could afford. Also, they were able to search for the meanings of words or any information online and recheck or review for the correct answers. They also favored the way they could check the scores instantly after completing the test. The response in this regard supported the results of questionnaire on System Use such as Item 1, “Operation system of the electronic- based English proficiency test is smooth and convenient.”that was rated the highest among all questionnaire items by the student participants.

➢ Efficient time management –

Some of the interviewees said that theelectronic-based English proficiency test assisted them in more efficient time management for doing the test than the paper-based one did. With its convenience and appropriate system, they had enough time to do, recheck, correct, and select the best answers. This sort of responses was supported by the questionnaire Item 10,

“The electronic-based English proficiency test is more systematic than a paper-based test is.”, and Item 18, “Exam results interface design of the electronic-based English proficiency test is appropriate.”

On a contrary, there were a few of the interviewees reported that they were not able to complete all test items due to insufficient time. However, they also remarked that this limitation could be solved by the examinees themselves if they knew how to manage time on the system more wisely. This issue would be corresponding to the agreement by the students’

response on the questionnaire Item 10 concerning the systematic features of the electronic- based test.

➢ Safety of the system –

Most of the examinees agreed that the register interface of the electronic-based test was friendly-user but a few suggested that the electronic-based test should be accessed with more complicated user’s password since the current one was too simple. Questionnaire Item 14, “Register interface of the electronic-based English proficiency test is friendly-user.”

was implemented by this theme of the interviewees’ responses on System Use category.

➢ Difficulty due to the Internet connection –

Some interviewees reported that the system was easy to access and no test section on the electronic-based test was difficult for them. Questionnaire Item 1, “Operation system of the electronic-based English proficiency test is smooth and convenient.”, and Item7,

“Browsing among web pages on the electronic-based English proficiency test is easy.” were confirmed by their responses on this issue. Nevertheless, a few interviewees reckoned that the difficulty in taking this test was caused by the Internet system or a WIFI signal which was out of their control.

▪ Learning Impact

➢ Revision of the test items –

To be able to revise or cross-check on the electronic-based test items was considered as the advantage by all interviewees. In other words, they liked it when they had more chances to review the test items for the correct answers more easily than when they had a paper-based test. This theme of the responses supported the students’ agreement on the questionnaire Item 13, “Immediate feedback on the electronic-based English proficiency test helps the examinee to reflect on his/her learning.”

➢ Instant test results –

The interviewees also preferred the electronic-based test when they were able to check the scores instantly which was so different from the paper-based test. With the paper- based test, they had to wait until the teacher finished checking the paper to know the scores.

This way assisted them to know how to improve themselves. Obviously, this helped confirm the students’ agreement on the highly rated score of questionnaire Item 2, “Assessment on the electronic-based English proficiency test is fair.” on Learning Impact category.

▪ User's Opinion

➢ Pressure and distraction free –

Surprisingly, several interviewees reported that they had less pressure and anxiety while doing this electronic-based English proficiency test. Also, twenty percent of the interviewees stated that there was no pressure in doing this electronic-based test. Unlike the atmosphere of a traditional exam room with a paper-based test, they were able to do the test where they were convenient and felt comfortable with more relaxing atmosphere such as at home, in bedroom, and so on. Moreover, all interviewees also had the same comments that this kind of electronic-based test was suitable at the time of pandemic. They had no anxiety or

worry and felt safe by being at home doing the test. Simply put, this helped lessen their pressure and distraction; in other words, they were likely to have the better concentration and attention while doing this electronic-based test. Thus, this theme of the responses well supported the questionnaire results of Item 3 on User’s Opinion category, “The electronic-based English proficiency test better lessens the examinee’s anxiety than a paper-based test does.”

➢ Safety –

The interviewees’ responses concerning the safety of the electronic- based test was considered as its advantage. They mentioned that it was not possible to cheat or copy other students while doing the test since the password was required before each examinee accessed the test system. The results of the students’ agreement with Questionnaire Item 6, “It is difficult to cheat on the electronic-based English proficiency test.” was supported by this set of the responses.

➢ Cost and time saving –

Several interviewees agreed that the electronic-based test helped save cost and time. They did not have to travel to the campus or an exam room and this made them save time and money more efficiently than when they had a paper-based test. With electronic-based system, the interviewees reported that they appeared to have more time to do the test and select the correct answers better. To simply put, the test system was fast, and they were able to search for information and look up words on the dictionary online. These aforementioned responses corresponded to the students’ high level of agreement with Questionnaire Item 8, “The electronic-based English proficiency test is faster to complete than a paper-based test is.”, and Item 9, “The electronic-based English proficiency test is more modern than a paper-based test

is.” On User’s Opinion category as well.

▪ Design of the Test

➢ Compatibility of the test format –

Some interviewees explained that one of the advantages of the electronic-based English proficiency test was its compatibility of the test format. Broadly speaking, they liked the exam interface design; for instance, on the reading part which included the reading texts and the questions on the same page. Moreover, they added that unlike the paper-based test, the examinees did not waste the time to erase the paper when they made the wrong choice on the electronic-based test. All these responses tended to support the students’ high level of agreement on Questionnaire Item 16, “Exam interface of the electronic-based English proficiency test is friendly-user.”, Item 17, “Exam interface design of the electronic-based

English proficiency test is appropriate.” as well as Item 21, “Page-by-page style of questions facilitates the examinee in taking a test.” respectively.

➢ Test time duration –

By being given eighty minutes to complete the electronic-based English proficiency test, some interviewees revealed that this was not enough for them to complete all test items in time. They added that they had to spend quite a lot of time to read texts such as on the reading part before selecting the correct answers on the test. This type of response emphasized the results of Questionnaire Item 15 on Design of the Test category, “Seeing the timer on the electronic-based English proficiency test helps the examinee progress better.” that was rated with the lower mean (x̅ ) score when being compared to other the questionnaire items.

In addition, more than forty percent of the interviewees suggested that the longer time should be given, and it should be set separately for the better time management.

➢ Questions on the electronic-based English proficiency test –

More than fifty percent of the interviewees claimed that they had some difficulties in the test questions both in the listening part and the reading part. Literally, the difficulty was from the comprehension questions since there were a lot to read. Despite the long reading texts; however, one third of the interviewees liked the reading part most because they were able to learn, practice, and improve their English vocabulary skill and critical thinking skill by doing this part. With the same number of the interviewees who liked reading part, nine interviewees stated that they liked the listening part of the electronic-based test most.

Even though there were unfamiliar accents, long scripts, and very fast speech on the scripts, this part turned out to be quite easy for them with pictures, audio and visual clips as well as multimedia features. All of the responses above well supported the results of the mean (x̅ ) scores rated by the students on Questionnaire Item 4, “All directions of the electronic-based English proficiency test are easy to follow without any confusion.”, Item 21, “Page-by-page style of questions facilitates the examinee in taking a test.”, and Item 22,“Overview interface of the electronic-based English proficiency test is friendly-user.” on Design of the Test category.

▪ Physical features of electronic-based English proficiency test –

More than one-fourth of the interviewees noted that the physical features or the design of the electronic-based test such as colors, design, the number of test questions were all good and user-friendly. They remarkably preferred the paper-less style of the test which was fast, compatible, and convenient. This theme of the responses corresponded to the results of

the mean (x̅ ) scores that were highly rated by the students on Questionnaire Item 5, “The design of the electronic-based English proficiency test is appropriate.”, and Questionnaire Item 23,

“Overview interface design of the electronic-based English proficiency test is appropriate.” on Design of the Test category as well.

As stated in the previous chapter, the interviewees added some more comments and suggestions before each focus group interview session was finished. The interviewees provided the informative and insightful responses. These are to be discussed in the following part.