Chapter 4 Research Results 39
4.1 Quantitative Data Analysis 39
method would ease the data analysis and facilitate the researchers to achieve an answer to the research objective.
The questionnaire items for assessing the usability of the electronic-based English proficiency test for tertiary-level students were classified into four categories: System Use, Learning Impact, User’s Opinion and Design of the Test. For a clearer picture, the questionnaire items were categorized on Table 4.1 as follows.
Table 4.1 Questionnaire items category
The researchers collected the responses, then had them computed and analyzed using descriptive analysis (mean and standard deviation). In addition, the mean (x̅) and standard deviation scores of each category of the items for the usability of the electronic-based English proficiency test as well as the total mean and standard deviation scores of all items were also analyzed and interpreted according to the statistical analysis interpretation of the mean score (see Table 3.3).
The scores obtained from the analysis were used to indicate the students’ high and low level of agreement with the questionnaire items. To simply put, the higher scores represented the higher level of the students’ agreement with the items of the questionnaire whereas the lower scores demonstrated the lower level of their agreement; in other words, the disagreement with the items of the questionnaire respectively.
Table 4.2 below displays the mean and standard deviation of each questionnaire item together with the interpretation of the level of the students’ agreement or disagreement with the items.
Questionnaire Item Category Number of Items Item Numbers
System Use 6 1, 7, 10, 12, 14, and 18
Learning Impact 3 2, 13, and 19
User's Opinion 4 3, 6, 8, and 9
Design of the Test 10 4, 5, 11, 15, 16, 17, 20, 21, 22, and 23
Table 4.2 Quantitative data and interpretation of questionnaire items for the students’ level of agreement and disagreement
Item
No. Items Mean
(x̅ ) SD Level of Agreement 1
Operation system of the electronic-based English proficiency test is smooth and convenient.
4.48 0.58 Agree 2 Assessment on the electronic-based English
proficiency test is fair. 4.20 0.62 Agree
3
The electronic-based English proficiency test better lessens the examinee’s anxiety than a paper-based test does.
4.31 0.83 Agree 4
All directions of the electronic-based English proficiency test are easy to follow without any confusion.
4.27 0.72 Agree 5 The design of the electronic-based English
proficiency test is appropriate. 4.39 0.69 Agree 6 It is difficult to cheat on the electronic-based
English proficiency test. 3.93 0.84 Agree
7 Browsing among web pages on the electronic-
based English proficiency test is easy. 3.95 0.87 Agree 8 The electronic-based English proficiency test
is faster to complete than a paper-based test is. 4.30 0.81 Agree 9 The electronic-based English proficiency test
is more modern than a paper-based test is. 4.40 0.67 Agree 10 The electronic-based English proficiency test
is more systematic than a paper-based test is. 4.35 0.78 Agree 11 Registration process of the electronic-based
English proficiency test is easy. 4.40 0.67 Agree 12 Log-in interface of the electronic-based
English proficiency test is friendly-user. 4.38 0.71 Agree 13
Immediate feedback on the electronic-based English proficiency test helps the examinee to reflect on his/her learning.
4.07 0.65 Agree 14 Register interface of the electronic-based
English proficiency test is friendly-user. 4.32 0.65 Agree 15
Seeing the timer on the electronic-based English proficiency test helps the examinee progress better.
4.23 0.78 Agree 16 Exam interface of the electronic-based English
proficiency test is friendly-user. 4.31 0.76 Agree
As mentioned previously, the 23-item questionnaire for the students’ responses on the development and on assessing the usability of the electronic-based English proficiency test for tertiary-level was carried out as a post-survey with all 355 students in the sample group after the tests of of validity and reliability were done and also after the students had completed the electronic-based English proficiency test.
Table 4.2 above revealed that the total score of the students’ level of agreement with the questionnaire items on the development and the usability of the electronic-based English proficiency test was considered as of a high level; in other words, they generally agreed with all of the items with the total mean (x̅ ) score of 4.27 and SD of 0.27 (see Appendix N for the details of the students’ responses of the questionnaire items analysis).
By examining the mean score of each questionnaire item concerning the development and the usability of the electronic-based English proficiency test more thoroughly, each of the results remarkably demonstrated the high level of students’ agreement with the mean scores ranging from 3.93 to 4.48 respectively. The highest score was rated on Item 1, “Operation system of the electronic-based English proficiency test is smooth and convenient.” with the mean score of 4.48 (SD=0.58) whereas the lowest one was rated on Item 6, “It is difficult to cheat on the electronic-based English proficiency test.” with the mean score of 3.93 (SD=0.84).
17 Exam interface design of the electronic-based
English proficiency test is appropriate. 4.30 0.69 Agree 18 Exam results interface design of the electronic-
based English proficiency test is appropriate. 4.24 0.73 Agree 19
It is hopeful that the electronic-based English proficiency test will be used in other courses of English.
4.18 0.72 Agree 20
Previous exam attempts interface of the electronic-based English proficiency test is friendly-user.
4.31 0.66 Agree 21 Page-by-page style of questions facilitates the
examinee in taking a test. 4.42 0.68 Agree
22 Overview interface of the electronic-based
English proficiency test is friendly-user. 4.33 0.75 Agree 23 Overview interface design of the electronic-
based English proficiency test is appropriate. 4.26 0.72 Agree
Total 4.27 0.27 Agree
By investigating each item more closely, it was found that other 3 items that were rated in a high level of agreement next to the highest one included Item 21, “Page-by-page style of questions facilitates the examinee in taking a test.” with the mean score of 4.42 (SD=0.68);
Item 9, “The electronic-based English proficiency test is more modern than a paper-based test is.” with the mean score of 4.40 (SD = 0.67); and Item 11, “Registration process of the electronic-based English proficiency test is easy.” with the mean score of 4.40 (SD=0.67) as well.
In addition, other 3 items were also rated in a level of agreement which was from the lowest; these includedItem 7, “Browsing among web pages on the electronic-based English proficiency test is easy.” with the mean score of 3.95 (SD=0.87); Item 13, “Immediate feedback on the electronic-based English proficiency test helps the examinee to reflect on his/her learning.” with the mean score of 4.07 (SD=0.65); and Item 19, “It is hopeful that the electronic-based English proficiency test will be used in other courses of English.” with the mean score of 4.18 (SD=0.72).
Furthermore, the mean (x̅ ) and standard deviation scores of four categories of the questionnaire items for the students’ agreement with the usability of the electronic-based English proficiency test for tertiary-level: System Use category, Learning Impact category, User’s Opinion category, and Design of the Test category were also computed, analyzed and interpreted according to the same statistical analysis interpretation of the mean score (see Table 3.3).
Tables 4.3, 4.4, 4.5, and 4.6 below show means, standard deviations and students’ level of agreement interpretation of the questionnaire items of aforementioned System Use category, Learning Impact category, User’s Opinion category, and Design of the Test category respectively.
Table 4.3 Quantitative data and interpretation of students’ level of agreement with the questionnaire items in System Use category
Item
No. Items Mean
(x̅ ) SD Level of Agreement 1
Operation system of the electronic-based English proficiency test is smooth and convenient.
4.48 0.58 Agree
Regarding the questionnaire items in System Use category including Items 1, 7, 10, 12, 14, and 18, the students’ level of agreement in this category was regarded as of the high level;
in other words, the students agreed with all of these items with the total mean (x̅ ) score of 4.30 and SD of 0.75. As mentioned previously, Item 1, “Operation system of the electronic-based English proficiency test is smooth and convenient.” was rated as the highest among all 23 items of the questionnaire with the mean score of 4.48 and SD of 0.58 whereas Item 7, “Browsing among web pages on the electronic-based English proficiency test is easy.” was rated as the lowest in this category with the mean (x̅ ) score of 3.95 and SD of 0.87.
Table 4.4 Quantitative data and interpretation of students’ level of agreement with the questionnaire items in Learning Impact category
7 Browsing among web pages on the electronic-
based English proficiency test is easy. 3.95 0.87 Agree 10 The electronic-based English proficiency test
is more systematic than a paper-based test is. 4.35 0.78 Agree 12 Log-in interface of the electronic-based
English proficiency test is friendly-user. 4.38 0.71 Agree 14 Register interface of the electronic-based
English proficiency test is friendly-user. 4.32 0.65 Agree 18 Exam results interface design of the
electronic-based English proficiency test is appropriate.
4.24 0.73 Agree
Total 4.30 0.75 Agree
Item
No. Items Mean
(x̅ ) SD Level of Agreement 2 Assessment on the electronic-based English
proficiency test is fair. 4.20 0.62 Agree
13
Immediate feedback on the electronic-based English proficiency test helps the examinee to reflect on his/her learning.
4.07 0.65 Agree
19
It is hopeful that the electronic-based English proficiency test will be used in other courses of English.
4.18 0.72 Agree
Total 4.18 0.67 Agree
According to the questionnaire items on Learning Impact category including Items 2, 13, and 19, the students’ level of agreement in this category was regarded as of the high level or agree with the total mean (x̅ ) score of 4.18 and SD of 0.67. Item 2, “Assessment on the electronic-based English proficiency test is fair.” was rated with the highest mean (x̅ ) score within this category (x̅ =4.20 and SD=0.62) while the other 2 items; Item 19, “It is hopeful that the electronic-based English proficiency test will be used in other courses of English.” together with Item 13, “Immediate feedback on the electronic-based English proficiency test helps the examinee to reflect on his/her learning.” were rated with the lower mean scores of 4.18 (SD=0.72) and 4.07 (SD=0.65) respectively.
Table 4.5 Quantitative data and interpretation of students’ level of agreement with the questionnaire items in User’s Opinion category
By investigating the data of the questionnaire items in User’s Opinion category above (Items 3, 6, 8, and 9), it was seen that the students’ level of agreement on the items in this category was considered as of the high level or agree as well with the total mean (x̅ ) score of 4.25 and SD of 0.81. Item 9, “The electronic-based English proficiency test is more modern than a paper-based test is.” was given the highest mean score of 4.40 and SD of 0.67 in this category. Interestingly, Item 6, “It is difficult to cheat on the electronic-based English proficiency test.” was rated as the lowest among all 23 items of the questionnaire with the mean score of 3.93 (SD=0.84) and its mean score was; therefore, the lowest one in this categeory.
Item
No. Items Mean
(x̅ ) SD Level of Agreement 3
The electronic-based English proficiency test better lessens the examinee’s anxiety than a paper-based test does.
4.31 0.83 Agree 6 It is difficult to cheat on the electronic-based
English proficiency test. 3.93 0.84 Agree
8
The electronic-based English proficiency test is faster to complete than a paper-based test is.
4.30 0.81 Agree 9 The electronic-based English proficiency test
is more modern than a paper-based test is. 4.40 0.67 Agree
Total 4.25 0.81 Agree
Table 4.6 Quantitative data and interpretation of students’ level of agreement with the questionnaire items in Design of the Test category
By considering the computed data of the questionnaire items in Design of the Test category consisting of Items 4, 5, 11, 15, 16, 17, 20, 21, 22, and 23 respectively, it was found that the students’ level of agreement on the questionnaire items in this category was also regarded as of the high level; simply put, the students agreed with all of the questionnaire items in this category with the total mean (x̅ ) score of 4.33 and SD of 0.71. Noticeably, Item 21,
“Page-by-page style of questions facilitates the examinee in taking a test.” was rated within the high level of students’ agreement with the highest mean (x̅ ) score in this category (x̅=4.42 and
Item
No. Items Mean
(x̅ ) SD Level of Agreement 4 All directions of the electronic-based English
proficiency test are easy to follow without any confusion.
4.27 0.72 Agree 5 The design of the electronic-based English
proficiency test is appropriate. 4.39 0.69 Agree 11 Registration process of the electronic-based
English proficiency test is easy. 4.40 0.67 Agree 15 Seeing the timer on the electronic-based
English proficiency test helps the examinee progress better.
4.23 0.78 Agree 16 Exam interface of the electronic-based English
proficiency test is friendly-user. 4.31 0.76 Agree 17 Exam interface design of the electronic-based
English proficiency test is appropriate. 4.30 0.69 Agree 20 Previous exam attempts interface of the
electronic-based English proficiency test is friendly-user.
4.31 0.66 Agree 21 Page-by-page style of questions facilitates the
examinee in taking a test. 4.42 0.68 Agree
22 Overview interface of the electronic-based
English proficiency test is friendly-user. 4.33 0.75 Agree 23 Overview interface design of the electronic-
based English proficiency test is appropriate. 4.26 0.72 Agree
Total 4.33 0.71 Agree
SD=0.68), and Item 11, “Registration process of the electronic-based English proficiency test is easy.” was rated as the second highest within this category (x̅=4.40 and SD=0.67). Item 15,
“Seeing the timer on the electronic-based English proficiency test helps the examinee progress better.” was also rated within the high level of students’ agreement but with the lowest mean score in this category (x̅ =4.23 and SD=0.78).
Up to this point, it was noticed that the total mean score of Design of the Test category was rated as the highest of allmean scores of four categories mentioned abovewith the total mean (x̅ ) score of 4.33 and SD of 0.71. The second highest total mean score was given to System Use category (x̅ =4.30 and SD=0.75); the next was for User’s Opinion category (x̅=4.25 and SD=0.81); and the lowest was rated on Learning Impact category with the total mean (x̅ ) score of 4.18 and SD of 0.67).
4.1.2 Data Analysis of Students’ Electronic-Based English Proficiency Test Scores
As mentioned in the previous chapter, the electronic-based English proficiency test items of this study had been validated by the experts and tested for difficulty, discrimination and reliability. Each test item was cross-checked and examined by the researchers to assure that it was rated with the acceptable scores before being administered with 355 students in the sample group. The test items consisted of 60 questions for 60 points that were divided into two parts including 30 questions for Reading part (30 points) and the other 30 questions for Listening part (30 points).
All of the students were assigned to do the test via the electronic-based English Proficiency test using MOODLE Platform at the same time as their classmates; using any gadget that they had such as a mobile phone, a computer laptop, and so on. Each student was given 80 minutes to complete and submit the test paper to the examiner on the system. The students’ total test scores as well as their scores of the reading part and of the listening part were collected and computed using the computer program for the analysis (see Appendix O for individual students’ test scores). Although the analysis and results of the individual students’
mean scores and standard deviation were not used as a main data to achieve research objectives in this study, they may be utilized as supplementary data to provide a clearer perspective on assessing the usability of the electronic-based English proficiency test.Analysis and results of the students’ test mean scores and standard deviation appear in Table 4.7 below.
Table 4.7 Students’ electronic-based English proficiency test scores
From the table above, it was noticeable that the students’ total mean score of the electronic-based test was 42.25 with SD of 9.98. To examine each part of the test items, it appeared that the mean score of the Listening part (x̅ =22.31, SD=5.18) was a little higher than the mean score of the Reading part (x̅ =19.93, SD=5.63); resulting in the mean difference of 2.38 only. To simply put, the students’ performance on the listening part was rather better than their performance on the reading part. This was illustrated in Figure 4.1 below.
Figure 4.1 Students’ electronic-based English proficiency test scores
Test items Mean SD
Reading 19.93 5.63
Listening 22.31 5.18
Total 42.25 9.98