• Tidak ada hasil yang ditemukan

The development of electronic-based English proficiency test for tertiary-level students

N/A
N/A
Nguyễn Gia Hào

Academic year: 2023

Membagikan "The development of electronic-based English proficiency test for tertiary-level students"

Copied!
167
0
0

Teks penuh

The objectives of this study were to develop an electronic test of English proficiency for tertiary students; and rate its usability in four categories, including system usability, learning effect, user opinion, and test design. The results showed that the students in the sample group rated all the items of the questionnaire on the development and usability of the electronic English proficiency test with a high level of agreement with an overall mean (x̅) score of 4.27 and SD of 0.27. . Interestingly, the overall mean (x̅) score of students' level of agreement on the test design was the highest among all four categories with an overall mean (x̅) score of 4.33 and SD of 0.71.

Table                          Page
Table Page

Introduction 1

Background and Rationale of the Study 1

The ordering of exam questions and the possibility of repeating the test several times add to the advantages of online assessments (Betlej, 2013). Some of them are; data and test results take some time to check and announce; the test does not allow students to make their own assessments at any time when they are ready to be assessed; the test uses a lot of paper in addition to administration time which adds to the cost of administering the test. In addition to its video-audio support option, Moodle can also allow the submission of document files within the test itself meaning that a wide range of software applications can be deployed as problem-solving tools within the same system.

Research Questions 4

Scope of the Study 4

In addition, 3 pilot study sessions were selected to test the reliability, quality, and effectiveness of the electronic English proficiency test. This study involved the development of an Electronic Test of English Proficiency (CEFR) with quantitative and qualitative data, which were used with carefully designed research instruments to protect both types of data. The electronic English proficiency test was the independent variable, while the development and usability of the electronic English proficiency test were the dependent variables.

Limitations of the Study 6

An electronic test refers to any form of test that requires examinees to use electronic tools such as computers, tablets or mobile phones. English proficiency refers to students' ability to perform certain tasks using the English language in given situations. A knowledge test refers to any test that is given to students to measure their level of knowledge.

Significance of the Study 7

By trying other applications or systems, the many different results and outcomes would emerge. This chapter presents concepts, models and literature related to the study to provide the theoretical background for the study. The study shared the results of other studies closely related to the one being conducted.

Assessment as an Education Process 8

Assessment in English Language Teaching 9

This is one of the popular types of TOEFL exams, which has completely replaced the computer-based knowledge test since 2006. These demanding standardized language tests have become more important since the introduction of the Common European Framework of Reference for Languages: Learning, Teaching, Assessment (CEFR). If a test result can be one of the CEFR levels, it becomes clear what this result means and what test takers with at least this result are likely to be able to do.

Figure  2.1 Interpreting  Results.  From  “Mapping the  TOEIC tests  on CEFR,” by  ETS, 2021  (https://www.ets.org/s/toeic/pdf/toeic-cef-mapping-flyer.pdf)
Figure 2.1 Interpreting Results. From “Mapping the TOEIC tests on CEFR,” by ETS, 2021 (https://www.ets.org/s/toeic/pdf/toeic-cef-mapping-flyer.pdf)

Electronic-Based Assessment 15

One of the most essential parts of any LMS is the subsystem that enables electronic testing. Some of the LMSs that teachers and administrators rely on to manage electronic testing at the tertiary level of education are Blackboard, Canvas, Google Classroom, and Moodle. On the other hand, Moodle is the most popular type of open source platform and is free, extensible and customizable (Gotarkar, 2017).

Moodle as a Platform for e-Testing 16

Calculated questions provide a way to create individual numerical questions through the use of wildcards that are replaced with individual values ​​when the quiz is taken. The application must be downloaded and installed on the device the student is using to take the quiz. Using this security feature, students can only attempt the quiz if they use Safe Exam Browser (SEB).

Related Research and Studies 22

Students further pointed out that the Moodle platform enables them to recognize their progress which encouraged them to study more and focus on the subject. Most students were in favor of doing other assessment components with the MOODLE platform such as Forums, blogs, online assignments, group cases and activities, debates, etc. The most attractive thing about e-assessment was that it was free from human errors which made it more reliable. for students positively influencing ease of use and perceived efficiency.

Research Methodology 24

  • Research Design 24
  • Research Setting 26
  • Population and Sample of the Study 27
  • Research Instruments 28
  • Validity and Reliability of the Instruments 32
  • Data Collection Procedures 35
  • Data Analysis 36

5) 12 questions about the design of the test (Example: the design of the electronic English proficiency test is appropriate.). The explanation of the statistical analysis of the mean score according to the students' responses to the electronic test of English proficiency is shown in Table 3.2 below. To analyze the data of this study, quantitative data were calculated and analyzed using students' scores on electronic English proficiency tests and questionnaire responses.

Figure 3.1 Illustration of Research Design Pilot 1
Figure 3.1 Illustration of Research Design Pilot 1

Research Results 39

Quantitative Data Analysis 39

All the instructions of the electronic English proficiency test are easy to follow without any confusion. Watching the timer in the electronic English proficiency test helps the examinee to make better progress. The electronic English proficiency test is quicker to complete than a paper-based test.

Table 4.1 Questionnaire items category
Table 4.1 Questionnaire items category

Qualitative Data Analysis 49

I was able to pay better attention than when I took the test in the exam room and I was not distracted during the electronic test.” (Interviewee 20). Some interviewees reported that the electronic language proficiency test helped them manage time and complete the test efficiently. I liked the listening part and the format of the test because I could do each part first or later.

Some interviewees said they were literally pressured into the time allotted for the test. Three interviewees expressed that they did not like the time given to them to take the test. I did not like the time given to take the test, which was not enough, especially in the reading part.

Interestingly, there were 3 interviewees who expressed that they did not like the test at all and everything on this electronic-based English proficiency test was fine. The difficult part of the test was about the internet system because it was out of our control. It was convenient as we didn't have to do the test on paper or on campus.

Furthermore, the interviewees seemed to concentrate better on the electronic test than on the paper test because they were alone during the test.

Conclusion 65

Conclusion 65

The above-mentioned mean (x̅) scores showed that the students' overall agreement with the questionnaire on the development and usability of the electronic language proficiency test was at a high level. More than half of those interviewed indicated that this electronic test was useful. The electronic English proficiency test is more systematic than a paper test.”, and item 18, “The design of the interface of the examination results of the electronic language proficiency test is appropriate.”.

Questionnaire Item 1, “The operating system of the electronic language proficiency test is smooth and convenient.”, and Item 7. This theme of responses supported the students' agreement on the questionnaire Item 13, “Immediate feedback on the electronic English language test helps the examinee to reflect on his/her learning process.” Thus, this theme of responses well supported the questionnaire results of item 3 on the User Opinion category: “The electronic language proficiency test reduces the subject's anxiety better than a paper test.”

The respondents' answers about the safety of the electronic-based test were considered as its advantage. The results of the students' agreement with Questionnaire Item 6, "It is difficult to cheat on the electronic-based English proficiency test." was supported by this set of answers. Some interviewees explained that one of the advantages of the electronic-based English proficiency test was the compatibility of the test format.

The clear interface design of the electronic English proficiency test is adequate.” also in the Test Design category.

Discussion 73

These results are supported by the focus group interview findings, in that all interviewees preferred the electronic language proficiency test over the traditional paper-based test. Simply put, students' responses to the questionnaire and focus group interview showed corresponding positivity about the development and usability of the electronic test. This was consistent with the results of Kundu and Bej's (2021) electronic assessment study, which also had a positive impact on students' perceptions.

These corresponded to interviewees' positive responses to the focus group interview regarding their satisfaction and preference for the modern, convenient and user-friendly system of the electronic-based test. It is clear that the interviewees' focus group interview responses supported the results on the electronic-based test's browsing difficulty. This was evidently supported by the highly rated agreement result from questionnaire item 3, "The electronic-based English proficiency test reduces the examinee's anxiety better than a paper-based test does." with mean score (x̅ ) of 4.31 and SD of 0.83.

In addition, some interviewees stated that the electronic test helped them manage time and solve the test more efficiently. High level of students' agreement with item 8 of the questionnaire, "The electronic test of English proficiency is completed faster than the paper test.". When asked what should be improved in the electronic test of English knowledge, only a few interviewees commented that they should set a longer duration of the test and require a more complex password to access the test system.

Surprisingly, more than a quarter of the interviewees found that nothing needs to be improved in the electronic English test.

Practical Implications 76

By being able to do the electronic-based test anytime and anywhere, students seem to have less anxiety, pressure, stress than when they do the paper-based test. Up to this part, it is clear that the advantages of the electronic-based test are welcomed by not only students, but also the teachers who are the examiners. Considering the security of electronic-based test, it is convenient for teachers/examiners when viewing the students/examiners during the test.

In addition, with certain types of application, the electronic based test can be recorded. It is likely that the examiners will mark the test papers and get the test results out faster with the electronic test. E-based testing offers greater security than paper-based testing for teachers/examiners as all questions, student information and test results are all stored electronically.

Given the significant advantages of electronic English proficiency testing in this study, English teachers/examiners at tertiary level should introduce and offer students/test takers this more modern, innovative type of proficiency testing. The benefits and advantages of electronic test are welcomed not only by students/test takers or teachers/test takers, but also by administrators in educational organizations. Administrators should take this electronic test at all levels in institutions.

Administrators should promote and support electronic-based English language proficiency testing, which may result in institutions being able to conduct English tests more frequently and with a wider geographic reach.

Recommendations for Further Study 79

It was convenient as I didn't have to go to the university to take the test. I had better attention span than taking the test in the exam hall and was not distracted while taking the electronic test. It was more convenient than the paper based test as we could take the test anywhere.

However, I wanted more time to perform the test, especially for listening with videos.

Illustration of the independent variable and dependent variables 6

Interpreting Results. Mapping the TOEIC tests on CEFR 13

Comparison Tables. Comparing Scores 13

Comparing IELTS and the Common European Framework 14

Illustration of Research Design 25

Students’ electronic-based English proficiency test scores 48

Gambar

Table                          Page
Figure 1.1 Illustration of the independent variable and dependent variables
Figure  2.2  Comparison  Tables.  From  “Comparing  Scores,”  by  ETS  TOEFL,  2021  (https://www.ets.org/toefl/score-users/scores-admissions/compare/)
Figure  2.1 Interpreting  Results.  From  “Mapping the  TOEIC tests  on CEFR,” by  ETS, 2021  (https://www.ets.org/s/toeic/pdf/toeic-cef-mapping-flyer.pdf)
+7

Referensi

Dokumen terkait

The objective of this study is to know whether there is significant difference on English proficiency among students of different gender in fifth grade students of SDN

This research will be different from all of those researches because in this research the researcher will analyze the level of reading literacy proficiency

As shown in Table 9, there is a difference in the level of English Language proficiency of Class 2014 students based on scores on listening comprehension, grammar and written

The test takers were required to tape their responses to Parts 1 and 2 of the test using their smartphone, and sent their audio files to a CBET email.. For Part 3, the test takers were

Development of Electronic Test to Measure Students’ English Abilities of Rajamangala University of Technology Phra Nakhon Watchara Phothisorn Field of English for International

The Influence of English Proficiency, IT Knowledge and Working Experience of UUM Accounting Students Towards Accounting Anxiety A thesis submitted to the College of Business In

Thus, a more general English test would be more accurate to capture the participants’ language proficiency level prior to the start of their study and an academic English test, like a

Subsequently, an independent sample t-test was used to examine the effectiveness of the Story Completion and Storytelling techniques in enhancing students' speaking proficiency...