• Tidak ada hasil yang ditemukan

Spreadsheet Assignment 1 - Processes and Closed Cycle Analysis

4.1 The Computer Spreadsheet Exercises

4.1.7 Spreadsheet Assignment 1 - Processes and Closed Cycle Analysis

As this was very likely the first time students have faced assessing their peers ideally, as highlighted in Chapter 2.5.1.2, students should perform some formative peer assessment as training. However, due to the time constraints of the project this was not possible. As the rubrics, discussed in Chapter 4.1.5, had been up in the computer laboratories for several weeks, students had had adequate time to study and utilise them if they wished. Before the first assessment took place the Researcher spent some time in class going over the rubric to explain what students were to do for the exercise. The assessor teams were to assess their allocated file as a whole, using the assessment table, graded on the Likert scale, mentioned in Chapter 4.1.5.

83 They also had to write down their comments in the spaces provided, stating the assignment’s good points and areas for improvement.

Only three rubrics did not have written comments in the spaces provided. The student comments varied widely and a copy of these appears in Appendix J, for assignment 1, reproduced verbatim from each rubric form, with staff assessment comments tabulated below with a cross reference to students comments. It also has a comment on the final outcome of their graphs in that spreadsheet. Students’ comments for Assignment 2 appear in the table in Appendix K.

4.1.7.2 Analysis of the Answers and the Graphs for Assignment 1

Of the 39 assessed assignments, only eight answers were placed on the rubrics as seen in Appendix E. None of these answers was correct. Twenty-six rubrics had graphs drawn on them by the student assessors, taken from the spreadsheet they marked, indicating that the graphs were included in the spreadsheets as required by the assignment. Of the solution sketches drawn on the marking rubric, only two were conceptually graphically correct, as seen in Appendix Q as compared to the scaled model solution illustrated next to the student sketch provided.

In analysing the first one in more detail, since no scale was included on the sketch, no further interpretation of its correctness according to the sketch could be ascertained from the rubric.

However, the shape according to the cycle requirements was appropriate to that particular problem question, PROBLEM#4 in Appendix M. When the particular assessed file was opened it was found that the correct PV diagram shape was there, as seen by Graph 4.1 below.

GRAPH 4.1 : Assessed student graph taken from Excel file

84 It was also noted that the graph illustrated was not calculated automatically, using the process formulas as it was supposed to, but that each point plot had been worked out manually and input into a table, as seen below in Table 4.5 below. However, the information used to draw the graph, in Table 4.5 was not as per the question given on the rubric, PROBLEM#4, seen in Appendix M.

Table 4.5 : Data used to draw Graph 4.5 above

The graph is therefore incorrect although conceptually it has the same shape as the solution to PROBLEM#4, seen in Appendix N.

The second sketch in Appendix Q, relating to PROBLEM#2, could not be confirmed as to whether the diagram, although conceptually correct, was indeed a true correct answer. This was due to the files associated with it not being available from the backups, most likely due to lost files, one of the problems described in Chapter 4.1.1.

The fact that no graphs were correct was cause for concern. However, several factors could have contributed to this, such as:

the file was not available to assess or was locked on the required day, one group reporting that the file was not locatable and three groups reporting locked files

the graph was incorrectly displayed, twenty showing either partially or incorrectly formed graphs

the graph was not in the spreadsheet at all, ten rubrics did not have graphs drawn on them at all and six groups specified that there was no graph in the spreadsheet

Pressure Volume 551.325 0.002

428.45 0.0025 348.678 0.003 292.937 0.0035 251.325 0.004 287.229 0.0035 335.1 0.003 402.12 0.0025 502.65 0.002 510.65 0.002 520.65 0.002 530.25 0.002 540.65 0.002 551.325 0.002

85

incorrectly input or calculated values were used to draw the graph

the graph was not automatically updated as new problems were entered into the spreadsheet, hence the original samples used as practice exercises were still generating the original practice graph (i.e. it was not dynamic).

the exercise was too difficult or too long

Other reasons may also have applied, as indicated by some of the other comments students placed on the rubrics. However, there was no mark allocated to the graph itself, nor for the answer, as mentioned in Chapter 4.1.6. The comments the students made on the assessment forms, followed by staff assessor’s comments, appear in Appendix J, as mentioned in Chapter 4.1.7.1. Although it was probably the first time students had assessed something, peer

assessment or otherwise, some groups obviously interrogated the rubrics fairly carefully. One of the rubric’s Likert Scale choices was associated with the graph, whether it appeared in the spreadsheet and if it updated itself in real time as new data was added, the only component that gave the graph a mark.. Reading through them, one can see that several student assessor groups commented on the graphs, their correctness, their inter-activeness (i.e. the ability to update itself when new information is presented to it), or simply the lack of one, as mentioned above.

In assessing some of the items in the other criteria students also referred to the keywords used in the rubric table in the ‘valid evidence’ column. This showed that they were at least attempting to consider key items in the spreadsheet assessments. This indicates that the assessment at least had face validity in that it attempts to assess items related to the outcomes for the subject.

However, from a reliability concern if one compares the comments made by the students and the moderator (moderation is discussed in Chapter 4.1.7.3) the comments do not appear to reach consensus.

Moreover, during the assessment exercise it was observed that quite a number of groups were attempting to solve the problem themselves manually whilst trying to assess the spreadsheets.

This was despite their being told that this was not required for the assessment exercise. The Researcher and his assistant had the solutions to all the problems with them, and students were told at the start that these could be viewed at any time to confirm the answers in the

spreadsheet’s solution. It was the spreadsheet itself that was supposed to be solving the problem. To the Researcher’s knowledge, no students approached the Researcher or his assistant to check the answers during the assessment session. This may have been due to the

86 students not listening clearly to the instructions beforehand, but simply getting on with the task so that they could leave when finished, as many did not stay after assessing even though they still had plenty of time to work on assignment 2.

Again the problem that the assessor groups often had was to locate the files they needed to assess, because the server was continuously interrupted by viruses and kept hanging and had to be rebooted on several occasions, as described in Chapter 4.1.4. Students thus started to get restless during the exercise as much valuable time was wasted, some leaving as mentioned earlier. Some time was also devoted to running around trying to find the originators of the files to unlock them before they could be assessed. During this time it was also noted that some students were also running around trying to find their assessor groups. When questioned about this they said that they were “worried that the assessors were not going to assess them fairly”.

They were told to go and do the task allocated to them, i.e. assess the file they were supposed to.

Just as Mindham (1998, p.50), as discussed in Chapter 2.6.2, mentions the inability of first time assessors to perform the task appropriately, it would appear that students don’t appear to trust other students to do the task either.

Another problem arose when file names were not correctly recorded according to the

instructions given on the rubric. The assessor group was instructed to save the assessed file with their assigned group code, together with the file extension “.ASS”, as seen in Appendix E. Only nine appear to have done so on the assessment day. This problem could have been partly alleviated if the Researcher had written the file names required into the allocated area on the assessment rubric sheet beforehand.

4.1.7.3 Moderation

Eleven out of the thirty-nine peer assessed assignments were moderated. This represents 28,2%.

Although the Researcher had originally indicated on the rubric that only 10% would be moderated, this would have totalled only 4 assignments, which would not have generated a large enough sample from which to obtain a valid moderation weighting factor.

The assignments to be moderated were specifically chosen after an initial evaluation by the Researcher of the returned rubrics, together with information provided by verbal feedback from the students on assessment day. The assignments chosen for moderation were those with very high or low marks and those where very diverse comments had been written on the rubrics.

87 As it was the students’ first attempt at assessing their peers, if there was a large difference between the peer and moderator’s assessment marks, then the final mark for that assignment became the average of the two marks. Guided by the DIT rule specifying that if a student’s class mark is greater than 20% different to his exam mark (DIT, 2006, p.30) they are automatically eligible for a rewrite, a cut-off point of 25%, slightly higher than the DIT guideline, was chosen.

This was done in order to achieve a more realistic and fair mark where groups may have been either too lenient or too strict in their marking, whilst still keeping the students’ evaluation of their peers in the marking loop. This will be called the adjustment factor. This adjustment factor was used in seven of the eleven moderated assignments, two going up and five going down.

Besides the adjustment factor a moderation weighting factor was also calculated by dividing the average of the peer assessed marks by the average of the moderated marks. The factor obtained by this process was 0,9664, indicating that the peer assessed marks were generally slightly higher than the moderated marks. All the final Assignment 1 percentage marks were adjusted by multiplying them by this factor which lowered them slightly.

The justification for making both these adjustments can be seen by comparing the Ogive curves in Graph 4.2 of the peer assessed assignments before and after the moderation weighting factor and adjustment factor were used. The curve of the adjusted marks is closer to the characteristic S-shaped curve of a normal Ogive graph, ogiving being an accepted method of normalising marks.

GRAPH 4.2: Graph of comparison of Ogive before and after moderation for Assignment 1

0 20 40 60 80 100

Cumulative Frequency (%)

0 2 4 6 8 10

Intervals

peer assessment before moderation peer assessment after moderation

Graph of comparison of Ogive before and after Moderation for Assignment

88 A further statistical analysis of the marks using Quattro Pro version 9 was also undertaken to evaluate the effect of the moderation exercise. Table 4.6(a) shows the statistical data of the students’ marks before and after the factors have been applied.

Table 4.6(a): Statistical Analysis of Moderation of Assignment 1 Statistical data Students’

generated marks

Moderator’s mark

Marks after moderation weighting factor applied

Marks after moderation weighting factor

and adjustment factor applied

Readings, n 39 11 39 39

Average, µ(%) 67,1 64,8 64,8 64,1

Population standard deviation, σ(%)

20,0 19,3 16,1

Sample standard deviation, σ(%)

16,2

Walpole and Meyers (1978, p.513) indicate that in a standard normal distribution curve the following applies:

68,3% of the population should lie between the mean and plus/minus one standard deviation on either side of the mean,

95,4% of the population should lie between the mean and plus/minus two standard deviations on either side of the mean,

99,7% of the population should lie between the mean and plus/minus three standard deviations on either side of the mean.

Table 4.6(b) shows the predicted and actual numbers of groups that fall within these ranges and their respective percentages before and after the factors have been applied. From this table it can be deduced that including the moderation weighting factor alone did not change the distribution, as seen by comparing the middle two columns of the table. However, including the adjustment factor as well had a slightly bigger impact on the distribution, although it lowered the number in the first standard deviation interval, but brought the numbers in the second standard interval more in line with the expected value, the numbers in the third interval remaining the same throughout.

89 Table 4.6(b): Predicted and actual numbers of groups in normal intervals

Normal intervals (predicted value in parentheses)

Number of groups after students’

marking

Number of groups after moderation

weighting factor

Number of groups after moderation

weighting and adjustment factors

mean + σ (68,3%) 26(66,7%) 26(66,7%) 25(64,1%)

mean + 2σ (95,4%) 36(92,3%) 36(92,3%) 37(94,9%)

mean + 3σ (99,7%) 39(100%) 39(100%) 39(100%)

4.1.8 Spreadsheet Assignment 2 - Non-flow and Steady-flow Energy Problems and