• Tidak ada hasil yang ditemukan

SOUTHEAST ASIAN MATHEMATICS EDUCATION JOURNAL

N/A
N/A
Protected

Academic year: 2018

Membagikan "SOUTHEAST ASIAN MATHEMATICS EDUCATION JOURNAL"

Copied!
78
0
0

Teks penuh

(1)
(2)

VOLUME , NO.4 4 DECEMBER2014 ISSN 2089-4716

South ast Asian Ministers of Education Organization (SEAMEO)e

Regional Centre for Quality Improvement of Teachers and Education Personnel (QITEP) in Mathematics Yogyakarta Indonesia

situated in

i

(3)
(4)

Southeast Asian Mathematics Education Journal – SEAMEJ DescriptionofSouth East AsianMathematics Education Journal

(SEAMEJ)

South East Asia Mathematics Education Journal (SEAMEJ) is an academic journal devoted to reflect a variety of research ideas and methods in the field of mathematics education. SEAMEJ aims to stimulate discussions at all levels of mathematics education through significant and innovative research studies. The Journal welcomes articles highlighting empirical as well as theoretical research studies, which have a perspective wider than local or national interest. All contributions are peer reviewed.

SEAMEO QITEP in Mathematics aim tos publishSEAMEJtwicea year, inJuneandDecember.

Contact Information: Professor Dr. Subanar

The Director of SEAMEO QITEP In Mathematics Jl. Kaliurang Km. 6, Sambisari, Condongcatur, Depok

Sleman, Yogyakarta, Indonesia. Phone: +62(274)889987

Fax: +62(274)887222

Email: secretariat@qitepinmath.org

(5)
(6)

v Editorial Board Members

Wahyudi SEAMEO QITEP in Mathematics

Widodo PPPPTK Matematika Yogyakarta

Ganung Anggraeni PPPPTK Matematika Yogyakarta

Fadjar Shadiq SEAMEO QITEP in Mathematics

Pujiati SEAMEO QITEP in Mathematics

Sahid SEAMEO QITEP in Mathematics

Anna Tri Lestari SEAMEO QITEP in Mathematics

Punang Amaripuja SEAMEO QITEP in Mathematics

Manuscript Editors

Sriyanti SEAMEO QITEP in Mathematics

Siti Khamimah SEAMEO QITEP in Mathematics

Rachma Noviani SEAMEO QITEP in Mathematics

Marfuah SEAMEO QITEP in Mathematics

Luqmanul Hakim SEAMEO QITEP in Mathematics

Mutiatul Hasanah SEAMEO QITEP in Mathematics

Tri Budi Wijayanto SEAMEO QITEP in Mathematics

Wahyu Kharina Praptiwi SEAMEO QITEP in Mathematics

Administrative Assistants

Rina Kusumayanti SEAMEO QITEP in Mathematics

Rini Handayani SEAMEO QITEP in Mathematics

Cover Design & Typesetting

Joko Setiyono SEAMEO QITEP in Mathematics

Suhananto SEAMEO QITEP in Mathematics

Eko Nugroho SEAMEO QITEP in Mathematics

Tika Setyawati SEAMEO QITEP in Mathematics

Febriarto Cahyo Nugroho SEAMEO QITEP in Mathematics

International Advisory Panels

Professor Mohan Chinnappan University of Adelaide, Australia

Professor Philip Clarkson Australian Catholic University, Melbourne Professor Lim Chap Sam Universiti Sains Malaysia

Dr Ui Hock SEAMEO RECSAM Malaysia

Chair

Subanar SEAMEO QITEP in Mathematics

Chief Editor

(7)
(8)

Allan Leslie White & Wahyudi 1 Editorial

Joseph B. W. YEO 3 Mathematical Investigation Proficiency among Singapore Secondary School Students: An Exploratory Study

Catherine Attard 23 Engagement, Technology, and Mathematics: Students’ Perceptions

Dorian Stoilescu 35 Exploring Challenges in Integrating ICT in Secondary Mathematics with TPACK

Allan Leslie White 57 Juggling Mathematical Understanding

CONTENTS

(9)
(10)

1 Editorial

The South East Asian Mathematics Education Journal (SEAMEJ) is an academic journal devoted to publishing a variety of research studies and theoretical papers in the field of

mathematics education.SEAMEJseeks to stimulate discussion at all levels of the mathematics education community.SEAMEJaims to eventually publish an edition twice a

year, in June and December.

SEAMEJis supported by the Southeast Asian Ministers of Education Organization (SEAMEO), Centre for Quality Improvement of Teachers and Education Personnel (QITEP)

in Mathematics situated in Yogyakarta Indonesia. Launched on July 13, 2009, there are now three QITEP SEAMEO Centres for Quality Improvement of Teachers and Education Personnel in Indonesia. One centre is in Mathematics (Yogyakarta), one in Science (Bandung)

and one in Languages (English - Jakarta).

In this issue we present papers that were subjected to a blind review process by the International Review Panel and we are thankful to the panel for their continued support. This

collection of papers reflects the diversity of contributors with papers originating from research conducted in Singapore, Canada and Australia. The first paper presents an exploratory study of mathematical investigation proficiency among Singapore secondary school students. This is followed by a paper reporting the results from the two Australian studies into the relationship between the ways in which technologies are incorporated into

mathematics lessons, and teacher experience and expertise. The third paper describes the challenges encountered by three experienced Canadian secondary mathematics teachers when

they try to integrate ICT into their classrooms. And the final paper presents a theoretical model for the teaching for understanding of school mathematics.

As we are still refining our processes, we wish to apologise if we have made errors or omissions. We welcome feedback and suggestions for improvement, but most of all, we

welcome paper contributions.

The Journalseeks articles highlighting empirical as well as theoretical research studies, particularly those thathave a perspective wider than local or national interest . Alls contributionstoSEAMEJwill bepeer reviewedand we are indebted to those on the

International Advisory Panel for their support. Dr Wahyudi

(11)
(12)

3

Mathematical Investigation Proficiency among Singapore Secondary

School Students: An Exploratory Study

Joseph B. W. YEO

National Institute of Education, Nanyang Technological University, Singapore <josephbw.yeo@nie.edu.sg>

Abstract

This article presents an exploratory study to find out whether high-ability secondary school students in Singapore were able to deal with open mathematical investigative tasks. A class of Secondary One (or Grade 7) students, who had no prior experience with this kind of investigation, were given a paper-and-pencil test consisting of four open tasks. The results show that these students did not even know how to begin, despite sample questions being given in the first two tasks to guide and help them pose their own problems. The main difficulty was the inability to understand the task requirement: what does it mean to investigate? Another issue was the difference between searching for any patterns without a specific problem to solve, and searching for patterns to solve a given problem. The implications of these findings on teaching and on research methodologies that rely on paper-and-pencil test instruments will also be discussed.

Keywords: Mathematical investigation; open investigative tasks; thinking processes; research methodologies

Background

Mathematical problem solving is at the heart of the Singapore school mathematics curriculum and it includes “a wide range of situations from routine mathematical problems to problems in unfamiliar context and open-ended investigations” (Ministry of Education of

Singapore, 2000, p. 10). However, most mathematics teachers in Singapore are only familiar

with solving mathematical problems but they do not know much about mathematical

investigation. Whenever I talked to teachers about mathematical investigation, many of them would look at me blankly and ask, “What is that?” (Yeo, 2008)

There are important thinking processes that can only be developed more fully when

students engage in mathematical investigation (see next section). If teachers do not know

what an investigation is, then they will not be able to develop these processes in their students

(Frobisher, 1994). Thus I embarked on a research study to investigate the current proficiency

in open investigation among secondary school students and to develop a teaching experiment

to help students improve on their competency in investigation so that teachers can adopt a

similar approach for their students. The original plan was to devise a paper-and-pencil test to

study the nature of processes in open investigation and the current proficiency among a large

(13)

4

data collected to investigate the development of these processes among a small sample of

about 30 students (in depth).

The first problem was the construction of a suitable test. I conducted a small study

with a convenient sample of pre-service teachers to see if they could deal with open investigative tasks that contained no specific problems but ended with the word ‘investigate’ (Yeo, 2008). As expected, most of them were unable to begin because they did not

understand the task requirement: what does the task want them to investigate? The

implication was that most secondary school students might also have difficulty starting. Thus

there was a need to modify the test instrument. If the students were given specific problems,

then this would not be open investigation. So I thought of the idea of giving guided questions

for the first two tasks and then no scaffolding for the last two tasks in the hope that students

would know how and what to investigate for the last two open tasks. The first task with the

guided question of finding as many patterns as possible was tried out on the same sample of

pre-service teachers and this time all of them knew what to do.

In this article, I will outline an exploratory study where a class of high-ability students

were given this test instrument in which the outcome was very surprising because most of the

students could not even start on the first guided task! It revealed an important difference

between a written test for open investigation and for problem solving, which has some serious

implications for future research in this area. The study also shed some light on the current

proficiency of Singapore secondary school students in open investigation.

Literature Review

Lee and Miller (1997) believed that “investigations, by their very nature, demand an open-minded, multifaceted approach” (p. 6) and Bailey (2007) asserted that a mathematical investigation is “an open-ended problem or statement that lends itself to the possibility of multiple mathematical pathways being explored, leading to a variety of mathematical ideas and/or solutions” (p. 103). Orton and Frobisher (1996) contended that mathematical investigation must be open in the sense that it should not contain any specified problems for

students to solve, and so mathematical investigation involves both problem posing and

problem solving (Cai & Cifarelli, 2005). Although HMI (1985) stipulated that there is no

(14)

5

separating them into two distinct activities: investigation is divergent while problem solving is convergent. Ernest (1991) also described problem solving as “trail-blazing to a desired location” (p. 285), but investigation as the exploration of an unknown land. However, Evans (1987) argued that if a student does not know what to do when faced with an investigation,

then the investigation is still a problem to the student. Pirie (1987) alleged that no fruitful service would be performed by indulging in the ‘investigation’ versus ‘problem solving’ dispute, but I agree with Frobisher (1994) that this was a crucial issue which would affect

how and what teachers teach their students.

Yeo and Yeap (2010) attempted to resolve this issue by suggesting that the term ‘mathematical investigation’ has different meanings. Just as Christiansen and Walther (1986) distinguished between a task and an activity, Yeo and Yeap (2010) asserted that investigation

can refer to the task, the process or the activity. For example, an open investigative task is the

Mathematical Investigative Task 3 shown in Figure 1 in the next section where no specified

problems are given in the task statement and so the task is open for students to pose any

problems to investigate; while a closed problem-solving task is the specific problem posed in

part (a) of the Mathematical Investigative Task 2 shown in the same figure. When students

attempt this kind of open investigative tasks, they are engaged in an open investigative

activity, and this is consistent with the distinction made by Christiansen and Walther (1986)

between a task and an activity. Thus an open investigative activity will involve posing and

solving problems.

Yeo and Yeap (2010) also contended that posing a problem to investigate means that

the student has not started the investigation, so there is a difference between investigation as an activity and investigation as a process. An analogy is Pólya’s (1957) four stages of problem solving for closed problems or what is called problem-solving tasks in this article.

During the first stage, the problem solver should try to understand the problem before

attempting to devise a plan to solve it in the second stage. The third stage is carrying out the

plan while the fourth stage is looking back after solving the problem. Another analogy is

cooking. In order to cook (viewed as an activity), you need to prepare the ingredients before

the actual cooking process, and you need to scoop up the food onto a serving dish after

cooking. Similarly, in an open investigative activity, the students need to understand the task

and pose a problem to investigate before engaging in the actual process of investigation. After

(15)

6

investigate. Yeo and Yeap (2010) observed that the process of investigation usually involves

examining empirical data or specific cases, formulating and justifying conjectures, and

generalising. But these are the four core mathematical thinking processes advocated by

Mason, Burton and Stacey (1985) for solving closed problem-solving tasks. Hence, Yeo and

Yeap (2010) argued that investigation as a process can also occur when solving closed

problem-solving tasks. It is noteworthy that Jaworski (1994) also viewed an investigative

approach to mathematics teaching as involving specialising, conjecturing, justifying and generalisation when she observed that “making and justifying conjectures was common to all three [investigative] classrooms, as was seeking generality through exploration of special

cases and simplified situations” (p. 171).

However, there are some processes that students can develop more extensively in

open investigation than in closed problem-solving. For example, problem posing is one of the

primary foci of open investigation because students cannot start to investigate unless they

know how to pose a suitable problem, but students can solve a problem-solving task without

posing any problem. Another example is the difference, as suggested by the present study,

between searching for any patterns in an open investigation where there is no specific

problem to solve and searching for patterns to solve a problem-solving task.

Presently, there are a lot more research studies on problem solving than on open

investigation. But many of the latter (e.g. Ng, 2003; Tanner, 1989) only dealt with the general

benefits of investigation without examining in details the actual thinking processes that

students engage in when they do investigation. Therefore, there is a big research gap in

studying the processes in open investigation, even though open investigation is emphasised in

many school curricula, e.g. the Cockcroft Report in the United Kingdom (Cockcroft, 1982)

and the Australian national curriculum (Australian Education Council, 1991). Hence, there is

a need for more research in this area, and the present exploratory study is part of a bigger

research project to investigate the nature and development of processes in open investigation

(16)

7

Methodology

It was decided that the sample should consist of high-ability students with no prior

experience in this kind of open investigation as I needed the high-ability students to provide

me with a wider range of possible thinking processes to plan for future research in this area.

Thus a convenient intact class of 29 high-ability Secondary One (or Grade 7) students from a

top secondary school in Singapore was selected. It is important to note that these students did

not belong to the Gifted Education Programme (GEP) because the latter were more likely to

encounter this type of investigation. This was further confirmed from a written survey

conducted at the end of the paper-and-pencil test where all the students in the sample wrote

that they had not seen this kind of open investigative tasks before. The written survey also asked for the students’ Primary School Leaving Examinations (PSLE) score. Twenty of the 29 students (or 69%) had a PSLE score of 253 to 255 (i.e. they were in the top 1% of the

entire primary school cohort in Singapore), five students scored between 242 and 249, and

three students scored between 230 to 239. The last student did not write his score.

The written test consisted of four investigative tasks (see Fig. 1) to be completed

individually within one hour and thirty minutes. As explained in the first section, the first two

tasks had guiding questions while the last two tasks were more open. The first and third tasks

are on number patterns, while the second and fourth tasks are ‘word problems’ (i.e. problems

with a context) that have been made more open by asking students to investigate further. The

(17)

8

Figure 1: Mathematical Investigative Tasks for Test Instrument

Two types of scoring rubrics were developed: one to analyse how and the other to

analyse what students investigated. The rubrics were task-specific, so there were two rubrics

for each task, but the descriptors for each type of rubric were the same, except for the

(18)

9

were five levels. For example, in Level 1H0, the first number refers to the task number, the

letter H stands for How and the last number refers to the level. Although some students did

evaluate some powers of 9, they either did nothing after that or they could not even find the

simplest patterns, thus suggesting that they did not know how to investigate and so they

should be classified under Level 1H1. This is different from students who knew how to

investigate by evaluating powers of 9 and finding some patterns, which is Level 1H2. If

students did not realise that the patterns that they have discovered are just conjectures, then

they will be placed under Level 1H2. If they recognised that the observed patterns are

conjectures but did not test them, they will be classified under Level 1H3. The highest level

1H4 is reserved for students who try to test conjectures even if they are not be able to prove

or refute the conjectures.

Table 1

A Scoring Rubric to Describe How Students Investigated Task 1

Level Descriptors and Exemplars No. of

Students Percentage 1H4 Students knew how to investigate by testing conjectures and

generalising.

0 0%

1H3 Students knew how to investigate by formulating conjectures, i.e. they recognised that observed patterns were just

conjectures and might not be true.

0 0%

1H2 Students knew how to investigate by examining empirical data, e.g. evaluating the numerical values of some powers.

14 48%

1H1 Students did not know how to investigate but did something. For example:

 Students just wrote superficial or irrelevant things.

 Students just evaluated some powers and then did nothing or could not even find the simplest patterns.

9 31%

1H0 Students did not write anything. 6 21%

Total 29 100%

Table 2 shows the second rubric which described what students investigated in the

first task. There were also five levels. If students investigated or found trivial patterns such as ‘powers of 9 are divisible by 9’, they will be placed in Level 1W1 because this shows that they did not know what to investigate. Levels 1W2 to 1W4 are for different complexities of

non-trivial patterns which students investigated or discovered. Wrong conjectures were also

included because the process of formulating conjectures was also important, even if the

conjectures turned out to be false. If a student discovered more than one pattern, then the

(19)

10

the patterns described in the scoring rubric were just exemplars and not all of them were discovered by the students. A stratified sample consisting of about 10% of the students’ answer scripts was given to a former teacher, who had a Master’s degree in mathematics education and experience in coding qualitative data, to grade. The inter-rater reliability of

88% was rather high, partly because these rubrics were not so complicated and partly because

enough exemplars were given in the descriptors.

Table 2

A Scoring Rubric to Describe What Students Investigated for Task 1

Level Descriptors and Exemplars No. of

Students Percentage 1W4 Students investigated or discovered very complicated patterns,

which might be wrong. For example:

 The last two digits of the powers of 9 repeat themselves after 10 times.

 The last two digits of the powers of other single-digit numbers also repeats itself but after different numbers of times.

0 0%

1W3 Students investigated or discovered moderately complicated patterns, which might be wrong. For example:

 The last digit of the powers of other single-digit numbers also repeats itself but after different numbers of times.

 When multiplying two powers of 9, the indices are added together to give the index of the resulting power of 9 (law of indices).

1 3%

1W2 Students investigated or discovered non-trivial but simple patterns, which might be wrong. For example:

 Powers of 9 are odd.

 The last digit of the powers of 9 repeats itself after 2 times.

 Powers of 9 are divisible by factors of 9.

 The sum of all the digits of powers of 9 is divisible by 9.

 Odd powers of 9 contain at least one digit 9 and even powers of 9 contain no digit 9 (the latter is false since 912 contains one digit 9).

 9n has n digits (which is false since 922 has 21 digits).

8 28%

1W1 Students did not know what to investigate but did something. For example,

 Students just wrote superficial or irrelevant things and did not discover anything.

 Students investigated or discovered trivial patterns such as:

 Powers of 9 are divisible by 9 or are multiples of 9.

 When a power of 9 is divided by 9, the result is the preceding power of 9.

 When the index of a power of 9 is increased by 1, the new number is the same as multiplying the original power of 9 by 9.

14 48%

1W0 Students did not write anything. 6 21%

(20)

11

Findings

Before analysing the students’ answers, it was worthwhile to describe what happened in the classroom just after the start of the test. Within five minutes, five students raised their

hand and asked me what they were supposed to do for the first investigative task. I told them

to read the sample questions but they replied that they did not understand these questions.

Although I would very much like to explain the task requirement further, I refrained from

doing so, partly because the purpose of this test instrument was to see if the students knew

how and what to investigate when given this kind of open investigative tasks, and partly because I would not be the one administering the test and answering students’ questions if the test was given to a large sample of students. I will now analyse how and what these students

investigated.

Mathematical Investigative Task 1: Powers of a Number

Table 1 in the previous section shows the number and percentage of students who

achieved each level for Task 1. Six students wrote nothing (Level 1H0), probably because

they did not understand what it meant to find any patterns. The workings of another nine

students showed that they did not know how to investigate. For example, four of them wrote

superficial stuff such as ‘the pattern is 91, 92, 93, 94, 95 which will continue’ or ‘96 = 9 x 9 x

9 x 9 x 9 x 9’ without evaluating the powers; five of them did find the numerical values of

some powers of 9, but two of them just stopped there while the other three said that they could not find any patterns, not even simple patterns such as ‘all the powers of 9 are odd’ or ‘the last digit alternates between 1 and 9’. In total, 15 out of the 29 students (or 52%) did not know how to investigate by examining empirical data (Levels 1H0 to 1H1). The remaining 14

students knew how to investigate, but there were no attempts to justify why the patterns

worked (Level 1H2). No students attained Levels 1H3 or 1H4.

Table 2 in the previous section shows that six students wrote nothing (Level 1W0),

while 14 students wrote something but they did not know what to investigate, e.g. six of them observed trivial patterns such as ‘powers of 9 are divisible by 9’ or ‘when a power of 9 is divided by 9, the result is the preceding power of 9’ In total, 20 out of the 29 students (or

69%) did not know what to investigate (Levels 1W0 to 1W1).

The remaining nine students were able to find some non-trivial patterns although most

(21)

12

(Level 1W2). It was surprising that only four students discovered that ‘the last digit alternates between 1 and 9’ since this was one of the more obvious patterns. Two of these four students also examined the last digit of powers of other numbers: one of them investigated powers of 6

and powers of 2, and concluded that powers of different numbers had their own patterns but did not identify the patterns; the other discovered that ‘the last digit of powers of 4 is always 4’, but after evaluating the first few powers of 5, he concluded that ‘the pattern for powers of 9 does not apply to powers of 4 and powers of 5’. It seems that this student interpreted the suggested problem at the end of the task statement, “Do these patterns occur for powers of other numbers?” to mean whether the exact pattern for the last digit of powers of 9 applies to powers of other numbers. He was unable to see beyond this specific pattern to the general

pattern that the last digit of powers of any number repeats itself. These two students were

classified under Level 1W2 instead of Level 1W3 because the latter level required the

students to be proficient in investigating the last digit of powers of other numbers but it was

doubtful that these two students knew what they were doing. None of the students

investigated the patterns for the last two digits (Level 1W4). Three students also discovered that ‘the sum of the digits of powers of 9 is divisible by 9’, which was rather unexpected because such pattern would not have existed if powers of 2, 4, 7 or 8 (with a period of 4 for

the repeating pattern) were used in Task 1 instead. One of the three students shed some lights

as to why he investigated the sum of the digits:

Student: As I have read a book on Maths, I remembered that the sum of the digits in each number adds up to a multiple of 9.

Two of the above three students who discovered about the sum of the digits of powers

of 9 also investigated the sum of the digits of powers of 2 and powers of 3, and concluded

that the sum of the digits of powers of 3 is divisible by 3. The remaining student investigated

powers of 4, but stopped after evaluating 42 and 43. Therefore, only a total of five students

(or 17%) investigated powers of other numbers, including the previous two students who

examined the last digit of powers of other numbers.

There were also two wrong but interesting conjectures made by two other students. One of them concluded that ‘odd powers of 9 contain at least one digit 9 but even powers of 9 do not contain any digit 9’. However, he based his finding on the first 10 powers of 9, but the

next even power of 9, 912, contains a digit 9. Another student believed that ‘9n has n digits’.

(22)

13

did not realise that they could not generalise by looking at only a few cases without any

justification.

Only one student discovered a moderately complicated pattern (Level 1W3): a law of

indices which the class had not studied before. The working of the student also suggests that

he had not learnt this law from elsewhere because he wrote the following to describe what he

was doing.

Student: Trying to multiply the powers to see if they make out another power.

In the end, the student discovered that 9m x 9n = 9m+n. This finding was rather

similar to a research study conducted by Lampert (1990). During a series of lessons, a class of

fifth-grade students were investigating squares when they discovered some patterns in the last

digit of the squares. Lampert then made use of the students’ discoveries to extend the problem to “What is the last digit of 54? 64? 74?” and later to “What is the last digit of 75? 76 77 78?” The students were unsure of how to obtain 78 from 74. This eventually led to the discovery

that the indices should be added together when multiplying powers with the same base.

Mathematical Investigative Task 2: Basketball Tournament

This task had a guiding problem in part (a) and students were free to investigate

anything in part (b). Due to space constraint, it is not possible to provide the two scoring

rubrics for the task, so the main findings will be described instead. For part (a), 20 out of 29 students (or 69%) obtained the correct answer by using 19 + 18 + 17 + … + 1 = 190. This was not surprising since this was a common problem for primary school students, and 22 of these

students had indicated in the post-test survey that they had seen or solved this part-question or

a similar question. Of the remaining nine students, five made some careless mistakes while

four got the concept wrong by doing direct multiplication such as 20 x 19 = 380 matches.

For part (b), the issue was whether the students knew how and what to investigate

when no problem was given. In particular, did the students try to generalise? Three students wrote nothing for part (b) while 15 students posed irrelevant or poor questions such as “how many players score in one team?” or “how many players are there altogether if there are 11 players in each of the 20 teams?” The solution for the latter problem is just multiplication,

which is structurally different from the solution for part (a). In total, 18 out of the 29 students

(23)

14

problem in part (a). Of the remaining 11 students, 10 of them posed some relevant problems,

but not those that tend towards generalisation. For example, five students posed a similar

question as part (a), except that two changed the context while three changed the numbers.

For the latter, the three students stopped at the question without attempting to see if there was

any pattern that could lead to generalisation. Therefore, an overwhelming 28 out of the 29

students (or 97%) did not try to generalise. Only one student attempted some kind of

generalisation. He tried to find a pattern between the number of teams and the total number of

matches they had to play against each other. In a way, this was trying to generalise the total

number of matches for different numbers of teams, but he got part (a) wrong, meaning he just

took 20 teams x 19 matches per team = 380 matches, so his pattern in part (b) was also

incorrect.

Mathematical Investigative Task 3: Polite Numbers

This task had no scaffolding and students were expected to pose their own problems

to investigate. The structure of this task was more similar to that of Task 1 because both tasks

involved looking for any patterns. I will first look at how the students investigated. Four

students wrote nothing and six students wrote irrelevant or superficial things such as ‘if c = 2 + 3 + 4 + 5, then we can find c = 14’ or ‘what is the sum of 9 + 11 + 18?’ Nine students started with some random natural numbers and attempted to express them as sums of

consecutive natural numbers to find out whether they are polite or to discover some other

patterns. In total, 19 out of the 29 students (or 66%) did not know how to investigate by

systematically listing the sums of consecutive natural numbers to sieve out the polite numbers

or to discover other patterns. For the remaining 10 students, only three explained why their

conjectures were correct while one proved that his conjecture was wrong based on a counter

example.

I will now look at what the students investigated. Four students wrote nothing, and nine students wrote incorrect trivial results such as ‘the polite numbers kept on increasing from 9 to 11 to 18 and could not decrease’ (which was false). Another six students observed trivial patterns such as ‘all numbers cannot be polite numbers’ and ‘to find the next polite number for say, 6 = 1 + 2 + 3, just add 4, i.e. 1 + 2 + 3 + 4 = 10’ (but 10 is not the next polite number after 6). In total, 19 out of the 29 students (or 66%) did not know what to investigate.

(24)

15

successive polite numbers that are sums of two (or three) consecutive numbers, e.g. one of them proved that ‘the difference between successive polite numbers that are sums of two consecutive numbers is always 2, and the difference between successive polite numbers that are sums of three consecutive numbers is always 3’. The last student tried to find out ‘whether a bigger number can be expressed as a sum of consecutive numbers in more ways than a smaller number’ since one of the examples given in the task was expressed as a sum in two ways (this is called the degree of politeness), but unfortunately, he misinterpreted what polite

numbers were because his sums were sums of distinct natural numbers and not consecutive

numbers. None of the students examined more complicated patterns such as the difference

between successive polite numbers that are sums of more than three consecutive numbers, nor

attempted to find a pattern for impolite numbers.

Mathematical Investigative Task 4: Amazing Race

The usual phrasing of a similar closed problem-solving task for this question is to find

out which person will win the race given a scenario. But to make the closed problem-solving

task into an open investigative task, two options were given in part (a) and students were

asked to investigate the difference between the two options; and for part (b), students were to

think of some other options for the race which they could investigate.

For part (a), it was very surprising that three students wrote that the difference

between the two options was that Bernard was 10 m in front of the start line in the first option

while Ali was 10 m behind the start line in the second option, which was exactly what the

task stated! Equally surprising was one student who argued that there was no difference

between the two options since the distances of the two runners from the start line were both

10 m, i.e. it did not matter to this student that one runner was starting 10 m after the start line

while the other runner was starting 10 m before the start line! Nine other students said that the

difference between the two options was the different distances that the runners had to run, i.e.

both runners needed to run 10 m more in the second option than in the first option. In fact,

three of these nine students said that Ali would have to use up more energy to run the longer

distance of 110 m in the second option. Another student claimed that there was no difference

because Bernard had to run 10 m less than Ali in both options. Four other students did not

write anything. In total, 18 out of the 29 students (or 62%) did not realise that the crucial

difference between the two options was how they would affect who would win the race.

(25)

16

not possible to judge whether they knew this crucial difference or not. Of the remaining nine

students, seven reasoned wrongly that both options would end in a draw, and only two

students were able to argue correctly that the first option would end in a draw while Ali

would win in the second option. In other words, only two out of the 29 students (or 7%)

answered part (a) correctly.

For part (b), 11 students wrote nothing. Another 11 students suggested a trivial option

to investigate, e.g. four of them wanted Bernard to run 10 m before Ali starts running, but this

was the same as Bernard starting 10 m after the start line which was the first option stated in

part (a). Another five students wanted the faster runner Ali to run on the outer lane of a

curved track with both starting at the same start line and ending at the same end line, but it

was puzzling that these five students wrote that Ali would have more distance to run and yet

they still called it a 100-m race or a 100-m curved track. Anyone who has watched athletic

meets on TV can tell you that 100-m races are always on the straight portions of the track

while the starting points for 400-m races are different for all the runners precisely because of

the curved track. In total, 22 out of the 29 students (or 76%) did not know what to investigate

for part (b). The remaining seven students were able to suggest a non-trivial option for the

race. One of them suggested the faster runner to reduce his speed by a certain amount and so

you have to recalculate who will win the race. Another claimed it was fair that the faster

runner should start 5 m behind the start line while the slower runner should start 5 m after the

starting line. Two other students wanted the faster runner to start 10 m after the starting line,

while the slower runner to start 20 m after the start line. The last three students suggested

giving the slower runner a head start of a certain timeframe such as two or three seconds, but

one of the three students recommended a few minutes’ head start! The latter did not seem to

have a sense of how long it would take to run 100 m. In fact, two other students drew a 100-m

oval track for part (a), suggesting that they might not have any idea how long the oval tracks

in a stadium were. It seems that some of these students were not able to connect the real

world to school mathematics.

Post-test Written Survey

Seventeen students (or 59%) commented in the post-test written survey that the test

was difficult. Of these 17 students, only three of them said that the challenge was good for

(26)

17

negatively. However, another six students wrote very negative comments about the test, three

of which were reprinted below.

Student 1: I feel it is pretty useless and a waste of my time.

Student 2: I think I am going to fail. I think that I am going to be scolded as I don’t I think that [the test] was good as it help [sic] me to find out that I’m dumb.

Student 3: Boring, Very Uninteresting, No motive to complete, time wastes [sic] Can be improve [sic] a lot.

Nevertheless, there were six positive comments, including the three who said that the difficulty was good for them. Only three of these comments were reproduced below.

Student 4: This test was fun and I enjoyed myself. It was fun to investigate and I would not mind another test.

Student 5: It is very challenging. It requires your brain and your knowledge and experience. Like, have you come across this sum before? I think I would like to practice some questions like this. It helps us to exercise our brains.

Student 6: It was interesting. But there should have been more time given as the questions were quite challenging.

Discussion

The data analysis shows that most of the high-ability students in the present study did

not know how and/or what to investigate. The first problem is the failure to understand what

it means to investigate. Even when sample problems were given in Task 1, many of them still

did not know what to do. This may be because the suggested problem about finding as many

patterns as possible about powers of 9 was too general for the students. In Singapore, students

are familiar with solving specific problems by looking for patterns, e.g. Task 2(a), but looking

for any patterns without having a problem to solve had confounded many students in this

study. It seems that the absence of a specific problem results in the students’ inability to

understand the task requirement, thus resulting in 52% of the students not knowing how to

investigate and 69% of the students not knowing what to investigate for Task 1. The students

also fared badly in the other three tasks: 62% were not able to pose relevant or good problems

to investigate in Task 2(b) despite the guiding problem in part (a); 66% did not know how to

investigate by systematic listing in Task 3; and 76% did not know what to investigate in Task

4(b).

The second problem is the inability of most of the students to pose their own problems

to investigate. Students in Singapore are familiar with solving problems but they are seldom

(27)

18

that ‘naturally follows’ from the given task, and he found that high-ability students were able to pose it directly but low-ability students were not able to do so unless they were given hints.

However, most of the students in the present study were not able to do so. For example, 86%

did not try to discover a pattern in the last digit of powers of 9 in Task 1; 97% did not attempt

to find a general formula for the number of matches in Task 2; 86% did not try to find out

which numbers in Task 3 are polite or impolite; and 62% were unable to investigate the

crucial difference between the two given options in Task 4(a). There may be another reason

why some of these students did not pose non-trivial problems to investigate. Mason et al.

(1985) was concerned that some students may pose only problems that they can solve. In the

present study, a student stated this intention explicitly:

Student: I think I constructed [sic] a simple investigation that can be solved easily.

Although the other students did not write this explicitly, this may be a real issue

because in traditional classrooms, the goal is to obtain the correct answer and getting stuck is

usually considered a drawback. So there may be a tendency for some of these students to pose

trivial problems that they can solve easily.

If many of the high-ability students in the present study did not know how and what to

investigate, then average and low-ability students may encounter even greater difficulty in

such written tests on open investigation although there may be exceptions. This suggests that

the current level of proficiency in open investigation among Singapore secondary school

students is very low. Although further research is needed to substantiate this, it is not

advisable to continue using this kind of paper-and-pencil test for other students because it

may result in more students hating open mathematical investigation, judging from the many

adverse reactions from the students in the current study.

Implications for Research and Teaching

Open mathematical investigation and closed problem solving are indeed very

different. For the former, students without prior experience do not know what the task

requirement is and so they cannot even start; for the latter, although students may not know

how to solve the problems, they can at least start trying because they know the goal is to solve

the problems. For problem solving, simpler problems can be included at the beginning to give

(28)

19

more specific question ‘find as many patterns as possible’ in Task 1 was still too general for the high-ability students in this current study to understand the task requirement; and a

sample problem in part (a) of Task 2 also did not help these students to know what to

investigate for part (b). As a result, the more guided investigative Tasks 1 and 2 did not help

the students to understand the task requirement of the more open investigative Tasks 3 and 4.

Therefore, this poses a serious dilemma for researchers on how they can assess students’ current level of competency in open investigation. It is unethical to give a paper -and-pencil test to a large group of students after knowing that this may cause many of them to

loathe open investigation. Even for research using a teaching experiment with a small group

of students, it will not be surprising if the students perform significantly better in the written

post-test than in the pre-test, as this is the consequence of the students not understanding the

task requirement in the pre-test, and not necessarily because the teaching experiment is

effective in developing the various thinking processes in open investigation.

Hence, it is necessary to rethink how to measure the efficacy of a teaching experiment

in open investigation. One suggestion is not to begin with a pre-test but with a familiarisation

programme so that students know what it means to investigate when given an open

investigative task. Once the students have understood the task requirement, they can then be

given a paper-and-pencil pre-test, followed by a teaching experiment with the aim of

developing in students the many different processes in open investigation. In this way,

students may not perform significantly better in the post-test if the teaching experiment is not

effective in cultivating these processes in the students; but any significant difference between

the pre-test and the post-test can then be attributed to the effectiveness of the teaching

experiment.

Teachers may know that students need to be taught how and what to investigate when

given open investigative tasks, but they may think that students can cope with questions

asking them to find as many patterns as possible. The present study has revealed that students

may have great difficulty finding any patterns when there is no specific problem to solve,

unlike finding patterns to solve a specific problem. So there is still a need for teachers to

(29)

20

Conclusion

We learn not only from the success of a research study. In fact, we can learn valuable

lessons from the failure of the paper-and-pencil test on open investigation in the present

study. For example, the current study has shed some light on the inadequacy of a pre-test on

open investigation that is administered before a teaching experiment, or a stand-alone written

test on open investigation as a research methodology, although a similar test instrument for

closed problem solving can still work quite well. Therefore, there is a need to rethink whether we should measure students’ current or initial level of competency in open investigation before any teaching experiment.

The present study has also revealed the low level of proficiency in open investigation

among secondary school students in Singapore. Since there are valuable processes that can

only be developed more fully in open investigation, such as problem posing and searching for

any patterns to understand the underlying mathematical structures, then Singapore students

should be exposed to this kind of investigation, but local teachers may not be ready to teach

this since most of them do not understand what open investigation is (Yeo, 2008). Moreover,

further research needs to be done to find out more about the nature and development of

processes in open investigation so as to inform teachers on how to plan and design suitable

open investigative tasks and teaching strategies to develop these processes in their students.

References

Australian Education Council. (1991). A national statement on mathematics for Australian schools. Melbourne: Curriculum Corporation.

Bailey, J. (2007). Mathematical investigations: A primary teacher educator’s narrative

journey of professional awareness. In J. Watson & K. Beswick (Eds.), Proceedings of the 30th annual conference of the Mathematics Education Research Group of Australasia (MERGA): Mathematics: Essential research, essential practice (Vol. 1, pp. 103-112). Adelaide, South Australia: MERGA.

Cai, J., & Cifarelli, V. (2005). Exploring mathematical exploration: How two college students formulated and solve their own mathematical problems. Focus on Learning Problems in Mathematics, 27(3), 43-72.

Christiansen, B., and Walther, G. (1986). Task and activity. In B. Christiansen, A. G.

Howson, & M. Otte (Eds.), Perspectives on mathematics education: Papers submitted by members of the Bacomet Group (pp. 243-307). Dordrecht, The Netherlands: Reidel. Cockcroft, W. H. (1982). Mathematics counts: Report of the committee of inquiry into the

teaching of mathematics in schools under the chairmanship of Dr W H Cockcroft. London: Her Majesty’s Stationery Office (HMSO).

(30)

21

Evans, J. (1987). Investigations: The state of the art. Mathematics in School, 16(1), 27-30. Frobisher, L. (1994). Problems, investigations and an investigative approach. In A. Orton &

G. Wain (Eds.), Issues in teaching mathematics (pp. 150-173). London: Cassell. HMI. (1985). Mathematics from 5 to 16.London: Her Majesty’s Stationery Office (or

HMSO).

Jaworski, B. (1994). Investigating mathematics teaching: A constructivist enquiry. London: Falmer Press.

Krutetskii, V. A. (1976). The psychology of mathematical abilities in school children. In J. Kilpatrick & I. Wirszup (Series Eds.), Soviet studies in the psychology of learning and teaching mathematics. Chicago: University of Chicago Press.

Lampert, M. (1990). When the problem is not the question and the solution is not the answer: Mathematical knowing and teaching. American Educational Research Journal, 27, 29-63.

Lee, M., & Miller, M. (1997). Real-life math investigations: 30 activities that help students apply mathematical thinking to real-life situations. New York: Scholastic Professional Books.

Mason, J., Burton, L., & Stacey, K. (1985). Thinking mathematically (Rev. ed.). Wokingham, England: Addison-Wesley.

Ministry of Education of Singapore. (2000). Mathematics syllabus: Lower secondary. Singapore: Curriculum Planning and Development Division.

Ng, H. C. (2003). Benefits of using investigative tasks in the primary classroom. Unpublished master’s thesis, National Institute of Education, Nanyang Technological University, Singapore.

Orton, A., & Frobisher, L. (1996). Insights into teaching mathematics. London: Cassell. Pirie, S. (1987). Mathematical investigations in your classroom: A guide for teachers.

Basingstoke. UK: Macmillan.

Pólya, G. (1957). How to solve it: A new aspect of mathematical method (2nd ed.). Princeton, NJ: Princeton University Press.

Tanner, H. (1989). Introducing investigations. Mathematics Teaching, 127, 20-23. Yeo, J. B. W. (2008). Pre-Service teachers engaging in mathematical investigation. In

Electronic Proceedings of the Asia-Pacific Educational Research Association (APERA) Conference 2008: Educational Research for Innovation & Quality in Education: Policy & Pedagogical Engagements Across Contexts. Singapore: APERA and Educational Research Association of Singapore (ERAS).

(31)
(32)

23

Many schools are investing significant funds in technology in the hope that it will address declining engagement with mathematics and improve students’ learning outcomes. This paper presents data from two studies and explores students’ perceptions of technology use during mathematics lessons and their resulting engagement. Results from the two studies suggest some relationship between the ways in which technologies are incorporated into mathematics lessons, and teacher experience and expertise.

Keywords: engagement, iPads, laptops, primary school mathematics, student perceptions

Student engagement with mathematics continues to be a concern in Australia (Attard,

2014; Commonwealth of Australia, 2008; Sullivan & McDonough, 2007) and internationally

(Boaler, 2009; Douglas Willms, Friesen, & Milton, 2009). Decreased engagement could

potentially have wide-reaching consequences that affect our communities beyond the obvious

need to full occupations that require the use of mathematics. Students who choose to

discontinue their study of mathematics due to disengagement potentially limit their capacity

to understand and interpret life experiences through a mathematical perspective (Sullivan,

Mousley, & Zevenbergen, 2005).

One of the most significant influences on student engagement is the teacher and the

pedagogical practices employed (Attard, 2012; Hayes, Mills, Christie, & Lingard, 2006).

Amongst the pedagogical practices commonly seen in contemporary mathematics classrooms

is the use of a range of technologies, from desktop computers to mobile devices such as

iPads. Although standard desktop and laptop computers have been common in schools for

several years, the increased popularity of mobile devices has resulted in schools within

Australia and internationally investing significant funds in mobile technologies in the hope

that their incorporation into learning and teaching practices will better address the needs of

contemporary learners and result in deeper engagement and improved outcomes.

Although the use of technologies in mathematics classrooms is explicitly promoted

within curriculum documents in Australia (Australian Curriculum and Reporting Authority,

2012; Board of Studies New South Wales, 2012), there is little evidence that a result of

(33)

24

separate studies to explore students’ perceptions of technology use and its influence on their

engagement in the mathematics classroom. Current literature pertaining to technology and

mathematics, and engagement is now provided.

Mathematics and Technology

Emerging technologies continue to change every aspect of home life and work: the

way we communicate, calculate, analyse, shop, make presentations and socialise. “Computer

technologies are changing the very ways we think and make sense of the world” (Collins &

Halverson, 2009, p. 11). Many teachers expect the incorporation of computer technology into

mathematics teaching and learning to motivate and engage students (Pierce & Ball, 2009).

However, research into their use in mathematics classrooms has revealed some issues that

could negatively impact on student engagement as a result of how they integrate with existing

pedagogies. Current curriculum documents express the expectation that such technologies are

integrated into lessons, and although there is a significant body of research exploring

teachers’ perspectives regarding computers and mathematics (Goos & Bennison, 2008;

Hoyles et al., 2010; Thomas & Chinnappan, 2008), there has been little research on their

impact on student learning and engagement.

Some teachers who regularly incorporate traditional ICTs such as desktop or laptop

computers into their lessons tend to use them in a way that resonates with a didactical,

teacher-centred approach.

The traditional methods of teaching mathematics seem to have such a strong hold that the computer’s position as an agent of change is relatively weak. Instead, it seems that the computer is assimilated into existing math teaching traditions as regards both content and form (Samuelsson, 2007, p. 11).

Other teachers who have successfully implemented computer technology were found

to display a wide variety of teaching styles that included a willingness to become a learner

alongside students, and a willingness to lessen teacher control in the classroom (Thomas,

Tyrrell, & Bullock, 1996). Factors that inhibit teachers’ use of computers include issues of

access, lack of technical support, and the time it takes for students to learn to use the

equipment and software programs, taking away from time learning mathematics (Forgasz,

2006; Hoyles et al., 2010).

The introduction of mobile technologies such as iPads has provided an opportunity to

(34)

25

to use, provide ubiquitous access to ICT and have a broad range of affordances. There is also

a belief that the devices have the potential to address a disparity between the way young

people use digital media outside school, and the ways in which it is used within the classroom

(Henderson, 2011; Selwyn, Potter, & Cranmer, 2009). This disparity is described by Selwyn

et al. as a ‘digital disconnect’ between schools and learners. Distinct affordances offered by

the iPads when compared to traditional ICTs include their affordability and ubiquitous access,

mobility, ease of use, opportunities for more flexible learning spaces and more opportunities

for students to author their own work rather than simply consuming the work of others

(Ireland & Woollerton, 2010; Kiger, Herro, & Prunty, 2012; Melhuish & Fallon, 2010).

Although they are easier to operate, it should not be assumed that teachers will use

iPads or other mobile technologies in “pedagogically innovative and appropriate ways”

(Herrington, Mantei, Herrington, Olney, & Ferry, 2008, p. 419). If they are provided with

little or no professional development and limited opportunities for professional dialogue,

teachers are less likely to embed new technologies within their practices (Bennison & Goos,

2010).

Arguably the use of computer technology in the mathematics classroom has the

potential to improve students’ engagement in a positive way when used appropriately. This

literature review now turns to the concept of engagement and provides a clear definition that

will be applied to the data informing this paper.

Engagement and Mathematics

The term engagement is used often by educators when describing students’ levels of

involvement with teaching and learning, and many definitions of engagement can be found in

literature. In this paper, engagement is viewed as a multi-faceted construct that operates at

three levels: cognitive, affective and operative (Fair Go Team NSW Department of Education

and Training, 2006; Fredricks, Blumenfeld, & Paris, 2004). On a general level, cognitive

engagement relates to the idea of investment, recognition of the value of learning and a

willingness to go beyond the minimum requirements. Affective engagement involves

students’ reactions to school, teachers, peers and academics, influencing their willingness to become involved in schoolwork. Finally operative engagement encompasses the idea of

active participation and involvement in academic and social activities, and is considered

(35)

26

Translated into a mathematics classroom context, engagement occurs when all three

facets come together. This occurs when students are procedurally engaged during

mathematics lessons and beyond, they enjoy learning and doing mathematics, and view the

learning and doing of mathematics as a valuable, worthwhile task, useful within and beyond

the classroom (Attard, 2014). In her Framework for Engagement with Mathematics (FEM),

Attard includes the use of technology as an important aspect of teachers’ pedagogical

repertoires where: “technology is embedded and used to enhance mathematical understanding

through a student-centred approach to learning” (p. 2).

For the purposes of this paper, the concept of engagement will encompass operative,

cognitive and affective engagement, leading to students valuing and enjoying school

mathematics and seeing connections between school mathematics and their own lives. A brief

overview of the methodologies relating to the three studies informing this paper is now

provided.

Methodology

In order to explore students’ perceptions of technology use and their resulting

engagement, data is drawn from two separate qualitative research studies. The studies

provided access to student focus group, student and teacher interview and classroom

observation data. The first study, referred to as Study A, is a longitudinal study conducted

over a period of three years. Study A investigated the influences on student engagement with

mathematics during the middle years (Years 5 to 8 in Australia). Data was gathered from a

group of 20 students, beginning in their final year of primary school in a Catholic primary

school located in Western Sydney, and concluding in their second year of high school at a

Catholic secondary college also in Western Sydney (Attard, 2013). For this paper, the

students’ experiences in a one-to-one laptop program during their first year of secondary school will be explored.

The second study which I will refer to as Study B, is a qualitative study exploring the

experiences of one teacher and his Year 3 class group using iPads in mathematics lessons

during a six-month iPad trial at a government primary school in Sydney (Attard & Curry,

2012). At the time, the teacher was in his first year of teaching and was the school’s

technology coordinator. The teacher had received no professional development to assist in

(36)

27

Findings

Laptops as a central resource: Study A

In Study A the majority of mathematics lessons were based upon either a traditional

text-book accessed via CD-Rom on the students’ laptops, or a commercial, interactive

mathematics website that the school subscribed to. This was vastly different to the range of

resources the participants had been exposed to in the primary school. Although their initial

reaction to the constant use of computers was one of excitement, the participants soon

reported missing the interactions with each other and with concrete materials, with Kristie’s

comment reflecting others made by the group: “I think I liked it better when we could hands

-on stuff ‘cause I – I think I learn better when we do hands-on stuff ‘cause you’re actually doing it instead of learning the theory.”

The utilisation of a single resource such as the website or textbook for the length of a

100 minute lesson also made it difficult for the students to stay on task and interested in the

lesson, with some students admitting to becoming restless and bored. Andrew’s comment

reflects this sentiment: “after about maybe an hour or so, maybe 70 minutes, me and a couple

of my friends get really bored and then we start mucking around and not doing our work

cause it’s – we just can’t stay interested in it. So we muck around and probably not do our

work.”

The use of computer technology as the central resource for teaching and learning

caused initial excitement when the students began Year 7, with Elizabeth commenting: “A good way of how we’re learning now is we’re getting in touch with technology and we’ll be able to do more stuff in the future.” However, as a result of the way in which computers were

used, the students soon began to experience dissatisfaction with them, finding them

distracting and resulting in unfinished class work. The following comment from Samantha

reveals the building frustration among the participants:

When we use our Macs on, there’s also a lot of distractions ‘cause I can see people around, like the boys around me, they’re not actually doing their work, they’ve got games and the calculators on the dashboards and like the internet, and it’s like using the Macs, not just in maths but to write, I find it a lot more distracting ‘cause there’s Bluetooth and I’m very tempted, and the Internet’s just there and the teachers aren’t on your back a hundred percent there’s like a lot of gimmicks where you can just get rid of the screen straight away and go back to your work, like command-H hides that page.

The use of laptops to deliver content by either a textbook or online learning site as the

(37)

28

the use of technology in the mathematics classroom (Thomas & Chinnappan, 2008).

Although the availability of computer technology provided the opportunity for teachers to

deliver a new and relevant way of teaching and learning (Collins & Halverson, 2009), they

instead appeared to be used as replacements for teachers. Alison picked up this emerging idea

among the students:

Its’ probably not the best way of learning because last year at least if you missed the day that they taught you, you still had groups so your group could tell you what was happening. Where now, we’ve got the computers and it’s alright because there is, um, left side of the screen does give you examples and stuff, but if you don’t understand it, it’s really, hard to understand.

It is reasonable to suggest that the website and textbook were not necessarily bad

resources. However, the data was showing that it was the way they were used in isolation that

meant the students were beginning to disengage from mathematics. The reliance on the

computers led to a lack of interaction among students and between teachers and students. By

the end of Term 1 many of the participants were displaying negative attitudes with regard to

the use of computers in the mathematics classroom.

The limited range of teaching and learning resources implemented during Year 7

appeared to have had a negative impact on the students’ attitudes towards and engagement with mathematics. It seemed as though the ‘fun’ element of mathematics lessons the students

had discussed in Year 6 no longer existed.

When interviewed, one of the mathematics teachers who used the laptops as described

above discussed his reasons for using the commercial website and text book as his major

resources:

We have (the commercial website), which I’m a fan of, we have the text book – I like text books, I’m not just limited to that but they would be my typical lesson. I like the idea where there’s an initiation phase, an introduction phase where the kids are actually shown the right way a concept is to be approached, or, even better, the application of a process. There’s widgets and things on (the commercial website) which I find excellent. There’s also worksheets and all sorts of things that go with that. That’s been a pretty good resource.

It is interesting that there is such a broad difference between the teacher’s attitude

towards those particular resources and the attitudes of the students. This difference in attitude

could be attributed to the generational technology literacy gap described by Collins and

Halverson (2009) where students’ experiences of technology may be significantly more

complex than the experiences of the teacher, therefore causing the disparity in the

Gambar

Figure 1: Mathematical Investigative Tasks for Test Instrument
Table 1 A Scoring Rubric to Describe How Students Investigated Task 1
Table 2
Fig 1. Card Trick 1
+4

Referensi

Dokumen terkait

Peramalan Tingkat Kebutuhan Beras di Kota Tanjungbalai Pada Tahun 2016. Pada kesempatan ini, dalam penulis mendapatkan banyak

YANG MEMENGARUHI POTENSI KECELAKAAN KERJA PADA PENGEMUDI TRUK DI PT BERKATNUGRAHA SINARLESTARI, BELAWAN TAHUN 2015 ” ini beserta seluruh isinya adalah benar hasil

Dari struktur orgsnisasi,komunikasi dari bawah ke atas berarti alur pesan yang disampaikan berasal dari bawah (karyawan) menuju ke atas(manajer).

[r]

Menemukan alternatif model lintasan perakitan kipas angin terbaik melalui faktor kinerja keseimbangan lintasan yaitu, smoothness index, output, current

Sungguh suatu anugrah yang luar biasa dari Allah yang telah diberikan kepada penulis, sehingga penulis mampu menyelesaikan tesis yang berjudul Strategi Bersaing

Pengelolaan madu oleh masyarakat Sumbawa umumnya dilakukan secara t radisional , dengan memanj at Boan (sebut an bagi pohon t inggi yang memiliki sarang lebah).. Pemanj at

Alat Pendeteksi kabel listrik ini bekerja dengan menangkap sinyal frekuensi rendah yang dihasilkan oleh kawat listrik di dalam dinding yang terhubung dengan jaringan listrik arus