• Tidak ada hasil yang ditemukan

ACEC2014 - DEVELOPING QUICKSMART ONLINE TO ENGAGE LEARNERS

Helen Doyle, Stephanie Belson, Lorraine Taber & Chris Reading University of New England, Australia

Abstract

Literacy and numeracy are identified as necessary skills for employment. QuickSmart Online (QSO) was developed with the aim of closing the gap in numeracy skills to enable the unemployed to break the cycle of long-term unemployment. QSO focuses on the learner developing fast and accurate basic skills, which in turn develops their neural pathways, allowing the learner’s working memory to be freed up to enable further learning. This paper focuses on researching facilitator stories based on feedback from learners and teachers, and on observations of QSO usage. These stories reported on the learner experience during the initial development of QSO. The program was informally trialed for a period of twelve months with learners, ranging in age from eight to the late fifties, from a variety of learning institutions. There was some evidence of engagement with the program. The five main aspects of the program that impacted on this engagement: learner confidence, learner support, learner e-literacy, online environment style, and context of learning are described. Key recommendations to increase learner engagement for the next iteration of QSO are outlined.

More than seventy-five percent of employers in 2009 reported that their businesses were affected by low levels of numeracy and literacy skills amongst their workforce (Ai Group, 2010), whilst the Industry Skills Council (ISC) reported more than half of working age Australians have Language, Literacy and Numeracy (LLN) problems (Industry Skills Councils, 2011). Successive Australian Governments have reported low LLN skills of Australians. The Australian Government has co- operated with The Organisation for Economic Co-operation and Development (OECD, 2013) resulting in an International Report on LLN skills of adults, which described such skills every individual needed to participate in society. Capraro, Capraro, and Jones (2014) also stressed that numeracy is an important skill for full participation in the workforce. The Science, Information and Communication Technology, and Mathematics Education for Rural and Regional Australia National Research Centre (SiMERR) received an Australian Federal Government (AFG) Grant in 2012 to develop and produce an online version of QuickSmart, called QuickSmart Online (QSO) targeted at Adult Job Seekers with identified low levels of LLN skills. This grant was part of the NBN-enabled Tele-education Trial to support the Australian Government’s Digital Economy Goal for expanded online education (Hand, 2013). This paper focuses on the numeracy component of QSO. First, some background is provided about QSO, engagement with learning and the QSO software development cycle, then, some results are discussed from the initial trialling of QSO. The results include the main aspects of QSO that impacted learner engagement and recommendations for improving QSO to increase engagement.

QuickSmart

QuickSmart (QS) was first developed in 2001 as a face-to-face (f-2-f) early intervention numeracy program for middle-school students, followed by an additional literacy component. QS focused on enhancing the students’ fluency in either numeracy or literacy (automaticity) through improving their working memory. Students work in pairs with an instructor for thirty minutes, three times per week, for an average of thirty weeks. Pegg, Graham, and Bellert (2005) defined automaticity as learners’

fluency and facility with basic number facts. They researched the links between working memory and the ability to recall basic number facts and found that improvements made to a person’s processing speed of basic skills frees up his/her working memory capacity, which then becomes available to address more difficult mathematical tasks. This research also showed that the improvements made to

a person’s working memory continued for at least twelve months following the completion of the QS intervention program.

QuickSmart Online

The QS f-2-f program was the framework used to develop QSO numeracy. A team from SiMERR was responsible for creating the content while an emerging software development company was contracted for the technical development of the online environment. Like QS the development of QSO was aimed at improving a person’s automaticity in numeracy, thereby freeing up his/her working memory to allow him/her to perform more complex tasks. It is important to emphasise that QSO is not intended to be a computer game. However, QSO does align with Whitton’s (2014) game definition of providing a challenging activity and containing structure, rules, goal progression and rewards. The numeracy component of QSO commenced trialling in April 2013. The remainder of this paper focuses on this trial of QSO numeracy.

The QSO numeracy program consists of seven components: Warm-Up, Focus Facts, Flash Cards, SpeedSheets, Fast and Accurate Basic Skills (FABS), Problem Solving (PS) and a game. After enrolment into the program, the learner completes an eighty-question pre-test covering each of the four basic mathematical operations. This was designed to establish a learner’s entry point into QSO.

Many learners have low e-literacy skills and/or low LLN. A calibration activity to assess a learner’s keyboarding skills was included for the timed activities, i.e., Flash Cards, FABS and SpeedSheets.

All seven components of QSO were designed to help the learner engage with his/her learning.

Engagement with Learning

To better report on aspects of the engagement of learners when using QSO, it is first necessary to clarify what is meant by engagement. Engagement, energy in action (Russell, Ainley, & Frydenberg, 2005), focuses on the connection between the learner and the activity. Care must be taken not to confuse engagement with motivation, which is about energy and action and focuses on the reasons for behaviour (Russell et al., 2005). Engagement is more likely than motivation to be affected by learning experiences and rapport with people involved with those experiences. Students who are motivated are not necessarily engaged. Teachers need to be able to design learning environments (f- 2-f or online) that will engage students.

Three distinct types of engagement: behavioural, cognitive and emotional, as described by Fredricks, Blumenfeld, and Paris (2004), provide a useful framework for elaborating the concept of engagement.

Behavioural engagement involves: positive conduct, e.g., absence of non-disruptive behaviours; and involvement in learning tasks, e.g., persistence. Emotional engagement involves: affective reactions in learning situations, e.g., interest; and affective reactions to those delivering the learning, e.g., respecting teacher. Cognitive engagement involves psychological investment in learning, e.g., desire to go beyond the requirements; inner psychological investment, e.g., desire to learn; and self- regulation, e.g., evaluating cognition when accomplishing tasks. Although categorising these three types of engagement can assist in expanding perceptions of engagement, care needs to be taken as confusion can result from these three types of engagement being “dynamically interrelated within the individual” (Fredricks et al., 2004, p. 61). Such an expanded view of engagement, with three types, provided a suitable framework for considering how learners were engaging with QSO.

QSO Software Development Cycle

QSO needed a software development cycle to monitor and evaluate each step of the development process. The Most Significant Change (MSC) technique, developed by Davies and Dart (2005), was favoured as a framework to collect stories from researching facilitators, hereafter called facilitators, working closely with the QSO trial. MSC is primarily a monitoring technique (Willetts & Crawford, 2007) involving collecting significant change stories from the people who are most closely involved

with a program and then the most significant of these is selected by the stakeholders. Adapting the MSC technique for implementation in a specific evaluation situation, Willetts and Crawford (2007) developed a monitoring and evaluation model, called the Monitoring and Evaluation Data Cycle (M&E). Learning rather than accountability is the focus of the M&E Data Cycle.

The QSO Software Development Cycle (Figure 5), was created as an adaption of the M&E Data Cycle.

The six stages in the process were: 1) Identification - involves selecting the data to be captured with indicators tracked throughout the life of the project; 2) Capture - involves collecting data, through informal and formal processes, relevant to the chosen indicators; 3) Analysis - involves analysing the raw data and developing recommendations for further software development; 4) Development - (Dissemination in M&E Data Cycle) involves acting on the recommendations to develop the next iteration; 5) Implementation - (Utilisation in M&E Data Cycle) involves the implementation of the new iteration; 6) Assessment - involves assessing and reflecting on whether or not the indicators in the Identification stage were the most appropriate and whether they need to be refined in subsequent iterations. This paper focuses on the first three stages of the development cycle, coloured green in Figure 5, as undertaken in the QSO trial.

Method

The QSO trial occurred in 2013, with three facilitators who worked with 40 early-school-leavers and adult learners and 44 school-aged learners. The early-school-leavers participated in a youth-off-the- streets program and the adult learners in a government education program. Both programs required the study of basic skills because these learners had been identified as having skills too low for satisfactory employment. The school-aged learners struggled with mathematical skills, but were not necessarily the lowest achievers in their respective cohorts.

Unlike QS, where instructors work with pairs of students, QSO was designed for the learner to use independently. For the trial both teachers/teacher aides and facilitators were present during each

session. The teachers/teacher aides were there to learn how to support the use of QSO in their class rooms. The three facilitators were there to assist with overcoming any technical issues, and to observe the learners, which informed the monitoring, and evaluation of the program. Each of the facilitators attended at least one session weekly.

The research presented in this paper aimed to monitor and evaluate the trial of QSO to determine improvements needed to increase student engagement. The first three stages of the QSO Software Development Cycle (see Figure 1), as followed in the QSO trial, are now described. The first stage, Identification, involved choosing the data to be collected. In accordance with the MSC approach, the most significant change stories provided by the three facilitators were chosen. These facilitators had the opportunity to observe the learner use of, and reaction to QSO and also to have informal conversations with the learners and the teachers/teacher aides. The key indicator of interest was student engagement. The second stage, Capture, involved the three facilitators writing their individual stories recording their observations of student engagement with QSO, including identifying significant changes that occurred. The third stage, Analysis, involved the facilitators collaborating with an independent researcher to combine the three stories to synthesise the most significant outcomes and impacts about learner engagement. The indicators of engagement, as evidenced in the combined story, are reported across all three types, behavioural, emotional and cognitive, to demonstrate the breadth of engagement.

The Story

During the Analysis phase information shared by the three facilitators about their observations during the trial of QSO varied considerably, justifying the need to consider the stories from all three facilitators, rather than just using feedback from one. Two important common themes were that QSO gave the learner the opportunity: to improve basic number skills thus developing automaticity; and to practice those skills with contextually appropriate problems. As a consequence of the improved skills, learner confidence increased both within and beyond the learning environment.

However, consideration of the diversity within the three stories showed that the learner experience varied accordingly to three key factors: perceived employment opportunities, learner age, and teacher engagement. First, employment opportunities, as perceived by the learners, differed between geographic locations with some learners believing that there was no point in engaging with the program when there were no job opportunities relevant to their skill levels. Second, the learners varied in age from eight to late fifties. Typically, school-aged learners could overcome technical issues and engage from the outset and the early-school-leavers did not engage because of recent failure with the school system, while adult learners could see the value in trying a new approach to learning LLN.

Third, teacher engagement decreased during the trial, when it became apparent that QSO was not aligned to their specific curriculum requirements and that they were unable to access learner results to map performance outcomes. Learners were more engaged when teachers were engaged.

Analysis of the stories showed that some learners were more engaged than others. Evidence of engagement spread across all three types of engagement: behavioural - seeking assistance, persisting with difficult tasks, completing work above minimum requirements, and assisting peers; emotional - liking the facilitators, reacting positively to progress, and reacting positively to constructive comments; and cognitive - using feedback, interpreting progress graphs, recognising when a fact is

“learnt”, linking progress to non-QSO life events, and acknowledging the value in learning.

Of most importance to the QSO team was identifying what had the greatest impact on learner engagement and what recommendations could be made to inform the next iteration of QSO in the Development stage of the QSO Software Development Cycle. The five main aspects of QSO found to impact on learner engagement were: learner confidence; learner support; learner e-literacy; online environment style; and context of learning. These are elaborated below.

Learner Confidence

The trial commenced with learners displaying varying levels of confidence. There were many issues that affected their confidence, most importantly, socio-economic status, family attitude to education, fear of mathematics, and prior learning experiences. Despite lacking confidence, some learners were excited to be part of the trial. Generally, the school-aged learners lacked confidence in their mathematical abilities, however were confident using the online environment. Early-school-leaver confidence was affected by previous failures at school. Being labelled as an early school leaver defines a person as a failure, not achieving success in current societal norms (Schwab, 2012). The confidence of many adult learners was affected by the fact that they previously achieved recognition for attending and completing courses, yet were self-aware that they still lacked the basic skills.

Generally, adult learners were confident that QSO provided an alternate learning environment beyond what they had previously experienced. They willingly persevered and continued to engage with the program even when they experienced little academic progression due to technical issues.

Both starting level and automaticity were important influences on learner confidence. QS was designed to start a lesson with Focus Facts, where the learner starts with facts already known and then moves onto the unknown (Pegg et al., 2005). For the trial, QSO was designed so that every learner started from the easiest Focus Fact, plus 2. For some this meant working on facts already known.

However, being able to answer the questions correctly helped the learner to develop confidence before going on to the questions at a higher level for a new Focus Fact. Similarly, the rate at which automaticity was achieved was linked to developing confidence. On completion of the trial, many learners demonstrated greater confidence with most able to articulate that they had noticed improvement in their confidence, both within and beyond the online learning environment.

Recommendation 1: QSO calibration be adjusted to start a learner practicing new skills at one level lower than the level at which he/she tested successfully and to stop the learner from spending anymore than six sessions on each Focus Fact.

Learner Support

There were four types of learner support involved: from facilitators; from progress feedback within the online environment; from engaged teachers; and from peers. Initially, increased engagement with QSO occurred when the learner had a facilitator encouraging him/her to get started and/or continue.

The facilitator continually encouraged the learner to attempt the questions. The facilitators realised that QSO failed to replicate, within the online environment, what the instructor does in the f-2-f QS.

There are two types of progress feedback within the online environment: results, which are graphed in a learner portfolio and incorrect answers which are displayed at the end of each activity. When the learner engaged with his/her portfolio, there was a greater understanding of results and what was necessary to achieve automaticity. Support from the teacher is important to a learner’s success in QSO. Teachers were more likely to support learners if QSO helped the learners to achieve curriculum- based outcomes. Many teachers admitted to having poor e-literacy skills themselves and therefore were not confident using QSO without facilitators to support them. A few teachers showed an obvious lack of engagement with the learners and with QSO and generally when this occurred learners were not engaged. Support from peers appeared to have more benefits for the peer who provided the support than for the actual learner. In fact, some of the peers providing the support increased their own confidence to such an extent that they went on to further study.

Recommendation 2: QSO incorporates features to replicate the facilitator in the classroom by providing assistance through intelligent feedback and presenting progress graphs on completion of each activity.

Learner e-Literacy

Many of the learners had low levels of e-literacy, with a few having never used a computer prior to the trial. The exception being some of the school-aged learners who had computers and internet access at home. This lack of e-literacy had not been anticipated. Enrolment in QSO required each learner to have an email address, which the majority of the early-school-leavers and adults did not have or if they did they did not know how to access. Therefore, the facilitators were required to enrol each learner with a username and password. This included a master list given to the teacher to assist those learners who could not remember their details from session to session. The facilitators spent valuable time in the beginning teaching basic keyboarding skills to the learners, including skills as simple as the use of enter key and the numerical keypad on desktop computers. The learners tended to switch between data entry methods throughout an activity between the numerical keypad, the numbers on a standard keyboard, and pointing to the onscreen keyboard using the mouse. This then had the unintended effect of compromising the calibration data in the timed activities, and hence the ability to achieve automaticity.

Recommendation 3: QSO restricts learners to one entry method for numeric characters and incorporates the capability for bulk enrolment of learners.

Online Environment Style

Three important aspects of the QSO environment style related to learner engagement: the interactivity in screen layout, the capability for learners to have individualised programs and the opportunity for non-judgemental anonymity. For the QSO trial, the screen was divided visually into three sections, the centre, left side and right side. Screen layout design was found to have less impact on engagement of the school-aged learner than the older adult learners. School-aged learners were observed to be more adaptable and confident with the screen layout design, enabling them to proceed with few difficulties. The early-school-leavers generally wanted the QSO screen design to be more interactive or game like. Due to their generally poor e-literacy levels the older adult learners found the screen layout design not to be intuitive and they required more help. The adult learners found the lack of explanation as to why they were doing the activities and how they were to do the activities confronting, likening QSO to just one test after another and expressing fear of being unsure as to what was expected next. This affected their academic progress, particularly in the timed activities, and hence their automaticity.

Many learners were more engaged because the individualised learning program nature of QSO gave them the opportunity to learn at their own pace. This was the first time they had felt like they were actually achieving on their own merit. Previously some adult learners had “completed” courses in LLN, however they still could not read or complete basic mathematical skills. These particular learners gained an enormous amount of self-efficacy when they were engaged willingly with QSO to achieve on their own merit.

Some adult learners articulated that the online learning environment was non-judgemental which made it more comfortable than a f-2-f learning situation. This gave the learners the security to engage with QSO and not be embarrassed by having incorrect responses made public. It had been observed that many of these learners sat very quietly in a traditional classroom situation, not engaging and/or not comprehending the classroom instruction.

Recommendation 4: QSO be more interactive by following a logical flow process through the activities and presenting components of QSO when they are needed, making the environment more App-like.