There is a tremendous need for the development of a research-based, classroom- tested set of instructional materials for teaching and learning with GIS in order to plan software systems designed to better support student learning (Bednarz, 2004). In keeping with the goals of ScienceMaps, to support the ongoing develop- ment and field-testing of this innovative courseware and to promote the integra- tion of technology into the science curriculum, and to validate the efficacy of the ScienceMaps research and development methodology, assessment and evaluation will include quantitative and qualitative data collection and analysis. Field-studies will provide feedback from the various ScienceMaps users (teachers and students) to understand how well participant school districts are integrating GIS technology into their classrooms using the resources available in the ScienceMaps portal. These field-studies will include participant surveys and interviews to assess such system usability issues as ease of navigation and search features of the online portal, as well as how well students interact with the many GIS applications. This feedback will not only inform future resource development, such as new functionality to be added, but will also be used to update existing portal resources. Empirical analyses of this data will provide findings regarding the effective use of technology in the science classroom and will add to the knowledge base in this area.
Initial.Results
Data was collected from 17 undergraduate students enrolled in a teacher prepara- tion program at a private university in Southern California. Students completed a system usability scale. The system usability scale (SUS) is a simple, ten-item scale giving a global view of subjective assessments of usability (Brooke, 1996). The SUS Figure 4. ScienceMaps resource development methodology
Content Area Focus
Science Lesson Developm ent
GIS Application Developm ent
Field-Studies
Feedback Inform s
Inform s
Inform s
Feedback
was developed by the Digital Equipment Corporation and is a Likert scale, where a statement is made and the respondent then indicates the degree of agreement or disagreement with the statement on a 5-point scale. The selected statements address a variety of aspects of system usability, such as the need for support, training, and complexity, and thus have a high level of coverage for measuring the usability of a system (B. Hilton, 2003). Once a respondent has used the system, they are asked to immediately respond to each item prior to discussing their reactions with others.
The SUS provides a single number, between 0 and 100, which is a composite mea- sure of the overall usability of the system being studied. SUS scores are calculated as follows:
1. For questions 1, 3, 5, 7, and 9, the score contribution is the scale position minus 1.
2. For questions 2, 4, 6, 8 and 10, the contribution is 5 minus the scale posi- tion.
3. Multiply the sum of these scores by 2.5 to obtain the overall value of System Usability.
Additionally, respondents were asked three open-ended questions regarding their impressions of ScienceMaps.
The mean overall SUS score for this group (N = 17) was 57.21. Table 1 illustrates the descriptive statistics for the group. These statistics are comparable to other studies for first-time use (American Institutes for Research, 2001; B. Hilton, 2003;
Musgrave & Ryssevik, 2000). These studies included N = 23 and Mean = 69.13 for Hilton; N = 72 and mean SUS score for the alpha version = 69 and N = 53 and mean = 69.5 in the beta version for Musgrave & Ryssevik; and N = 22 with mean SUS score for Office XP = 91.71 and 51.82 for Office 2000 for American Institutes for Research.
Table 2 outlines the descriptive statistics for the individual SUS questions for the group. The means of the positive questions (1, 3, 5, 7, and 9) ranged from 1.71 to 2.47 while the means of the negative questions (2, 4, 6, 8, and 10) ranged from 1.88 to 2.82. Standard deviations for all questions ranged from 0.81 to 1.17.
Table 1. Descriptive statistics for SUS score
Mean 57.21
Std. Deviation 15.68
Range 55.00
Minimum 35.00
Maximum 90.00
Subjective responses indicated that generally users liked the visual representation of the maps and thought it provided good background information. Many thought the ability to enter an address, and to find specific information in relation (proximity) to that address, was both useful and informative. Respondents believed that the site addressed California science standards and would be a useful tool for instruction.
Suggestions for improvement included adding online help/training as well as mak- ing minor improvements to the user interface. These suggestions will be addressed and changes implemented in future versions of the system.
Conclusion
ScienceMaps will continue to utilize the latest cutting edge geospatial technologies to enhance science instruction. The research findings of this effort will inform the ongoing research on the effects of technology on science education. Given the focus on classroom teachers, the resulting impact on K-12 students could be substantial.
ScienceMaps can help shape future resource portals, especially those using GIS so that they become more effective instructional tools. The significance of Science- Maps lies in its ability to address the issue of how to assist educators in developing more in-depth science content knowledge while building a structured approach for teaching science standards using GIS technology. Furthermore, the use of an Inter- net-based GIS provides the ability to bring geospatial data to anyone with access to an Internet browser. There is no need to run specific GIS software or download data; the science lessons and data layers are already constructed and available for immediate viewing and analysis through the various GIS applications.
Table 2. Descriptive statistics for individual SUS questions
SUS.Questions Mean Std..Dev
Q1: I think that I would like to use this system frequently. 1.71 1.05
Q2: I found the system unnecessarily complex. 2.41 1.06
Q3: I thought the system was easy to use. 2.06 1.09
Q4: I think that I would need the support of a technical person to be able to use the system
2.53 1.07
Q5: I found the various functions in this system were well integrated. 2.29 0.85 Q6: I thought there was too much inconsistency in this system. 2.35 1.00 Q7: I would imagine that most pople would learn to use this system very quickly. 1.47 0.94
Q8: I found the system very cumbersome to use. 1.88 0.99
Q9: I felt confident using the system. 2.35 1.17
Q10: I needed to learn a lot of things before I could get going with this system. 2.82 0.81
The future effort of ScienceMaps is to utilize both the knowledge, gained as well as the lessons and applications developed, to provide the foundation for an initia- tive that would be replicated nationwide. This would include customized lessons and applications for individual states’ science standards, professional development seminars, and evaluation of effectiveness. States with attributes similar to California, large student and teacher populations, high capacity to use technology but low use and access, would be selected first for this initiative.
There is much enthusiasm about ways that these technologies can help shrink the digital divide. ScienceMaps focuses on the integration of technology into the secondary science curriculum by emphasizing effective use while simultaneously addressing accessibility issues and learning styles. Furthermore, it promotes the use of innovative GIS technology in a discipline with which it is not usually associated.
While there are programs that exist to promote the use of GIS in education, they require a high level of proficiency in GIS skills and knowledge, expensive software, and a tremendous time commitment. ScienceMaps concentrates on using GIS to teach, not on teaching GIS.
References
Adelman, N., Donnelly, M. B., Dove, T., Tiffany-Morales, J., Wayne, A., & Zucker, A. (2002). The integrated studies of educational technology: Professional development and teachers’ use of technology (Tech. Rep. No. SRI Project P10474). Menlo Park, CA.
American Institutes for Research. (2001). Microsoft Office XP vs. Office 2000 comparison test public report (Tech. Rep. No. AIR Project No. 01674.001).
Washington, DC.
Archer, J. (1998). The link to higher scores. Education Week — Technology Counts, 1998 (p. 18).
Becker, H. J. (2000). Findings from the teaching, learning, and computing survey:
Is Larry Cuban right? Paper presented at the Council of Chief State School Officers Annual Technology Leadership Conference, Washington, DC.
Bednarz, S. W. (2004). Geographic information systems: A tool to support geography and environmental education? GeoJournal, 60(2), 191–199.
Borgman, C. L., Leazer, G. H., Gilliland-Swetland, A., Millwood, K., Champeny, L., Finley, J., et al. (2004). How geography professors select materials for class- room lectures: Implications for the design of digital libraries. In Proceedings of the 4th ACM/IEEE-CS Joint Conference on Digital Libraries (pp. 179-185).
Tuscon, AZ: ACM Press.
Brooke, J. (1996). SUS: A ‘quick and dirty’ usability scale. In I. McClelland (Ed.), Usability evaluation in industry (pp. 189-194). London: Taylor & Francis Ltd.
CEO Forum. (2001). Year 4 StaR Report, 2003. Retrieved from http://www.elec- tronicschool.com/2001/09/0901ewire.html#forum
Drew, D. E. (1996). Aptitude revisited: Rethinking math and science education for America’s next century. Baltimore, MD: The Johns Hopkins University Press.
Du, D., Havard, B. C., Olinzock, A., & Yang, Y. (2004). The impact of technol- ogy use on low-income and minority student academic achievements. Paper presented at the American Educational Research Association 2004 Annual Meeting, San Diego, CA.
Ely, D. P. (2002). Trends in educational technology (5th ed.). Syracuse, NY: ERIC Clearinghouse on Education and Technology.
Farrell, B. A., & Kotrlik, J. W. (2003). Design and evaluation of a tool to assess strategical information processing styles. Journal of Vocational Education Research, 28(2), 141-160.
Fox, E. (2005). Technology counts, 2005: Tracking U.S. trends. Education Week, 24, 40-79.
Gaudet, C., Annulis, H., & Carr, J. (2001). Workforce development models for geo- spatial technology. Hattiesburg, MS: The University of Southern Mississippi, Geospatial Workforce Development Center.
Gross, P., Goodenough, U., Haack, S., Lerner, L., Schwartz, M., & Schwartz, R.
(2005). The state of state science standards. Washington, DC: Thomas B.
Fordham Foundation.
Gutierrez, M., Coulter, B., & Goodwin, D. (2002). Natural disasters workshop integrating hands-on activities, Internet-based data, and GIS. Journal of Geo- science Education, 50(4), 437-443.
Hanson, K., & Carlson, B. (2005). Effective access: Teachers’ use of digital resources in STEM teaching. Newton, MA: Gender, Diversities, and Technology Institute at the Education Development Center.
Hedges, L. V., Konstantopoulos, S., & Thoreson, A. (2000). Designing studies to measure the implementation and impact of technology in American schools.
Paper presented at the Conference on The Effectiveness of Educational Tech- nology: Research Designs for the Next Decade, Menlo Park, CA.
Hilton, B. (2003). The impact of open source software and Web services on informa- tion system development: An investigation of proximate computing. Claremont, CA: School of Information Science, Claremont Graduate University.
Hilton, J. (2003). The effects of technology on student science achievement. Clare- mont, CA: School of Educational Studies, Claremont Graduate University.
Hilton, J. (2005). Narrowing the digital divide: Technology integration in a high- poverty school. In D. Carbonara (Ed.), Technology literacy applications in learning environments (pp. 385). Hershey, PA: Idea Group Publishing.
Hilton, J. (2006). The effect of technology on student science achievement. In E. M.
Alkhalifa (Ed.), Cognitively informed systems: Utilizing practical approaches to enrich information presentation and transfer (pp. 346). Hershey, PA: Idea Group Publishing.
Houtsonen, L., Kankaanrinta, I.-K., & Rehunen, A. (2004). Web use in geographi- cal and environmental education: An international survey at the primary and secondary level. GeoJournal, 60(2), 165 - 174.
Jerald, C. D. (1998). By the numbers. Education Week — Technology Counts (Vol.
18).
Joseph, E. (2004). Community GIS: University collaboration and outreach with K-12 teachers. Paper presented at the ESRI 2004 Users Conference, San Diego, CA.
Levin, D., Arafeh, S., Lenhart, A., & Rainie, L. (2002). The digital disconnect: The widening gap between Internet savvy students and their schools. Washington, DC: The Pew Internet & American Life Project.
Levinson, E. (2000). Technology and accountability: A chicken-and-egg question.
Converge, 3(11), 58-59.
Manzo, K. K. (2001). Academic record. Education Week — Technology Counts (Vol. 20, pp. 22-23).
Marsh, T., Wong, W. L., Carriazo, E., Nocera, L., Yang, K., Varma, A., et al. (2005).
User experiences and lessons learned from developing and implementing an immersive game for the science classroom. Paper presented at the HCI International 2005 — 11th International Conference on Human-Computer Interaction, Las Vegas, NV.
Musgrave, S., & Ryssevik, J. (2000). NESSTAR final report. Brussels: Networked Social Science Tools and Resources.
Parsad, B., & Jones, J. (2005). Internet access in U.S. public schools and classrooms:
1994–2003 (Rep. No. NCES 2005015). Washington, DC: National Center for Education Statistics.
Quintana, C., & Zhang, M. (2004). IdeaKeeper notepads: Scaffolding digital library information analysis in online inquiry. Paper presented at the Conference on Human Factors in Computing Systems (CHI 2004).
Rumberger, R. W. (2000). A multi-level, longitudinal approach to evaluating the effectiveness of educational technology. Paper presented at the Design Meeting on Effectiveness of Educational Technology, Menlo Park, CA.
Schmitt, C. (2002). Technology in schools: Suggestions, tools, and guidelines for assessing technology in elementary and secondary education (Rep. No. NCES 2003313). Washington, DC: National Center for Education Statistics.
Skinner, R. A. (2002). Tracking tech trends. Education Week — Technology Counts (Vol. 22, pp. 53-56).
Trotter, A. (1999). Preparing teachers for the digital age. Education Week —-Tech- nology Counts (Vol. 19, pp. 37-43).
U.S. Bureau of the Census (1998). Statistical Abstract of the United States: 1998 (118th ed.). Washington, DC: U.S. Government Printing Office.
University of Wisconsin - Madison. (2004). The Internet Scout Project, 2005. Re- trieved from http://scout.wisc.edu/Projects/CWIS/index.php
Wetzel, D. R. (1999). A model for the successful implementation of instructional technology in science education. Paper presented at the Mid-South Education Research Association Annual Conference, Point Clear, AL.
Williams, R. S. (2002). Future of education = Technology + teachers. Washington, DC: U.S. Department of Commerce Technology Administration.