• Tidak ada hasil yang ditemukan

INCREASING KNOWLEDGE ABOUT EDUCATIONAL RESEARCH: DESIGN AND DATA

EDUCATION

BRIDGE 2: INCREASING KNOWLEDGE ABOUT EDUCATIONAL RESEARCH: DESIGN AND DATA

The author identified significant improvements in research design and data gathering in her 2003 progress report compared with her 1998 project. She attributes this improvement in practice to her coursework that focused on current educational research knowledge and skills essential for developing assessment and evaluation processes.

Prior to these courses, she had no formal exposure to educational re- search. Like many librarians of her generation, her library education pre- ceded any emphasis on critical self-reflection as an administrator/researcher or on formal research design and implementation. Among the 2003 im- provements were the addition of a reflective process prior to accepting the new project, the development of a research design, and the proper use of appropriate data-gathering and data analysis techniques.

Two phases of the 2003 progress report provide examples of bridging the theory and application of educational research. One phase was the author’s process for determining whether she understood the research question and JEAN MULHERN 62

was appropriately equipped to address the question. The other phase in- volved designing the project research with appropriate data gathering and analysis activities.

In Spring 2003, the college president asked the author to consider taking on the progress report project. Unsure, she assessed her role in the insti- tutional environment using SWOT analysis (strengths, weaknesses, oppor- tunities, threats) (matrix analysis in Krathwohl, 1998) in order to acquire and organize information to inform her decision. She needed to determine whether she could be a tenable leader of such a project. Her coursework had emphasized the importance of having sound bases for successful leadership and recognizing when to lead and when to follow. In this case, she con- sidered her minor position in the organizational structure and governance, the interrelationships among the administrative and faculty leaders, and the critical importance of accreditation to the institution (Brubacher & Rudy, 1997;Kezar, 2001;NCA-CIHE, 1997). The author used the SWOT analysis to clarify writing her role as the report coordinator and writer. Buoyed by the president’s support of her responding project proposal, the author ac- cepted the challenges presented by this additional assigned duty. Applying that decision process to library programs and services, one would place a high priority on broad-based preplanning and winning early and significant formal institutional support for library initiatives.

A second important decision requiring a systematic assessment approach was selecting the project’s research design. Unlike the 1998 self-study, the 2003 project had a research design and gave attention to appropriate quality control. The author chose a qualitative design appropriate to the purpose of the report and developed a qualitative content analysis tool, relying on knowledge gained from coursework.

The author’s design decisions acknowledged the differences between qualitative and quantitative research and their appropriate uses (Krathwohl, 1998; Newman & Benz, 1998). Assessment of student learning involves a variety of quantitative and qualitative research strategies. On the other hand, assessing the progress of implementing an assessment plan depends on an accumulation of qualitative data. Such data needs chronological organ- ization to track and evaluate progress. Detailed data could demonstrate positive change and progress, no progress, or even negative change since 1998 for each of five required categories of characteristics (NCA-HLC, 2003a). Further, to increase the perceived truth-value of report conclusions, such changes could be measured against a set of standards or accepted norms. Based on these criteria, the author designed the assessment project to chart and match documented institutional characteristics with those

described in a rubric, the NCA Levels of Implementation, Patterns of Characteristics (2000).

Newman and Benz (1998)supplied more than a dozen specific criteria for systematically elevating the truth-value of the qualitative research design of the project report. Almost all of these criteria were met through the research plan for the project. In contrast, the 1998 self-study did not even have a research plan, just miscellaneous data gathering strategies without a con- sistent philosophical research paradigm. The lesson learned was that the quality of academic library research can benefit from the selection of ap- propriate research paradigms with integrated design.

Missing in the research process for the 2003 progress report was peer debriefing by someone external to the institution but familiar with accred- itation expectations and guidelines and with student learning assessment (Newman & Benz, 1998). Having a critique by such a person(s) in dialog with the major stakeholders in the academic areas would have been a val- uable exercise in shared learning (Senge, 1990). Such a critique could have helped build shared understanding and internal expertise and increased the planning value of the report (Newman & Benz, 1998, pp. 51–52). Never- theless, the accrediting agency response to the progress report would pro- vide a third-party critique, a strategy implicit in other aspects of its accreditation processes as well (Mulhern, 2000a).

The author also bridged the theory and practice of educational research in her design choices for project data gathering and evaluation. She discovered multiple internal interpretations of what had or had not been accomplished in learning assessment in the previous five years. As cautioned in the agency documents (NCA-HLC, 2000), a college’s decentralized approach to as- sessment often hinders development of shared understandings. To both demonstrate and resolve this common organizational communication bar- rier, the author designed a chronology template as a way to systematically organize the qualitative data supplied by multiple sources. This template was flexible in accommodating different types of academic programs and the different interpretations of progress. Coding the data entries in the chro- nology spreadsheet identified common themes and eased sorting the data for later analysis.

In summary, the author learned to value systematic, systemic educational research as an intellectual bridge because it can provide a common infor- mation base for participants in any academic change process and a baseline for cycles of evaluation. The empirical research approach practiced by sci- entific scholars since the Enlightenment (Boorstin, 1983) remains funda- mental to research in education (Willower & Forsyth, 1999). Even so, JEAN MULHERN 64

researchers can balance quantitative research strategies with ways to acquire and evaluate qualitative information about a complex culture and its ac- tivities (interactive continuum inNewman & Benz, 1998). Constructing this intellectual bridge allowed the author to move away from using self-taught research strategies based on descriptive input/output data. Accessing pro- fessional research terminology and techniques enabled her strengthen a goal- appropriate educational research design. With increased knowledge about assessment and research, she reports increased confidence in selecting library research projects appropriate to her skill set and with higher design quality.

BRIDGE 3: DEVELOPING INFORMED SENSITIVITY