• Tidak ada hasil yang ditemukan

Manajemen | Fakultas Ekonomi Universitas Maritim Raja Ali Haji joeb.82.4.241-243

N/A
N/A
Protected

Academic year: 2017

Membagikan "Manajemen | Fakultas Ekonomi Universitas Maritim Raja Ali Haji joeb.82.4.241-243"

Copied!
4
0
0

Teks penuh

(1)

Full Terms & Conditions of access and use can be found at

http://www.tandfonline.com/action/journalInformation?journalCode=vjeb20

Download by: [Universitas Maritim Raja Ali Haji] Date: 11 January 2016, At: 23:26

Journal of Education for Business

ISSN: 0883-2323 (Print) 1940-3356 (Online) Journal homepage: http://www.tandfonline.com/loi/vjeb20

Assurance of Learning (AoL) Methods Just Have to

Be Good Enough

To cite this article: (2007) Assurance of Learning (AoL) Methods Just Have to Be Good Enough, Journal of Education for Business, 82:4, 241-243, DOI: 10.3200/JOEB.82.4.241-243

To link to this article: http://dx.doi.org/10.3200/JOEB.82.4.241-243

Published online: 07 Aug 2010.

Submit your article to this journal

Article views: 20

(2)

INTERVIEW/KATHRYN MARTELL

BIO.Kathryn Martell(PhD, University of Maryland; BA, University of Chicago) is Associate Dean of the School of Business and Professor of Management at Montclair State University. Prior to 2002, she was Associate Dean for Academic Affairs at the School of Business at Southern Illinois Uni-versity at Edwardsville (1999–2002). Since the new accreditation standards were passed in April 2003, Dr. Martell has worked closely with the AACSB to help schools meet the assurance of learning (AoL) stan-dards. According to Dr. Martell, more than 700 faculty and administrators from 250 uni-versities have attended the AACSB seminars on assessment of student learning that she facilitates. She is also a frequent speaker at AACSB national and regional conferences. She developed the content for AACSB’s online assessment resource center (www.aacsb.edu/ARC), and edited (along-side Dr. Thomas Calderon) the newly released book published by the AACSB and Association for Institutional Research (AIR)

Assessment of Student Learning in Business Schools: Best Practices Each Step of the Way. Dr. Martell recently talked to the Jour-nal of Education for Business’s Anjoo Pokharelabout program assessment issues in business education.

Copyright © 2007 Heldref Publications

JEB: Business schools seem to be aware that accreditation standards set by the AACSB are closely tied to program assessment. What—do you think—are the schools doing to meet this standard through program assessment (e.g., cur-riculum innovativeness)?

Martell: The AACSB standards are divided into three sections, and one of the sections is assurance of learning (AoL), that is, assessment. Schools must successfully meet all standards to attain or maintain accreditation. The AoL standards call for assessment of student learning for each degree pro-gram in the business school. For each degree program, learning goals must be articulated, and evaluated through direct measure. The assessment data must then be used to improve the curriculum. The AACSB allowed for a transition period, which is now over. Starting this year (2007), schools are expected to have their assessment plans fully implement-ed for each degree program.

With regard to curriculum manage-ment (now called AoL) there are two main differences from what was required under the previous set of stan-dards. Under the old standards, curricu-lum was evaluated against a prescribed list of topics and skill areas such as mul-ticultural perspective or ethical reason-ing. Second, the key form of document-ing curriculum management pre-2003 was to indicate how business schools were teaching these topics, combined with survey data that reflected students’ or alumni perceptions of their learning.

Today, the curriculum must be aligned with the learning goals that the faculty establishes for each degree program. The documentation to meet these stan-dards must be focused on students’ demonstration of their learning, called the direct measures. This is a major departure from what was required in the past, and most business schools did not have assessment programs that would meet these specifications when the stan-dards were passed in 2003. Meeting the AoL standards has required a significant effort from most business schools.

JEB:Do program assessment scores affect obtaining or maintaining accredi-tation? This seems to put pressure on the schools to mask problems with their programs? What is being done to pre-vent that?

Martell: The standards call for schools to assess student learning and use that assessment data to improve their curriculum. At least initially, schools will not be accountable for the assessment results. Schools have the freedom to uncover a problem with their students learning—that they cannot use quantitative methods for example—as long as that finding is used to make changes in the curriculum. The point is not to evaluate one school’s effective-ness versus the other. The point is to systematically and routinely evaluate students’ performance on learning goals that the faculty deem important and make improvements as needed, over time, to improve student learning.

“Assurance of Learning (AoL) Methods

Just Have to Be Good Enough”

March/April 2007 241

(3)

JEB: How close are educators to standardizing program assessment methods? Can it work just like SAT, GMAT, or ETS’s Major Field Achieve-ment Test (MFAT) for Business? Why or why not?

Martell: AACSB has no interest in standardizing assessment. Just as the faculty have the freedom to choose learning goals that best fit their mission, so do they have the flexibility to choose methods—as long as they are direct measures—to evaluate student learning.

JEB:Do program assessment scores influence school ranking such as the one published in the U.S. News and World Report?

Martell:Not directly.

JEB: Who sets guidelines or gives training to the faculty to “close the loop” after program assessment is com-plete? How are the faculty compensated for their time and effort in planning, designing, and implementing program assessment methods? Or, is program assessment part of their teaching responsibility?

Martell: The standards call for schools to use assessment data to improve their curriculum, also known as

closing the loop. There is no prescribed way to close the loop. Alternatives include modifying an existing required course or courses, adding a new course, modifying entrance requirements or a curriculum within a major, or faculty development. For example, let’s say that your assessment results indicate that your students’ writing skills are not up to the faculty standard. Here are some closing-the-loop options: New writing course could be required; selected courses across the curriculum could be modified to include more writing; entrance requirements could be modi-fied to include a writing skills test; if students from certain majors underper-formed others in writing, that curricu-lum could be revised to include more writing; if transfer students underper-formed native students in writing, an additional writing requirement could be part of the transfer requirements, and faculty development could focus on writing across the curriculum. The choice between these alternatives will

depend on the circumstances of the business school, but some action is required. If the faculty has identified a particular learning goal, it is saying that it is willing to take accountability, over time, for students meeting an acceptable standard for this goal. “We can’t be expected to teach them how to write” is not an acceptable response.

With regard to faculty compensation, this depends on the implementation model. My survey data indicates that about half of the time the dean’s office takes the lead in administering assess-ment, while in the other half, there is a faculty member or committee that takes the key role. Designing and administer-ing a program assessment program is not just “service”—it is too time consuming. Because the new standards were passed almost 4 years ago, there has been an increase in the use of release time—typ-ically one course a semester—to a facul-ty member who is in charge of assess-ment, and the use of faculty stipends—typically $1,000–$1,500—to faculty for 10–20 hrs of work in the summer to “do” the assessments—such as, use a rubric to assess student writing assignments. With regard to implement-ing an assessment activity—assignimplement-ing an individually written case analysis, for example, or including a new internation-al module—in my opinion, this finternation-alls within a faculty member’s normal teach-ing responsibility.

JEB:In your opinion, is it true that faculty members often fail to under-stand certain nuances of program assessment such as the importance of direct and indirect assessment, or the process of course-embedded assess-ment? How can schools encourage fac-ulty to adopt assessment process as a formal part of teaching and service?

Martell: Absolutely. Most business faculty did not have education courses included in their PhD curriculum, and are initially overwhelmed with assess-ment methodology and language. Train-ing is the solution to this problem and, not surprisingly, hundreds of business schools have sent faculty to assessment seminars over the past 3 years. Faculty’s main concerns about assessment include the time involved, the use of results to evaluate their own teaching, and a loss

of freedom in the classroom. Although assessment clearly does involve a time commitment, most faculty members will not be involved with the nuts and bolts of assessment. Once faculty are made aware of what assessment involves, many realize it is far less time consum-ing than they imagined.

With regard to the second concern, program assessment data should never be used to assess individual faculty members. Although I can understand that concern, I have never seen program assessment data used for that purpose. Finally, assessment programs do not require standardization across cours-es—everyone teaching from a common syllabus, for example—but faculty should not be surprised if they have to do something different in their classes as a result of assessment. The point of assessment is to diagnose areas for improvement in student learning in the business curriculum. As problems are diagnosed, they will need to be addressed. That may mean that some courses have a greater emphasis on some skill building, or reinforcing knowledge that students learned earlier in their program. Other faculty mem-bers may need to include individually prepared assignments in their courses for assessment purposes.

It is hard to imagine a scenario where an honest, comprehensive program assessment will not reveal some areas of needed curriculum improvement. Schools can facilitate faculty’s involve-ment in assessinvolve-ment programs through training, leadership, support, and evalu-ation. Over time, I would expect that many schools would formally include assessment activities as part of how fac-ulty responsibility is defined and evalu-ated. Program assessment is a require-ment for both AACSB and regional accreditation; some states have require-ments as well. To the extent that accred-itation is important to a school—and it is hard to imagine many situations in which this would not be the case—it is critical that it develops supporting sys-tems. Assessment must be faculty dri-ven; therefore, their involvement is crit-ical. To ensure that this critical responsibility is fulfilled and rewarded, assessment activities must become part of formal evaluation processes.

242 Journal of Education for Business

(4)

JEB:What are the pitfalls of imple-menting program assessment methods? In your opinion, how can that be over-come?

Martell: The first point to make about AoL methods is that the emphasis is on direct measures of student learn-ing. Surveys can provide some useful feedback about student satisfaction, but as an AoL measure you are better off using surveys strictly as a secondary measure. Another quick point to make about methods is that the student prod-uct used for the assessment has to be an individual product. You cannot use a team written paper to evaluate writing skills, or financial analysis skills, for example, that group projects typically represent the best students’ work. Team presentations are acceptable to evaluate oral communication skills as long as students do not self select. If they do, once again you’re just evaluating the best students’ work.

Different methods have different trade-offs. Some require more faculty resources, while others require more financial resources. Some can be imple-mented quickly; others require more development time. Standardized mea-sures can be quickly implemented, but can be relatively expensive and are not, generally speaking, as tied to the school’s curriculum as are homegrown measures. Some methods, primarily those that do not also fill a course requirement for students, can present motivational issues.

My advice with regard to methods is not to strive for the same level of rigor that is required for scholarly research. Make an honest effort to assess the learning goals that are important to the school. How perfect does a measure have to be to provide the basis for a con-clusion on whether students can write or not, conduct a statistical test, download and analyze a financial spreadsheet? Very often, faculty become so preoccu-pied with rigor and validity of the mea-sures that their AoL efforts get stymied. Just get started. You will revise as you gain experience.

Other advice that I offer on methods is, wherever possible, (a) build on something you are already doing, (b)

use the same method to gather data on two goals (for example, use a case to assess both writing and problem solv-ing skills), (c) and gather some descrip-tive data on the student as you are implementing the assessment that you will help analyze the data later. For example, it’s always a good idea to ask students to indicate on the assessment their major or concentration, transfer status, and how many credits they’ve completed. Collecting descriptive data from the student saves a lot of time when you’re trying to analyze the assessment results later.

Finally, always, always, always choose a method that is going to pro-duce data that you can use. I have seen some psychometric tests, for example, on leadership, multicultural affinity, moral reasoning, etc., that I have a diffi-cult time imagining how the results might be used to improve the curricu-lum. My colleague, Doug Eder, says, “A pig doesn’t get any fatter merely by weighing it ... you just end up with a very annoyed pig!” Don’t waste time, money, or goodwill by using an assess-ment method that is not going to pro-duce actionable data.

JEB:You have experience teaching capstone courses. Do you think a cap-stone course should be required of all majors to capture student growth and maturity in their area of study? Why?

Martell:My PhD is in strategy, and I’ve taught the capstone course my entire academic career. Given my back-ground, it’s probably not a surprise that I think the capstone (strategy) course plays a very important role in both the undergraduate and MBA curriculum. My work in assessment has only strengthened this conviction. Many schools, including my own, are drawing conclusions from their AoL processes that retention of knowledge is a major problem for students. The capstone course provides us with a venue for reinforcing critical knowledge in addi-tion to training students about competi-tive dynamics and how to integrate across the curriculum. It is no wonder that many schools, when they go through their curriculum alignment

exercise, find that the capstone course incorporates many of their program’s learning goals.

JEB:You have recently edited a two-volume book on program assessment published by AACSB/AIR. You must have come across exciting ideas in the areas of program assessment. Please share some of the ideas that struck you as brilliant.

Martell: I suppose I am a bit like a movie reviewer when it comes to assessment. Just as a movie reviewer who watches hundreds of movies a year tends to become enamored with a movie that is different, so too do I get excited about a unique approach to assessment. The book on best practices provides many of these examples: Valparasio’s assessment center, Seton Hall’s assess-ment panel, Cal State Fullerton’s com-prehensive approach to writing assess-ment, the use of business practitioners to evaluate presentation skills at Eastern Kentucky, Texas Christian’s selection processes. I have provided some other examples (including a few from my own school, Montclair State) in the article in this journal issue (see p. 189), and still more on the AACSB assessment resource center Web page including the Cal State Chico STEPs program, and a terrific original measure that the faculty at the College of Business at Sam Hous-ton developed to assess critical thinking. But, here’s my point: AoL doesn’t have to be brilliant. It doesn’t even have to be original. In fact, the assessment community is very generous and colle-gial about sharing their methods. It just has to be “good enough”—good enough to help you diagnose problems with your students’ learning. Don’t waste your time on trying to be brilliant. An honest effort, that leaves you with enough energy for the real task at hand—improving your students’ learn-ing—is the best approach.

NOTE

Correspondence concerning this interview should be addressed to Dr. Kathryn Martell, Asso-ciate Dean and Professor of Management, School of Business, Montclair State University, Mont-clair, NJ 07043.

E–mail: martellk@mail.montclair.edu

March/April 2005 243

Referensi

Dokumen terkait

Maka bersama ini kami mengundang Saudara untuk pembuktian kualifikasi tersebut yang akan dilaksanakan pada : Menunjuk hasil Evaluasi dan Penelitian Dokumen Penawaran &

Berdasarkan Penetapan Hasil Kualifikasi Nomor : 03.13.SS.D15/ULP/SS -V/VIII/2013 tanggal 13 Agustus 2013, maka dengan ini kami umumkan Calon Rekanan yang masuk Daftar Pendek

Adapun tujuan penelitian ini adalah untuk menguji adakah pengaruh pengetahuan akuntansi dan motivasi kualitas, karir, ekonomi terhadap minat mahasiswa akuntansi untuk

Based on the research data on difference between cefoxitin disc and oxacilin disc on in vitro MRSA detection using diffusion method, it can concluded that there are

Karakteristik Mata Pelajaran Matematika ……… 10B. Model Penemuan

5 Draft Usulan Penelitian yang sudah di tanda tangani dosen pembimbing 6 Eksemplar 6 Tanda Bukti Pembayaran/Herregistrasi terakhir dari bagian Keuangan 1 Lembar. 7 Poto copy KTM

[r]

Dengan ini diberitahukan kepada seluruh peserta e-Lelang Umum bahwa paket pelelangan Pengadaan Fasilitasi Peralatan Olahraga Circuit Training / Sport Science KONI Untuk