• Tidak ada hasil yang ditemukan

Focus Group

Dalam dokumen Copyright © 2016 Kristen Ann Ferguson (Halaman 93-167)

took the survey and then responded to eight questions regarding the clarity of the purpose, questions, and answers of the survey as well as one question asking how long the survey took to complete.

Results from the pilot test indicated that the survey worked accurately. The questionnaire revealed that participants understood the purpose of the survey and the overall clarity of the questions. Minor changes to the wording of questions were

suggested in the questionnaire. These changes were made, but the changes suggested did not indicate any confusion on how to respond or the meaning of the questions.

To determine the reliability of the survey instrument, I tested the original survey and retested the survey in an alternate form with 6 ETS members. As indicated by Litwin, alternate-form reliability requires that “questions and responses are reworded or their order changed to produce two items that are similar but not identical.”31 I sent the survey in its original form to 6 participants, and then sent the survey with minor changes to the response orders to the same 6 participants one week later. In comparing the test and alternate-form retest, there were no significant differences between the responses in the first test and the alternate form test. This indicates that the participants understood the questions accurately providing reliability to the instrument.

3. Why do faculty perceive that faculty development for teaching online is of lesser quality in their institutions? Is it the institution? Is it that faculty are not attending?

4. What role does community play in the spiritual development of a theological student?

How does online learning promote or hinder that development through community?

5. Why do faculty who believe online learning will not improve in achieving the goals of theological education still think it will grow in the future

6. Why do you think there a relationship between one’s view of teaching style and one’s view of online learning’s ability to achieve the goals of theological education?

7. Do you think faculty perceptions will improve in the future? Why or why not?

8. Is there anything else that you would like to add to this discussion that has not been covered already?

Procedures Phase 1: Survey

Due to the need to combine two survey instruments, an expert panel was needed to review the instrument for validity. Next, the survey instrument endured a pilot test to ensure validity and reliability. Once these two steps were completed, the survey was ready for distribution to the population.

Upon approval from the ethics committee, I provided ETS with an email that included directions and a link to the survey located on SurveyGizmo.32 Participants were offered access to the results of the study as an incentive for participation. According to trends identified by ETS, full members typically have a higher rate of response to emails sent by ETS between February 15 and March 15 on both Monday and Friday mornings.

Therefore, the first email was sent on Friday, February 19, 2016. A follow-up email to remind participants of the survey was sent on Friday, February 26, 2016, and again on March 4, 2016. The survey closed on March 11, 2016.

Once the survey closed, data was analyzed for patterns and themes in addition to any statistically significant relationships revealed in the cross-tabulations. Responses to the open-ended questions were coded to find common themes among participants.

32See appendix 6 for email.

After the data was analyzed, the focus group interview questions were determined based on the data of the survey and on this study’s research questions. Both an explanation of the survey findings and a description of what the focus group

participants perceive about the present and future state of online learning in graduate- level theological education were assessed.

Phase 2: Focus Group

Participants of the survey who indicated interest in volunteering for the focus group were contacted for confirmation of their continued interest and were provided two date and time options.33 Volunteers were notified that their participation was confidential and their identity would not be reported. Volunteers were also told the nature of the focus group as well as the necessary hardware and software needed for the virtual meeting in Adobe Connect. Upon confirmation, I provided the volunteers with a link to the meeting room and further details of their involvement.

The focus group was conducted through Adobe Connect software. This software allowed the meeting to be recorded. Confirmed participants received a link to the Adobe Connect meeting room and were guided on how to set up their audio and microphone at the beginning of the session. I recruited a technician to assist in the focus groups to help turn on and off microphones, pull up chat boxes, run polls, and assist with any technical issues throughout the meeting.

At the beginning of the focus group meeting time, I provided a simple introduction to the meeting time as well as instructions for how the participants could respond to the questions posed. Participants were given the option to respond in the chat provided or audibly. During the focus group, eight questions were asked regarding the findings of the survey. It was semi-structured, allowing for participants and the moderator to follow-up on comments or statements to encourage further explanation. Questions as

33See appendix 7 for email.

well as relevant tables and data, were provided on screen for the participants. Each focus group lasted for one hour.34

Once recorded, the focus group meetings were transcribed and coded according to the themes that emerged. I first looked for key words used frequently throughout the transcription. These words were analyzed according to how the participants used them in context. Next, I identified themes within the focus group transcription that fit into the categories identified within the literature: student-related perceptions, instructor-related perceptions, institution-related perceptions, and course-related perceptions about online learning. Themes that emerge from the focus group, but do not fit into these categories, were included as emergent data. Definitions of each category and theme, as well as examples directly from the transcription of the focus group, are included in the content analysis of chapter 4.35

34Duke University, “Guidelines for Conducting a Focus Group.”

35Hsiu-Fang Hsieh and Sarah E. Shannon, “Three Approaches to Qualitative Content Analysis,” Qualitative Health Research 15, no. 9 (November 2005): 1277-88.

CHAPTER 4

ANALYSIS OF FINDINGS

Research pertaining to evangelical faculty perceptions of online learning in graduate-level theological education has been neglected in the existing literature. These perceptions are valuable as institutions continue to press forward in expanding online courses and degrees. Therefore, an explanatory sequential study was conducted to determine evangelical faculty perceptions about online learning in graduate-level theological education. A survey and follow-up focus group were the primary means of collecting the necessary data. The following chapter provides a summary compilation of protocols, a summary of the findings according to the research questions, and an

evaluation of the research design.

Compilation of Protocols Phase 1: Survey

Once the survey was approved by the ethics committee of the Southern Baptist Theological Seminary and pilot tested for both validity and reliability, the Evangelical Theological Society emailed the full members inviting them to take the survey.1

Subsequent emails were sent each Friday until the survey was closed. SurveyGizmo was set so that respondents could not take the survey more than once based on the IP address from which the respondent originally took the survey, which ruled out any duplicate responses. Partial responses were identified and were not be used in this data.

The sample size required in order to obtain a 95 percent confidence interval was 336 based on the population size of 2,650 full ETS members. That number was

1See appendix 6 for email contents.

reached and exceeded. The survey was open for three weeks, and when the survey closed, the final count of participants was 459.

Once this data set was complete, SurveyGizmo produced a summary report of the findings from which I derived the questions for the focus group. SurveyGizmo also was used to create cross-tabulations between demographic information and the responses for the survey in order to observe any trends or patterns among sub-groups within the population. I found the mean and the standard deviation for each question in an effort to understand further the significance of the responses within the survey. The mean score for each Likert scale was derived from attributing a score of 5 for “strongly agree”

responses, 4 for “agree,” 3 for “neutral,” 2 for “disagree,” and 1 for “strongly disagree.”

Therefore, a score higher than 3 indicated a more agreeable response while a score lower than 3 indicated a more disagreeable response. The findings of these protocols appear later in this chapter.

Phase 2: Focus Group

Over 100 survey participants volunteered for the follow-up focus group. Upon receiving their contact information, I sent an email containing two optional meeting times for the virtual focus group. In the first focus group, 5 participants were present and

discussed the questions. In the second focus group, 10 participants were present and discussed the questions. The recommended number for focus groups was between 5 and 10 participants; therefore, each focus group achieved the recommended number of participants.2

Once the focus group recordings were transcribed, including both the chat responses and the audio responses from all participants, the data was entered on an Excel spreadsheet. Each question was analyzed according to major themes found in the

2Richard Krueger and Mary Anne Casey, Focus Groups: A Practical Guide for Applied Research, 5th ed. (Los Angeles: SAGE, 2015), 1-2.

responses. A summary was created based on the general consensus of the responses, and representative remarks and comments were noted for inclusion in the data.

Synthesis of Data from Phase 1 and 2

In this explanatory sequential study, the quantitative data gained from the survey were the primary data source in answering the research questions. The focus group findings were used as a means to explain the quantitative data further. Therefore, the summary of findings is structured according to the research questions and includes both the survey data and focus group data where appropriate to identify the answers to the research questions.

Summary of Findings

In the following summary of findings, the data from both the survey and focus groups are divided according to their relevance to answering the research questions. First, a summary of the demographics of the sample population is articulated. Then a summary of the findings for each research question is provided below. Table 2 identifies which questions within the survey and focus groups are used to answer each research question.

The summary of findings is divided according to the categories listed in the table. Within each category, the findings from each instrument, survey and focus groups are explained.

Table 2. Research questions and instrumentation division

Category Question

Demographics Survey: #1-8, 11

RQ 1: Current State Survey: #9, 12-18, 20-22

Focus Group: #1-4

RQ 2: Future State Survey: #10, 19

Focus Group: #5-7 RQ 3: Comparison to Literature Summaries as relevant

Demographic Findings of Sample Population

Survey question 1. Survey question 1 was “Identify your gender.” Out of the 459 respondents, 438, or 95.4 percent, were male and 21, or 4.6 percent, were female.

While this may seem like a polarized demographic, this is representative of the

population. The Evangelical Theological Society does not track gender demographics for full members only, but does for all members, including full, associate, and student. In the general membership, males comprise 93 percent and female comprise 7 percent of the population. With this understanding, gender will not play a major statistical role in the findings, but it is appropriate to state that the survey sample size is representative of the population with regard to gender.

Survey question 2. The next demographic questions asked, “What is your age?”

The responses were divided into age brackets of 25-34, 35-44, 45-54, 55-64, and 65+. For the sample population, the following table presents the age bracket representation.

Table 3. Age demographic findings

Age Bracket Count Percentage

25-34 35 7.6

35-44 95 20.7

45-54 95 20.7

55-64 172 37.5

65+ 62 13.5

As shown in the table, the age bracket of 55-64 contained the largest number of respondents. Age brackets 35-44 and 45-54, however, combined to be a total of 190 respondents. Therefore, the average respondent was 47.9 years of age with a standard deviation of +/- 11.6. Age was an influential factor in several of the survey questions and was identified within each question as relevant.

Survey question 3. Survey question 3 asked respondents, “What is your denomination affiliation?” Respondents were given the ability to write in their individual response. By far, the most common response to this question was Baptist (with its

variations including Southern Baptist, Independent, and General). Of the total 228 Baptist respondents, 117 were Southern Baptist, 79 indicated simply “Baptist,” and 70 were a variety of responses including Independent, American, General Conference, and others.

Survey question 4. Question 4 on the survey asked, “What is your faculty status?” Responses for this question were limited to “part-time (or adjunct instructor),”

“full-time,” or “none.” Participants who chose “none” as their response were excluded from the survey so that only faculty responses were recorded. The sample of 459 respondents indicated either “part-time” or “full-time” as their response. Of the respondents, 132, or 28.8 percent, were part-time, and 327, or 71.2 percent, were full- time. This indicates that the majority of the total respondents were full-time faculty.

Faculty status made a statistically significant difference in some of the findings within the survey and will be identified within each question as relevant.

Survey question 5. Survey question 5 asked the respondents to “identify the number of years you have taught online.” Respondents were given six choices: 0, Less than 1, 1-4, 5-9, 10-14, and 15+. Table 4 indicates the distribution of responses to this question:

As seen in table 4, the average number of years taught online was between 1-4 and 5-9. The 78 respondents who indicated never teaching online will prove to have a significant impact on several questions within the survey. In those questions, the mean for the total responses as well as the mean for the responses minus these 78 respondents is calculated. The years of experience a respondent had teaching online was statistically significant in some of the survey questions. The impact of this demographic question is included within the analysis of these questions.

Table 4. Years taught online

Years Count Percentage

0 78 17.0

Less than 1 35 7.6

1-4 124 27.0

5-9 126 27.5

10-14 68 14.8

15+ 28 6.1

Survey question 6. The next survey question was similar to question 5, but asked respondents to identify the number of years they have taught face-to-face.

Respondents were given six choices to this question also: 0, Less than 1, 1-4, 5-9, 10-14, and 15+. Table 5 indicates the distribution of responses to this question:

Table 5. Years taught face-to-face

Years Count Percentage

0 8 1.7

Less than 1 9 2.0

1-4 56 12.2

5-9 86 18.7

10-14 77 16.8

15+ 223 48.6

As seen in table 5, the sample population has more experience teaching face- to-face than online as indicated by the most common response being over 15 years of face-to-face experience. Of the total respondents, 48.6 percent indicated having 15 or more years of experience. The next largest group of responses was 18.7 percent in the 5-9 year span, closely followed by 16.8 percent within the 10-14 year span.

Survey question 7. Survey question 7 asked respondents, “What disciplines do

disciplines. Among the highest recorded responses, 50.5 percent taught New Testament, 47.9 percent taught Theology, 38.6 percent taught Old Testament, and 35.7 percent taught Greek/Hebrew/Other Language. Participants were also able to write in a response if their discipline did not appear in the predetermined response list. Commonly written-in responses included Hermeneutics (10 responses), Apologetics (13 responses), and

Ethics/Christian Ethics (9 responses).

Survey question 8. The next survey question was “In addition to a faculty role, do you also hold an administrative role at your institution?” This question was suggested by the expert panel and is relevant to the literature about faculty perceptions. In some of the higher education studies on faculty perceptions, administrator perceptions were also gathered for a comparison.3 Likewise, “Grade Level” by Allen and Seaman also compared faculty and administrative responses to discover if there were conflicting opinions between those leading the institution and those teaching the courses in the institution.4

In the present study, 55.8 percent of respondents did not hold an administrative role in addition to the faculty role, but 44.2 percent of respondents did in fact hold an administrative role. This demographic question proved to be statistically significant for many of the questions within the survey.

Survey question 11. The final demographic question was included in a later section of the survey to provide the respondents with an immediate context of application for their responses. The question asked participants to indicate if their institution offered (a) Online courses and online programs, (b) Individual online courses, but no degree programs consisting of entirely online courses, or (c) None. As indicated in the responses,

3William Michael Wilson, “Faculty and Administrator Attitudes and Perceptions toward Distance Learning in Southern Baptist-Related Educational Institutions” (Ed.D. thesis, The Southern Baptist Theological Seminary, 2002).

4I. Elaine Allen and Jeff Seaman, “Conflicted: Faculty and Online Education, 2012,” Babson Survey Research Group, 2012, accessed February 25, 2015, http://eric.ed.gov/?id=ED535214.

69.1 percent indicated that their institution had both online courses and online programs, 24.6 percent indicated their institution only had individual online courses, but no degree programs consisting of entirely online courses, and 6.3 percent of respondents indicated that their school had no online courses or online degree programs.

Statistically significant findings. When the demographics questions were cross-tabulated with the entire survey, multiple findings were statistically significant.

Appendix 9 provides a summary of these findings for reference according to the p-value calculated in each cross-tabulation. Additionally, these findings are mentioned in the analysis of each survey question as it is relevant. As is shown in the table, teaching online and holding an administrative position had the largest impact on the participants’

responses.

Research Question 1: Current Perceptions

The first research question guiding this study was, “What are evangelical faculty perceptions of online learning in graduate-level theological education?” To answer this question, the survey instrument asked questions concerning the current state of online learning, operations of the faculty member’s institution regarding online learning, faculty-related issues, course-related issues, and student-related issues. Additionally, questions were asked based on the potential for online learning to achieve the objectives set by the Association of Theological Schools regarding the master of divinity degree.

Furthermore, focus group questions 1 to 4 sought to explain some of this data. The following summary of findings address research question 1.

General Perceptions of the Current State of Online Learning

Survey question 9a. Survey question 9 asked respondents to indicate their opinions about the current state of online learning. Under this question were six statements in which participants were asked to rate on a 5-point Likert scale. The statement for

question 9a reads, “Online education can be as effective in helping students learn as face- to-face education.” On the 5-point Likert scale, 12.0 percent strongly agreed, 26.8 percent agreed, 8.5 percent were neutral, 34.4 percent disagreed, and 18.3 percent strongly

disagreed. The responses are illustrated in the figure 1.

Figure 1. Online education effectiveness

As seen in figure 1, respondents were divided on whether or not online

education was equal in effectiveness to face-to-face education, but a larger percentage of respondents had a negative view of online education’s effectiveness than those who had a positive view. The mean score for 9a was 2.80 with a standard deviation of +/- 1.34.5 It should also be noted that there were very few neutral responses to this question, indicating that most evangelical faculty had some opinion on this matter.

When this data is cross-tabulated with demographic independent variables, the number of years taught online (p = 0.0088), the number of years teaching face-to-face (p = 0.0435), and holding an administrative role (p = 0.0061) were all statistically

significant factors in the response indicated by participants. Faculty who had fewer years

5As discussed earlier in this chap., a mean score lower than 3 indicates a more disagreeable response.

0.00%

5.00%

10.00%

15.00%

20.00%

25.00%

30.00%

35.00%

40.00%

Strongly Agree

Agree Neutral Disagree Strongly

Disagree

Dalam dokumen Copyright © 2016 Kristen Ann Ferguson (Halaman 93-167)

Dokumen terkait