School psychologists were asked to participate in this study on a voluntary basis and 116 participants completed the survey from start to finish, the majority of the participants were female (81.7%). The researcher emailed a letter to 813 school psychologists using a list provided by the New York Association for School Psychologists (NYASP) to ask them to participate in this study 1. Thus, the overall response rate was 14.3%. To encourage participation, regional chapters of School Psychologists in New York were also emailed the same letter.
Measures
A questionnaire developed by the researcher (see Appendix A) was used to gather information about the district’s elementary school policies, practices, curriculum, and overall school environment regarding LGBT-headed families. This was developed because no other instruments were found which specifically addressed the information desired. The content was based upon a literature review of relevant issues and includes questions asking basic
demographic information, policies about diversity, practices, curricula, and attitudes (GLSEN, 2008; Kosciw, Diaz, & Greytak, 2008).
Survey research offers several advantages; for example, it is economical and researchers have the ability to collect a large amount of data to sample a large population (Babbie, 2007).
There are several disadvantages in that it is somewhat artificial, inflexible, and does not offer a full sense of the natural setting. Using surveys is generally weaker on validity and stronger on reliability. The survey in the current study was developed in order to assess the specific constructs needed to answer the research questions in order to increase content validity.
The format of a questionnaire can influence the quality of data collected. For this survey, development was based upon recommendations by Fink and Kosecoff (1985). A clear format ensures that the respondents will answer all of the questions intended for them as well as increase reliability. Since the order of items can influence the responses given, questions in the survey are grouped based on similarity and organized in a natural, logical sequence. The survey was
developed to avoid biased words, phrases, and views, so respondents’ responses were not unfairly influenced. SurveyMonkey, a private company, was used to create a web-based survey that the research participants could easily access on their own computers while still remaining anonymous (Surveymonkey.com, LLC). The survey was designed to prevent participants from dropping out because they were allowed to skip certain questions. The questions that participants were allowed to skip were determined by the researcher because they were sensitive in nature or difficult to answer. Since the participants had the option of skipping some of the questions, each question has a different number of respondents. Other participants started the survey but did not complete it from start to finish. It is not known if this was due to the nature of the survey. It should be noted that the participants were allowed to leave the survey and complete it later, which a few participants chose to do.
Based on the initial hypotheses, three different indexes were developed: school
curriculum, practices, and policies. According to Howard (2004, p. 175), “the principal steps in constructing an index include selecting possible items, examining their empirical relationships, scoring the index, and validating it.” See Appendix B for Items and Scoring Protocols for each index. Each item was given a weight, coded and scored to create an index. The curriculum index used items 9e, 11 (weighted .5 for responding “yes” to informal curriculum and 1 for formal), 12 (also weighted the same as item 11), and 14a. In this index, formal curriculum was given more
weight in Items 11 and 12 than a less structured, informal curriculum. The policies index used items 6, 7, and 8 where “yes” responses were coded as 1 and then added together to create the index. The practices index used items 10, 13 (“very available” and “available” were coded as 1 and “neutral,” “unavailable,” and “very unavailable” were coded as 0), 21 (“strongly agree” and
“agree” were coded as 1 and “neither agree nor disagree,” “disagree,” and “strongly disagree”
were coded as 0), 24, 25a, 25b, 25c, and 25d. In order to examine school location, two dummy variables were created to compare urban to rural and suburban locations (called rural and
suburban, respectively). After the items were coded, they were added together within each index.
A pilot test of five school psychologists was conducted to determine the time it took to complete the survey, sources of difficulty by clarifying hard-to-follow directions, eliminating poorly-worded questions, and obtaining feedback regarding format and content. Several experts in education and survey methodology reviewed the questionnaire as well. The results of the pilot study were used to change the order of some of the questions and for minor re-wording. The purpose of pilot testing and analysis of results was to increase reliability and validity.
Procedure
School psychologists were sent a letter (see Appendix C) via email telling them the purpose of the survey, encouraging them to participate, explaining who was being surveyed, providing instructions on how to complete the survey and emphasizing confidentiality. The letter attempted to get the participants to take the survey seriously by stressing the importance of the accuracy of their responses for research purposes, which is one way to improve the reliability and validity of measured variables (Stangor, 2004). The email also provided a link to
surveymonkey.com where participants were able to take the survey. School psychologists were asked to complete the survey within two weeks. After two weeks, a reminder email (see
Appendix D) was sent out thanking those who completed the survey and asking those who did not complete the survey to do so within two weeks. In compensation for completing the survey, participants were asked if they wanted to receive the results of the study and/or materials on how to incorporate information about LGBT-headed families into the curriculum. Participants who completed the survey were entered in a drawing to win one of four $50 amazon.com gift certificates. The four gift certificates were issued a month after the original email was sent out.
There are several advantages to email surveys, including cost savings, time savings, and an increased willingness of subjects to respond (Bachmann, Elfrink, & Vazzana, 1996). Results from a study conducted by Conderre, Mathiew, and St-Laurent (2004) indicated that the quality of the information gathered through email surveys is similar to that of information gathered through mail or telephone surveys. For example, the average number of items obtained by open- ended questions in the email survey did not vary from the number obtained in mail and telephone surveys. Schaefer and Dillman (1998) found electronic mail surveys to come in more quickly and with a slightly lower item non-response rate compared to paper surveys. One disadvantage to email surveys is dealing with the potential high percentage of non-deliverable surveys in the email group (Bachmann et al., 1996). In the current study, out of the 911 emails on the NYASP listserve, 98 emails were un-deliverable, therefore 813 individuals received the email.