2012 National Survey of School Counselors Background
On behalf of the College Board’s National Office for School Counselor Advocacy (NOSCA), Hart Research conducted 2,890 online interviews among 806 middle school and 2,084 high school counselors from May 1 to June 18, 2012. The three-to-one ratio of high school to middle school counselors was predetermined (although the actual completion rate ended up being 2.58 to one) to include the voices of middle school counselors, but not fully in proportion to the actual number of middle school counselors. In addition, Hart Research interviewed 439 high school and middle school administrators (including principals, vice principals and assistant principals).
Counselors were invited to participate in the online survey via email and by postcard, and many received both forms of invitation. Counselors’ and administrators’ contact information was obtained through the list provider MDR, a Dunn and Bradstreet company. Records were divided by the available contact information counselors who had emails were contacted via email and records that had only postal mail contact information were contacted by postcard; all administrators were contacted only by email. While it was possible for a counselor to have a postal address with no email address, every email address had a corresponding physical address. Many counselors, particularly those in the smaller states, received both forms of contact.
The survey was developed over a period of several months through collaboration among researchers and managers from the College Board, NOSCA, Civic Enterprises, and Hart Research. Six focus groups also were conducted among school counselors and school administrators in March 2012, to explore potential survey topics and to give some counselors and administrators an opportunity to express their views in their own words: two groups were convened in Charlotte, N.C., two in Dallas, Texas, and two in Chicago, Ill.
More than a dozen drafts of the survey passed between the organizations on the way to the final survey, which attempted to cover a wide variety of topics without imposing an excessive time burden on the counselors who volunteered to complete the survey. Counselors were not compensated or offered any incentive for the time it took from their other work.
Counselor invitations were made on a state-by-state basis with a sample of 1,600 counselors as the original target for the national sample. Each state’s target number of responses was designed to be representative of the size of the state’s school counselor population, as determined by the counselors’ universe. In addition to the national survey of 1,600 counselors, the eight largest states were targeted with an oversample of 200 additional counselors per state. In Virginia, Ohio, Illinois and Florida, the number of completed surveys fell short of this goal.
Wherever possible, the mail and email records were selected at random, but this survey should not be viewed as randomly sampled because all available counselor records were sent invitations in a large number of states. In the smaller states, only a fraction of the total population was sent a postal invitation. However, in several of the oversample states the postal invitations exhausted the available number of counselors. Further, all available counselors were sent email invitations, exhausting the national sample of email contacts.
The first email wave and postcard wave was sent out on May 1, 2012. Those who received a postcard also received a follow-up postcard, while those receiving emails could have received up to five additional emails, with two of those waves including an expanded population encompassing the entire available universe. The final wave was sent only to schools still in session on June 13, 2012.
In total, approximately 68,918 counselors received at least one postal or email invitation to take the survey, with a 4.19 percent completed interview response rate. In addition to the 2,890 completed interviews, 127 interviews were stopped due to the respondent’s failing a screening question, and 2,896 interviews were stopped by respondents.
Administrators also were selected at random from the national sample, with the number of invitations per state proportional to the state size. The first email wave was sent to 40,663 counselors on May 17, 2012, with only one additional wave afterward that was inclusive of the original sample selection. The survey closed with 439 completed interviews, giving a total response rate of 1.08 percent. In addition, 41 interviews were stopped by a screening question and 503 interviews were terminated by the respondent or timed out before the survey was completed.
Many questions in this survey ask for ratings on a zero-to-10 scale. Whenever we present these results we report the proportion giving a rating of “8” or higher because this indicates full or nearly full agreement, and a rating of “6” or “7,” while indicating general agreement, also could indicate a barrier in the form of an incomplete endorsement. There are two exceptions to this rule: one is cases in which the counselor is rating others’ commitment — either the administrators or school district. We set the bar lower for counselors’ perceptions of administrators’ support to assess simply whether counselors see administrators as “on their side” (indicated by a “6” or higher). The second exception is acceptance of various accountability measures where a “6” indicates that the measure is acceptable.