Institutional Research & Market Analytics
 
   
Research
Student Market Matrix Programs & Courses Faculty & Staff Survey Resources
   IRMA Home  »  Research  » 

Survey Resources

 
    Welcome to the DePaul University Survey Resources page, the best place to find information on surveys at DePaul. From here, you can:
    • Learn about the surveys that are being administered to DePaul students, faculty, staff and alumni
    • Learn about using Qualtrics - our online survey administration tool
    • Find results from previously administered surveys
    • Request assistance for conducting your own survey

    Conducting a Survey

    Initial Considerations

    Conducting a survey is more than just having students answer a bunch of questions. There are many things a person interested in surveying DePaul's students, faculty, staff or alumni should consider, including:
    1. Relevance: Does the survey provide useful information for DePaul's planning purposes? Does it provide useful feedback to those providing services to students? Does it provide the University with useful information on the experience of DePaul's students?
    2. Content and Design: Is the survey designed well? Is the content appropriate? Does it follow sound survey methods and practices? Is it of appropriate length? Are the questions easily understood and interpretable?
    3. Population and Sampling Methodology: What is the target population? Will the entire population be surveyed, or a sample? If the latter, what is the sampling methodology and is it sound? If the former, is a sample an option that should be considered to lessen the burden on students?
    4. Timing: When will the survey be conducted? Does it overlap with other student surveys? Is it conducted at a time during the academic year when students are likely to respond?
    5. Dissemination and Use of Information Collected: Who will have access to the information collected and how will they use it? Will it be presented to the President, Provost, Vice-Presidents, Deans, Directors, and/or other line managers who might be better informed in making decisions related to students as a result?
    6. Protection of the data and information: How will the student responses be secured to comply with FERPA and other policies?
    7. Overall Impact: What will be the impact of the study? Will the study negatively impact the University? Does the survey overburden students? Does the survey divert important University resources away from other more important projects?
    Use the links on this page to learn more about survey design (writing questions, methodology, sampling, and disseminating your results). If you have any questions or need assistance with any of these areas, please contact Joe Filkins.

    Survey Modes

    There are many ways a survey can be administered: paper-and-pencil surveys either administered individually (distributed through the US Postal service or in person) or in a group setting; surveys administered over the phone or online; and personal interviews. How the survey is to be administered is a function of many factors including:
    • Time -- When are the data needed?
    • Sample -- Who is being surveyed and how accessible are they?
    • Resources -- Both in terms of human resources and facilities
    • Cost -- Mail surveys and personal interviews tend to be more expensive
    When the specific data are needed quickly, snap polls are easy to do online or over the phone. They require little in the way of data entry and analysis. Larger studies that involve a number of research questions are better not done over the telephone. In such instances, a mail survey or an online survey might be the way to go. Your choice of survey modality will have profound effects on a number of things related to your project including:
    • How you sample
    • What form your questions will take
    • What your response rate will be
    • How representative of the population your sample will end up being

    Writing Questions

    In surveys, the questions you ask are typically aimed at measuring some underlying construct or large issue. To understand student satisfaction with DePaul, we have to ask more than just "Are you satisfied with your experiences at DePaul?" Good questions are reliable (provide consistent responses) and valid (correspond to what is intended to be measured).

    When writing questions for a survey, one must first consider how the survey is to be administered. Questions asked in one-on-one interviews will be different than those asked online! When writing questions, one needs to consider:

    • Open-ended versus close-ended questions -- Open-ended means free form; the respondent replies as he/she sees fit. Close-ended means a list of response options is provided to the respondent - think in terms of an agreement scale where respondents must choose a response from 1 (strongly agree) to 7 (strongly disagree). Paper-and-pencil (or online) surveys will be comprised of mostly close-ended questions for ease of data coding and analysis. An open-ended question or two are acceptable to provide respondents the opportunity to explain their responses or to provide insight that would not be possible to ascertain in a closed format.
    • Forced verses multiple choice questions -- Most scaled survey items are examples of forced choice questions - the respondent is forced to choose one response, agree or disagree, satisfied or not satisfied - and only one response is accepted. Sometimes, though, you may ask someone to "Check all that apply" to a list of items. In this case, you have a multiple choice question.
    • Four, five, or seven point scales -- Should your scale have a midpoint? How many response options should there be? There are no absolute answers to these questions. Some prefer the use of a four-point scale (i.e., Strongly Agree, Agree, Disagree, Strongly Disagree) to force the respondent to make a choice, rather than to give them the "neutral" option that a five-point scale would include. While some have found a decrease in socially desirable responses using the four point format, others have found an increase in the reliability and validity of the questions when using a five- or seven-point format.

    Sampling

    While it would be nice to be able to survey every student at the university, that simply is not feasible because of cost and logistical concerns. Plus, we do not want to overburden our students! Thus, we draw samples, or smaller, representative subsets, of the population to survey. How representative a sample is of your population is a function of many factors, including how the sample was drawn, the sample size and the response rate. There are many ways that we can create a sample of students. Some of the more common ones include:
    • Simple Random Sampling -- This is essentially drawing students from a hat. Members of the population are selected one at a time, without replacement - meaning once selected, that individual has no further chance of being selected again.
    • Systematic Sampling -- A variation of simple random sampling which is mechanically easier to create. Say there are 8,500 members in your population and you want a sample of 100. Dividing the former into the latter shows that you require 1 of every 85 persons in the population to be included in the sample. A systematic sample would select a random starting point from the population list and then include every 85th individual.
    • Stratified Sampling -- When you know some of the characteristics of the population before sampling, you can structure the sampling process to produce a sample that is more likely to reflect the total population than would be produced by simple random sampling. This process is called stratification. For example, say you wanted to compare responses to a survey by the class level of a student. If the population was made up of equal percentages of freshmen, sophomores, juniors and seniors, then a simple random sample would generate roughly equal numbers of each class. However, if there is unequal representation across class levels within the population, sampling within the class levels (or strata) will produce groups of roughly equal size and make comparisons easier analytically.
    One of the most common questions asked is how big the sample should be. Really, this boils down to how you plan to analyze the data and what subgroups you will want to consider, along with an estimate of what fraction of the population falls into those subgroups. Most sample size decisions are concentrated on the minimum sample sizes that can be tolerated for the smallest subgroups of importance.

    Deliverables

    Once the data are collected and analyzed, it becomes time to prepare the report on the survey. There are a multitude of factors to consider when preparing your report and we direct you to the monograph Effective Reporting, 2nd Edition by Liz Sanders and Joe Filkins published by the Association for Institutional Research in 2009 for more specifics. Suffice it to say, when preparing for the results of your survey to be made public, there are a number of strategies you can follow to ultimately make your report or presentation more effective:
    • Make sure the data reported are sufficient, relevant, timely and consistent -- Answer the appropriate questions and provide information that relates to the questions or issues at hand. Report the data in the same way throughout. Deliver your report in sufficient time for the data to be useful where decisions are involved.
    • Know your audience -- Don't assume your audience knows the intricacies of your data and analytics, or what your statistical procedures mean.
    • Distill important findings for your readers -- To get your points across, think about using one-page briefs that summarize the key findings and refer readers to larger reports. Be concise.
    • Practice effective presentation (written and oral) skills -- Always be mindful of an engaged audience. Consider presenting reports in a Q&A format, include anecdotes and quotes to liven be report.

    Frequently Asked Questions

    What is the best time of year to survey students?
    The answer depends on both the purpose of your survey and the cycle of other university surveys. For example, if you want to ask freshmen about their experiences at DePaul, then you want to make sure they have had enough time to have experiences - waiting until spring term is better than surveying freshmen in the fall. However, there may be other university surveys, like the DePaul Student Satisfaction Survey, that are sent out in the spring. Consult the Survey Schedule on this website or contact Joe Filkins for more information.

    How do I know if my survey is too long?
    No one has ever complained of a survey being too short! You want to ask enough questions to address your topic, but not too many to exhaust your respondents - this is what we call "survey fatigue." We recommend that a typical survey take no longer than 10 minutes to complete. Have someone not involved in your project answer the survey to test the length.

    How many students should I survey?
    This boils down to how you plan to analyze the data and what subgroups you will want to consider, along with an estimate of what fraction of the population falls into those subgroups. Most sample size decisions are concentrated on the minimum sample sizes that can be tolerated for the smallest subgroups of importance.

    Should your scale have a midpoint? How many response options should there be?
    There are no absolute answers to these questions. Some prefer the use of a four-point scale (i.e., Strongly Agree, Agree, Disagree, Strongly Disagree) to force the respondent to make a choice, rather than to give them the "neutral" option that a five point scale would include. While some have found a decrease in socially desirable responses using the four point format, others have found an increase in the reliability and validity of the questions when using a five or seven point format.

    Have a question you want answered? Ask it here.

    Useful Links

    Here are several resources at the university that provide useful information for those interested in conducting surveys or reviewing what surveys have been done:

    Office for Teaching, Learning and Assessment (TLA) - provides a number of assessment reports that different programs have done, some of which utilize surveys of their students (portions password protected).

    Academic Program Review (APR) - includes the self-studies for all units that have undergone review (portions password protected).

    Information Services CRM (IS-CRM) - provides survey and online marketing services targeting DePaul faculty and staff.

    Using Qualtrics Survey Software

    Provided here is a series of documents outlining the steps required to perform some of the basic functions in Qualtrics. Note also that Information Services has put together a page with links to Qualtrics training videos and more that you can access here.

    If you have more specific questions about using Qualtrics, please contact Joe Filkins.

    1. How do I log into Qualtrics?
    2. Instrumentation
    3. Panels
    4. Distribution
    5. Data and Reporting

 

 


DePaul University

Institutional Research & Market Analytics

Questions or Comments?
irma@depaul.edu