Institutional Research & Market Analytics
 
   
Research
Student Market Matrix Programs & Courses Faculty & Staff Survey Resources
   IRMA Home  »  Research  » 

Survey Resources

 
Welcome to the DePaul University Survey Resources page, the best place to find information on surveys at DePaul. From here, you can:
  • Learn about the surveys that are being administered to DePaul students, faculty, staff and alumni
  • Learn about using Qualtrics - our online survey administration tool
  • Find results from previously administered surveys
  • Request assistance for conducting your own survey

Conducting a Survey

Initial Considerations

Conducting a survey is more than just having students answer a bunch of questions. There are many things a person interested in surveying DePaul's students, faculty, staff or alumni should consider, including:
  1. Relevance: Does the survey provide useful information for DePaul's planning purposes? Does it provide useful feedback to those providing services to students? Does it provide the University with useful information on the experience of DePaul's students?
  2. Content and Design: Is the survey designed well? Is the content appropriate? Does it follow sound survey methods and practices? Is it of appropriate length? Are the questions easily understood and interpretable?
  3. Population and Sampling Methodology: What is the target population? Will the entire population be surveyed, or a sample? If the latter, what is the sampling methodology and is it sound? If the former, is a sample an option that should be considered to lessen the burden on students?
  4. Timing: When will the survey be conducted? Does it overlap with other student surveys? Is it conducted at a time during the academic year when students are likely to respond?
  5. Dissemination and Use of Information Collected: Who will have access to the information collected and how will they use it? Will it be presented to the President, Provost, Vice-Presidents, Deans, Directors, and/or other line managers who might be better informed in making decisions related to students as a result?
  6. Protection of the data and information: How will the student responses be secured to comply with FERPA and other policies?
  7. Overall Impact: What will be the impact of the study? Will the study negatively impact the University? Does the survey overburden students? Does the survey divert important University resources away from other more important projects?
Use the links on this page to learn more about survey design (writing questions, methodology, sampling, and disseminating your results). If you have any questions or need assistance with any of these areas, please contact Joe Filkins.

Survey Modes

There are many ways a survey can be administered: paper-and-pencil surveys either administered individually (distributed through the US Postal service or in person) or in a group setting; surveys administered over the phone or online; and personal interviews. How the survey is to be administered is a function of many factors including:
  • Time -- When are the data needed?
  • Sample -- Who is being surveyed and how accessible are they?
  • Resources -- Both in terms of human resources and facilities
  • Cost -- Mail surveys and personal interviews tend to be more expensive
When the specific data are needed quickly, snap polls are easy to do online or over the phone. They require little in the way of data entry and analysis. Larger studies that involve a number of research questions are better not done over the telephone. In such instances, a mail survey or an online survey might be the way to go. Your choice of survey modality will have profound effects on a number of things related to your project including:
  • How you sample
  • What form your questions will take
  • What your response rate will be
  • How representative of the population your sample will end up being

Writing Questions

In surveys, the questions you ask are typically aimed at measuring some underlying construct or large issue. To understand student satisfaction with DePaul, we have to ask more than just "Are you satisfied with your experiences at DePaul?" Good questions are reliable (provide consistent responses) and valid (correspond to what is intended to be measured).

When writing questions for a survey, one must first consider how the survey is to be administered. Questions asked in one-on-one interviews will be different than those asked online! When writing questions, one needs to consider:

  • Open-ended versus close-ended questions -- Open-ended means free form; the respondent replies as he/she sees fit. Close-ended means a list of response options is provided to the respondent - think in terms of an agreement scale where respondents must choose a response from 1 (strongly agree) to 7 (strongly disagree). Paper-and-pencil (or online) surveys will be comprised of mostly close-ended questions for ease of data coding and analysis. An open-ended question or two are acceptable to provide respondents the opportunity to explain their responses or to provide insight that would not be possible to ascertain in a closed format.
  • Forced verses multiple choice questions -- Most scaled survey items are examples of forced choice questions - the respondent is forced to choose one response, agree or disagree, satisfied or not satisfied - and only one response is accepted. Sometimes, though, you may ask someone to "Check all that apply" to a list of items. In this case, you have a multiple choice question.
  • Four, five, or seven point scales -- Should your scale have a midpoint? How many response options should there be? There are no absolute answers to these questions. Some prefer the use of a four-point scale (i.e., Strongly Agree, Agree, Disagree, Strongly Disagree) to force the respondent to make a choice, rather than to give them the "neutral" option that a five-point scale would include. While some have found a decrease in socially desirable responses using the four point format, others have found an increase in the reliability and validity of the questions when using a five- or seven-point format.

Sampling

While it would be nice to be able to survey every student at the university, that simply is not feasible because of cost and logistical concerns. Plus, we do not want to overburden our students! Thus, we draw samples, or smaller, representative subsets, of the population to survey. How representative a sample is of your population is a function of many factors, including how the sample was drawn, the sample size and the response rate. There are many ways that we can create a sample of students. Some of the more common ones include:
  • Simple Random Sampling -- This is essentially drawing students from a hat. Members of the population are selected one at a time, without replacement - meaning once selected, that individual has no further chance of being selected again.
  • Systematic Sampling -- A variation of simple random sampling which is mechanically easier to create. Say there are 8,500 members in your population and you want a sample of 100. Dividing the former into the latter shows that you require 1 of every 85 persons in the population to be included in the sample. A systematic sample would select a random starting point from the population list and then include every 85th individual.
  • Stratified Sampling -- When you know some of the characteristics of the population before sampling, you can structure the sampling process to produce a sample that is more likely to reflect the total population than would be produced by simple random sampling. This process is called stratification. For example, say you wanted to compare responses to a survey by the class level of a student. If the population was made up of equal percentages of freshmen, sophomores, juniors and seniors, then a simple random sample would generate roughly equal numbers of each class. However, if there is unequal representation across class levels within the population, sampling within the class levels (or strata) will produce groups of roughly equal size and make comparisons easier analytically.
One of the most common questions asked is how big the sample should be. Really, this boils down to how you plan to analyze the data and what subgroups you will want to consider, along with an estimate of what fraction of the population falls into those subgroups. Most sample size decisions are concentrated on the minimum sample sizes that can be tolerated for the smallest subgroups of importance.

Deliverables

Once the data are collected and analyzed, it becomes time to prepare the report on the survey. There are a multitude of factors to consider when preparing your report and we direct you to the monograph Effective Reporting, 2nd Edition by Liz Sanders and Joe Filkins published by the Association for Institutional Research in 2009 for more specifics. Suffice it to say, when preparing for the results of your survey to be made public, there are a number of strategies you can follow to ultimately make your report or presentation more effective:
  • Make sure the data reported are sufficient, relevant, timely and consistent -- Answer the appropriate questions and provide information that relates to the questions or issues at hand. Report the data in the same way throughout. Deliver your report in sufficient time for the data to be useful where decisions are involved.
  • Know your audience -- Don't assume your audience knows the intricacies of your data and analytics, or what your statistical procedures mean.
  • Distill important findings for your readers -- To get your points across, think about using one-page briefs that summarize the key findings and refer readers to larger reports. Be concise.
  • Practice effective presentation (written and oral) skills -- Always be mindful of an engaged audience. Consider presenting reports in a Q&A format, include anecdotes and quotes to liven be report.

Frequently Asked Questions

What is the best time of year to survey students?
The answer depends on both the purpose of your survey and the cycle of other university surveys. For example, if you want to ask freshmen about their experiences at DePaul, then you want to make sure they have had enough time to have experiences - waiting until spring term is better than surveying freshmen in the fall. However, there may be other university surveys, like the DePaul Student Satisfaction Survey, that are sent out in the spring. Consult the Survey Schedule on this website or contact Joe Filkins for more information.

How do I know if my survey is too long?
No one has ever complained of a survey being too short! You want to ask enough questions to address your topic, but not too many to exhaust your respondents - this is what we call "survey fatigue." We recommend that a typical survey take no longer than 10 minutes to complete. Have someone not involved in your project answer the survey to test the length.

How many students should I survey?
This boils down to how you plan to analyze the data and what subgroups you will want to consider, along with an estimate of what fraction of the population falls into those subgroups. Most sample size decisions are concentrated on the minimum sample sizes that can be tolerated for the smallest subgroups of importance.

Should your scale have a midpoint? How many response options should there be?
There are no absolute answers to these questions. Some prefer the use of a four-point scale (i.e., Strongly Agree, Agree, Disagree, Strongly Disagree) to force the respondent to make a choice, rather than to give them the "neutral" option that a five point scale would include. While some have found a decrease in socially desirable responses using the four point format, others have found an increase in the reliability and validity of the questions when using a five or seven point format.

Have a question you want answered? Ask it here.

Useful Links

Here are several resources at the university that provide useful information for those interested in conducting surveys or reviewing what surveys have been done:

Office for Teaching, Learning and Assessment (TLA) - provides a number of assessment reports that different programs have done, some of which utilize surveys of their students (portions password protected).

Academic Program Review (APR) - includes the self-studies for all units that have undergone review (portions password protected).

Information Services CRM (IS-CRM) - provides survey and online marketing services targeting DePaul faculty and staff.

Using Qualtrics Survey Software

Provided here is a series of documents outlining the steps required to perform some of the basic functions in Qualtrics. Note also that Information Services has put together a page with links to Qualtrics training videos and more that you can access here.

If you have more specific questions about using Qualtrics, please contact Joe Filkins.

  1. How do I log into Qualtrics?
  2. Instrumentation
  3. Panels
  4. Distribution
  5. Data and Reporting

Current Student Survey Schedule

Quarter Survey Name Population Cycle Contact
 
INTERNAL SURVEYS
Fall Career Outcomes Follow-up* Graduating students Yearly Coleen Dickman / IRMA
MENP Graduate Survey All Recent MENP Graduates Yearly Meg Marchese / EMM
Cost of Attendance Survey All Full-Time Undergrad and Graduate Students Variable Joe Filkins / IRMA
AlcoholEdu for College New Freshmen Unknown Tyneka Harris / Student Affairs
Haven - Understanding Sexual Assault New Freshmen Unknown Tyneka Harris / Student Affairs
OAE - Orientation Outcomes Survey New Freshmen and Transfers Unknown Tyneka Harris / Student Affairs
Financial Wellness Survey Undergraduate Students Unknown Tyneka Harris / Student Affairs
Fall-Winter Continuing Student Survey Continuing UG/Grad and Law students Every 3 Years Joe Filkins / IRMA
Winter Graduating Student Survey - Grad Grad Student Graduation Applicants Yearly Joe Filkins / IRMA
Graduating Student Survey - UG UG Graduation Applicants Yearly Joe Filkins / IRMA
Advising Survey Undergrads and grad/law students Every 3 Years Joe Filkins / IRMA
Faculty/Staff Climate Surveys Faculty and staff Every 3 Years Joe Filkins / IRMA
Spring Career Outcomes Survey (Cap and Gown) Graduating students Yearly Coleen Dickman / IRMA
AAA - Athlete Survey Student Athletes Yearly Tyneka Harris / Student Affairs
RE-Alcohol Survey Resident Students Yearly Tyneka Harris / Student Affairs
RE-Quality of Life Resident Students Yearly Tyneka Harris / Student Affairs
Summer APR Alumni Outcomes Survey Alumni 1, 3 and 5 years out Yearly Joe Filkins / IRMA
Freshmen ASQ All Admitted Freshmen Variable Sue Stachler / IRMA
Graduate ASQ All Admitted Grad Students Variable Sue Stachler / IRMA
EXTERNAL SURVEYS
Winter-Spring NSSE (National Survey of Student Engagement) & New Freshmen and Seniors New freshmen and seniors Every 3 years Joe Filkins / IRMA
Diverse Learning Environments All Students TBD Joe Filkins / IRMA
Spring Your First Year in College New Freshmen TBD Joe Filkins / IRMA
 
* The Career Outcomes survey is conducted in two stages. Students are asked to complete the survey when they pick up their cap and gown for commencement, and then most are followed up with after six months. The most recent cap and gown survey was administered this past June to 16/17 grads. The follow-up survey will be distributed in late November to 16/17 grads. The most recent follow-up data we have is for 15/16 grads.

Reports and Questionnaires

Student Surveys

Freshman Admitted Student Surveys assesses incoming freshman perceptions of the college choice process, and how DePaul compares to other schools in the choice set on multiple characteristics. Freshman Admitted Student Surveys
 
Graduate Admitted Student Questionnaire assesses what incoming masterís students consider the most important characteristics of a graduate program, along with perceptions of aid and program offerings. Graduate Admitted Student Questionnaire
 
Continuing Students Surveys assesses student satisfaction with their overall and academic experiences, academic advising, career development and campus climate. This survey is administered to undergraduate, graduate and law students. Continuing Students Surveys
Graduating Student Exit Surveys assess perceived gains in areas related to the university's institutional or programmatic learning goals and satisfaction with university experiences of DePaul's graduating seniors and graduate students. Graduating seniors are also asked about their future employment and graduate school plans. Graduating Student Exit Surveys
National Survey of Student Engagement (NSSE) assesses the extent to which colleges and universities engage in practices and support environments that contribute to student learning. This survey is administered to first year and senior undergraduates. NSSE
Advising Survey assesses students' perceptions of their advising experiences in terms of needs and satisfaction with the advising process. This survey is administered to undergraduate students. Advising Survey

Faculty and Staff Surveys

In preparation for the last North Central Accreditation (NCA) visit, a campus climate survey for faculty and staff was created and administered in 2005. The survey includes sections on:
  • General Experiences
  • University Mission and Values
  • Diversity
  • Policies and Procedures
  • Human Resources and Professional Development
  • Collegiality
  • Leadership
  • Academic Mission
  • Faculty/Staff Support
  • Workload
  • Faculty/Staff Governance

It is intended that these data serve to support the administrative functioning of the university, provide key outcomes metrics for the VISION twenty12 Strategic Plan, and inform the university of staff perceptions.

2016 Faculty Climate Survey
2007-2016
2016 Staff Climate Survey
2007-2016
Faculty Survey Report
2013
Staff Survey Report
2013
2016 Faculty Climate Survey 2007-2016 2016 Staff Climate Survey 2007-2016 Faculty Survey Report 2013 Staff Survey Report 2013

Alumni Surveys

Understanding what happens to our students after they leave DePaul is as important as understanding the experiences of currently enrolled students. Over the years, there have been two primary surveys of alumni:

  • Alumni Outcomes Survey -- Done in conjunction with academic program review, this survey assesses the impact of the DePaul education on the personal and professional lives of our alumni.
  • Alumni one-, three-, and five-years out are routinely surveyed as part of the Academic Program Review process. Earlier program review cycles included surveys of alumni only from programs undergoing review. Starting in 2009, however, the sample of this survey was expanded to include ALL alumni in these timeframes so as to collect better baseline data for future program reviews.

    The current survey instrument can be found here. Note that this is an online survey and how it appears on the printed page is not how it appears to alumni as they complete it.

    Reports from that survey are available via the links below. For more information on this survey, please contact Joe Filkins.

    Postgraduate Outcomes
    2016
    Alumni Affinity
    2009
    Alumni Survey
    2010
    2016 Career Outcome Alumni Affinity 2009 Alumni Survey 2010

    Career Outcome information can be found directly on the Career Center Website.

    DePaul Student Survey Working Group

    During the 2007-08 academic year, representatives from the main areas of the university responsible for student surveys (Enrollment and Marketing Research, CRM/Information Services, Institutional Research, Career Services, and Student Affairs), as well as representation from Academic Affairs met to discuss the coordination of all large scale student surveys at the university. The group decided that Institutional Research would continue to be the primary provider of consultation to university constituencies interested in gathering data on students through the use of surveys. The purpose of such consultation includes:
  • Assisting in the survey planning process
    Consultation on survey methods and design to ensure that survey research is of the highest quality, and that this research produces information that adds value to university decision-making.
  • Coordinating the schedule of deployment for student surveys
    The creation of a survey calendar to more effectively, strategically, and efficiently support survey research on campus, preventing scheduling bottlenecks and overburdening of particular populations.
  • Sharing knowledge on existing data collected from previous and ongoing student surveys of sufficient scope
    Review of survey research projects to identify data that may have continued value for university study past the original study, with the informed consent and collaboration of the primary researcher.
  • We understand that the quality of the data received is directly related to the time and effort put into the survey design process, and we want to ensure that what's provided here will result in a high value product for the university. Ultimately, our goal is to increase the quality of the data collected and thereby the value of the information to the primary researcher and the university community.

    Please use us a resource for your research needs. A contact list has been created so that you can get in touch with us at any time. Also, we have created a list of frequently asked questions, as well as provided information on the survey administration process.

    For Institutional Research surveys and services:
    Dr. Joe Filkins
    Senior Research Associate
    (312) 362-6746
    jfilkins@depaul.edu
    Liz Holder
    Research Associate
    (312) 362-6719
    eholder1@depaul.edu

    For admission, recruitment and market-related surveys and services:
    Susan Stachler
    Senior Research Associate
    (312) 362-5219
    sstachle@depaul.edu

    For Student Affairs surveys and projects:
    Dr. Ellen Meents-Decaigny
    AVP for Assessment, Research and Communications
    (312) 362-7298
    emeentsd@depaul.edu
    Tyneka Harris
    Project Leader
    (312) 362-8451
    tharris6@depaul.edu

    For IS/CRM surveys and services:
    Jason Koziara
    CRM Team Lead
    (312) 362-5217
    crm@depaul.edu

     

     


    DePaul University

    Institutional Research & Market Analytics

    Questions or Comments?
    irma@depaul.edu