How to do a Survey

This is an evolving step-by-step list of what you need to think about and do if you are going to do a well thought out survey on your campus or in your community.   It is the fullest list of steps we can think of currently, really an 'as complete as possible' brain dump of all the things you may need to think about.  We will also publish a condensed, 5 or 6 step procedure, and suggest more ways to outsource much of the work, with the goal of making this as easy as possible for those interested in doing a survey with as little time expenditure as possible.

  1. Determine the value/uses of a survey on your campus
    • ​to identify potential contributors
    • to identify project collaborators, eg, faculty, staff or students
    • to provide advocacy material
      • ​​at university level eg, Provost
      • at department level
      • within a unit (eg, Library)
    • ​to support on-going initiative or project
      • what are project needs
      • what info would be useful
    • ​For research purposes
      • ​​what type of research
      • what knowledge contribution proposed

         

  2. Decide what kind of survey you want to do
    • OCW/OER/OA/OTextbook/OS/...
    • Inclusive technology use, practices
    • some combination of these
    • other, eg, copyright knowledge, faculty concerns over opening their material,  incentives for opening, or making inclusive, etc

       

  3. Investigate local survey support resources
    • Find out level of services, cost
    • Find previous clients and vet services
    • Determine what you want them to do and what you want to do yourself, what would you find difficult
      • creating sample
      • mailing to large number of recipients
      • doing secure email, encrypted invitations (if desired or required)
      • mailing to undisclosed recipients (keeping list confidential)
      • tracking responses and doing follow-up mailing
      • determining/managing respondent incentives, eg, offering an iPad in raffle
      • creating survey (questions and question order, response types...)
      • managing survey site
        • ​providing encrypted access, SSL
      • managing data
        • ​meeting security requirements like password protected data, or data encryption
      • analyzing results
      • providing synoptic reports
      • interpreting results

         

  4. Develop survey questions
    • choose from existing questions
    • see which worked best in other surveys
      • localize
      • create new questions
      • do reliability checks

         

  5. Decide on question order
    • ask familiarity questions first
    • group questions by topic, eg OCW vs OA
    • demographic questions last

       

  6. Determine Research Ethics or Human Subject Research requirements locally
    • Develop RE/HSR application
    • Determine requirements of confidentiality
    • Fulfill privacy, data security requirements
    • Develop consent procedures online methods
    • Determine what data can be taken from email address,
      • what are local protocols
      • what is available from list source
      • what can be discovered while confidentiality maintained
    • How can resulting data be sufficiently anonymized to release for open data effort
      • what are local requirements
      • often different for students vs faculty/staff
      • what demographic data must be stripped/can be maintained
         
  7. Identify list of potential respondents/sample frame
    • get from University/department/local survey organization
    • develop from public sources
       
  8. Determine data collection period
    • what conflicts exist with school calendar
    • do at beginning, middle, end of quarter/semester/year
    • what have been local practices
    • how long to leave site open
    • how many reminders, at what interval
       
  9. Draw sample or do census
     
  10. Develop invitation email
    • link to survey site
    • include consent method
    • include link for more info
    • identify person to answer questions
       
  11. Bring survey site up
    • do test runs of site
    • do testing of questions with small sample
      • flow
      • reliability testing
      • user concerns/questions encountered
      • user drop-out points

         

  12. Send out email invitations
    • Do Follow-on emails
    • 2-3-4 reminders
      • to those not completing if tracking
      • to everyone if not tracking completions
      • include apology for those who have completed
      • add additional incentives if response not sufficient
         
  13. Track response data for anomalies

     

  14. End data collection period

     

  15. Determine and announce incentive raffle winners
    • random choice among respondents
    • develop PR around gift award
      • to encourage future survey respondents
      • to increase exposure of project if desired at this stage

         

  16. Clean data and do initial data description/analysis
    • frequencies
    • crosstabs for important demographic groups/departments

       

  17. Write up synoptic report/slide set for internal presentations
     
  18. Determine what further data analysis desired
  19. If releasing data as open data, prepare data for this
  20. Write up results for publication or wider distribution