Click on the headings to open them. They will open on this page. Open the following link for further information about these headings if required.
Your browser does not support these headings. To ensure that the contents remain accessible, they have been automatically opened so that all the information on the page is displayed.
However, to take advantage of the headings and to ensure that the layout and design of this site are displayed correctly, you are recommended to upgrade to a current version of one of the following standards compliant browsers:
- Internet Explorer (http://www.microsoft.com/windows/downloads/ie/getitnow.mspx)
- Mozilla Firefox (http://www.mozilla.com/en-US/firefox/firefox.html)
- Opera (http://www.opera.com/download/)
There are references to sources and further reading within the text. You can view the full reference by clicking on the name to open a 'pop-up window'. You can then add comments to these references and include them in a personal references list.
Ongoing instructions are provided, but if you would like to read more information on how to do this before you begin, or if you experience problems, select this link for instructions on how to use the personal references list
- Select the references to see full bibliographic details in a pop-up window.
- NB. If you use pop-up window blocking software, you will need to deactivate it for pop-ups on this site to use the reference list. Alternatively, all full references can be seen by navigating to the 'References' page.
- If you would like to add a comment to attach to your record of the reference, write in the text box.
- Select 'add to list' to add the reference and comment to your list.
- You can view your references at any time, by selecting one of the 'Show references list' links. This will open your list in a pop-up window.
- NB. Each module has a different reference list. If you are navigating between modules, any references collected will be saved to different lists. To view the list for a particular module, select any 'Show references list' link within that module.
- If you leave this page, your list will be saved and will be
available for you to refer to again if you return.
(This will not work if you have disabled cookies in your browser settings)
- NB. Comments will not be saved if you navigate away from the page. You should copy all comments before you leave if you would like to save them.
Glossary links are also included within the text. If a word appears as a link, clicking on this link will show the definition of the word in a 'pop-up window'. Select the following link for information about these glossary links if required.
- Select the links see the definitions in a pop-up window.
- NB. If you use pop-up window blocking software, you will need to deactivate it for pop-ups on this site to use the glossary links. Alternatively, all glossary definitions can be seen on the 'Glossary' page in the 'Resources' section.
Using online questionnaires enables the researcher to collect large volumes of data quickly and at low cost (Fleming and Bowden, 2009; Couper et. al., 2007). Harris (1997), for example, reports that most completed online surveys are returned within 48-72 hours, making turnaround incredibly fast compared to onsite methods. Data can also be analysed continuously and directly imported into statistical tools and databases, increasing the speed and accuracy of analysis. Also online questionnaires are usually easier and faster to update during the pilot phase and data can be collected continuously - independent of the time of day and day of week. It must, however, be noted that the time taken to prepare an online questionnaire can be substantial and will outweigh some of the time savings noted above. Also, a large volume of responses does not guarantee a good quality of responses.
Costs associated with online questionnaires can be substantially lower than those associated with onsite surveys. Paperwork, telephone, postage and printing costs can be cut. Travel costs may be reduced and time may be saved by not having to travel to fieldwork sites. No costs are incurred for organising or hiring an interview venue. Software used to conduct online questionnaires is now often free, and savings may also be made on the costs associated with importing data for analysis. But it must be remembered that indirect costs can be passed to the participants, and this raises ethical issues. For example, respondents usually bear the costs of internet connection time. Additionally, financial benefits only accrue to researchers with institutional support in terms of computer equipment, software literacy training costs, internet connection time and technical support.
It is generally agreed that online questionnaires can provide a superior questionnaire interface compared to onsite surveys, as it is possible to make them more user friendly and attractive, thus encouraging higher response rates. Each questionnaire can be tailored to individual respondents with different questions being offered to different individuals. Questions can also be ordered randomly and a dynamic interface can be provided with pop-up instructions and drop down boxes. Skip patterns may be built in for ease of navigation, and there is also scope for personalisation of the research experience through, for example, providing feedback or results (Joinson and Reips, 2007). Online questionnaires can also be included on a dedicated website which can be used as a platform to provide more information about the project, the researchers and the affiliated institution. Online questionnaires also enable multi-lingual formats and pre-populating data about respondents. They can also include prompts if the respondent skips a question and can include audiovisual stimuli. These issues provide an inherently flexible design strategy, which Zhang (1999) suggests may increase a respondent’s motivation to compete the questionnaire. The use of online questionnaires can also provide the researcher with the potential to track how respondants interact with the questionnaire. It is possible to analyse the requests made to the server hosting the questionnaire to measure the number of people opening a questionnaire, viewing particular pages and submitting responses or leaving the questionnaire without submitting. This can allow problems affecting response rates in particular sections of a questionnaire to be identified and dealt with.
Responses from online questionnaires can also be automatically inserted into spreadsheets, databases or statistical packages, such as Microsoft Access and SPSS. This not only saves time and costs in the analysis phase but also ensures that data processing is automated, reducing human error in data entry and coding. Data can be automatically validated because if a data value is entered in an incorrect format, the web-based program can return an error message requesting the respondent to enter the data correctly and resubmit the questionnaire. This means that data entry errors are often low- there are no problems with interpreting handwriting, for example.
Online questionnaires can be useful in providing direct access to research populations without the need of any 'cultural gatekeepers' who might restrict access to such groups. They also enable greater potential access to small specific population sub-groups, such as people with specific illnesses, family structures, particular ethnicities, as the potential population one can draw on is generally larger than that of most onsite surveys. Finally, online questionnaires can be useful for contacting socially and physically isolated groups.
The anonymity provided by online questionnaires can also be helpful for some topics. Harris (1997), for example, suggests that interviewer bias is reduced or eliminated in online surveys. Pealer et al. (2001) also report that respondents are more likely to answer socially threatening questions in online questionnaires compared to onsite surveys or telephone interviews where interviewer effect and privacy issues may affect reliability (Braunsberger et. al., 2007). During self-administered online questionnaires, the tangible presence of the researcher is removed, so bodily presence (age, gender, ethnicity, hairstyle, clothes, accent) become invisible. It has been claimed that this can lead to online research becoming a 'great equaliser', with the researcher having less control over the research process and potentially becoming a 'participant researcher'. Other have argued that this is a utopian vision as it is also likely that while the 'lived body' is invisible during an online questionnaire, pre-interpreted meanings and unstated assumptions are clearly 'visible' in the creation of online questions because we do not leave the body, and all its material inequalities, behind when we enter cyberspace (see Sweet 2001). Additionally, the 'equaliser argument' glosses over the structural power hierarchies that enable researchers to set the agenda, ask the questions and benefit from the results of the survey process.
As online questionnaires are quick to complete, and can be completed at a time and place convenient to the respondent, they are often more popular than onsite surveys. Madge and O’Connor (2002), for example, used online questionnaires to research mothers of newborn babies. They concluded that online methods were particularly suitable for contacting this particular population group because onsite surveying was not feasible owing to physical and mental exhaustion of the mothers after childbirth and the constant demands of caring for a new baby. In this research project the use of online questionnaires enabled a 'community' (women with new-borns or young children) notoriously difficult to reach and hence habitually left out of research, to be contacted.
Perhaps the most questionable, and certainly the most commonly debated, aspect of online questionnaires is sample bias. There are enduring social and spatial divides in access and use of the internet which can induce sample biases to any online research. Also the researcher has less control over the sample population and so has no way of discerning if there are several respondents at one computer address or if one respondent is completing a questionnaire from a variety of computers. Because of the complexity of the debate, full details of this are discussed in the 'Sampling' section of this module.
Some researchers (Sax et al. 2003) have found online surveys to be subject to some unreliability (or 'measurement error') because responses to the same question vary if the questionnaire is administered online or onsite. Others (Carini et al. 2003) note this measurement error to be particularly large when technology related questions are included in a questionnaire because respondents who complete online surveys are usually more technologically competent than those completing onsite surveys.
This is the bias introduced when the respondents who answer an online questionnaire have very different attitudes or demographic characteristics to those who do not respond. This is particularly the case for online questionnaires because some social groups are underrepresented among internet users, including people of limited financial resources, members of some ethnic groups, older people and those with lower educational levels (Umbach 2004). Non-response bias is also increased when different levels of technical ability are present among the respondents, and it becomes a particular problem when response rates are low. This may be related to anxieties about getting viruses or becoming a victim of identity theft. Bosnjak et al. (1991) have identified seven patterns in a typology of non-response, as shown in the following diagram:
Select the processing types for a description of each. A text-only alternative is provided below if required.
Those responders who view and answer all questions.
Those who provide answers to those questions displayed, but quit prior to completing the survey.
Those responders who view the whole questionnaire, but only answer some of the questions.
Item nonresponding Drop-outs:
Those who view some of the questions, answer some but not all of those viewed, and also quit prior to the end of the survey. A 'more accurate depiction of actual events in web surveys than the relatively basic categorization of complete participation, unit nonresponse, or item non-response.'
Those who view some of the questions without answering, but also quit the survey prior to reaching the end, thus sharing some characteristics with 'answering drop-outs' and 'lurkers'.
Those who do not participate in the survey. There are two possible variations: They may be 'technically hindered' or may 'purposefully withdraw after the welcome screen is displayed, but prior to viewing any questions'.
Those who view all of the questions in the survey, but do not answer any of them.
Source: Bosnjak, M., Tuten, T. L., Bandilla, W. (1991) Participation in Web Surveys - A Typology, ZUMA Nachrichten 48, 7-17.
Online surveys may have to be shorter than those conducted onsite. Response rates drop off after 10-15 questions and are directly and negatively correlated with questionnaire length (Harris 1997). It is also reported that online surveys have lower overall response rates than onsite surveys, Witmer et al. (1999) suggesting response rates of 10% or lower being common. Additionally, it has been suggested that drop out from online questionnaires is much more likely than onsite questionnaires. This may be because individual questions regarding the completion of the online questionnaires are usually not possible, which can increase drop out rates. Finally, online questionnaires can be easily ignored and deleted at the touch of a button so getting a reasonable response rate can be challenging.
Protecting respondent privacy and confidentiality is a significant ethical issue. Spamming can be considered an invasion of privacy (Umbach 2004). Researchers must be very careful not to unwittingly collect information without respondent permission. Data security is also important to protect the anonymity and confidentiality of the respondent. Because of the complexity of the debate, full details of these ethical issues are discussed in the 'Online research ethics' module.