About this Resource
[Skip navigation][Access key info]SEARCH | SITE MAP
SELF-STUDY INDEX
Exploring online research methods - Incorporating TRI-ORM

TRI-ORM - Project evaluation

[Skip instructions]

[i] Click on the headings to open them. They will open on this page. Open the following link for further information about these headings if required.

Your browser does not support these headings. To ensure that the contents remain accessible, they have been automatically opened so that all the information on the page is displayed.

However, to take advantage of the headings and to ensure that the layout and design of this site are displayed correctly, you are recommended to upgrade to a current version of one of the following standards compliant browsers:

 

Open/close headingOverview

The training programme was evaluated from the outset by an independent Evaluation Consultant, Dr Julia Meek. The purpose of this external evaluation was to identify and feedback to the project team on how delivery of the project objectives could be improved within the lifetime of the project (formative evaluation) and on whether the project outcomes had been delivered by the end of the project (summative evaluation).

Close heading CLOSE

Open/close headingEvaluation timeline

The evaluation plan can be seen in the following timeline which shows the key points when evaluation activities took place.

[i] The evaluation timeline will open in a new window, which you should close to return to this page.

The TRI-ORM Evaluation timeline (rtf, 74KB)

 

Close heading CLOSE

Open/close headingFormative evaluation

The Formative Evaluation focused on three key areas:

Open/close headingEvaluation of the TRI-ORM website

The independent evaluator undertook a heuristic evaluation of the website, focusing on navigation and ease-of-use in November 2007, and changes were made to the site to respond to the issues highlighted.

[i] The evaluation will open in a new window, which you should close to return to this page.

TRI-ORM heuristic evaluation report (rtf, 70KB)

 

Close heading CLOSE

Open/close headingEvaluation of the TRI-ORM workshops

The evaluation of the TRI-ORM workshops contained four phases:

  1. Pre-workshop questionnaire (online)
    This obtained the background information from participants for example discipline area / role / experience and their motivation for attending the workshop.
  2. Workshop questionnaire (paper based)
    This focused on participants' immediate reaction to the workshop. Qualitative feedback was obtained on what they liked/disliked and how they plan to use their new skills.
  3. Workshop observation
    The independent evaluator attended and observed the workshops to evaluate the effectiveness of the training and obtain feedback from the participants.
  4. Post workshop feedback (email)
    Participant were contacted six months after the workshop and asked how they have used the skills they learnt at the workshop. They were also asked to reflect on what was the most important part of the workshop was for long-term use and how they used the methods in research/teaching.

 

Close heading CLOSE

Open/close headingEvaluation of the online course

A detailed evaluation of the online module was conducted during the first year. Formative data was gathered as the module was conducted. Firstly, the evaluator participated in the module as a learner and kept a weekly diary. Secondly, the development team created a message board to enable participants to post any problems they were experiencing. A summative evaluation was conducted at the end of the first iteration of the module using an Online Questionnaire and a synchronous online focus, which enabled participants’ to feed back their experience of participating in the module. An evaluation report presenting the findings was produced for the module development team, they were able to use this feedback to make minor changes to the running for the module form the second year.


Close heading CLOSE

Findings from the formative evaluation were reported to the project team in order that any lessons learnt from the evaluation could be fed into preparations for the next workshops and online training courses.

 

Close heading CLOSE

Open/close headingSummative evaluation

The Summative Evaluation drew on the formative evaluation reports from the three key areas of TRI-ORM to produce three final summative evaluation reports.

Open/close headingEvaluation of the TRI-ORM workshops

TRI-ORM provided six introductory face-to-face one day workshops, providing a basic grounding in online research methods. Workshops were held at ESRC Regional Training Centres (Qualiti at Cardiff and at the Cathy March Centre for Census and Survey Research (CCSR) in Manchester) and at the host University for the project. All workshops were oversubscribed.

Table 1 provides details of the workshops that were conducted, including the date, location, number of participants registered for the workshop and the number of participants who took part in the workshop. The original intention of the project team was to limit the number of participants at each workshop to 20. However, as the project progressed it was clear that there was such a high demand for places at the workshops, the project team decided to allow more than 20 to register for each event. The workshops presented in grey in Table 1 represent additional workshop that were requested and run during the project.

Date

Location

Number Registered

Number attending

28/06/07

University of Leicester

20

20

12/10/07

Cathy March Centre for Census and Survey Research, University of Manchester

20

19

5/12/07

University of Leicester

13

13

13/02/08

Qualiti, University of Cardiff

30

29

1/07/08

University of Leicester

24

21

12/09/08

Cathy March Centre for Census and Survey Research, University of Manchester

25

22

31/10/08

Qualiti,  University of Cardiff

25

24

25/03/09

University of Leicester

30

30 plus 1 video link from Greece.

SPR Programme University of Leicester (two further events requested)

Total

179

Table 1: TRI-ORM one day workshops

Table 1 illustrates that in total 179 (135 at planned workshops / 44 at additional workshops) people attended the series of one day workshops. This exceeds the estimate of 120 people that was outlined in the project bid. The project team has clearly met the objective of training a core group of people in basic online research methods.

1 Workshop Evaluation

The primary aim of the evaluation was to provide formative feedback to the project team on the workshops (areas for improvement) and to identify participants’ views about the success or otherwise of the workshops. A series of evaluation methods were utilised:

  • Participants were asked to complete an online pre-workshop questionnaire: the aim was to gather data on the experience and level of knowledge of workshop participants prior to attending the workshop. The data was provided to the workshop team who were then aware of the discipline area and the level of knowledge of online research methods of participants. This enabled the workshop presenters to customise the workshop to fit the knowledge and experience of participants.
  • Observation: The three workshops during the first year of the project were observed by the evaluator to provide feedback to the team on how the workshop might be improved; recommendations were fed into the next workshop. By the end of the first year it was agreed that the workshop format had been perfected and observation of the later workshops was unnecessary.
  • Feedback questionnaire (paper-based): participants were asked to complete a paper based feedback questionnaire during the workshop. Participants were requested to fill in a section of the questionnaire at set points in the day to reflect on the workshop session they had just experienced. Participants were also asked to make suggestions how to improve the event, provide their overall reaction and describe how they planned to use the knowledge they had gained at the workshop in the future.
  • Six month follow up online questionnaire: participants who consented to be contacted were sent an online questionnaire, six months after the workshop. The aim was to gather data on whether they had actually used online research methods, in order to ascertain the longevity of knowledge transfer provided by the workshop.

The following sections describe the key results from each of the evaluation methods.

1.1 Pre-workshop questionnaire

People who registered for the workshops were asked to complete an online pre-workshop questionnaire. The aim of the questionnaire was to capture profile data about the type of person who was attending the workshop, their currently level of knowledge of online research methods and the key areas of the workshop that were of particular interest to them.

The profile data was sent by the evaluator to the project team, who were able to see a profile of the participants attending the workshop e.g. their discipline area, their particular area of interest in online research methods, what they wanted to gain from attending the workshop. This information enabled the workshop presenters to shift the workshop aims and focus appropriately to suit the people attending the workshop.

Not all participants who attend the workshops opted to complete the pre-workshop questionnaire. However, the profile data gathered indicates the groups that the participants were drawn from. Table 2 provides details of the workshops and a breakdown of the roles of the people who opted to complete the pre-workshop questionnaire.

Type of participant

Leicester 2007

Manchester 2007

Cardiff

Feb 2008

Leicester

2008

Manchester

2008

Cardiff

Oct 2008

Total

Total number attending workshop

20

19

29

21

22

24

135

Research students

30%

25%

17%

33%

45%

21%

28%

Early career researchers

5%

37%

14%

28%

22%

33%

23%

Mid-career researchers

5%

 

10%

5%

4%

8%

5%

Senior researchers

5%

19%

7%

5%

 

8%

7%

Trainers

   

3%

5%

   

1%

Practitioners

 

6%

     

4%

2%

Other*

Project Manager

Regional Observatory Coordinator

Voluntary sector researcher

Clerical Assistant

Policy and Research Assistant

Research Associate

Secretary

Research Staff Development Manager

Medical practitioner

IT lecturer, IT consultant, IT support, you name it?! Senior Lecturer in Clinical Sciences (Chiropractic)

Administrator involved in course evaluation

 

Table 2: Participants’ roles

Table 2 highlights that a wide range of users have been trained, the tendency is towards the Research student and Early Career researcher, which perhaps is to be expected as these are the groups who are most familiar with new technology and most interested and motivated to explore using new technology based research methodologies.

The pre-workshop questionnaire captured information on the participants’ institutions and discipline areas. This illustrates that participants came from a wide range of social sciences, medical sciences, government research organisations, university administrative sectors, a whole spectrum of disciplines and institutions. Table 3 provides a list of the institutions/organisations that participants were drawn from; the number that attended a workshop from that institution. A summary of the broad range of discipline areas is also presented alphabetically.

Leicester 2007

Manchester 2007

Cardiff

Feb 2008

Leicester

2008

Manchester

2008

Cardiff

Oct 2008

De Montfort University (7)

Loughborough University(3)

University of Nottingham (5)

University of East Anglia

Keele University

University of Leicester (2)

BACP

De Montfort University

Coventry University

University of Nottingham (2)

University of Sheffield (2)

Manchester Metropolitan University (2)

Birmingham City University (2)

University of York (2)

Loughborough University

University of Central Lancashire

University of Plymouth

Intelligence East Midlands

Turning Point

British Psychological Society

Service Improvement Team, Trading Service

Cardiff University (4)

City University

University of Bristol (2)

Bournemouth University

University of Birmingham

Swansea University (6)

Heriot-Watt University

Keele University

Liverpool John Moores

University

London School of Economics

Loughborough University (3)

University of Portsmouth

University of Teeside (2)

York St John University

CRAC: Career Development Organisation

Higher Education Academy

NCH

London School of Economics (3)

University of Bristol (2)

University of Nottingham

University of London

De Montfort University

University of Teeside (2)

Swansea University

University of Dundee

Lancaster University

University of Leicester (4)

University of Manchester (11)

Lancaster University

Leeds Metropolitan University (4)

University of Central Lancashire (2)

London South Bank University

University of Leeds

Cardiff University (2)

University of Bristol (3)

University of Glamorgan (10)

University of Sheffield

Leeds Metropolitan University

University of Leeds

University of East London

University of Salford

Roehampton University

Newman College of H.E.

Bangor University

University of the West of England

Discipline areas:

Accounting and Finance, Career development, Children's Social Policy and Research Chiropractic, Consumer research, Counselling and Psychotherapy, Counselling and Psychotherapy, Psychology, Customer Satisfaction and Education, Demography, Population science Development of informatics research, Doctors in training, Education, Education, Social Sciences, Educational informatics, Educational Research, Environmental Planning, Film, Genetics, Genetics education of health professionals (esp. nurses and midwives), Government Health, Health and social care, Health Psychology, Health Related Research and Evidence Based Practice Information Support, Health Services Research, Information Management, Learning Technology, Law, Learning & Teaching Research, Learning Disabilities - Social Work, Learning Disabilities, genetics, Media and Cultural studies, Medicine, Midwifery, Health Science, Mlearning and science education, Nursing/health care, Occupational Psychology, Pedagogy in HE, Personnel Services and Staff Development, Physiotherapy, Political science, Primary care, Psychiatry, Psychology, Psychology and Counselling, Psychology/ mental health, Qualitative Research, Social Policy, Social Research, Social work research, Sociological Studies, Sociology, Sociology of Sport, Staff and Student Development, Tourism, Marketing

Table 3: Institutions and Discipline areas

The geographic spread of institutions highlights that participants were willing to travel some distance to attend the workshops (even though the workshops were spread across the country) illustrating the high demand for these workshops. The wide range of disciplines highlights the interdisciplinary interest in online research methods and the demand for training in online methods across all academic subject areas.

The face-to-face workshops were designed to provide an introduction and a basic grounding in online methods for people new to this area of research. The pre-workshop questionnaire asked participants to reflect on their current knowledge of online research methods. Table 4 highlights that the majority of participants saw their pre-workshop knowledge of online research methods as basic.

Level of knowledge of ORM

Leicester 2007

Manchester 2007

Cardiff

Feb 2008

Leicester

2008

Manchester

2008

Cardiff

Oct 2008

Total

Basic

12

13

19

13

11

16

84

Intermediate

2

3

1

1

0

3

10

Advanced

0

0

1

0

0

0

1

Table 4: Level of knowledge

The fact that the majority of participants at the workshops identified their knowledge of online research methods as basic illustrates that there was a clear need for the introductory level of training. It also indicates that the marketing and publicity for the workshops was clearly reaching its target group.

The pre-workshop questionnaire asked participants Which, if any, of the following online methods have you experience of as a researcher?’’. Table 5 illustrates that some participants had experience of online questionnaires but the majority had no experience of online research methods at all. These workshops were thus clearly catering for base level training needs.

Level of knowledge of ORM

Manchester 2007

Cardiff

Feb 2008

Leicester

2008

Manchester

2008

Cardiff

Oct 2008

Total

Online interviews:

1

2

0

0

1

4

Online questionnaires:

2

10

5

3

10

30

Online ethnographies:

1

0

0

0

1

2

Online experiments:

0

1

0

0

0

1

None of the above:

13

9

7

6

8

43

Table 5: Experience of Online Research Methods

In the pre-workshop questionnaire participants were asked what they were most interested in gaining from the workshop. Clear categories emerged from the responses, for example; broader knowledge of online research methods, confidence, good practice, practical application of online research methods. Selections of quotes illustrating participants’ responses are included below:

A better understanding of what is involved in doing research online.

I would like to learn more about how the internet can be used for research purposes. I am also interested in discussing some of the issues facing online researchers with other individuals who conducting similar work.

An understanding of how I can use on-line methods in my own research .

Best practice re:  participant completion rates, 'things to avoid' tips from experienced researchers using online methods. 

The main software, skills and literature to obtain information and confidence to formulate and carry out online research.        

Further insight into the range of issues that relate to online research.   

How best to conduct asynchronous online interviews.     

Having a more in-depth understanding about when to use/when not to use online methods and how to get the most out of them in research. 

A broader understanding of on-line methodologies and ethics.

It is clear that the pre-registration questionnaire was important for establishing the needs of participants’ prior to the workshop and this data helped in the allocation of groups within workshops. This information enabled the project team to ‘mould’ the training to participants’ needs.

1.2 Workshop Observation

The three workshops conducted during the first year of funding were observed by the evaluator (by the end of the first year it was agreed that the workshop format had been perfected and observation of the later workshops was unnecessary). The main objective was to provide immediate feedback to the project team on the delivery of the workshops. The evaluator wrote a report following each workshop providing formative feedback to the project team which included a series of recommendations for the next workshop. The team acted on the recommendations, leading to a series of minor changes to the following workshop format. For example; using data from the pre-workshop questionnaire to draw on participants experiences, tailoring the workshop to specific needs of the group, ensuring the host institution provided administrative support for workshop registration, slight changes to the structure of the sessions reducing the time spent on group feedback in the morning session, including key references in the delegate packs, extending access to Bristol Online Survey (BOS) for a period following the workshop to allow participants to complete their online surveys etc.

The six workshops followed the same format, which was modified slightly as a result of feedback during the first year and updated by the project team during the second year e.g. new ethical case studies were introduced, new references were added, timing of sessions was perfected, pre-workshop liaison with the host institution was improved etc.

A key aim for the project team was to make the workshop as practical as possible, enabling participants to gain hands on experience of a range of online research methods. These hands on sessions were highlighted by many participants in their feedback as one of the most beneficial aspects of the day. However, the project team had to put in a lot of liaison work with the various host institutions and faced a number of practical challenges to get these hands on sessions to work. The project team had to liaise with the host institutions IT support staff to arrange guest computer accounts for all participants; they had to ensure that the host institutions network firewall would enable participants to access the Connect software that was being used for the online interview practical session. Even when all these checks had been made on one occasion the project team faced technical issues on the day of the workshop accessing the interview software. Fortunately the issue was resolved by the local IT support staff in time for the session to run successfully.

The format for the workshops

The morning session focused on online questionnaires, covering theory and providing hands on activities. The opportunity to use Bristol Online survey (BOS) to design their own basic online questionnaire was highlighted by many participants as a positive element of the workshop, illustrated by the quotes from participants below:

Hands on activity – it was really useful to consider how questions can be set online.

Being signposted to useful resources and introduced to Bristol Online survey. It was really useful to have practical experience of using this.

Opportunities to explore using and designing an online questionnaire. Good that tutoring staff were clearly talking from personal experience which enabled them to personalise the methodology

The afternoon session focused on online interviews covering the theory and presenting an illustrative case study. A scenario was presented and participants were given the opportunity for hands on experience of an online synchronous interview. Working in groups, one member of the group was given the role of online interviewer and others adopted the roles of interviewees using the scenario that had been presented. Hands on experience really gave participants a taste of online interviews. Participants’ enjoyed the role play and also realised;

… that online interviews can be very difficult and require great skill.

The difficulty of dealing with the numbers of people online and establishing rapport. The necessity of up front info about the research. The speed of questions and answers.

The Speed of response when interviewing more than one person at a time – it is essential to pilot and practice interviews.

Really enjoyed practicing online interviews, it was fun but also great to see how it would work and potential complications

The final session of the day focused on the important issue of ethics. Participants worked in groups using scenarios to consider the key issues.

The complexities involved and deep thinking involved before undertaking any online methods.

Raised questions which I hadn’t previously considered.

Thought provoking.

Space for discussion and debate.

Thinking about ethics of collecting sensitive data.

Workshop participants’ were asked to complete a paper based feedback questionnaire at the end of each session, reflecting on the session and providing their overall feedback on the workshop. Their responses’ are discussed in the following section.

1.3 Paper-based feedback questionnaire

The participants at the workshops were asked to comment on the best and worst aspects of each of the workshop sessions. This formative feedback enabled the project team to understand the aspects of the workshop that worked well and not so well. The questionnaires were transcribed and analysed by the project evaluator and feedback was given to the project team, enabling any necessary changes to the workshop format to be made.

The overall feedback from participants on the workshop was very positive. Some of their comments are presented below:

Nice mix of information giving and practical work, quite a lot of info in a short space of time.

Hands on practice on designing the web questionnaire. Having experienced convenors and some experienced participants at the workshop does help in my learning.

A very good intro.

Perfect day.

Opportunity to discuss scenarios covering difficult ethical issues.

That is one of the best ORM workshops I have been to.

Participants were also asked to rate the workshop using a scale of 1 to 5, where 1 indicated not satisfactory and 5 very satisfactory (Table 6)

How satisfactory was the workshop

Leicester

2007

Manchester

2007

Cardiff

Feb 2008

Leicester

2008

Manchester

2008

Cardiff

Oct 2008

Total

1 (not satisfactory)

0

0

0

0

0

0

0

2

0

0

4%

0

0

0

0.5%

3

0

0

21%

0

12%

12%

7.5%

4

22%

50%

54%

56%

69%

18%

45%

5 (very satisfactory)

78%

50%

21%

44%

19%

70%

47%

Table 6 - How satisfactory was the workshop

It is clear the workshops were rated highly. A majority of participants rated the workshop as 4 or 5, indicating they were very satisfactory.

Participants were asked to suggest other areas they would have like to see covered in the workshop or any improvements to the workshop for the future. Key themes that emerged from the comments were: to develop a two day workshop (repeated by many respondents), including more practical experience, a session on online experiments, a session on online surveys engines, a session on using the Web as a way of collecting/ presenting analysis, more examples of successful online research and information on the legal facts. The wide range of suggestions for areas to include in future workshops highlights the breadth of material that could be covered in the area of online research methods and the demand for training in these areas.

The workshops focused on ‘training the trainers’ with the aim that the enhanced knowledge of established researchers would be passed on (‘trickle down’) to research staff and postgraduates through follow-on teaching and supervision. The feedback provided by participants at the workshops illustrates that many intended to use their new knowledge in their own PhD research, their teaching or pass it on to their PhD students, as illustrated by the following quotes:

To inform research and teaching.

Will be presenting a short item on the organisations R & D forum.

Apply ideas to developing methodology to my PhD work.

In student project supervision.

I’m planning to conduct a quantitative survey. A lot from today will be useful.

I’m going to pilot a questionnaire if I can have use of some software at work. I’ll definitely use the website and possibly the online course for my own interest.

Will disseminate with colleagues and possibly use related ethically based issues in scenarios.

The future orientation of the training was followed up through a six month follow up online questionnaire.

1.4 Six month follow up questionnaire

In the paper based feedback form participants were asked if they would be happy to be contacted after six months. Those who consented to be contacted were sent a link to an online questionnaire which enabled the evaluator to gather data on how they had used the information from the workshops. The qualitative feedback provided by respondents highlights that the majority had used the knowledge they had gained at the workshop to implement an online questionnaires either using BOS, SurveyMonkey, or their own existing software. Illustrated by the responses included below:

I used surveymonkey.

Online questionnaire to our membership. System is an internal one to BAC.

Online series of questionnaires for PhD pilot study to trial the online feasibility for the main study. Used Survey Monkey, which is user friendly and 'looks good’ online.

I am about to send out a questionnaire using BOS.

I have developed two online questionnaires regarding help-seeking for mental health problems, and evaluation of smoking policy. I used SurveyMonkey for one study, and hosted another one to a self-registered domain.

BOS and Questback PRES and PTES survey and some internal Academy surveys.

BOS questionnaire devised

I provided some input into an online survey designed to gather some initial information on attitudes towards policing and crime amongst new migrants in London. The team used survey monkey to host the questionnaire which is still running.

Other respondents had conducted asynchronous interviews using email and returned to the TRI-ORM website. Illustrated by the response is included below:

I had a few interviews conducted via emails. I have not used any software ( I guess I need time to familiarize with the software).

I have conducted online interview using email.

I have used the resources on the TRI-ORM website, ethics in the design of a bid to involve using a blog (and interviews - non-online).

The six month follow up data illustrates that participants’ were using the knowledge they had gained for their own research or in teaching, six months after the workshop. This illustrates the deep impact and long term sustainability of the training provided by the TRI-ORM workshops.

2 Additional benefits and demand

The workshops themselves provided an excellent opportunity for networking, which is a key aspect of training sustainability. Participants were encouraged to network; the workshop offered the opportunity to make contact with other researchers using online research methods.

All the workshops were oversubscribed and approximately ten additional requests have been received by the project team following the final workshop: Illustrated by the correspondence included below:

I notice that the online-research methods training events have now closed

- are there any plans for future workshops?

Lecturer in the Psychology

I would be very interested in attending the one day workshop you run in online research methods.  I have looked at your web site and note that I as I have missed the current round of workshops that ran last year, I should register my interest for future workshops with yourself.

Are there any workshops planned for this year as yet?

Senior Lecturer, Faculty of Health and Life Sciences

The fact the workshops were oversubscribed and requests for workshops were being received by the project team following the end of the workshop programme illustrates that there is a clear demand for this basic level of training in online research methods.

3 Workshop Conclusion

The project team have successfully trained a core group of people in basic online research methods. In total 179 people attended the series of one day workshops, this exceeds the estimate of 120 people that was outlined in the project bid.

A range of users have been trained, the tendency is towards the Research student and Early Career researcher, which is perhaps to be expected as these groups are possibly more motivated to explore using new technology based research methodologies. The wide range of discipline areas that participants’ came from clearly highlights the interdisciplinary interest in online research methods and the demand for training in online research methods across all academic subject areas.

The aim of the workshops was to provide a basic grounding in online research methods. The pre-workshop questionnaires illustrated that participants were very interested in online research methods but their pre-workshop knowledge was basic and their experience limited. This illustrates that the marketing and publicity for the workshops was clearly reaching its target group. It also highlights the great interest in online research methods and the need for a basic level of training. The interest in this area is demonstrated by the fact that each of the workshops was oversubscribed and additional workshops have been requested.

The workshops have been delivered appropriately and efficiently. The observations of the evaluator and feedback from participants’ highlight that the workshops were very well run, with a high level of hands on relevant experience for participants. However, it should be noted that running such interactive workshop required a lot of time and effort from the project team.

The immediate feedback from participants at the workshops highlighted their intention to use their new knowledge of online research methods in their research and teaching. Evidence that this ‘intended use’ has been transferred into ‘actual use’ of online research methods was gathered in the six month follow up feedback questionnaire. The six month follow up data illustrates that participants’ were using the knowledge they had gained for their own research or in teaching, six months after the workshop. This practical use of online research methods clearly demonstrates the deep impact of the workshop and long term sustainability of the training provided by the TRI-ORM project team. The practical transfer knowledge gained at the workshops into research practices provides evidence of the lasting value the project will have for the social science community.

Open/close headingIndividual workshop evaluations

The evaluation reports on the workshops delivered through the TRI-ORM project are available below. Select the workshop titles to view the reports.

Open/close headingUniversity of Leicester, 28th June 2007

The first workshop was delivered at the University of Leicester in June 2007. The following evaluation report was written by evaluation consultant Julia Meek, who observed the workshop throughout.

An introduction to online research methods
Thursday 28th June 2007
University of Leicester

Workshop Report

The aim of the workshop was to introduce participants to a selection of online research methods. The workshop covered theoretical, methodological and ethical issues and involved a lot of hands-on-activities. The structure of the day is presented below:

10.00

Arrival/Registration/Coffee

10.15

Welcome and introduction to the day 

10.45:

Group Activity: Brainstorm the advantages and disadvantages of online research methods

11.00

Case study presentation – highlighting the use of online questionnaires and interviews and the ethical issues surrounding the use of ORM.

11.30

Online questionnaires: Types/sampling/design etc.

11.45

Group Activity: Develop online research strategy for a hypothetical research question and producing an online questionnaire.

12.45-1.45

Lunch

1.45

Online Interviews. Types, sampling, design etc

2.00

Conducting an online interview 

3.00

Tea and question and answer session

3.30

Group activity: Ethics and online research

4.00

Finish

Observations of the day and participants feedback

The introductory session worked very well. The group gelled well together, this was helped by the opportunity for them to introduce themselves and describe their interests.

The morning session focused on online questionnaires, covering theory and opportunities for hands on activities. The participants felt this session gave them a clear picture of the issues surrounding online questionnaires:

 “Designing an online questionnaire is not as simple as it seems there are a lot of issues to be considered”

The opportunity for hands on experience working with the Bristol Online Survey was identified by many participants as a highlight of the morning session.

Clare Madge delivering a presentation at the workshop

The afternoon session focused on online interviews covering the theory and presenting an illustrative case study. A scenario was presented and participants were given the opportunity for hands on experience of an online interview. Working in groups one member of the group was given the role of online interviewer and others adopted the roles of interviewees using the scenario that had been presented. Hands on experience really gave participants a taste of online interviews, they all enjoyed the role play and also realised;

“… that online interviews can be very difficult and require great skill”.

“The difficulty of dealing with the numbers of people online and establishing rapport. The necessity of up front info about the research. The speed of questions and answers”.

The “Speed of response when interviewing more than one person at a time – it is essential to pilot and practice interviews”.

The final session of the day focused on the important issue of ethics. Participants worked in groups using scenarios to consider the key issues. They found the scenarios a very useful method for discussing the issues, “Case studies useful to link theory to practice” . The session highlighted:

“The complexities involved and deep thinking involved before undertaking any online methods”.

Workshop participants in a group discussion activity

The feedback from participants on the workshop was very positive; some of their comments are presented below:

“nice mix of information giving and practical work, quite a lot of info in a short space of time”

“Hands on practice on designing the web questionnaire. Having experienced convenors and some experienced participants at the workshop does help in my learning”.

“Really enjoyed practicing online interviews, it was fun but also great to see how it would work and potential complications”.

“A very good intro”.

“Perfect day”.

 

Julia Meek
External Evaluator

 

 

Close heading CLOSE

Open/close headingUniversity of Manchester, 12th October 2007

This workshop was delivered at the University of Manchester in October 2007. The following evaluation report was written by evaluation consultant Julia Meek.

An introduction to online research methods
Friday 12th October 2007
University of Manchester

Workshop Report

The aim of the workshop was to introduce participants to a selection of online research methods. The workshop covered theoretical, methodological and ethical issues and involved a lot of hands-on-activities.

Participants attending the workshop came from a range of subject disciplines, for example education, social policy, consumer research and health psychology, to name just a few. Most participants were involved in research either as a postgraduate student or research associate/fellow. The majority of participants described their experience of online research methods as basic, only a few had used online methods as a researcher.

The workshop presenters were Dr Clare Madge and Dr Tristram Hooley. The day was spilt between presentations, hands on sessions and discussions. All participants received copies of the ‘Workshop pack’ which contained handouts of the presentation slides, guidance notes on creating an online survey, information about the ‘Exploring Online Research Methods’ website and the new ‘Online Module’. The structure of the day is presented below:

10.00 to 10.40

Introduction

10.45 to 1.00

Online questionnaires

1.00 to 1.45

Lunch

1.45 to 3.00

Online Interviews

3.00 to 3.10

Tea

3.10 to 3.45

Ethics and online research

3.45 to 4.00

Questions, evaluation, online futures, future directions

Observations of the day and participants feedback

An initial problem experienced by several participants was actually finding the workshop! Roadwork’s made finding the building difficult as the expected entrance road was blocked, which resulted in some participants arriving a little late. However, once they had arrived they experienced a very interesting day.

During the introductory session the presenters outlined what would be covered during the course of the day. There was also an opportunity for participants to describe their roles and interests in online methods.

The introductory session on online questionnaires was conducted in the seminar room before moving to the computer room (which unfortunately was a rather long walk from the seminar room) for participants to get some hands on experience of online questionnaires. This led into a discussion on the advantages and disadvantages of online questionnaires. Participants then experimented with the Bristol Online Survey (BOS) to design their own online questionnaire. Some participants commented that they would have liked more time to complete their questionnaire. Unfortunately, this was not possible in the time available. However, participant’s access to BOS was extended for a few extra days to enable them to work on their questionnaire in their own time. In their feedback forms participants were asked what was the ‘best thing’ about the morning session, a selection of their responses are :

“Discussion of advantages and disadvantages of different types of online questionnaire – it helped me to think more about this more critically.”

“Hands on activity – it was really useful to consider how questions can be set online.”

“Being signposted to useful resources and introduced to Bristol Online survey. It was really useful to have practical experience of using this.”

Workshop participants    Tristram Hooley with a workshop participants    Tristram Hooley with workshop participants

The afternoon session focused on online interviews covering the theory and presenting an illustrative case study. A scenario was presented and participants were given the opportunity for hands on experience of an online interview, using Adobe Connect. Working in groups one member of each group was given the role of online interviewer and others adopted the roles of interviewees, using the scenario that had been presented. The hands on experience really gave participants a taste of online interviews, they all enjoyed the role play and also realised;

“That you need to experience an online interview as a participant before trying it out as an interviewer – to understand the problems that participants may have.”

In their feedback forms participants were asked what was the ‘best thing’ about afternoon session on online interviews; participants commented:

“Well presented session with practical experience. Lots of very useful information.”

“The fun we had !”

“Experiencing an online interview.”

The final session of the day focused on the important issue of ethics. Participants worked in groups and discussed the scenarios they had been given to consider key issues. In their feedback forms participants were asked what was the ‘best thing’ about the session on ethics their responses were:

“Discussion was useful to get different ideas and identity subtlety of issues.”

“The lecture – introducing these issues.”

“Opportunity to discuss scenarios covering difficult ethical issues.”

“The complexity and necessity of it.”

The workshop was very successful. The only negative comments from participants focused on the venue, that it was initially difficult to find and they had to move a long distance between rooms.

“Different venue – computer room too far away from seminar room . Morning break earlier.”

“A break in the morning. Computers in the same room.”

“Venue not ideal – too far between computer and lecture room – wasted time.”

Apart from these comments the feedback from participants on the workshop was all very positive.

 

Julia Meek
External Evaluator

 

 

Close heading CLOSE

Open/close headingUniversity of Cardiff, 13th February 2008

This workshop was delivered at the University of Cardiff in February 2008. The following evaluation report was written by evaluation consultant Julia Meek.

An introduction to online research methods
Friday 13th February 2008
University of Cardiff

Workshop Report

The aim of the workshop was to introduce participants to a selection of online research methods. The workshop covered theoretical, methodological and ethical issues and involved a lot of hands on activities.

Due to demand for places at this workshop the groups was larger than the usual twenty maximum, with thirty people registering and twenty seven finally attending. Participants attending the workshop came from a range of subject disciplines, for example Anthropology, Health Services Research, Psychiatry, Sociology of Sport, Children's Social Policy and Research. Participants ranged form postgraduate students to senior researchers. The majority of participants had some experience of using online methods as a researcher, most described this knowledge as basic with a few describing their knowledge as intermediate and one described their knowledge as advanced.

The workshop presenters were Dr Jane Wellens and Dr Tristram Hooley. The day was spilt between presentations, hands on sessions and discussions. All participants received copies of the ‘Workshop pack’ which contained handouts of the presentation slides, guidance notes on creating an online survey, information about the ‘Exploring Online Research Methods’ website and the new ‘Online Module’. The structure of the day is presented below:

10.00 to 10.40

Introduction

10.45 to 1.00

Online questionnaires

1.00 to 1.45

Lunch

1.45 to 3.00

Online Interviews

3.00 to 3.10

Tea

3.10 to 3.45

Ethics and online research

3.45 to 4.00

Questions, evaluation, online futures, future directions

Observations of the day and participants feedback

The venue was impressive, registration, presentations and lunch in an ornate boardroom with a computer room a short walk along the corridor. The help and support the workshop presenters received form the support staff at Cardiff was also impressive. Ranging from a representative being available to register workshop participants; to technical support opening a port in the firewall to enable the afternoon’s online synchronous discussion to take place.

The day began with an introduction. Participants were then asked to work in small groups (3 or 4) to discuss the advantages and disadvantages of online research methods. Each group fed back one key point which was discussed. This led into a discussion on the advantages and disadvantages of online questionnaires.

Then participants took a short walk to the computer room for the technical introduction to online questionnaires and presentation of usability issues. Working in groups of two’s or three’s using a scenario participants discussed potential questions that they might include in an online questionnaire. Participants remained in these groups to experiment with the Bristol Online Survey (BOS) and design their own online questionnaire.

In their feedback forms participants were asked what was the ‘best thing’ about the morning session, a selection of their responses were:

“Bringing all the points/ information together in one place. Providing links to tools/sites, particularly the Leicester site where I can go back and look at the online support”

“Detailed information on structure of online questionnaires. Some insightful points were highlighted”

“Chance to have a go at designing an online questionnaire”

“Opportunities to explore using and designing an online questionnaire. Good that tutoring staff were clearly talking from personal experience which enabled them to personalise the methodology”

Workshop participants    Tristram Hooley with workshop participants    Workshop participants

The afternoon session focused on online interviews covering the theory and presenting an illustrative case study. Working in groups one member of each group was given the role of online interviewer and others adopted the roles of interviewees. The interviewers were briefed on their role before the afternoon session began. This enabled them to think about the role of the interviewer as they listened to the introduction to online interviews. A scenario was presented and participants were given the opportunity for hands on experience of an online interview, using Adobe Connect. The hands on experience really gave participants a taste of online interviews, they all enjoyed the role play and also realised the;

“Difficulties around managing an online (synchronous focus group)”

When asked for feedback on the ‘best thing’ about the afternoon session, a selection of their responses were:

“Great experience of participating in an online focus group highlighted some of the practical issues and reinforced value of looking at online in the same way as doing real time research”

“Trying out online focus groups to get real understanding of how it might feel from a participant’s perspective”

“To learn more about this method, hands on activity was great”

The final session of the day focused on the important issue of ethics, the group were introduced to key issues and pointed to websites where they could find out more information. The participants then were given one of three case studies presenting fictional research project seeking ethics approval. The participants working in groups of three’s or four discussed the case study they had been given and then fed back to the wider groups with their thoughts/comments. This activity worked well all groups made good observations, which the presenters used to reinforce key issues. In their feedback forms participants were asked what was the ‘best thing’ about the session on ethics their responses were:

“Useful food for thought”

“Raised questions which I hadn’t previously considered”

“Thought provoking”

“Space for discussion and debate”

“Thinking about ethics of collecting sensitive data”

This was a very successful workshop, the presenters dealt effectively with a large group. In their feedback forms the majority of participants commented that they would be using online research methods in their own research or to support students.

 

Dr. Julia Meek
External Evaluator
25/02/08

 

 

Close heading CLOSE

Open/close headingUniversity of Leicester, 1st July 2008

This workshop was delivered at the University of Leicester in July 2008. The following evaluation report was written by evaluation consultant Julia Meek.

An introduction to online research methods
Friday 1st July 2008
University of Leicester

Workshop Report

This report is based on the feedback provided by the participants on their paper based feedback forms. Twenty four people were due to attend the workshop. Seventeen did attend and sixteen evaluation questionnaires were completed and returned.

Participants feedback

The participants were asked to note the main issues that stood out in the morning session; the importance of reflecting on the use of online methods and questionnaire design were highlighted.

"The design of the questionnaire is incredibly important – I think this could be a course in itself."

"That using online methods should be thought about carefully, not just as a quick, easy and cheap way of doing research."

The opportunity for hands on experience working with the Bristol Online Survey was identified by many participants as a best thing about the morning session.

"Having a go at creating a questionnaire using BOS. Also I like the way that both Jane and Clare were very responsive to questions and discussion."

"Trying BOS. Highlighting the issues about Qs."

"Trying out and experimenting on how as online question is devised."

When asked to highlight the worst thing about the morning session, limited time was mentioned and the fact BOS was not very user friendly. One respondent would have liked feedback on the questions they wrote in BOS, another would have liked to send the questionnaire to others and get some feedback to see the reporting tool. Although both these suggestions might be desirable from a participant’s perspective they would not be feasible within the already constrained time limits of the workshop.

"Limited time – always a difficulty!"

"Not trying out the questionnaire, that is sending it to others and receiving responses to have a feel of how responses looks like."

"Working with the software (not very user friendly)."

"We did not get any feedback on the questionnaires that we wrote in BOS."

The main issues that stood out for participants in the afternoon session were the importance of interview technique and building rapport in online interviews.

"Creating an online rapport and the challenges around this."

"How online interviews could produce a very different response and questioning style etc than that used in face to face interviews."

"The need for clarity of communication and to be careful about building rapport."

"Possibilities created by online interviews."

"It is important to perfect your interview technique before launching into this if using it for rigorous research. It may be easier to use for more casual research."

Hands on experience really gave participants a taste of online interviews, many highlighted the practical session as the best thing about the afternoon session.

"Getting the opportunity to conduct an online interview / and the intro to adobe connect."

"Highlighting the importance preparing for the interviews, setting ground rules, importance of pace, dealing with sciences."

"The practice of online interviewing – very thought provoking."

"Actually trying it out ourselves."

"Having a go on online interview."

"It was easier that I thought to grapple with the software! I’d use it again."

"Practical session – especially briefing and feedback."

When asked to note the worst thing about the afternoon session the majority said “Nothing”. Others mentioned limited time. Two mentioned the content of the interview, one said they felt uncomfortable and the second noted that the topic should be one relevant to everyone.

"I found conducting the interview uncomfortable (because of the content of the interview) but that’s not a criticism just part of the process of research."

"Having to finish so quickly when interview hardly under way."

"Perhaps not quite enough time for the practical session – not enough time to develop a real sense of the issues."

"Might have been easier to have chosen a topic relevant to everyone – might have made the situation more realistic e.g. public transport."

The final session of the day focused on ethics, when asked what were the main issue that stood out participants highlighted; privacy, dilemmas, the importance to considering the issues.

"Dilemmas. No easy answers to ethical questions."

"Ethical pluralism."

"Confidentiality/anonymity."

"Privacy, confidentiality and issues related to online research."

" Not to make assumptions. To consider the issues in relation to online research as opposed to the historical route – something to reflect on further."

When asked to highlight the best thing about the final session of the day the opportunity for discussion, references, examples and tour of the website were highlighted.

"Examples put forward for discussion."

"The information on resources will be very useful."

"Tour through the website, especially the response tool."

"Detailed walkthrough and the website."

"Opportunity for some discussion."

"Discussing the issue."

When asked to highlight the worst thing about the final session of the day lack of time was the issue highlighted.

Participants were asked to make suggestions for improving the workshop; four suggested making it longer and three of those mentioned a two day format. On participant suggested sending the ethical scenarios in advance to enable participants to become familiar with them prior to the workshop.

The participants were asked to rate the workshop, on a 1 to 5 scale, where 1 was not satisfactorily and 5 was very satisfactory. Nine of the participants rated the workshop as 4 and seven rated it as 5 very satisfactory.

The participants were asked to rate the administrative arrangements for the workshop, on a 1 to 5 scale, where 1 was not satisfactorily and 5 was very satisfactory. Eight of the participants rated the workshop as 4 and eight rated it as 5 very satisfactory. Two made comments on the administrative/room arrangements.
"Moving between rooms would have been better if room was locked for people to leave bag in if travelled up t he night before."
"You might include some suggestions for those staying over night – e.g. Ibis Hotel round corner from railway station is handy/ reasonable but there may be other accommodation to know about."

Participants were asked how they thought they might use the information from the day in the future. The majority mentioned that they would either use what they had learnt in their research or consider using it in their research.

All agreed for me to get in touch in six months time to find out how they had used what they learnt and fifteen or the sixteen said they would like to receive further information about the online module.

Julia Meek
16/09/08

 

 

Close heading CLOSE

 

Close heading CLOSE

 

Close heading CLOSE

 

Open/close headingEvaluation of the online course

The second key element of TRI-ORM was the fifteen credit M level training module ‘Advanced Online Research Methods’. The aim of the module was to provide a professional development opportunity for researchers to develop their practice and undertake high quality online research. The design of the module was built on the pedagogic framework outlined in E-tivities and E-moderating (Salmon, 2000; 2002). Salmon’s five stage model, listed below:

  1. Access and Motivation
  2. Online Socialisation
  3. Information exchange
  4. Knowledge construction
  5. Development

The online module ran January-April 2008 and January-April 2009. The module aimed to enable participants to:

  • Evaluate when online research methods are appropriate for social science research
  • Discuss the key ethical and methodological issues surrounding the use of online research methods
  • Critically evaluate the use of online research methods to address their own and their peers’ research questions
  • Design and pilot an online questionnaire or online interview to address their own research questions using appropriate software/hardware
  • Evaluate their online research design in the light of feedback from the pilot
  • Identify the key issues in successful implementation of online research methods from the perspective of researcher and respondent
  • Reflect on their experience as an online researcher and identify further opportunities for development

The Advanced Online Research Methods module was advertised through the series of introductory face to face workshops run by the TRI-ORM project team. The module was also promoted through the University of Leicester and RDI websites and through direct email to potentially interested groups. People interested in participating on the module completed an online application form and were asked to provide a description of a research project they would be interested in undertaking using online research methods.

Table 7 presents details of the number of people accepted on the module, the number that withdrew and those that completed.

Status

2008

2009

Accepted on the module

26

27

Withdrew

6

2

Completed

20

25

Completed and submitted assessments

14

21

Table 7 Figures for module participants

The module was oversubscribed; the retention rate for both years was very good. During the first year six withdrew during the module, their reasons related to changes in their personal circumstances, listed below:

  • Husbands ill health
  • No longer going to use any online research methods for their study
  • Finding it difficult to make the Tuesday commitment
  • Substantial and unexpected changes in work responsibilities

Table 8 provides a breakdown of the roles of the participants on the module in 2008 and 2009.

Role

2008

2009

Postgraduate Research student  

9

13

University Research Associate

/Fellow

6

7

Public sector researcher 

6

1

University Lecturer & PT PGR 

0

2

University Lecturer  

5

3

Table 8: Participant roles

A high proportion of the people registered for the module especially during the second year were post graduate research students, this perhaps to be expected as this group is possibly more motivated to explore using new technology based research methodologies

The module was structured around weekly synchronous meetings. Participants completed tasks to prepare for the weekly synchronous session and follow up tasks after the session. During the second year of the module participants were also required to keep a weekly blog of their experience.

1 Module Evaluation

The evaluation was conducted over the two years that the module ran. Evaluation in the first year sought to capture formative data to provide feedback to the development team and enable changes to be made for the second iteration of the module. Feedback on the module was gained through the following methods.

Evaluator/Learner Diary (2008): The evaluator participated in the module as a learner, completed the reading and e-tivities, joined in the synchronous discussion and posted to the discussion board. This personal experience was fed back to the development team on a weekly basis.

Message board: Any problems participants experienced could be posted to the ‘I’m stuck’ message board. The information on the message board has been reviewed to highlight issues that arose.

Online Questionnaire (2008): The aim of the online questionnaire was to capture feedback from participants on; the registration process, the weekly sessions and their thoughts on whether they felt they had achieved the learning aims set out in the module description.

Online Focus Group (2008): To gain more in-depth information from participants the evaluator ran a synchronous online focus groups at the end of the first year of the module. The evaluator used the responses from the online questionnaire to inform the focus group questions and follow up key issues that emerged from the questionnaire.

Feedback from Module Co-ordinators (2008): A Formative Evaluation Report was written at the end of the first year of the module, which presented a series of recommendations. The module coordinators responded to this report and outlined the changes that would be implemented for the 2009 running of the module.

Participant Blog’s (2009): The module development introduced a reflective blog for all participants on the module during 2009. The blogs provide in depth qualitative data tracking participant’s experience of studying the module. Participants were asked to post a reflective blog at the end of the module providing feedback on their experience.

1.1 Formative Evaluation 2008

Evaluators Diary 2008

The evaluator participated on the module as a learner, and kept a Diary detailing the experience from a personal perspective. The following section describes the issues emerging from the diary entries. 

One of the key issues that emerged from the Diary was the weekly feeling of uncertainty prior to entering the synchronous meeting, would the technology work, would she be able to make a valuable contribution to the discussion. This feeling of uncertainty prior to the online meeting each week was balanced by the actual experience of participating in the synchronous session, the tutors kept the discussion focused and the other learners were encouraging and supportive.

I always enter the discussion with a feeling of trepidation and come away with a feeling that I’ve just been involved in something really interesting and that I’ve learnt a lot. (Diary, Week 6).

As part of the module the tutors gave the participants experience of working in groups of different size for the synchronous meeting to illustrate the difference of working with smaller and larger groups online. Eight participants took part in the discussion in week two and the diary illustrates that it was difficult to keep up with the flow, reading the comments of others and then having the confidence to participate quickly prior to the discussion moving on. The issue of varying group size was picked up in the online focus group. One member of the group felt that a discussion with two people was best for making a connection but a group of three or four worked for keeping the discussion flowing.

The diary entries highlight the skill of the tutors moderating the groups and ensuring that discussion moved forward and that all members of the groups had the opportunity to participant. Key questions were posted in week three prior to the synchronous meeting and diary entries illustrate that these questions helped to focus the tasks.

‘I’m Stuck’ Message board 2008

Any problem that participants experienced could be posted to the ‘I’m Stuck’ message board. The issues highlighted on the message board were dealt with straight away (where possible) by the development team.

Accessing Connect: Several participants experienced problems entering the Connect chat room for the first online meeting. This issue of was followed up in the online questionnaire, where six respondents highlighted that they experienced problems with access to connect.

Because our institution uses proxy servers I had to go home to use CONNECT (Anonymous 4, 2008)

Respondents were asked if they had any comments on how the team dealt with technical problems.

There was not enough information in the beginning on the technical requirements to get access (e.g. port 3222) (Anonymous 21, 2008)

Leicester team gave all help they could (Anonymous 14, 2008)

Don't know why but my connection to Connect kept dropping out every 5 mins or so whilst doing the weekly synchronous discussions. (Anonymous 12, 2008)

Books: One participant requested that the development team included the isbns of the books that were suggested to make ordering them easier.

Links to message boards: One participant requested that the links to the message boards were put on the left hand side menu; this was done immediately to aid navigation.

Links to readings: A faulty link to a background paper was highlighted by participants and immediately corrected by the development team.

Photos: Several participants experienced problems adding their photo to the message board, instructions were immediately sent.

Evaluation questionnaire

A link to an online questionnaire was circulated to participants at the end of the final synchronous session. The aim of the questionnaire was to capture feedback on the registration process; the weekly sessions and whether participants felt they had been able to achieve the learning aims set out in the module description. It also offered participants the opportunity to feedback any problems they experienced during the course of the module. The response rate was 50% (10 out of 20).

Registration and administration arrangements

Participants were asked if they were happy with the registration process for the module, 100% of respondents were happy with the process.

In filling out the application to take part I had to think through how it might be appropriate to me personally. Thought this was a good way of beginning things (Anonymous 12, 2008).

Participants were asked if they were happy with the administrative arrangements for the module, 75%, were happy. The 25% who were unhappy with the arrangements listed the late arrival of login details. A key recommendation from the formative evaluation report at the end of the first year was that Login details be sent to participants well in advance. This was taken on by the development team and arrangements were made for the early dispatch of login details for the 2009 cohort.

Participants were asked if they were happy with the week to week running of the module 84% were very positive.

I felt that the weekly meetings kept me going - especially because it is conducted online, I feel that a regular contact with lecturers and other students was very important. (Anonymous 23, 2008)

16% who were less positive about the weekly running of the module highlighted that instructions should be provided in advance. Reading lists should also be available in advance and the following week’s activities should be posted as soon as the previous week’s session had been completed. These recommendations were fed back to the development team and implemented for the 2009 module.

Participants were asked whether they had experienced any problems and if they were happy with the way the team dealt with them; 75% were happy. Advance notice of technical requirements was highlighted. This change was implemented during the registration process for the 2009 cohort.

Content

Participants were asked whether the weekly preparation/reading was too much, 84% said it was the right amount, 16% said it was a bit much.

Respondents were asked how many of the follow up task they undertook. None of the respondent failed to undertake any of the follow up tasks  8% undertook all the tasks, 68% most, 16% some and 8% very little.

The follow-up takes where helpful and enabled a consolidation of ideas from articles and then online discussion/interaction. (Anonymous 11, 2008)

Found these were good in getting you to reflect/evaluate what was done in the week’s activities. (Anonymous 12, 2008)

Respondents were asked about the time commitment for the module. 50% said the module took more time than they expected, 50% said the module took the amount of time they expected. The issue of time was explored further in the focus group where one member of the group said time had been a real issue, trying to fit in the module with work. Another member of the group said that the module had been critical for them as using online methods would form part of their research, so although they had a full time job and were studying they had managed to find time for all the reading. Other members of the group thought the workload for the module was manageable. The group suggested providing guidelines at the start of the course about how long participants are expected to spend. This information was available on the website for the module and was clearly highlighted in the registration process for the 2009 cohort.

Participants were asked how useful they found the weekly activities. The responses were very positive during the key learning weeks however, some dissatisfaction emerged towards the end focusing on the pilot. Participants in the focus group were asked to suggest how the development team might improve this stage. One member of the group said they had thought they would have to write a research proposal but had not realised they would have to do a pilot project. Another member of the group said they felt the module petered out towards the end and that the long Easter break did not help as momentum was lost.

The participants were asked to list the three best and the three worst things about studying the module. The issues that emerge as ‘best’ included, the scope of the reading, working with people from other academic backgrounds, best practice, and peer support. The issues that are highlighted as ‘worst’ include technical problems and time to do the work.

Participants were asked what one piece of advice they would give to someone taking this module in the future.

Try and get the most out of the module as possible (Anonymous 27, 2008)

Be genuinely interested in online research methods and be prepared to devote some time to the learning process (Anonymous 19, 2008)

Make sure you devote plenty time and effort to the course - it's really worthwhile learning experience if you can do this. (Anonymous 12, 2008)

Interact with other people on the course early to maximise the experience. (Anonymous 11, 2008)

1.2 Feedback from Module Co-ordinators 2008

A Formative Evaluation Report was submitted to the development team (November 2008) based on the Evaluator Diary, Online Questionnaire and Online Focus Group. The Report contained a series of recommendations. The recommendations and a series of questions related to the time they spent preparing and delivering the meeting were also emailed to the module coordinators. This enabled them to reflect and respond to the recommendations. Table 9 provides a list of recommendations and the actions that were taken for the second iteration of the module in 2009.

Recommendation

Action taken

R1 Clearly explain the purpose of the different group sizes for different tasks. Use the technique of key questions to focus tasks more broadly across the weeks, following the model used in Week 3.

Purpose was clearly explained and this led to discussion in the participant’s blogs on the advantages/disadvantages of working with groups of different size.

R2 Provide a longer time for discussion of pilot projects in Week 7.

Participants were able to use their blogs to discuss the pilots.

R3 Ensure that the technical requirements for the module part of the registration process.

Technical requirements included in registration process.

R4 Key texts, details should be provided prior to the start of the module to enable key texts to be ordered from libraries.

Details provided.

R5 Develop a general Help section for future years with FAQ’s.

“I’m stuck” message board available.

R6 Ask for confirmation from the applicants technical support team that the necessary ports are open or will be able to be opened prior to the start of the module.

This was done, however central changes to the Connect system at the university meant different ports were necessary. So the same initial connection problems arose.

R8 Open a Connect room prior to the start of the module, ask participants to enter the room and leave a message to ensure they can access Connect and will feel confident about entering the discussion room for the first synchronous meeting.

Test room opened.

R9 Ensure all Leicester id and logins are set up prior to the module starting and that all participants receive them in advance.

All ids sent out in advance.

R10 Advance instructions of reading lists and tasks

Details provided.

R11 Ensure that there administrative support so that participants have a quick response to email, even if it is just a holding email until the problem can be resolved

Action taken.

R12 Reflect on the group size and task for the first synchronous meeting.

Decision made to keep same task as the ice breaker.

R13 Ensure Leicester login will provide participants with access to Athens to enable them to obtain key papers for the module. Alternatively make all papers available through the module site.

Athens access available.

R14 Provide guidelines at the start of the module about the amount of time learners are expected to spend.

Details sent out as part of registration process.

R15 Reflect on the tasks that groups failed to complete in the time allocated e.g. peer feedback task time was limited.

Blogs provided a forum for group discussion beyond set synchronous meetings.

R16 Ensure the teaching is in a clear block, the disruption of the long Easter break meant there was a loss of continuity

Module planned to complete prior to Easter break.

R17 Promote the evaluation process as part of the module, all participants have to contribute.

Blogs formed key part of evaluation process.

Table 9: Key recommendations formative evaluation 2008

1.3 Advanced online research methods module 2009

Participant Blog Entries 2009

The module coordinators introduced a reflective blog for all participants during 2009. The aim was to enable participants to build up a rapport with each other, draw on shared experience, reflect and discuss their experience of the module. The blog also contributed towards the second element of the assessment process, a reflective portfolio. The blogs provide in depth qualitative data tracking participants’ experience of studying the module. Participants were asked to write a blog reflecting on their experience of the module as their final post, providing qualitative data for the evaluation.

The participants on the module were expected to write an entry to their blog each week. The number of postings over the period of the ten week module ranged from 4 to 17.

  • 8 participants exceeded the minimum number of blog contributions
  • 7 made at least or slightly less than the minimum number of Blog contributions
  • 3 made less then the minimum number of Blog contributions
  • 3 made a very limited contribution

The blogs provide a week by week insight into the participants’ experience. The participants used the blog to outline their thoughts on the weekly reading and introduced new articles of interest to the group. They reflected on the synchronous meetings and highlighted the key issues that emerged from the discussions. This enabled individuals from different meeting groups to share the experience of other groups. There was very positive feedback on the benefits of the blogs.

I found maintaining my course blog to be a useful experience and used it to reflective portfolio and a summary of the areas covered (Anonymous 24, 2009)

The blogs and online discussion helped to make the course coherent (Anonymous 12, 2009)

The blogs enabled participants to introduce themselves and the aim was to build rapport in the group, the group was split of how successful building up rapport had been. One participant felt they did not gel with group until near end of module this was reflected by others; however it was also contradicted by other members of the group who felt they had built up rapport, especially during the piloting of the projects. The positive and negative reactions to building up rapport in the group are reflected in the quotes below:

The weak point was the low level of sustained connection between students in the class. I spent a lot of time reading other people's blogs for example, but one could not really get a sustained discussion going around them (Anonymous 5, 2009)

I have been reading the blogs posted by other course participants, and I do agree that the lack of opportunity to chat to each other and get to know each other better has been disappointing. While we have been in different groups at meetings, I feel there have been few opportunities to take these budding relationships further (Anonymous 4, 2009)

Someone else commented on their blog at some point that they did not feel that the course had any sort of "cohesion". I disagree. I think I have got to know some people, in a weird sort of way, even though I have never met them, and I don't know what they look like. I certainly got lots of help with my pilot study, and that was invaluable (Anonymous 19, 2009)

The way to interact with other students was very convenient. Having a weekly synchronous session was essential to keep me focused on the topics. In general I liked the type of interaction that you achieve using the chatting tool (Anonymous 10, 2009).

Socially, the course really picked up when we were grouped together for the purposes of piloting. Though everyone was friendly and constructive throughout (Anonymous 12, 2009).

The aim was to build rapport through the module blogs, through participants contributing and commenting on the contribution of others. As the module progressed the time commitment necessary to read and comment on blogs emerged as an issue. In their feedback several participants commented that they had underestimated of time needed for module and they struggled with the commitment of weekly blog. The time taken to read the blogs and then contribute possibly contributed to the difficultly of getting sustained discussion going on the blogs.

I just didn’t have time to read and reread especially to pick up some of the threads that emerged at times. I imagine that others too were very busy and I own to some disappointment that some questions I raised produced no responses at all (Anonymous 11, 2009)

The issue of time was also raised by one participant in relation to planning and piloting their project.

Piloting projects

Participants used the blogs to compare their experience of using synchronous and asynchronous methods and to describe their pilot projects to their peers. The blogs provided a means for the group to remain in contact during the piloting of their projects and provided a forum for participants to describe their experience to their peers.

The module gave participants the opportunity to pilot their research projects with their peers participate in other people’s pilots. Many found this experience extremely valuable

The experience of piloting and getting feedback on my own questionnaire and responding to others’ pilots was immensely valuable (Anonymous 11, 2009)

It has also provided the experience of being an online research participant, as well as practitioner (Anonymous 4, 2009)

In participating as a respondent in two other pilots I was able to expand my experience of learning and gain a different perspective in the use of ORMs (Anonymous 15, 2009)

Reflections

For their final post the group were asked to reflect on their experience of participating on the module and comment on the structure of the module. The feedback was extremely positive, the aims of the module were described as good, as was the structure and organisation, illustrated by the selection of quotes below:

I found the structure of the course excellent and the topics just seemed to flow one into the other at the time I needed them. In particular, I think the sessions up to the first assignment covered the essential ground to inform the thinking about the proposal (Anonymous 11, 2009)

ORM is well structured and organised. I felt that I was going the extra mile with each topic we explored….Having a weekly synchronous meeting was essential to keep me focused on the topics (Anonymous 10, 2009)

There were very positive comments about studying online e.g. no travel, fitting in with other work. The issue of fitting in the module around work was seen as a positive but as the module progressed comments related to pressure on participants time appeared in the module blogs.

One of the highlights has been the experiential nature of this learning. Much of this was through planned learning experiences and  I feel more confident that I have some notion of what it’s like to engage in synchronous and asynchronous interviews as a participant.( (Anonymous 11, 2009)

I have been able to integrate learning from this course since day 1 and the skills and knowledge I have gained are very applicable to my current professional role” ((Anonymous 15, 2009)

I think ORMs bring many challenges, but also many exciting possibilities, and I for one will now consider their potential place in any future research that I am planning” ((Anonymous 7, 2009)

Personally I feel I have gained what I sought from the ORM course experience in that I now feel confident in tackling it in my own research” (Anonymous 4, 2009))

I found the course to be very useful and it has definitely strengthened my knowledge and understanding of online research methods” (Anonymous 24, 2009)

Experientially, the course drew me out of my comfort zone. It got me blogging and chatting online, and familiarised me with the latest online trends (Anonymous 12, 2009)

There were very positive comments about module tutors, a few participants would have liked additional support others were happy. The resources were positively received, although several participants noted that the references needed to be kept fully up to date.

The online resources on the TRi-Orm website are superb. I really hope that website stays put, because I will be going back to that resource many times I suspect (Anonymous 19, 2009)

The online reading and viewing materials were excellent, thought some seemed to be getting dated in a fast moving field (Anonymous 12, 2009)

I’m Stuck’ Message board 2009

The technical issue raised during the first year of the course in relation to problems accessing Connect were addressed in the registration process for the second year of the module. Details were provided to ensure that the necessary ports were open. However, changes made centrally to the Connect system at the University meant that a different port was required to be opened. The module coordinators were unaware of this change which meant that some participants in the 2009 cohort also experienced problem accessing Connect for the first meeting. In response the module coordinators set up a test room in Connect and gave participants experiencing problems to direct email contact to the person responsible for he connect system at the university.

Participants experienced problems editing profiles adding photographs to their blogs. Unfortunately, this problem could not be easily resolved and the work around suggested by module coordinators was to add an image of themselves to their blog and then include it in a blog post.

Participants experienced problems login into Connect and their blogs at a mid point (week 5) in the module. It was necessary for them to reset their passwords. This problem had not occurred in the previous iteration of the module and module coordinators were not aware of the necessity to reset passwords at regular intervals. The problem of login was resolved once passwords were reset but the inconvenience and frustration was noted by participants in their feedback.

Week 5

The session started badly with nobody able to log in, and continued with the notes panel being locked down again - so no editing was possible until right near the end of the time.  It was also impossible for more than one person to edit at once.  To top off the experience - when it came to blogging, the logins no longer worked due to the password change system (Anonymous 12, 2009)

A contributory factor in the technical issue was the dispersed nature of all the participants on the module, they came from different institutions and one was login in from overseas. Although technical issues had been highlighted during the first year of the module and solutions had been put in place changes to the central Connect system meant the recommended ports were no longer relevant. Initial login problems were quickly resolved, the mid point that requirement of login passwords had to be regularly changed had not emerged during the first year of the module so was a new issue for the module coordinators. Again the problem was resolved but was frustrating for participants. The technical hurdles of running an online module should not be underestimated. 

2 Advanced Online Research Module Conclusions

The aim of the Advanced Online Research Methods module was to provide advanced training in online research methods, an aim which has been fulfilled. The ran for two years (semester two 2008 and semester two 2009) and was extremely successful. The module was oversubscribed in both years, during 2008 26 participants were accepted on to the module 21 people completed, in 2009 27 participants were accepted on to the module and 25 completed. The total of 46 completing the module exceeds the estimate of 40 people that was outlined in the project bid.

The module has been delivered appropriately and efficiently. The observations of the evaluator and feedback from participants’ highlight that the module were very well run, with a high level of hands on relevant experience for participants. The positive feedback from participants illustrated most effectively through the blog entries in 2009 reflects the skill of the tutors, designing and developing an innovative module. Also their skill in moderating the synchronous meetings and maintaining motivation within the group throughout the duration of the module.

The convenience of learning online and fitting their learning around their daily work was attractive too many participants however, during both years some participants had underestimated the amount of time the module would take (although the time commitment had been presented in the registration information). During the first year the time for planning and piloting projects was highlighted by some and during the second year the time commitment of reading, commenting and adding to their own blog was highlighted by some participants.

There were technical issue during both years the module ran, the problems with Connect resolved after the first year reappeared due to a different cause during the second year. The module coordinators were based in departments and were unaware of central changes to the connect system. The module attracted participants form a range of UK and overseas institutions, each institution has it’s own firewall and it was necessary for participants to ask their own IT services to open specific ports to enable them to access Connect. The technical issues and hurdles associated with online learning can not be underestimated.

 

Close heading CLOSE

 

Open/close headingEvaluation of the TRI-ORM website

The TRI-ORM website built on the work of the project ‘Exploring Online Methods in a Virtual Training Environment’, funded by the ESRC Research Methods Programme (RMP)(Phase 2) (Award no: RES-333-25-0001). This project produced an online training web resource addressing the use of online research methods, the website included sections on web-based questionnaires, synchronous and asynchronous interviews and online research ethics. The website underwent a thorough evaluation and peer review in 2006 and was extremely well received. . The following sections document the changes and additions that have been made to the website, linking back to the original plans for the website that were outlined in the original bid.

1 Enhancements to the website

A new home page and introductory pages to the site were added to reflect the new TRI-ORM project. The main structure and functionality of the main learning areas of the site remained the same as the functionality and ease of use of these areas had been fully evaluated during the initial design and development of the website. The key aims for the website under TRI-ORM were the development of an Online Research Methods Trainers Resource Bank and Enhancement of self-study resources. The following sections review the changes that have been made to the website to meet these two aims.

1.1 Online Research Methods Trainers Resource Bank

A Resource for Tutors area of the site has been created as a single point of reference for tutors. This section provides tutors with direct links to learning objects and content within the website. It includes materials that have been tried and tested in workshops and the Advanced Online Research Methods module which have been made available for reuse by tutors/trainers. There are activity guidelines with resources to support presentation of key topics, guidance has been incorporated on how existing self-study activities within the website can be repurposed for training. The videos of face-to-face workshop presentations for self-study use have been incorporated into the training area alongside the original PowerPoint slides. This ensures that trainers can easily access examples of how these resources have been used prior to incorporating them into their own teaching. A clear list of objectives for the trainers section of the site were listed in the original bid, these are presented in Table 1 along with how the project has met the objective.

Original objective

How the objective has been met

An annotated guide to using the Online Research Methods website for teaching;

Resources Index included in the Resources for tutors area.  ‘Activities’ area of the ‘Resources for tutors’ includes range of activities designed to allow materials from this website to be incorporated into face-to-face training situations

Sets of questions designed to develop participation in online learning

‘E-tivities’ area of the Resources for tutors includes tried and tested online activities focused on online research methods.

Case studies designed for use as classroom or online prompts

Range of innovative case studies included in the “Methodological futures“

Macromedia Breeze lectures (integrating video lectures with supporting slides and materials)

Breeze presentation have been included in the Presentations area of Resources for tutors and collected in the training area of the site, along with videos modelling use.

PowerPoint slides;

Powerpoint slides from the workshops included in the Presentations area of Resources for tutors and

Interactive learning activities

Included in the E-tivities, Activities and Presentations area Resources for Tutors.

Table 1: Features of the Trainers Resource area of the site.

A review of the initial set of enhancements for trainers/tutors set out in the original project bid illustrate that the project team have made the key changes to the website that they envisaged.

There is one key area that has not been achieved. Originally, it was intended that the training materials would developed as individual learning objects that would then be incorporated into the JISC Online Repository for Learning and Teaching Materials (JORUM). This was prevented by the lack of institutional membership of the service at the University of Leicester. 

The other key area of enhancement set out in the project bid was enhancement to the self study resources, this is reviewed in the following section.

1.2 Enhancement of self-study resources

It was originally envisaged in the project bid that all the new materials would be housed in a single new area entitled 'living archive'. However, a decision was taken by the project team to retain the original structure of the site, adding new materials related to the existing area of the site to those sections while newer methodologies would be presented through real-life examples in an Online methodological futures area of the website. The retaining the original structure of the site ensured that information was easier to locate, and the materials are presented in a more structured way.

Materials from the other strands of TRI-ORM have been incorporated into the self-study areas. This includes video lecture materials captured from the face-to-face workshops and Advanced Online Research Methods module, along with reading activities and discussions from the online module.  This material has been incorporated into the existing framework of the site's self-study modules to ensure it is embedded and contextualised by introductory materials.  This has allowed key materials to be made available to those who were unable to attend the other training strands and will ensure that this material is maintained for future reference.

The resource listing and in-text referencing within the self-study area has been extensively updated to ensure that key publications from the project period have been incorporated. This has ensured that the materials have continued to be informed by the latest developments, and has meant that the in-text references collector has been developed and expanded.

The TRI-ORM website has further developed the self-study resources. A key new area of the website is Online methodological futures. This section of the website aims to explicitly explore new and innovative online methodologies. Case studies from key researchers in the field provide contextualised introductions to a range of the latest online methods, including the use of blogging, the analysis of web traffic using website analytics tools and the automated analysis of the use of social networking sites.

2 Website Evaluation

The formative evaluation of the website ran throughout the project, the external evaluator and Principle Investigator reviewed the website and provided feedback to the Education Technologist. The summative evaluation of the website took the form of a peer review by academics experienced in the field of online research methods

2.1 Formative Evaluation

The evaluator reviewed the new areas of the site that were developed under TRI-ORM. A heuristic evaluation was undertaken by the evaluator in November 2007 to review the new look and feel of the home page and structure of the site. ideally a heuristic evaluation should be undertaken by three to five evaluators. However, this was an evaluation of an established site for which two heuristic evaluations had already been undertaken (September 2004, May 2005). Therefore, navigation, page design and content of the site had already been evaluated. This evaluation focused on the new TRI-ORM pages of the website. There were very few problems encountered in the TRI-ORM section of the website, recommendations included rechecking links, enlarging the font on the key texts pages, providing more information in the “Resources for Learners “ box on the home page. A final review of the complete website was undertaken by the evaluator in April 2009, this listed broken links and text changes, no major usability problems were found.

Feedback on the use of the website was provided through the usage reports that were written during the period of the project. They illustrate the wide use of the site as summarised in the following section.

3 Usage

A comprehensive report on usage written by Learning Technologist Rob Shaw is available from the link below. Key quotes from the usage report are included below to illustrate the extent to which the site is being used:

Usage: “Web usage statistics show that over the period 1st November 2006 to 30th April 2009, the website was accessed 62,656 times (an average of 69 visits per day).”

“A comparison of five equal 6-month periods for overall site usage reveals high levels of usage throughout the period of the TRI-ORM project, with the average number of visits daily ranging from 52 to 77.”

Quality: “In analysing these figures, however, it is vitally important to consider 'quality' of visits as well as quantity…The number of return visitors, over 10,000 users have visited the site repeatedly, and around 1000 users have clearly made it part of their researcher toolkit, returning to the resource on an average of a monthly basis or more.”

Geographic reach: Between 1st November 2006 and 5th February 2009 visits were made to the site from 179 countries/territories.  49.61% of traffic during this period came from the UK, followed by 12.92% from the US, 4.84% from India, 2.52% from Canada, 2.41% from Australia, 1.75% from Malaysia, 1.53% from Germany, 1.32% from the Philippines, 1.2% from Slovenia and 1.04% from Ireland… The overall picture is clearly one of high and increasing levels of geographical reach.”

Usage figure highlight that the TRI-ORM website has attracted users from a range of geographical locations and is well used. To provide more insight into how the site is currently being used the evaluator sought feedback from a group of academics who use and teaching online research methods, their feedback is presented in the following section.

[i] The usage report will open in a new window, which you should close to return to this page.

Exploring online research methods/TRI-ORM Website usage report, May 2009 (pdf, 82KB)

4 Peer Review

To obtain further insight into how the website is and might be used in teaching a series of academic reviewers were asked to provide feedback. The reviewers were Ted Gaiser (Director Academic and Research Services Boston College); Katja Lozar Manfreda (assistant professor of statistics and social informatics, Faculty of Social Sciences at University of Ljubljana, Slovenia) and Henrietta O'Connor (Senior Lecturer at the Centre for Labour Market Studies at the University of Leicester). The academic reviewers were asked to provide a short description of their general impressions of the website, describe how they might use the website and provide details of any technical problems they experienced (no technical problems were noted by the reviewers).

The reviewers general impressions of the website was very positive their comments focused on, that the information is useful to students and trainers, references, self study resources, ability to download presentations, and the ease of use of the website. The quotes from the reviewers presented below illustrate their positive impression of the website.

This is a very helpful resource for social science researchers. It provides some good overall information, has lots of great references, and gives a good overview of issues to consider for online researchers. I feel it's often most important simply to raise the questions ... such as, "is an online interview different from a face-to-face ... and if so, in what ways? (Ted Gaiser)

This is an outstanding resource for members of the social science community with an interest in either using or teaching aspects of online research methods. Alongside the self-study resources the site also provides a great deal of material for those who are teaching online methods either as a specialist unit covering all aspects in depth or as part of a broader methods course. (Henrietta O'Connor)

I find the site very useful for students and teachers of online research methods. I also find it useful for a practitioner who wants to do some online data collection but has no previous knowledge on this issue…I find the site also very easy to use, everything is clear, instructions are detailed and actually even not much needed since the organization of the information on the website is very straightforward, easy to follow and easy to be found.(Katja Lozar Manfreda)

Two of the reviewers highlighted that the website would be less useful for the more experienced user of online research methods. The reviewers were positive about how they use the website or could envisage using it in the future.

One of the first things I thought about when I looked at the site is what a great resource it would be for students. I teach an undergrad course in which students often choose to do an online study as part of their final project. This is an excellent resource for those students. I have also taught graduate seminars on online research methods. While I would consider it a resource for them, it would only be useful early in the semester (due to the issues raised in #1). I might also encourage colleagues considering venturing into the wide world of online research to use it as a resource to get started. (Ted Gaiser)

I have used the site in a number of different ways. First, I use the site as a resource for my own research, frequently referring back to this site for links to references and technical information. Second, I find the site invaluable as a teaching tool. Many of the students I teach are distance-learning students at Masters and Doctorate level who, for reasons of geography, cannot often attend face-to-face research methods training sessions. The importance of online methods to this group of students has grown immensely over recent years and many students plan to use online questionnaires in their research. This site is the first place that I suggest as a resource because it includes not only a textbook style account of online methods (which engages critically with relevant and up to date literature) but it also importantly includes practical information. (Henrietta O'Connor )

I might use the site as a teaching material in one of my graduate courses where I teach online research methods. I will use the case studies from "Resources for tutors" section. I will also use the suggestions for activities and presentations.( Katja Lozar Manfreda)

Additional feedback was also provided by Dr Steve Strand (Institute of Education, University of Warwick)

I use the site in my teaching of Advanced Research Methods (ARM) to an annual cohort of 25-30 MPhil/PhD students in the Institute of Education here at Warwick. I use one of the learning activities as a demonstration in class (the item content #2 activity) and walk through the material on sampling. More generally I point them at the website as an excellent self-study resource for students who can progress at the speed they want and choose the coverage that is most appropriate (I focus on online questionnaires but the student could also use the online interviewing modules).  This is a valuable and professional resource.

The academic peer review of the website was positive. However, the fact that this is a resource that will need to be updated and expanded was highlighted. In such a fast moving field the references will looked dated within a short period of time. The reviewers also picked up that the site is an introduction to online research methods and the more experienced online research methods practitioner may require more in depth information. The TRI-ORM team catered for the training needs of the more advanced online research methods through the Advanced Online Research Methods module, which was very well received. The module ran twice during the period of funding and would require further resources to run in the future.

5 Website Conclusions

The TRI-ORM website has been extended and enhanced during this period of funding, enhancements have been made to the self study sections of the website and a new Resource for tutors section has been developed. The website has enabled the three elements of TRI-ORM to be brought together. The materials from the workshops and online module are available for self study or for trainers to incorporate into their own training of online research methods. These developments meet the aims set out in the original funding bid.

The one key area that has not been achieved is that the training materials would be developed as individual learning objects and incorporated into the JISC Online Repository for Learning and Teaching Materials (JORUM). This was prevented by the lack of institutional membership of the service at the University of Leicester. The issue of the maintaining the resource in the future was highlighted by the academic reviewers. Online Research methods is a fast moving area of research and the references on the website will look dated within a short period of time. For the website to remain an attractive learning resource it will need to be updated and enhanced. The project team have established an agreement with ReStore, which would maintain existing material but would not provide scope for funding new material.

Close heading CLOSE

 

Close heading CLOSE

Open/close headingFinal evaluation report

The final TRIORM project evaluation report is available from the following link.

[i] The report will open in a new window, which you should close to return to this page.

TRI-ORM Final Evaluation Report (pdf, 183KB)

 

Close heading CLOSE

 

 

 

 

 

  © 2004-2010  All rights reserved    |    Maintained by ReStore    |    About this website    |    Disclaimer    |    Copyright    |    Citation policy    |    Contact us