Skip to content - accesskey S
City University logo *

Return to SENDA? Implementing accessibility for disabled students in virtual learning environments in UK further and higher education


2. Methodology

2. 1 Literature and web review

Literature germane to this research comes not only from the field of academic research, but also from educational and social policy, and from educational and technical guidelines produced for the various stakeholders - both academic and commercial – in the production of VLE-based courses.

Hart suggests that a literature review should comprise:

The selection of available documents (both published and unpublished) on the topic, which contain information, ideas, data and evidence written from a particular standpoint to fulfil certain aims or express certain views on the nature of the topic and how it is to be investigated, and the effective evaluation of these documents. (Hart 1998, p.13)

The evaluation undertaken here consists of both critical readings and the synthesis of disparate findings arising out of different disciplines or professions.

VLEs, like many aspects of information and communication technologies (ICTs), tend to transcend traditional academic and professional boundaries, and so what is common knowledge in one discipline may not be known in another. This may be one reason why, as several commentators have noted (see for example Seale and Rius-Riu (2001)) there is a lot of ‘reinvention of the wheel’ in e-learning.

This report therefore follows Wolcott’s injunction to ‘draw upon the literature selectively and appropriately as needed in the telling of the story’ (quoted in Silverman (2000), p.230).
Most of the literature reviewed in this study is from the United Kingdom, although some relevant work from the United States, Australia and the EU has been included.

2.2 Survey

A web-based survey was developed in July 2003. The full text (1) of the survey is at http://www.synergy-communications.co.uk/vle-questionnaire/.

The survey was drafted after the first phase of literature review and interviews, which informed the scope and direction of the survey, as well as appropriate questions.

The survey was developed using HTML and CSS, (2) with simple text based responses returned by e-mail to the author. The form was compliant with level A of the W3C WAI guidelines (W3C WAI 1999). (Preece et al. (2002) point out that potential inaccessibility is one of the major disadvantages of web-based surveys.)

The survey was limited to an approximate 15-minute completion time to encourage the response rate. It contained a mix of open and closed questions, rankings and multiple choice questions.

The survey incorporated automated validation, ‘enforcing’ either single or multiple choices via check box and radio button functions, and returning an error page if the user had inadvertently missed a question.

Preece et al. (2002) note the above validation functionality as an important strength of web-based questionnaires. Other advantages include speed of response, lower cost (compared to paper), ease of data transfer (no re-keying of raw data), reduced analysis time and speedy correction of design errors after piloting.

The survey was first piloted on five electronic publishing masters’ students at City University to check functionality. It was then piloted on three potential survey subjects – i.e. people who had knowledge of the domain in question – to check terminology, logic and scope. Amendments were made after each pilot stage.

An introductory web page described the context of the questionnaire and the confidentiality policy, and offered an incentive for completion.

Information concerning the survey was distributed via an e-mail containing the survey URL posted to three JISCmail (3) lists of particular relevance to the domain:

One of the main disadvantages of web-based surveys, according to Preece and colleagues (2002), is finding a representative sample of respondents. To a certain extent, the existence of specialist interest groups within the field of enquiry is an asset to the research, but it should also be remembered that this potentially skews the data. Respondents are likely to represent the knowledgeable/enthusiastic end of the spectrum of stakeholders involved in VLEs. Attention is drawn to this issue at appropriate points during the analysis.

The initial screening/cleaning of the data – to eliminate wrongly keyed responses, duplications etc – was carried out during the data collection process. The survey was ‘live’ between 01/08/03 and 02/09/03. A ‘reminder’ e-mail was sent to the lists on 26/08/03.

A total of 46 survey responses were received. It was recognised that this was in some ways not the optimum time of year to conduct an academic survey in the UK, as many potential respondents take holidays during this period. Conversely, respondents who were available may have had more time to undertake the survey. The relevance and detail of responses received justified the approach, and returns represented just under 10% of the VLEs estimated to be currently in use in the UK (JISC/UCISA 2003). (7) There was an even mix of FE and HE institutions, and the relative spread of types of VLEs was representative of FE and HE nationally.

Nonetheless, given the small sample size, analysis and conclusions are necessarily tentative and partial. Recommendations arising from this sample would need specific research with a larger sample size to enhance their sector-wide validity.

2.3 Interviews

Six face-to-face thematic interviews, and one telephone interview were undertaken. These were with individuals from a number of contrasting educational institutions:

  • an urban university in south east England
  • an urban university in southern Scotland
  • an English university delivering distance learning across the UK
  • a distributed higher education institution serving a rural population in Scotland.

There were also a number of e-mail exchanges with experts in the field (referenced individually in the bibliography), and follow-up e-mails/phone calls to face-to-face interviewees.

The interviews were each conducted according to a semi-structured schedule based on the survey questions. This allowed individuals to comment in depth on the topics under discussion, and to raise unanticipated data, which could be followed up during the interview. The interviews ranged from 45 minutes to 90 minutes in duration, and were conducted at the interviewee’s place of work. Each interview was digitally recorded and subsequently transcribed.

2.4 Analysis (8)

The combination of literature review, survey and interview data allowed triangulation of data, aiding robustness of findings and analysis.

Table 1 below outlines the data analysis processes undertaken. The data consisted of qualitative data from:

  • face-to-face (f2f), telephone and e-mail interviews
  • open questions in the online survey
  • results of previous studies included in the literature review

and quantitative data from:

  • scores, ratings and rankings from the online survey
  • results of previous studies in the literature review
  • statistical sources such as HESA (Higher Education Statistics Agency) LSC (Learning and Skills Council) and DfES (Department for Education and Skills).

2.5 Some methodological issues in e-learning research

The accessibility of materials published on the worldwide web can be subjected to standard testing methods and measurements, both qualitative and quantitative. Jacob Nielsen (Coyne and Nielsen 2001), Michael Paciello (2000) and a number of others have developed consistent work on testing web usability and web accessibility for disabled users, and standardised guidelines such as those developed by the World Wide Web Consortium’s web accessibility initiative (WAI) (W3C WAI 1999, 2003b) provide accepted benchmarks.

As outlined in the previous section, this research does not undertake accessibility testing per se, but summarises existing research on accessibility of VLEs, and examines more broadly the context in which VLEs are created and used, with a view to pinpointing where accessibility problems originate. As such it conforms to Silverman’s formula of a ‘descriptive study based upon a clear social problem’ (Silverman 2000, p.33).

It should be noted, however, that some research in e-learning – particularly on its benefits - has come in for criticism. Seale and Rius-Riu point out that:

There is evidence to suggest that, in a bid to gain scientific acceptability, some learning technology research has used scientific methods inappropriately. (Seale and Rius-Riu 2001, p.23)

Similarly, Mitchell (2000) argues that inaccurate design and inappropriate analysis mean that some e-learning research is ‘pseudo-scientific’. Concerning VLE research specifically, a summary by the ICT Research Team at BECTa (British Educational Communications and Technology Agency) points out:

Most of the evidence of benefits [of VLEs]… tends to be anecdotal … inconclusive and open to debate. For example, where a benefit is reported, to what extent is it product specific, and how much does it provide a finding that reflects the benefits of VLEs as a whole? (BECTa 2003a, p.11)

E-learning research is relatively young, and VLEs are still younger. As VLEs mature and become embedded in educational institutions, VLE research will become correspondingly more diverse and robust. For the purposes of the current study however, the relative scarcity of large-scale, conclusive research needs to be kept in mind.

Table 1. Data analysis processes

Data

Quantity

Process

Response to open questions

Interviews (f2f, email, phone) n=11
Surveys n=46

Classification/coding
Analysis
Interpretation (textual)

Check box questions

Surveys n=46

Checking
Data formatting
Analysis
Interpretation (textual/graphical)

Ratings

Surveys n=46

Checking
Data formatting
Analysis
Interpretation (textual/graphical)

Multiple choice/radio buttons

Surveys n=46

Checking
Data formatting
Analysis
Interpretation (textual/graphical)

Interview tape recordings

F2F Interviews n=6

Transcription
Analysis
Interpretation (textual)

Interview notes

phone + email interviews n=5

Analysis
Interpretation (textual)


Footnotes:
(Click the footnote number to return to the text)
(1) Archive only
(2) See glossary for definitions of technical terms and acronyms
(3) Academic mailing lists run by the Joint Information Systems Committee – see glossary
(4) Ferl-VLE archives at http://www.jiscmail.ac.uk/lists/vle.html
(5) CETIS-accessibility archives at http://www.jiscmail.ac.uk/lists/CETIS-ACCESSIBILITY.html
(6) JISC-MLE archives at http://www.jiscmail.ac.uk/lists/Jisc-MLE.html
(7) A recent JISC/UCISA (2003) report cites 83% of responding colleges as users of one or more VLEs. Latest DfES figures show 152 HEIs and 483 FEIs (inc. 6th form colleges). This implies approximately 500 institutions currently use a VLE.
(8)The audit methodology of the IDEAS project (Integrating Disability into Educational Arenas) (University of Aberdeen 2001) provided useful guidelines on methodology and data analysis targeted in this particular field.


XHTML 1.0 | WAI A | W3C CSS

All pages and content copyright © Sara Dunn 2003, unless otherwise stated.