This paper discusses some problems arising from developments in distance education as the old ‘closed’ pedagogies of the early years of the Open University (UKOU) have given way to more flexible ‘open’ and participatory versions. The discussion focuses mostly on educational dilemmas arising from the need to balance the ‘good’ and ‘bad’ sides of this progression, and I draw upon work in distance education specifically, and on more general commentaries upon the effects of electronic media on knowledge and culture. I think that this sort of analysis might help sketch out possible issues arising for the development of electronic libraries as well, especially in terms of how such technologies might be used and with what educational effects. However, more detailed implications are clearly best left to library and information professionals.
Distance Education at the OU: the ‘Closed’ Approach
My own interest in distance education arises from my early work at the UKOU in the 1970s. I worked in the Institute of Educational Technology (IET) which was initially responsible for designing the educational aspects of the whole system needed to meet the new conditions (the design of the system also reflected strong political and financial aspects, of course). I have given my own account of these early planning decisions in Harris (1987).
The most obvious problems included developing new sorts of teaching (with university subjects, using mixed media), with new sorts of students, studying part-time (and therefore meeting their own expenses), and potentially with non-traditional academic profiles (although, in fact, many of the new students in the early years had quite similar profiles to traditional university entrants). Less obvious changes included developing new working arrangements for staff, including the use of short-term contracts for some central and most regional tutors. The whole system was to be integrated using a new approach based in educational technology, which promised to deliver both efficiency and ‘openness’ (defined largely in terms of coping with non-traditional students).
For various reasons, however, this approach adopted a peculiarly tightly closed design for the teaching system. Early design techniques included the now-familiar ‘objectives’ approach, pursued with unusual consistency and zeal. Objectives were to be defined, ideally in measurable, ‘behavioural’ terms, and then each component of the system could be evaluated rigorously in terms of its ability to achieve them. Radical implications were drawn for each element in the system – from summer schools to educational broadcasting. The general implications for course design were summarised by one of the first professors in IET:
‘The main teaching points must be fully explained, misleading statements and irrelevant scholastic displays must be eliminated. There must be no mistakes, non-sequiturs, gaps, or any other defects in the arguments.’ (Lewis 1971).
However, the approach proved too radical (or too ambitious):
(a) Many components were indeed
found wanting -- especially broadcasting – but they had to be kept anyway,
largely for ‘political’ reasons in this case.
Nevertheless, despite certain revisions (which I discuss below), it is still possible to see the whole UKOU teaching system, and many of the approaches which have imitated it, to be based on an approach to teaching which stresses ‘telling’ (Lewis and Cook 1969), and a corresponding model of learning which sees the mastery of certain privileged concepts as the main task. This sort of conception of academic knowledge involves tight framing (i.e. with tight boundaries between ‘proper knowledge’ and ‘everyday’ knowledge), and a definite hierarchy between the levels of mastery (new students at the bottom, professors at the top) (see Bernstein in Young 1971).
More materially, there was also the need to offer equal chances, even to remote students, with restricted access to any material outside of the course, and a classically pessimistic view of non-traditional students lurked in the background. Indeed, such were the worries about the low performances and high drop-out rates of interlopers, that UKOU administrators never particularly encouraged applications from non-traditional students. Indeed, they developed a screening system for any that might apply, and breathed a huge sigh of relief when applications initially flooded in from highly ‘suitable’ students after all (Harris 1987).
However, it is course design that interests us here. These background factors all led to a teaching system where course packages were to be designed to include all the elements needed for successful mastery. Perhaps the principles are easier to see if we follow the switch in the organising metaphor for course design following the discovery of the inadequacies of the ‘objectives’ model. The principles of cybernetic psychology replaced those of behaviourist psychology, and course design was conceived as an exercise akin to computer programming (it was known as the ‘knowledge structures’ approach). As is well-known, successful program operations require that all the necessary information is specified and supplied in advance, in a ‘conceptually closed’ set or network.
The result was the classic OU course of the 1970s and 1980s, consisting of specially-written correspondence teaching packs, extracts from relevant books or articles, specially-commissioned course readers, collections of off-prints, and special editions of ‘set books’. No library service was offered for students, because none was thought necessary, although some special arrangements were negotiated with local libraries, and there were some small collections made available in study centres or summer schools. Librarians and their skills were not alone in being neglected, of course: local tutors and their skills were also seen as largely irrelevant, or, at best, as merely ‘remedial’, serving only to pick up any inevitable ‘noise’ or ‘bugs’ in the teaching system. The silence about any positive role for libraries in OU-style distance education extended to most of the early commentaries too, according to Cavanagh’s review (in Evans and Murphy 1994).
The system was designed to ‘tell’, and it had a number of telling technologies already to hand, including broadcasting. However, costs, and the reactions of an initially sceptical academic audience, rapidly led to the abandonment of radio and television as major technologies (the OU was to be called the ‘University of the Air’ until these problems arose).The need remained for a partnership with the BBC (partly to share costs, partly to borrow status), and so broadcasting remained in the system. Early studies showed a disappointing rate of suitable usage by OU students, however, a finding with some implications for later discussion. After all the initial excitement about new technologies (in the 1970s), correspondence teaching through print emerged as the workhorse of the system.
The New Educational Technology: ‘Independent Learning’
Just about all these assumptions about learning and its political context were to change, however. Academic knowledge became more loosely-framed, for example, especially with more ‘applied’ courses replacing the traditional academic subjects. Postgraduate courses required some kind of research, rather than just ‘mastery’. At undergraduate, including certificate or diploma, level the University shared in the more general shift to customer-centred modes of teaching in higher education, including ‘independent.learning’. At the theoretical level, educational technology underwent a conversion to more ‘student-centred’ or interactive modes of course design, while at the political level, many early fears evaporated as the OU recorded acceptable drop-out rates overall. The data these days are often far less encouraging than the public realise, in fact – Morgan argues that ‘Less than half of the new students now finally registering on a foundation course are likely to proceed to graduation’ (Evans and Juler 1992: 84).
However, the new openness and independence offers a mixture of advantages and disadvantages too. Morgan (1985) and O’Reilly (1991) outline some paradoxes with the concept of independent learning, for example. The same tensions – between the autonomy of the learner on the one hand, and the needs of teachers to develop specialist disciplines on the other -- affect the new pedagogies too. It is possible to have too much independence and autonomy, too much choice, so that the learner ceases to learn enough of the approved contents or structures to succeed in the assessment process that proper universities have to undertake.
Of course, to some extent, this problem too can be overcome. New assessment devices can be developed to reward divergence and independence. However, even these have to produce acceptable distributions of grades overall in the end, although British academics have managed to retain a good deal of secrecy about the actual processes whereby these final distributions are actually achieved. Students themselves can come to separate radically the business of passing tests from any broader interests in learning: they adopt ‘instrumental’ stances to assessment, developing often shared expertise in ‘selective neglect’ of the syllabus, learning how to ‘psych out’ their assessors and the like (Becker et al. 1964) or to become ‘cue-seekers’ (Miller and Parlett 1976). Without a design solution, only these deviant options remain, and all personnel in education would have to operate at two levels, so to speak – a public discourse of educational standards combined with a private cynical collaboration with students to pass tests might be one undesirable outcome.
The New Hardware and its Effects
How does this translate into a technology? The traditional option of the human tutor was reintroduced and given the newly-acknowledged skilled task of mediation, negotiation and compromise to reconcile independence and supervision. The UKOU also went on to become a pioneer in the use of new electronic technologies, of course, especially computer-linked tutorials, on-line courses, and email-based teaching. It is important now to turn to the clearly practical problems with the opportunities offered by the new electronic technologies, but it might be profitable to consider some of the older discussions first.
The ‘reproduction’ debate considered many of the options in the 1940s and 1950s, for example. On the one hand, the optimists stressed the benefits of immediate access to culture and knowledge that the new technologies offered. Anyone could view or hear the great works of art, for example, and do so directly, in their own homes, with none of the old elitist ‘aura’ that used to surround such works. This optimism informed early radical interest in the OU too, with Birnbaum’s hope (in Tunstall 1974) that academic knowledge would be democratised and normalised by educational television, as it beamed expert discussion and analysis into people’s living rooms, making it a part of their domestic and work lives, and avoiding all the socially divisive risks and rituals of university seminars run by elderly bourgeois dons.
For the pessimists, this access was provided at a price. Cultural (academic) materials would lose their distinctiveness along with their social auras, and would become cultural commodities along with all the other products of the ‘culture industry’ (Adorno and Horkheimer 1979). It was possible that great art, or academic critique, would be seized upon by eager critics thirsting for ammunition to pursue their radical projects, but far more likely that art and critique would be assigned a (minor and probably derogatory) place in the ordinary scheme of things. I am not proposing a definitive answer to this debate, but I have met some OU students who were indeed thirsty critics – a shop steward or a teacher taking Social Science courses in order to grasp the principles of a system of work they opposed. However, I have met far more who managed academic work in a cheerfully ‘normal’ way, coping with it instrumentally, treating it as an odd minority pursuit, and remaining largely unaffected, and certainly unradicalised, after their periods of study.
The new electronic technologies seem to offer similar possibilities. To be optimistic, information is now available on a vastly increased scale, whether mediated through (electronic) libraries, provided commercially, or even published by individual enthusiasts. As we know, whole new dimensions of interactivity have developed with electronic texts: it is possible to search databases (including rare books) from all over the world, at any hour of day or night, via new and convenient search systems. We can manipulate electronic text once downloaded: scholars can analyse it electronically, for example. Just as writers discovered that wordprocessors were far more than just fancy typewriters, so (even student) users have realised that electronic text can be conveniently edited, with the main points cut and pasted, say, or merged with other texts. Electronic journals can be updated rapidly by authors, while readers can contribute to FAQs, or interact with authors in MOOs or via email (which is just as speedy and somehow both more informal and precise than telephones).
Yet the pessimists are equally right to doubt the overall effects of such provision, driven as it is by non-educational agendas (see Evans and Nation in Evans and Nation 1996). What might be called proper academic knowledge occupies a small place in a huge virtual universe, and it would be not surprising to find its distinctiveness obliterated. In a system which connects you with a site offering Dead Sea scroll fragments one minute, and one providing details of Japanese bondage the next, there is a clear danger of reducing all to a common cultural denominator (both sets of materials could look equally ‘exotic’ and incomprehensible, perhaps). Any necessary distinctions or boundaries between these materials seem to be maintained at present in a curious way – little social networks or intranets build up and connoisseurs pass on useful URLs on scraps of paper, or buy specialist magazines or books (such as Tseng et al. 1996). Soon, some predict, larger intranets will reduce the present free-for-all – but will materials be classified as they are at present (e.g. via ‘subjects’), or in ways more suitable for the interests of the commercial providers?
On a more practical level, we know how difficult it is to get students to work with suitable classificatory systems (including the old technologies of card indices). Leave them to their own devices with electronic databases, and they search too inefficiently or too promiscuously for academic purposes. Apparently, even Australian 12 year-olds do this (Oliver and Oliver 1996), and it is suggested that training in efficient search procedures is needed. Yet what sort of training should it be – in the actual procedural mechanics of common search engines, or in something wider, involving the ability to impose suitable criteria on a formless resource?
There are optimistic and pessimistic possibilities again, it seems. A number of influential commentators on contemporary culture in general have stressed the good side of excess and relativism, for example. The very variability and promiscuity of structures organising and classifying knowledge will lead people to a great philosophical insight – that existing forms are artificial and that there is an unrepresentable openness to human knowledge, a ‘sublime’ dimension for Lyotard (1986). Deleuze (1992) has argued that the novel viewpoints offered by contemporary cinema extend the limits of human perception and deliver a new set of mobile images that burst conventional frames. More directly, Turkle (1995) investigates the effects on identity of being able to interact on the Net with a multiplicity of virtual identities: after an initial ‘sense of anguish’, a more liberating phase develops, and:
‘As we sense our inner diversity, we come to know our limitations. We understand that we do not and cannot know things completely, not the outside world and not ourselves.’ (Turkle 1995: 261).
The pessimistic possibilities have also been sketched in debates on existing media. Kinder (1991), discussing the free-wheeling relativism and radical intertextuality of contemporary children’s television, predicts serious damage to the old cultural mechanisms (standard narratives and stable representations) that permitted the steady growth of knowledge. Into the gaps will drop pre-packaged knowledge provided by commercial companies or dubious moral authorities, willing to provide the easy answers for cognitively stunted consumers. The Internet will exaggerate these trends, for Kinder, with its possibilities for illusory ‘participation’ and ‘choice’: these options will only destroy the old structures, and with them any genuinely independent thought.
Clearly, no hard and fast conclusions are available. Change is occurring at such a pace and on such a scale as to defy any easy answers, or indeed, any easy research to provide answers. Many small-scale and limited studies have been produced, with limited and local results (usually about usage rates or immediate practical problems with the new devices – the educational technology journals and collections I cite are strewn with such studies), but the really big and significant questions (about the impact on learning or on knowledge itself) are not likely to be tapped by user surveys.
At least some of us know how it feels to be confronted with these issues fairly late in our careers: just like ‘marginalised’ immigrants to modern societies, we ‘no longer invent local futures…[but remain]…tied to traditional pasts, inherited structures that either resist or yield to the new, but cannot produce it’ (Clifford 1988: 5).
Adorno, T and Horkheimer, M (1979)
Dialectic of the Enlightenment, London: Verso.