The first of a two-part series, by Scott Thornbury
“Nothing is particularly hard if you divide it into small jobs.” (Henry Ford)
ELT materials authors have recently been voicing their concern at the way the increasing digitalization and commodification of educational publishing might adversely affect not only their livelihoods but the status and integrity of their profession. The (often lucrative) royalty-based contracts on which many writers have relied in the past look set to be supplanted by a system of one-off negotiated fees. Worse, the ‘name author’ may soon be replaced by compliant teams of anonymous ‘item-writers’, many of whom may guiltlessly under-cut more experienced writers simply to get a foot in the publishing door.
The out-sourcing and fragmentation of content is not a new development: writers of ancillary materials, such as workbooks, teacher’s books, test and resource packs, have long since bitten the bullet and accepted the often nugatory fees on offer. Increasing digitalization has meant that a lot of content is not only ‘atomized’ but anonymous. And, as publishers, testing bodies and their institutional and political clients deliriously embrace the promise of ‘adaptive learning’ technologies, this trend is set to consolidate. The distinction between content-providers and test-item writers will start to blur, as will the distinction between named authors and out-sourced peons. As Neil Selwyn (2014: 129) observes, ‘One of the clear outcomes of the digitizations [sic] of education [...] is the reconstitution of education into forms that are reducible, quantifiable and ultimately contractible to various actors outside of the educational community.’
The commodification of education, including the reduction of content to the level of testable ‘bytes of information’, along with the relentless devaluing, de-skilling and disempowering of teachers that such commodification entails, should be resisted at all costs. And writers, like other vulnerable stakeholders, have not been slow in voicing their opposition. To their credit, their arguments are as much about the cynical disregard of educational values on the part of the multi-nationals as they are about any threat to their own well-being. Many ELT writers, after all, are former teachers and teacher trainers, and have a strong, even passionate allegiance to a model of education that is being brutally eroded by (in some cases) their current paymasters.
However, in defending their professional probity in the face of such dark forces, many writers overlook the fact that the ELT publishing industry (with them at its helm, it has to be said) had already gone down the atomization route long ago, even when the industry lacked the (digital) means to fully exploit this tendency. The view that language learning involves the incremental accumulation of discrete grammatical entities has long been enshrined in the design of coursebooks: witness these statements from publishers’ catalogues over two decades:
English structures are presented in small, manageable units and in incremental steps. (1989)
New grammar is introduced in manageable chunks and is given thorough and systematic practice. (1996)
The course has a transparent grammar syllabus which progresses steadily throughout the course. (2009)
Of course, this ‘incremental steps’ approach blissfully ignores current thinking (both then and now) with regard to how languages are actually learned. As Long and Robinson (1998: 16) put it:
Of the scores of detailed studies of naturalistic and classroom language learning reported over the past 30 years, none suggest, for example, that presentation of discrete points of grammar one at a time … bears any resemblance except an accidental one to either the order or the manner in which naturalistic or classroom acquirers learn those items. As Rutherford (1988) noted, SLA is a not a process of accumulating entities.
In a similar vein, but more recently, Rod Ellis (2008: 863), in reviewing the research to date, concludes, ‘Grammar instruction may prove powerless to alter the natural sequence of acquisition of developmental structures.’ And Diane Larsen-Freeman (1997: 151), coming from a dynamic systems perspective, reminds us that
Learning linguistic items is not a linear process – learners do not master one item and then move on to another. In fact, the learning curve for a single item is not linear either. The curve is filled with peaks and valleys, progress and backslidings.
Why, then, does the ‘accumulated entities’ view persist? Because, construed as ‘mcnuggets’, grammar offers a means of disguising the inherently chaotic and idiosyncratic nature of language learning, rendering it instead as systematic, predictable, manageable and, ultimately, testable. It is consistent with the ‘culture of positivism’ (Giroux, 1997: 11) in which ‘knowledge becomes identified with scientific methodology and its orientation towards self-subsistent facts whose law-like connections can be grasped descriptively’. Such a view of language lends itself to models of production, consumption and regulation that not only do not threaten the status quo but underpin a lucrative global marketing strategy. And, of course, when a grammar mcnuggets approach joins forces with digital delivery systems, it is a marriage made in heaven.
It is not just grammar that has been freeze-dried and vacuum-wrapped, either. Communicative competence itself, as Leung (2014: 135) points out, has been subject to the same reductionist treatment. Surveying the way that social interaction is dealt with in two recent coursebooks, as well as in the descriptors of the Common European Framework, Leung concludes that
present-day curriculum and pedagogic manifestations of the concept of communicative competence have tended to work with an inert and decomposed knowledge view, and this view continues to enjoy widespread circulation, despite a body of work that has pointed to the need to take a dynamic view of the social dimension.
When communicative competence – the theoretical construct that undergirds the whole communicative approach – becomes ‘inert and decomposed’, you can safely consign CLT to the dust-bin of methodological history.
As Lin (2013) warns: ‘Language teaching is increasingly prepackaged and delivered as if it were a standardised, marketable product [...] This commodifying ideology of language teaching and learning has gradually penetrated into school practices, turning teachers into “service providers.” The invisible consequence is that language learning and teaching has become a transaction of teachers passing on a marketable set of standardised knowledge items and skills to students.’
What Lin fails to acknowledge is that this ‘commodifying ideology’ has dominated since at least the mid-1980s, and that materials writers, willy-nilly, have been complicit. To protest that publishers, in cahoots with software designers, are only now taking commodification to its logical extreme is to ignore a long history of curriculum and materials design predicated on a production-line view of education.
Ellis, R. (2008) The Study of Second Language Acquisition (2nd edn.) Oxford: Oxford University Press.
Giroux, H. (1997). Pedagogy and the Politics of Hope: Theory, Culture and Schooling. Oxford: Westview Press.
Larsen-Freeman, D. (1997) ‘Chaos/complexity science and second language acquisition’, Applied Linguistics, 18/2.
Leung, C. (2014) ‘Communication and participatory involvement in linguistically diverse classrooms,’ in May, S. (ed.) The Multilingual Turn: Implications for SLA, TESOL and Bilingual Education, London: Routledge.
Lin, A. (2013). ‘Toward paradigmatic change in TESOL methodologies: building plurilingual pedagogies from the ground up’, TESOL Quarterly, 47/3.
Long, M. and Robinson, P. (1998). ‘Focus on form: Theory, research and practice’. In Doughty, C., and Williams, J. (eds.) Focus on form in classroom language acquisition. Cambridge: Cambridge University Press.
Selwyn, N. (2014) Distrusting Educational Technology, London: Routledge.
Next week: Writing by numbers: the myth of coursebook creativity.