I have received some interesting feedback about my recent post about technology in the 21st century. The feedback has been largely positive but I should clarify that I do appreciate that the article How Technology should be taught in a 21st Century School (UK v USA) does make some sweeping generalisations and is somewhat one-sided in its argument. However, for that, I make no apology, as it is the nature of the beast of blogging! If I was writing for an academic journal I would perhaps be a little more balanced in my writing.
My criticisms of IT/Computing in the UK could almost equally be made of any other subject on the National Curriculum – particularly in the Secondary sector – given the compartmentalised approach to learning that characterises the UK system. Given the nature of terminal assessment, at both GCSE and A-Level, subjects are predominantly taught as stand-alone entities and thus opportunities for “transfer” are limited/negligible. Many in the world of education technology argue that the UK approach to Computing/IT has improved considerably with recent amendments to the National Curriculum. Before changes were made the curriculum was predominantly focused upon teaching functional skills centred on effectively using a machine. Given that the tools of technology that are current today will be obsolete by the time today’s ten-year-olds graduate it made perfect sense to amend the curriculum to make it more focused on programming, debugging and coding. These skills, it is argued, are akin to those used to learn a language and moreover help students develop thinking skills. Thus students are now expected to emerge as innovators rather than merely users.
Given the influence of Understanding by Design on US education there is a greater emphasis on transferring learning between “subjects” in (good) US curriculum models. Thus drawing Standards from across academic disciplines is more commonly part of the predominant model of curriculum development in the US than is the case in the UK. Subsequently, it is not alien to US teachers to use ISTE Standards in their curriculum mapping as using standards and benchmarks from a range of disciplines are considered best practice.
Unfortunately, the reality doesn’t quite match my idealistic explanation, and practices vary within systems; from district to district, school-to-school and even from one classroom to another in the same school. Due to Standardized testing and pressures on teachers to deliver results in their subject areas the process of transfer has become neglected – because the skills of transferring knowledge, understanding and attitudes are not part of what is tested! Moreover, many UK schools might meet the demands of OFSTED at face value but to what extent is the impact on learning of IT/Computing assessed? The elements that are taught as a “subject” may have National Curriculum expected outcomes but what of those that are integrated into other subject areas? Furthermore, even when assessment tasks exist are they effective and appropriate? Are tests merely criterion-referenced tasks or do they dig deeper into a students ability to use technology as a tool that significantly advances learning?
I am sure there are examples of outstanding practice but more often than not I suspect that the average High School teacher is more conscious of the demands upon them to prepare students for terminal examinations than any other aspect of their roles and responsibilities. After all when it comes to performance management teachers are expected to justify the results of the students they have taught, in their particular subject discipline. Again my criticisms aren’t of technology/computing per se but more of the established norms of pedagogy, practice and assessment procedures.
There is more hope at Elementary and Middle School (Key Stage 3) though as the pressures of terminal examinations and/or high stakes testing are not as omnipresent. The opportunities for the transfer of learning and an assessment methodology involving performance tasks are more abundant. I read recently though that Human Rights education is infrequently taught in UK schools, and that the reason for this is that teachers do not consider that their university education had prepared them to deliver the content of such a course. Perhaps this problem affects the realm of computing too? How many UK schools were properly prepared to integrate the demands of the revised National Curriculum into their academic programs and where they have done so how good is the quality of teaching and learning?
Debates around technology fascinate me but more often than not I still wonder if the cart is not very much still before the horse. Despite protestations to the contrary those in the technology industry have a vested interest in promoting the use of technology. The cleverest technology evangelists promote their products and/or services with talk of integration and advancing learning but how objective is someone within the industry, or dependent upon it for employment, likely to be? I am no Luddite and see enormous potential in educational technology but a computer remains but a tool for learning and moreover one tool in a multifaceted toolbox!