Become a member

Language Magazine is a monthly print and online publication that provides cutting-edge information for language learners, educators, and professionals around the world.

― Advertisement ―

― Advertisement ―

Ontario Lawmaker Addresses Legislature in Anishininiimowin

A First Nations lawmaker in Ontario, Canada, has addressed the province’s legislature in Anishininiimowin, in a move that repudiates a centuries-long colonial “war” on...
HomeFeaturesChallenging the Neutrality Myth of EdTech

Challenging the Neutrality Myth of EdTech

In Language Magazine's July issue, Tim Stewart examines the role of digital technologies in ELL classrooms and how they are shaping our futures as citizens and educators

Media technology has been introduced into classrooms at least since the time of educational filmstrips. Later, private media companies began providing schools with “free” equipment that could broadcast their content, introducing commercials into one of the last advertising-free spaces. Today, students regularly use commercial learning software and learning management systems that generate a huge amount of student data as their progression through lessons is monitored. Few students, parents, teachers, or administrators are aware of what is being collected, how it is being used, or who has access to it. Technology is not neutral. Algorithms are inherently biased with the values of the programmers (O’Neil, 2016). We are not allowed to know how these algorithms determine outcomes, but they increasingly shape teachers’ careers and students’ educational prospects. With the rise of so-called teacher influencers cashing in on the Silicon Valley gravy train (Singer, 2017), it might be time to pause and take a more critical look at who benefits from decisions to adopt new technologies in schools.

Politics and Language Teaching
I doubt that most second-language teachers think of their profession as political. For those who do, most would refrain from airing political views during lessons. While there was a time when TESOL and applied linguistics were framed as somehow pleasantly neutral and outside of the sphere of politics, that is no longer the case. In the 1990s, we discovered that language education does not inhabit some idealized Leave It to Beaver! space of wholesome “American values” (e.g., Pennycook, 1989).

Politics is ultimately about having the decision-making power to set the agenda. At the start of the TESOL profession, educators had the curricular and pedagogical agenda laid out for them through the behavioral method of the 1960s and then through the implementation of a variety of methods in the 1970s–1980s. Experimentation with different methods eventually led to more eclectic styles of teaching, wherein teachers gained some control by combining elements from different methods, as teaching decisions were made based on individual reasoning rather than ideological faith in a method. As teachers gained pedagogical decision-making power over methods (postmethod), more of them began to investigate how and why classroom practice was connected to student and teacher learning (see Freeman, 2016). Each of these moves in the field strengthened the agency of teachers to advocate on behalf of the profession and of their students. The expansion of decision-making power also increased teachers’ awareness of the political dimensions of TESOL.

In his book Values in English Language Teaching, Johnston (2003, p. 54) laid out five areas that showcase the political dimension of TESOL:

“the part played by language education in the processes of colonization and decolonization, the effect of the spread of English on indigenous languages, the political dimension of teaching immigrant and refugee learners in ESL contexts, the dominance of English in the media and in computer-based technologies, and the role of English in globalization.”

This article is a brief examination of the role of digital technologies in TESOL classrooms and how they are shaping our futures as citizens and educators. For many educators, technology may appear neutral and nonpolitical. Therefore, it remains invisible and unchallenged. In reality, technology is highly influential for language education and therefore demands our attention as an important political aspect of TESOL.

The COVID-19 pandemic caused teachers and students to suddenly engage online digitally. The shift toward online, on-demand, and hybrid learning is likely to only gain momentum. New educational technologies are typically described in neutral terms as tools, like chalk and chalkboards, that simply are in the service of improving teaching and learning. In fact, the creators of these technologies have their own agendas that often are greatly at odds with the values of most language teachers.

The evolution of the field of English language teaching described above freed teachers from solely focusing on two-dimensional, black-and-white decisions regarding the most efficient way of acquiring an additional language. From the 1990s, second-language teachers became concerned with professional identity, ideology, and questions of values, politics, and power. Out of this process, there emerged a politics of teaching that is different from the teaching of politics. Teacher agency is obviously central to the politics of language teaching.

At the turn of the century, Hargreaves (2003) wondered how teacher agency could become a force to promote the creation of the so-called knowledge society. Back then, the internet was viewed as a medium for promoting equity of information and knowledge.

While Hargreaves was acutely aware of the powerful economic forces controlling the new digital technologies, he did not fully understand that these forces were surreptitiously creating a new social order. Zuboff (2019) recounts Durkheim’s analysis of the effects in industrial society of the division of labor as “an ordering principle that enabled and sustained a healthy modern community” (p. 184). In postindustrial society, a new social order has evolved based on the concentration of resources accumulated through technological surveillance—the division of learning (Zuboff, 2019).

Division of Learning
Fifty years ago, Marshall McLuhan recognized that “Today we live invested with an electronic information environment that is quite as imperceptible to us as water to a fish” (1969, p. 5). We are born into this world and live our lives in particular historical eras, largely unaware of the forces that shape the social order of our times. For people alive today, the new division of learning “reflects the primacy of learning, information, and knowledge in today’s quest for effective life” (Zuboff, 2019, pp. 184–185). Capitalism’s incessant drive for accumulation and the inevitable concentration of capital and power by industrialists in the 19th and early 20th century were eventually counterbalanced by union organizing and social justice movements. This action resulted in a social contract wherein productivity gains were shared, which resulted in the growth of a vibrant middle class. How the immense wealth today concentrated in the hands of executives and shareholders of Google, Amazon, Facebook, Apple, and Microsoft will be effectively counteracted is unclear.

To understand the new social order, we have to go back to the turn of the century. The promise of the so-called knowledge economy of the 1990s appears to have been distorted somewhat by accident and evolved into something Zuboff (2019) calls “surveillance capitalism”. The turn happened during the burst of the dot-com bubble, when Google investors got tired of waiting for the company to turn a profit. At that critical moment, some of its engineers stumbled on the data “exhaust” that Google Search retained about users. As we now know, this exhaust turned into the gold dust that is used to anticipate human behavior and is sold to advertisers as predictive products. What was formerly ignored as waste was fed into computational systems and generated a fortune.

This new form of capitalist accumulation has become the model for businesses of all kinds, and “When it comes to essential questions, surveillance capital has gathered the power and the asserted authority to supply all the answers” (Zuboff, p. 186). All of the data that people give away for free to big tech companies is computed and turned into predictions of behavior (knowledge) for commercial ends. We are not the customers; our lives are merely raw material. Our activities, emotions, and health are just the means to create algorithms that feed the networks. In short, the intrinsic value of a human life is being reduced to the data it produces. All this took place while we were distracted by the surface-level glitter of new technologies.

Your Information: Who Controls It and Who Decides?
Our data is constantly being gathered. At schools throughout the U.S., students begin their day with a check by metal detectors. In classrooms, individualization of instruction through technology enables closer surveillance of students. Google has a “free” suite of educational tools that are used in many school systems. While it appears that Google does not target advertisements at K–12 users of the service, the company still retains the personal data of children (Gillula and Cope, 2016). Typically, the privacy policy is so complex that parents and teachers might have to hire a contract attorney to figure out what data the company collects and what it does with that data. In addition, Silicon Valley has sold software to schools that tracks keystrokes, social media posts, facial expressions, and eye movement of children to monitor safety and learning. In contrast, elite schools in Silicon Valley are decidedly low tech, using “chalkboards and No. 2 pencils” (Akhtar and Ward, 2020).

Evaluation of student performance is increasingly being turned over to algorithms. While these online tools can save teachers valuable time as they help students review material and revise their writing by pointing out grammatical mistakes, for instance, the work still needs to be thoughtfully checked by a knowledgeable human. The truth of this was shown a number of times this past year. First, U.S. children figured out how to “play” Edgenuity’s algorithm by simply typing a string of probable keywords associated with a question. In fact, the company’s website states that “answers to certain questions receive 0% if they include no keywords, and 100% if they include at least one. Other questions earn a certain percentage based on the number of keywords included” (Chin, 2020). Second, in England, after regular examinations had to be cancelled, algorithms were used to standardize grades both for general secondary exams and the International Baccalaureate A-Level tests. Major protests by teachers, parents, and students over unexpectedly low grades forced the government to cancel these results (Dans, 2020).

Teachers are also being evaluated by algorithms. Value-added modelling (VAM) is supposed to calculate how much a teacher adds to her students’ academic growth. For ease of machine processing, these scores are calculated using student results on standardized tests. This formula puts teachers of English language learners at a distinct disadvantage. Teachers have been terminated based on these evaluations, but they are not allowed to know how the scores are calculated or whether they are accurate (O’Neil, 2016). This is because evaluation has been subcontracted to for-profit corporations that claim their algorithms are intellectual property. Teacher unions have successfully fought against this inhumane evaluation system, but a big worry is that increased surveillance will stop teachers from organizing, protesting, or striking for labor rights and social justice issues. It is not inconceivable that keyword searches by teachers in the U.S. for “health insurance” or “salary” might lead to retribution.

There is a growing abyss between what we know, what algorithms allow us to know, and what is known about us by private companies and governments. How can the seemingly overwhelming imbalance of the new division of learning be countered? The companies claim that this mutation is somehow an “inevitable” transformation in the economic system, ignoring the fact that it is simply a set of choices made by people. The monopolies that have been created by this concentration of knowledge need to be broken up and the markets that trade in predictions of human behavior should be outlawed. Furthermore, there need to be new laws for and regulation of online business and social media. It must be made illegal to secretly capture the data of individual human experience and online behavior.

However, the astronomical sums made by the companies engaged in surveillance capitalism mean that they will stubbornly resist all criticism of their methods and any movement toward real reform and regulation. A case in point is that of a former insider, AI ethicist at Google Dr. Timnit Gehbru. In a paper she co-authored with a university expert and four other Google researchers, the risks of the current method for training AI language models were highlighted. These risks included the global impact of the company’s expanding carbon footprint and the inherent bias and homogenization of the model because data scraped from the internet is mostly generated by wealthy communities in rich countries (Hao, 2020). Google executives did not appreciate these observations and immediately terminated Dr. Gehbru when she demanded explanations for the negative reaction to her team’s research. Profit trumps ethics every time.

What kind of future do we want for our children: one where it’s possible to forgive and forget youthful indiscretions and bad decisions, learn, turn the page, and move on, or one where these things can never be forgotten and follow them throughout their lives? It might sound overly dramatic, but this is a real choice that needs to be made soon. Students depend on their teachers’ guidance, and good mentorship requires accurately interpreting the world. In order to advocate on behalf of their students and their profession in the contemporary world, language educators must be aware of how digital technologies are rapidly changing the politics of teaching.

Akhtar, A., and Ward, M. (2020). “Bill Gates and Steve Jobs Raised Their Kids with Limited Tech.” Business Insider.

Chin, M. (2020). “These Students Figured Out Their Tests Were Graded by AI—and the Easiest Way to Cheat.” The Verge.

Dans, E. (2020). “Algorithms and Education: Not so fast.” Forbes.

Freeman, D. (2016). Educating Second Language Teachers. Oxford University Press.

Gillula, J., and Cope, S. (2016). “Google Changes Its Tune When It Comes to Tracking Students.” Electronic Frontier Foundation.

Hao, K. (2020). “We Read the Paper that Forced Timnit Gehbru Out of Google. Here’s what it says.” MIT Technology Review.

Hargreaves, A. (2003). Teaching in the Knowledge Society. Teachers College Press.

Johnston, B. (2003). Values in English Language Teaching. Lawrence Erlbaum Associates.

McLuhan, M. (1969). Counterblast. Rapp & Witing Ltd.

O’Neil, C. (2016). Weapons of Math Destruction. Penguin Books.

Pennycook, A. (1989). “The Concept of Method, Interested Knowledge, and the Politics of Language Teaching.” TESOL Quarterly, 23, 589–618.

Singer, N. (2017). “Silicon Valley Courts Brand-Name Teachers, Raising Ethics Issues.” New York Times.

Warner, J. (2019). “A Final Nail in the Coffin for Turnitin?” Inside Higher Ed.

Zuboff, S. (2019). The Age of Surveillance Capitalism. Profile Books.

Tim Stewart is professor of TESOL at Kyoto University, where he mentors undergraduate and graduate students alike in academic writing. He has worked for and been fired and rehired by the TESOL International Association.

Language Magazine
Send this to a friend