Estimated study time: 40 minutes
Although there is increasing policy and research attention paid to issues related to digital literacy, there is still relatively little information about how to put this into practice in the classroom. (van Dijk and van Deursen, 2014, p.3)
In the previous section, you considered why the concept of digital literacy should be widened to encompass more critical approaches. Now, you will look at the applications of critical digital literacy. This section outlines three key issues related to digital technologies and identifies how they may be examined using critical digital literacy and associated concepts.
Task 1 [15 minutes]
Reading: Countering truth decay
In this task, you will explore how educators may use 'media literacy' to counter the effects of 'truth decay' and promote the critical evaluation of media messages .
In Section 1, you read about the emergence of deepfake technology and its potential to spread disinformation. Establishing the origin and veracity of information has become more challenging in a world saturated by digitally-mediated messages and awash with disinformation. As a result, people face difficulties in ‘distinguishing between the truth and the representation of the truth, in determining which images and which experience to believe as true…’ (Mičunović, Badurina and Bosančić, 2016, p.140). This tendency, or the diminishing role that facts, data, and analysis play in political and civil discourse, has been called ‘truth decay’ (Huguet et al. 2019, p.ix).
Now, consider the following question: Can you think of any other examples of truth decay, other than 'deepfake' videos?
Some examples of the consequences of truth decay include the rapid dissemination of fake news using WhatsApp in India leading to mob lynchings (Chinmayi, 2019), or the Cambridge Analytica scandal involving Facebook and the U.S. presidential election (Day, 2019).
Huguet et al. (2019, p.ix) suggest that media literacy education offers a solution, by helping people navigate ‘increasingly complex information ecosystems’. Media literacy describes a set of competencies such as analysing, evaluating and synthesising information, understanding its context, and communicating responsibly. Adopting a media literacy approach, educators can encourage students to critically evaluate the motivations of those disseminating information, the content of a message, its construction, and the possible influence of the medium used to communicate it (Huguet et al., 2019, p.xi). Media literacy is often associated with other, overlapping concepts, including information literacy, news literacy, visual literacy (ibid). All these models build on ‘sociocultural perspectives of literacy’ in order to ‘contextualise digital practice within history, culture and power’ (Pangrazio, 2016, p.164). For example, in the Scandinavian context, national curricula incorporate critical approaches to the consumption and production of digital media, including multi-modal representations, analysis, evaluation and problem solving, including ethical and social concerns (Ryberg and Georgsen, 2010).
In Section 1, you read about the emergence of deepfake technology and its potential to spread disinformation. Establishing the origin and veracity of information has become more challenging in a world saturated by digitally-mediated messages and awash with disinformation. As a result, people face difficulties in ‘distinguishing between the truth and the representation of the truth, in determining which images and which experience to believe as true…’ (Mičunović, Badurina and Bosančić, 2016, p.140). This tendency, or the diminishing role that facts, data, and analysis play in political and civil discourse, has been called ‘truth decay’ (Huguet et al. 2019, p.ix).
Now, consider the following question: Can you think of any other examples of truth decay, other than 'deepfake' videos?
Some examples of the consequences of truth decay include the rapid dissemination of fake news using WhatsApp in India leading to mob lynchings (Chinmayi, 2019), or the Cambridge Analytica scandal involving Facebook and the U.S. presidential election (Day, 2019).
Huguet et al. (2019, p.ix) suggest that media literacy education offers a solution, by helping people navigate ‘increasingly complex information ecosystems’. Media literacy describes a set of competencies such as analysing, evaluating and synthesising information, understanding its context, and communicating responsibly. Adopting a media literacy approach, educators can encourage students to critically evaluate the motivations of those disseminating information, the content of a message, its construction, and the possible influence of the medium used to communicate it (Huguet et al., 2019, p.xi). Media literacy is often associated with other, overlapping concepts, including information literacy, news literacy, visual literacy (ibid). All these models build on ‘sociocultural perspectives of literacy’ in order to ‘contextualise digital practice within history, culture and power’ (Pangrazio, 2016, p.164). For example, in the Scandinavian context, national curricula incorporate critical approaches to the consumption and production of digital media, including multi-modal representations, analysis, evaluation and problem solving, including ethical and social concerns (Ryberg and Georgsen, 2010).
To better understand the ways in which media literacy education can counter truth decay, you will read the summary of a report called ‘Exploring Media Literacy Education as a Tool for Mitigating Truth Decay’. For this task, you should read only the Summary (pages ix to xx).
Huguet, A., Kavanagh, J., Baker, G., Blumenthal, M. S., 2019. Exploring Media Literacy Education as a tool for Mitigating truth decay. [e-book] The RAND Corporation.
Huguet, A., Kavanagh, J., Baker, G., Blumenthal, M. S., 2019. Exploring Media Literacy Education as a tool for Mitigating truth decay. [e-book] The RAND Corporation.
Task 2 [15 minutes]
Video: Foregrounding surveillance capitalism
In this task, we explore how educators can use critical digital literacy to engage with is the economic system of surveillance capitalism. Powered by ‘big data’ this ‘information capitalism aims to predict and modify human behavior as a means to produce revenue and market control’ (Zuboff, 2015, pp.75-76). Data collection has only sped up as individuals are increasingly equipped with wearables or surrounded by surveillance equipment and sensors embedded in everyday objects making up the ‘Internet of Things’. Often, users may trust that their data is safe with large corporations, known as ‘dataism’ (Van Dijck, 2013, 2014, cited in Lyon, 2017).
Shoshana Zuboff (2015, p.79-81) had conducted detailed study of Google and other ‘hyperscale’ technology companies. She uses Google as an example to demonstrate that this trust is misplaced and the process of extracting big data is characterised by the absence of consent. Zuboff argues that the firm’s customers are advertisers, not the end users or ‘consumers’ of their services who exist chiefly to be subjected to practices of commodification and behaviour modification. Thus, end-users are stripped of ‘consensual participation’, ‘reciprocal rights and obligations’.
You will watch a short section from of a talk given by Zuboff for the Data & Society Research Institute, in which she explains the basis of surveillance capitalism. Access the video below. It will start at the point where you should start watching (time stamp 13.13). You must then watch for about eight minutes, until time stamp 21.30.
As you watch the video, consider the following questions:
Shoshana Zuboff (2015, p.79-81) had conducted detailed study of Google and other ‘hyperscale’ technology companies. She uses Google as an example to demonstrate that this trust is misplaced and the process of extracting big data is characterised by the absence of consent. Zuboff argues that the firm’s customers are advertisers, not the end users or ‘consumers’ of their services who exist chiefly to be subjected to practices of commodification and behaviour modification. Thus, end-users are stripped of ‘consensual participation’, ‘reciprocal rights and obligations’.
You will watch a short section from of a talk given by Zuboff for the Data & Society Research Institute, in which she explains the basis of surveillance capitalism. Access the video below. It will start at the point where you should start watching (time stamp 13.13). You must then watch for about eight minutes, until time stamp 21.30.
As you watch the video, consider the following questions:
- How does Zuboff describes the emergence markets in ‘behavioural futures’?
- Do you agree with Zuboff’s assertion that prediction has become commonplace in our lives?
- Why does Zuboff consider the practices of data collection ‘a one-way mirror’?
Data & Society Research Institute, Databite no. 118: Shoshana Zuboff, Surveillance Capitalism and Democracy, 02.13.19. Licensed under Creative Commons Attribution 3.0 Unported license.
Now, let’s consider why the concept of surveillance capitalism is an importance one for educators to explore. You may recall reading about the term ‘digital natives’ in Section 1 and identifying why this is a problematic concept. It is evident that young people do not necessarily acquire the desired competencies or literacies through mere use of technology (Ryberg and Georgsen, 2010). Hoofnagle et al. (2010 cited in Zuboff, 2015, p.84) concluded that a ‘lack of knowledge’ rather than a disregard for privacy is the reason why large numbers of youth ‘engage with the digital world in a seemingly unconcerned manner’.
So, educators can use the principles of media literacy to help students engage with the issues as there is evidence that increased awareness about surveillance produces changed behaviours (Lyon, 2017). According to Gilliard (2017, p.65), educators should be ‘leveraging the classroom to make visible the effects of surveillance capitalism’ and exploring the notion of consent. Students can be encouraged to critically evaluate concerns about reliability, authorship, bias, conventions of particular genres of communication, motivation and media ownership.
Importantly, critical digital literacy involves the ability to analyse deterministic attitudes to digital technologies. While foregrounding the risks of surveillance capitalism for students, educators must recognise that ‘data subjects’ are not automatically compliant or lacking agency (Lyon, 2017). Students should not see themselves as passive recipients of surveillance culture but should be encouraged to explore how they are ‘critical recipients, active cultural consumers and co-producers’ (Ryberg and Georgsen, 2010).
So, educators can use the principles of media literacy to help students engage with the issues as there is evidence that increased awareness about surveillance produces changed behaviours (Lyon, 2017). According to Gilliard (2017, p.65), educators should be ‘leveraging the classroom to make visible the effects of surveillance capitalism’ and exploring the notion of consent. Students can be encouraged to critically evaluate concerns about reliability, authorship, bias, conventions of particular genres of communication, motivation and media ownership.
Importantly, critical digital literacy involves the ability to analyse deterministic attitudes to digital technologies. While foregrounding the risks of surveillance capitalism for students, educators must recognise that ‘data subjects’ are not automatically compliant or lacking agency (Lyon, 2017). Students should not see themselves as passive recipients of surveillance culture but should be encouraged to explore how they are ‘critical recipients, active cultural consumers and co-producers’ (Ryberg and Georgsen, 2010).
Task 3 [15 minutes]
Reading: Problematising personalised learning
Before embarking on the application of a learning analytics approach, the institution (or faculty) should be clear about what its key drivers for success are, what constraints exist, and which conditions must be met. (Slade and Prinsloo, 2013, p.1524)
In the final task of Section 4, you will consider why educators must problematise educational technologies for the purposes of critical inquiry. Through this task, you will specifically examine the growth of personalised learning, along with the related concepts of educational data analytics and predictive modelling.
As platforms and advertisers perfect their own data mining techniques, educational institutions ‘rush to mimic those strategies in order to improve retention’ (Gilliard, 2017, p.64). Underpinning the ‘smart classroom’ is the use of algorithms to analyse masses of student data, which is used to give individual feedback, personalise learning paths, recommend content or offer supportive intervention (Williamson, 2017, p.89). The allure of personalised learning is strong as it offers ‘a balm for budget austerity’ (Kim, 2019). However, there are concerns educators need to keep in mind.
Firstly, there is the question of educational data mining and students’ right to privacy. By the time a child attains the age of 13, a staggering 72 million data points have been collected by online trackers (Harris, 2017 cited in Mascheroni, 2018, p.6). Although students may be aware of data mining and surveillance, they may not be mindful of how this happens in education (Slade and Prinsloo, 2013, p.1516). By requiring students to interact with adaptive learning algorithms, are educators unwittingly contributing to ‘dataveillance’, whereby student data is harvested for commodification?
As platforms and advertisers perfect their own data mining techniques, educational institutions ‘rush to mimic those strategies in order to improve retention’ (Gilliard, 2017, p.64). Underpinning the ‘smart classroom’ is the use of algorithms to analyse masses of student data, which is used to give individual feedback, personalise learning paths, recommend content or offer supportive intervention (Williamson, 2017, p.89). The allure of personalised learning is strong as it offers ‘a balm for budget austerity’ (Kim, 2019). However, there are concerns educators need to keep in mind.
Firstly, there is the question of educational data mining and students’ right to privacy. By the time a child attains the age of 13, a staggering 72 million data points have been collected by online trackers (Harris, 2017 cited in Mascheroni, 2018, p.6). Although students may be aware of data mining and surveillance, they may not be mindful of how this happens in education (Slade and Prinsloo, 2013, p.1516). By requiring students to interact with adaptive learning algorithms, are educators unwittingly contributing to ‘dataveillance’, whereby student data is harvested for commodification?
Secondly, there is the issue of failure. A critical approach to learning analytics would also examine data storage. Students should be able to learn from past experiences, even failures, without those failures being inscribed onto their permanent record (Snowden, 2019, p.96; Slade and Prinsloo, 2013, p.1520). This is linked to wider issues of bias. Carr (2019, p.7) highlights how failure locates individuals within specific ‘social and academic strata’. Pattern recognition and predictive analysis could push students deemed ‘at risk’ of failure further towards it. Stigmatised students could be kept ‘prisoner to past choices’ by the predictive capacities of learning analytics (Pariser, 2011 cited in Slade and Prinsloo, 2013, p.1517). At the institutional level, this may result in ‘digital redlining’ where schools with historically less successful student populations are denied the same opportunities as successful ones, thereby perpetuating inequalities (Gilliard, 2017, p.64).
Thirdly, educators also can examine the ultimate objective of smart/cognitive classrooms; optimizing human cognition. ‘When nonconscious cognitive devices penetrate into human systems, they can then potentially change the dynamics of human behaviours through changing brain morphology’ (Williamson, 2017, pp.93-94). Critical digital literacy might lead educators to more closely examine the somewhat dystopian worldview that human brains need to be made ever more efficient and to question whose ends it serves. Furthermore, it remains unclear if intelligent tutoring systems based on brain modelling account for neurodiversity of any sort. Conceptualising the brain as a computer presupposes ‘standard’ brain functions to be reproduced. Educators would do well to consider if this drive towards more personalised learning could result in stultifying uniformity in human cognition. |
You will read a brief article about the potential effects of educational data mining, by Chris Gilliard. As you read, consider the following questions:
Gilliard, C., 2017. Pedagogy and the Logic of Platforms, EDUCAUSE Review 52, no. 4 (July/August 2017). [online]
- What are the potential effects of the practice of ‘digital redlining’?
- How does the author encourage his students to imagine ‘new ways to exist online’? (Gilliard, 2017)
Gilliard, C., 2017. Pedagogy and the Logic of Platforms, EDUCAUSE Review 52, no. 4 (July/August 2017). [online]
Additional reading and video (optional)
1. Data & Society Research Institute, Databite no. 118: Shoshana Zuboff, Surveillance Capitalism and Democracy, 02.13.19. Available through
https://datasociety.net/events/databite-no-118-shoshana-zuboff/
2. Gibson, G., 2007. Computers in the Classroom? A Critique of the Digital Computer as A Metaphor for Mind, The Writing Instructor, Available through https://files.eric.ed.gov/fulltext/EJ824636.pdf or http://www.writinginstructor.org/gibson-2007-09
This work is licensed under a Creative Commons Attribution-Share Alike 3.0 License.
3. Ryberg, T., and Georgsen, M., 2010. Enabling Digital Literacy, Nordic Journal of Digital Literacy 02/ 2010 Volume 5. Available through https://www.idunn.no/dk/2010/02/art03
4. Slade, S., and Prinsloo, P., 2013. Learning Analytics: Ethical Issues and Dilemmas, American Behavioral Scientist 57(10) 1510–1529, Available through http://oro.open.ac.uk/36594/
5. Traxler, J. 2019. Digital Literacy: Concepts Challenged by the Occupation. [blog] #ALTC BLOG News & Views from the ALT Community, Available through https://altc.alt.ac.uk/blog/2019/11/digital-literacy-concepts-challenged-by-the-occupation/
https://datasociety.net/events/databite-no-118-shoshana-zuboff/
2. Gibson, G., 2007. Computers in the Classroom? A Critique of the Digital Computer as A Metaphor for Mind, The Writing Instructor, Available through https://files.eric.ed.gov/fulltext/EJ824636.pdf or http://www.writinginstructor.org/gibson-2007-09
This work is licensed under a Creative Commons Attribution-Share Alike 3.0 License.
3. Ryberg, T., and Georgsen, M., 2010. Enabling Digital Literacy, Nordic Journal of Digital Literacy 02/ 2010 Volume 5. Available through https://www.idunn.no/dk/2010/02/art03
4. Slade, S., and Prinsloo, P., 2013. Learning Analytics: Ethical Issues and Dilemmas, American Behavioral Scientist 57(10) 1510–1529, Available through http://oro.open.ac.uk/36594/
5. Traxler, J. 2019. Digital Literacy: Concepts Challenged by the Occupation. [blog] #ALTC BLOG News & Views from the ALT Community, Available through https://altc.alt.ac.uk/blog/2019/11/digital-literacy-concepts-challenged-by-the-occupation/
Digital literacy for digital futures: Key implications for educators by Neenaz Ichaporia is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.