Sport Informatics and Analytics/Introductions/Ethical Issues

From WikiEducator
Jump to: navigation, search
Wrestler, McCreadie (taken for Leichhardt Stadium), 1937 - photographer Sam Hood (7539665638).jpg


This course offers opportunities to reflect critically[1] on some of the ethical issues[2] raised by the use of informatics and analytics in sport. As Harry Frankfurt suggests, reflection gives us the opportunity to consider what we care about[3].

One of the objectives that guide this course is:

To contribute to discussions about the epistemological foundations of sport informatics and analytics and the flourishing of ethical practice in the observation, recording and analysis of performance in play, games and sport.

This is a time of rapid expansion of informatics and analytics as fields of study and as practical activities. We hope that discussions about the quantification of performance will lead to reflections about the ethical framework within which we will work as our digital world is transformed by technological innovation[4][5][6][7][8] and how we position our practice in a way that contributes to discussions about open, pluralistic, tolerant, equitable behaviour[9], the 'ownership' of data[10][11] and privacy principles.[12][13]

Marshall McLuhan[14] explored a range of ethical issues prompted by an "electric age". This quote seems particularly relevant to the discussions on this page of the course:

In this electric age we see ourselves being translated more and more into the form of information, moving toward the technological extension of consciousness… By putting our physical bodies inside our extended nervous systems, by means of electric media, we set up a dynamic by which all previous technologies that are mere extensions of hands and feet and bodily heat-controls - all such extensions of our bodies, including cities - will be translated into information systems.[15]

In this topic you will have the opportunity to consider:

  • Philosophical issues
  • Socio-cultural issues
  • Pedagogy
  • Privacy and anonymity

In doing so, we are mindful of Nimrod Aloni and Lori Weintrob's (2017) observation:

Our current era has been diagnosed by many social critics as suffering from a disproportion between information and orientation. We live in an epoch that is named the information age, invests in scientific research, and celebrates technological innovations. Yet, at the same time, the commitment to liberal arts education and to serious public discourse has been abandoned, devaluing the capacities of thoughtful and empathetic deliberation required for ethical and political appraisal of personal choices and common goods.[16]

We start the discussion with a consideration of epistemic culture.

Epistemic culture

The study of informatics and analytics offers opportunities to explore our shared epistemic culture.

Karin Knorr Cetina says of an epistemic culture:

Everyone knows what science is about: it is about knowledge, the ‘objective’ and perhaps ‘true’ representation of the world as it really is. The problem is that no one is quite sure how scientists and other experts arrive at this knowledge. The notion of epistemic culture is designed to capture these interiorised processes of knowledge creation. It refers to those sets of practices, arrangements and mechanisms bound together by necessity, affinity and historical coincidence which, in a given area of professional expertise, make up how we know what we know. Epistemic cultures are cultures of creating and warranting knowledge.[17]

She adds that "the focus in an epistemic culture approach is on the construction of the machineries of knowledge construction" [18] (our emphasis).

Karin provides a detailed account of her work in her book Epistemic Cultures: How the sciences make knowledge (1999).

Ethical issues

Mark van Rijmenam [19], amongst others, has drawn attention to some ethical issues surrounding the use of artificial intelligence. He proposes that algorithms have two major flaws. Algorithms are:

  • Extremely literal; they pursue their (ultimate) goal literally and do exactly what is told while ignoring any other, important, consideration.
  • Black boxes; whatever happens inside an algorithm is only known to the organisation that uses it, and quite often not even.

He argues for a transparent approach to the use of algorithms that Michael van Lent[20] defined as 'explainable artificial intelligence' (XAI). Mark van Rijmenam notes Edward Shortliffe and his colleagues' (1975)[21] exposition of how a program can "explain its recommendations when queried".

Simon Buckingham Shum[22] suggests that such transparency supports 'algorithmic accountability'.

Katrina Karkazis and Jennifer Fishman (2017)[23] identify five areas of concern about the "largely unregulated and unexamined" use of biometric technologies in professional sports and the consumer sector:

  • validity and interpretation of data
  • increased surveillance and threats to privacy
  • risks to confidentiality and concerns regarding data security
  • conflicts of interest
  • coercion

Christopher Molnar (2018a)[24] shared his guide "for making black box models explainable". See also, Christopher's discussion (2018b)[25] of the potential of the CRAN package iml to analyse black box machine learning models.

In June, 2018, Google shared its principles for artificial intelligence. Sundar Pichai (2018)[26] noted "such powerful technology raises equally powerful questions about its use". The Google statement identified seven objectives for artificial intelligence.

In 2018, DrivenData hosted the deon ethical checklist for data scientists. The introduction to deon included:

The conversation about ethics in data science, machine learning, and AI is increasingly important. The goal of deon is to push that conversation forward and provide concrete, actionable reminders to the developers that have influence over how data science gets done.[27]

If you would like to explore some of the ethical issues in detail, you might find these four topics of interest:

Icon reading line.svg
Socio-cultural aspects

Icon reading line.svg
Privacy and anonymity

ePortfolio activity

If you are compiling an ePortfolio for this course, it is likely that you will be addressing some of these ethical issues in your own practice. You might want to consider this activity as a trigger for your own reflections about how we observe and monitor performance in training and competition.

Icon reflection line.svg
The ownership of performance data

Innovations in monitoring technologies are giving rise to what Robin James [28] refers to as "contemporary algorithmic culture".

What are your thoughts on the interface between the overt monitoring of performance and the privacy that each individual might expect in a digital world? We suggest you might like to consider these resources to support your reflections:


  1. Wiggins, Chris (2019). "Data: Past, Present, and Future". Retrieved 9 March 2019.
  2. Brusseau, James (6 May 2019). "Ethics of identity in the time of big data". Retrieved 9 May 2019.
  3. Frankfurt, Harry (1982). "The importance of what we care about". Synthese 53(2): 257-272.
  4. Durak-Somo, Thuto (6 September 2017). "Data visualization reshapes how players are evaluated and also how we see them". Retrieved 8 September 2017.
  5. IEEE "Ethically aligned design", December 2017. Retrieved on 14 December 2017.
  6. Kleinberg, Jon; Mullainathan, Sendhil; Raghavan, Manish (2016). "Inherent trade-offs in the fair determination of risk scores". arXiv preprint arXiv:1609.05807.
  7. McMahan, Ian (29 March 2018). "The Tricky Ethics of the NFL Sharing Troves of Player Data". Retrieved 30 March 2018.
  8. Clapperton, Guy (16 July 2018). "Would You Let Your Boss Put a Chip in Your Body?". Retrieved 17 July 2018.
  9. Digital Ethics Lab (2018). "About". Retrieved 20 April 2018.
  10. Mitchell, Vincent; Kamleitner, Bernadette (21 June 2018). "We don’t own data like we own a car – which is why we find data harder to protect". Retrieved 21 June 2018.
  11. McMullan, Thomas (22 June 2018). "How an Apple Watch Could Decide a Murder Case". Retrieved 24 June 2018.
  12. Jackson, Andrew (10 September 2018). "Wearable technologies and the Australian privacy principles". Retrieved 30 October 2018.
  13. Bettilyon, Tyler (24 April 2019). "Why ‘Anonymized Data’ Isn’t So Anonymous". Retrieved 26 April 2019.
  14. McLuhan, Marshall (1964). Understanding Media. New York: Mentor.
  15. McLuhan, Marshall (1964). Understanding Media. New York: Mentor. p. 64.
  16. Aloni, Nimrod; Weintrob, Lori (2017). Beyond Bystanders. Rotterdam: Sense Publishers. p. 1.
  17. Cetina, Karin (1999). "Culture in global knowledge societies: knowledge cultures and epistemic cultures". p. 363.
  18. Cetina, Karin (1999). "Culture in global knowledge societies: knowledge cultures and epistemic cultures". p. 363.
  19. Mark van Rijmenam. "Algorithms are Black Boxes, That is Why We Need Explainable AI". Retrieved on 12 March 2017.
  20. van Lent, Michael (2004). "An Explainable Artificial Intelligence System for Small-unit Tactical Behavior". Proceedings of the National Conference on Artificial Intelligence. Menlo Park, CA; Cambridge, MA; London; AAAI Press; MIT Press; 1999: 900-907.
  21. Mark van Rijmenam. "Algorithms are Black Boxes, That is Why We Need Explainable AI". Retrieved on 12 March 2017.
  22. "Algorithmic Accountability for Learning Analytics". Simon Buckingham Shum. 25 March 2016. Retrieved 24 July 2017.
  23. Karkazis, Katrina; Fishman, Jennifer (2017). "Tracking US professional athletes: The ethics of biometric technologies". The American Journal of Bioethics 17(1): 45-60.
  24. Molnar, Christopher (4 April 2018). "Interpretable Machine Learning". Retrieved 4 May 2018.
  25. Molnar, Christopher (30 April 2018). "Interpretable Machine Learning with iml and mlr". Retrieved 4 May 2018.
  26. Molnar, Christopher (7 June 2018). "AI at Google: our principles". Retrieved 9 June 2018.
  27. Deon (2018). "An ethics checklist for data scientists". Retrieved 3 November 2018.
  28. Robin James. "Cloudy Logic", 2015. Retrieved on 12 January 2016.
  29. de Montjoye, Yves-Alexandre (2013). "Unique in the Crowd: The privacy bounds of human mobility". Scientific Reports 3: 1376.
  30. Culnane, Chris; Rubinstein, Benjamin; Teague, Vanessa (29 September 2016). "Understanding the maths is crucial for protecting privacy". Retrieved 4 February 2018.
  31. John Scott-Railton. [1], 29 January 2018. Retrieved on 4 February 2018.
  32. Steve Loughran. [2], 29 January 2018. Retrieved on 4 February 2018.