Image by nd3000 on Envato ElementsImage by nd3000 on Envato ElementsBY ERIK GUSTAFSON

The increased sophistication of AI-generated text, images, and actions has stirred up widespread concern – if not outright panic – in both scholarly and public circles. How will we know if students did their own work? Will students do their own work? What jobs will be altered by AI software? What jobs will no longer exist because of AI software?

All of these questions are notable and likely to be central to conversations on education and industry in years to come. In actuality, we may look back and find that they foreshadowed what very well may be a rapturous moment in media history – or at least it sure feels that way at present.

However, our treatment of AI software, such as Chat GPT, is still missing something. While on the surface ChatGPT is an extension of years of research, development, and refinement of language processing models, there are other machinations at work. Part of the hullabaloo surrounding Chat GPT appears to be the perceived quality and speed of its work. What if that accuracy and speed was increased exponentially?

Enter quantum computation. The black box term has been thrown around for a couple decades now and has pointed to a new type of computation that functions in an altogether different manner than our current computers. We know quantum computation utilizes something called qubits and that they can deal with more data and do so at a much faster rate. We know that it is not fully developed and has some difficulties in both engineering and programming. Other than that, we as a populace generally know shockingly little about it. Even the Nobel Prize winning physicist Richard Feynman once famously said "I think I can safely say that nobody understands quantum mechanics" (Siegfried, 2018).

Despite this lack of public understanding, billions of dollars – more each year – are invested in quantum technologies (Bogobowicz et. al, 2023). Probably not this year, and perhaps not even in five or ten years, but certainly in the next twenty years, we will more than likely see quantum computers phase out classical computers, just as hybrids are phasing out gas powered cars. Generative AI is software – it is coding par excellence – but the computers that enable it are hardware. We have thus far been caught up with the software and its outputs and have ignored the hardware that currently and will enable it. Generative AI may be scary now, but it is on the cusp of getting an engine swap.

No new technology is merely an addition to the environment; each new technology is a complete transformation of the environment. Classical computers have taken the first leg of the race for generative AI, but quantum computers will surely be the relay team's anchor. To date, it appears that we have focused on what Marshall McLuhan called the figure, or the content (i.e. the software), of the issue, without paying attention to the ground, or the environment (i.e. the hardware), that has and will enable it (Logan, 2011; McLuhan, 1964).

What is most ethically concerning about quantum computers is not the exponential increase amount of data they can manage or the speed with which they can do. What is concerning is the fragility of the systems and the higher susceptibility to error. Quantum computers require precise temperature control and qubits (the storage vessel for the information) can be easily disturbed (Brooks, 2021). The bulwark of information and storage becomes intensely more complicated and accurate retrieval and analysis far less reliable in quantum computation.

When paired with generative AI, the consequences of continual refinement and incorporation of quantum computation could be detrimental (Zazario, 2022).  AI and other automated systems have the propensity to multiply and exacerbate human biases, logics, and errors (Noble, 2018; O'Neil, 2016). Sped up even further via quantum computation, AI and the institutionalization of its logics can alter human perception and feign objectivity of our systems. Perhaps most concerning, has been the humans historically slow awareness of the problems caused by such mechanisms. As quantum technology works out the kinks, our ability to identify the new problems they create will remain static.

Thus, it is integral to humanhood that we pay attention not just to the software programs that draw our attention in, but to the hardware that is enabling them. Though this short commentary egregiously glosses over the complexities of quantum computation and the well-documented effects of datatification, it is instead my hope as an author to shift our attention as scholars to that which we perceive is on the way.

It is precisely because we perceive the quantum era as the future that we must treat those infrastructural systems as the present. If we wait to concern ourselves with the underlying hardware that enables the programs we are tasked with addressing, until the possibilities become visible to us, we risk being forced to build the car while we are driving it. It is untenable to simply critique the ethics of the outcome, we must build ethical evaluation into the very construction – at every step – of the hardware of our society.


Works Cited

Bogobowicz, M., Zemmel, R., Gao, S., Masiowski, M., Mohr, N., Soller, H. (2023, April 24). "Quantum technology sees record investments, progress on talent gap." McKinsey Digital.

Logan, R.K. (2011). Figure/ground: Cracking the code. E-compos, 14(3), 1-13.

McLuhan, M. (1964). Understanding media: The extensions of man. Cambridge: MIT Press.

Noble, S. (2018). Algorithms of oppression: How search engines reinforce racism. New York: New York University Press.

O'Neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. New York: Crown Publishing Group.

Siegfried, T. (2018, May 8). "A celebration of curiosity for Feynman's 100th birthday. ScienceNews.

Zazario, Z. (2022, May 1). "How to fix quantum computing bugs." Scientific American.


  • Erik Gustafson is an Assistant Professor in the Department of Communication at the University of Texas at Tyler. His primary scholarly pursuits exist at the intersection of media, culture, and politics, and explore how rapidly evolving media environments affect our experience of what it means to be human.


Image by nd3000 on Envato Elements.