When I first entered this course, I did not expect it to change how I think about technology in any deep or lasting way. Information technology felt familiar, ordinary, and largely unavoidable. I used digital platforms daily, relied on computerized systems at work, and assumed that technology, while imperfect, generally pushed society forward. If there were problems, I believed they could be fixed with better regulation, better design, or more awareness. What I did not anticipate was how thoroughly this course would unsettle those assumptions, not through dramatic revelations, but through a steady and often uncomfortable process of rethinking what technology actually does in the world and who it truly serves. What became most interesting to me over time was not any single concept, but the pattern that emerged across the units. Again and again, the course returned to the idea that information technology is never neutral. It does not simply exist alongside society; it actively shapes social relationships, power structures, and lived experience. Early on, this felt abstract. As the course progressed, it became increasingly personal. I began to recognize these dynamics not only in historical examples or large-scale systems, but in my own daily routines, workplaces, and online interactions. One of the first moments where my thinking shifted involved surveillance. Before this course, surveillance was something I associated with extreme cases, such as authoritarian states, policing, or intelligence agencies. Giddens’ discussion made it clear that surveillance is far more ordinary and deeply embedded in modern life than I had previously understood (Giddens, 1991). Surveillance operates through records, databases, documentation, and information flows that allow institutions to function. What struck me was how normalized this has become. Surveillance is rarely framed as control. Instead, it is presented as organization, safety, accountability, or efficiency. As I reflected on this, I began noticing how often information about me is collected without my conscious attention. Workplace systems track performance. Digital platforms record behavior. Institutions retain records indefinitely. None of this feels dramatic, yet it quietly reshapes what is possible, acceptable, and expected. The emotional challenge here was subtle but real. It was unsettling to realize how easily I had accepted constant observation as a normal condition of participation in modern society. That discomfort intensified when I engaged with Foucault’s work on discipline and punishment. His argument that modern power functions through observation, judgment, and examination helped me make sense of experiences I had never previously named as political or disciplinary (Foucault, 1977). Performance reviews, audits, evaluations, and metrics suddenly looked less benign. They were not just tools for improvement, but mechanisms that shape behavior and identity. What unsettled me most was recognizing how much discipline operates internally. People regulate themselves not because someone is always watching, but because they know they could be watched, evaluated, or compared. This realization forced me to confront my own role within these systems. I am not only subjected to discipline; I also participate in it. I follow norms, enforce standards, and accept evaluation because they are presented as necessary and professional. Foucault’s work did not make me reject these systems outright, but it made me far more aware of their power. Once that awareness set in, it was difficult to return to a more comfortable, uncritical view. The course also profoundly reshaped how I understand inequality. Prior to this class, I thought of inequality mainly in economic terms. Van Dijk’s analysis revealed how deeply inequality is embedded within information societies themselves (van Dijk, 2005). Access to technology, which is often celebrated as a solution, is only the beginning. Real power lies in who controls information, who has the skills to use it strategically, and who benefits from data extraction. This reframing helped explain why technological expansion has not led to the equality it often promises. The Cambridge Analytica scandal made these abstract ideas painfully concrete. Learning how personal data was harvested and weaponized to influence political behavior was deeply unsettling (Schneble et al., 2018). What disturbed me most was the imbalance of awareness and control. Ordinary users unknowingly provided the raw material, while wealthy political actors turned that data into influence. This forced me to reconsider my earlier belief that participation in digital platforms is inherently empowering. In many cases, participation actually increases vulnerability. Emotionally, this produced a sense of disillusionment. I had once believed that digital platforms could democratize communication and amplify marginalized voices. While that potential still exists, the course made it impossible to ignore how these platforms also centralize power, obscure accountability, and deepen inequality. This tension between promise and reality became one of the most persistent themes in my reflection. My understanding of work and technological progress also changed significantly. I once associated technological advancement with improvement, assuming that better technology would naturally lead to better working conditions. Learning about high-tech sweatshops, algorithmic management, and the gig economy challenged this belief. From a Marxist perspective, technology under capitalism often intensifies exploitation rather than alleviating it (Marx, 1976). The Foxconn example was particularly striking because it revealed how extreme exploitation can coexist with cutting-edge technology. Advanced machinery did not reduce suffering; it increased pressure, surveillance, and alienation. What made this realization especially uncomfortable was recognizing similar dynamics in less extreme environments. Digital scheduling systems, performance metrics, and productivity tracking increasingly shape work across many sectors. These systems are framed as neutral or efficient, yet they often reduce autonomy and increase stress. The language of optimization masks a reality of control. This forced me to rethink my own workplace experiences and how easily technological management becomes normalized. The gig economy initially appears appealing, especially when framed in terms of flexibility and independence. However, through the course material, it became clear that this flexibility often comes at the cost of security, stability, and protection. Workers absorb risk while platforms retain power. This challenged the idea that innovation naturally leads to freedom. Instead, it reinforced the lesson that technology reflects existing power relations unless deliberately redirected. One of the most unexpected and intellectually engaging themes in the course was technological transcendence. Noble’s discussion of millennialism revealed how deeply Western technological thinking is shaped by religious ideas of salvation, perfection, and escape (Noble, 1997). I had never previously connected modern technology to religious belief, yet the parallels became impossible to ignore. Promises that technology will eliminate suffering or overcome human limits closely resemble spiritual visions of redemption. Reading Childhood’s End alongside Noble’s work deepened this reflection. Clarke’s vision of transcendence is peaceful and inevitable, yet it still involves loss, exclusion, and the disappearance of humanity as it is known. Adults are left behind. Humanity ends quietly. This made me realize how even hopeful visions of transcendence often avoid confronting present injustice. They shift attention away from current suffering toward a future that may never arrive. Emotionally, this course was challenging in a cumulative way. There was no single moment of shock, but rather a gradual dismantling of assumptions. I felt guilt for how easily I had accepted technological convenience without questioning who benefits and who is harmed. I felt discomfort in recognizing my own participation in disciplinary systems. At the same time, I felt a growing sense of clarity. Being able to name these dynamics using theory transformed vague unease into critical understanding. By the end of the course, what changed most was my orientation toward technology itself. I no longer see information technology as something that simply happens to society. It is shaped by history, politics, economics, and belief systems. Scholars such as Webster and Schiller emphasize that information societies must be understood through power, control, and inequality rather than innovation alone (Webster, 2006; Schiller, 1999). That insight now frames how I interpret digital life. This course did not leave me with easy optimism or simple solutions. Instead, it left me more aware, more cautious, and more critical. I now approach technology with questions rather than assumptions. Who benefits? Who is excluded? What values are being reinforced? That shift in how I think and reflect feels like the most meaningful outcome of the course, and one that will continue to shape how I engage with the digital world long after the course has ended.
References Clarke, A. C. (1953). Childhood’s end. Ballantine Books. Foucault, M. (1977). Discipline and punish: The birth of the prison (A. Sheridan, Trans.). Vintage Books. (Original work published 1975) Giddens, A. (1991). Modernity and self-identity: Self and society in the late modern age. Stanford University Press. Marx, K. (1976). Capital: A critique of political economy (Vol. 1, B. Fowkes, Trans.). Penguin Books. (Original work published 1867) Noble, D. F. (1997). The religion of technology: The divinity of man and the spirit of invention. Knopf. Schiller, H. I. (1999). Digital capitalism: Networking the global market system. MIT Press. Schneble, C. O., Elger, B. S., & Shaw, D. (2018). The Cambridge Analytica affair and Internet-mediated research. EMBO Reports, 19(8), e46579. https://doi.org/10.15252/embr.201846579 van Dijk, J. A. G. M. (2005). The deepening divide: Inequality in the information society. SAGE Publications. Webster, F. (2006). Theories of the information society (3rd ed.). Routledge.