In all honesty, I wasn’t sure what to expect to get out of this course when I registered. At first, I thought it might be a general overview of how information technology (IT) works in our day-to-day lives, or potentially the general changes information technology has had on our society throughout time. But going through the units, readings, and assignment, I’ve been surprised at how much this course made me think deeply and challenge my initial perspective about how information and technology are tied into power, inequality, politics, and even our sense of self. On a personal level, I thoroughly enjoy learning about these topics. But as an engineering student, I most definitely was not used to doing it in an academic setting. It wasn’t always easy or comfortable, but it was absolutely worth it.
The topic that I found most fascinating was the intersection of surveillance, power, and inequality. Learning about Anthony Giddens’ idea that surveillance is a fundamental aspect of how modern societies organize themselves was eye-opening. It’s not just about cameras or government spying. It’s about the invisible ways information is gathered and used to manage populations and maintain control. The expansion of IT, whether through Facebook or data analytics firms like Cambridge Analytica, takes this to a whole new level. The scale and sophistication of these systems, combined with the lack of transparency, made me realize how much power rests in the hands of a few and how easily our privacy and autonomy can be compromised.
What surprised me the most was realizing how surveillance isn’t always aggressive or obviously invasive. It’s often subtle, embedded in everyday systems we trust and rely on. Things like health databases, education platforms, social media, etc, all of them quietly collect and store our behaviours. It made me think about how much we’ve normalized being constantly watched, and how rarely we stop to ask where the data goes or who profits from it. It’s easy to shrug it off as the price of convenience, but this course pushed me to actually sit with the discomfort and start thinking more critically. I didn’t expect a sociology course to make me worried in a productive way, but it really did.
There was something really strange about realizing just how comfortable we’ve all become with surveillance as a default. And it’s not just the obvious platforms. It’s also subtle things like smart homes, Fitbits, and even the apps we use for school or work. That realization stuck with me. It’s made me more cautious, but also more confident to ask better questions about what I’m opting into and the implications.
Beyond just the general facts or definitions in information technology, one of the most fascinating lessons was understanding the complexity of the “information society.” Webster’s critique of the vague and narrow ways it’s often defined made me appreciate how deeply information flows are tied to culture, economy, and politics (Webster, 2014). It’s not enough to say we live in an information society because of gadgets or jobs. We have to look critically at who controls information, who accesses it, and who benefits.
David Noble’s discussion of technology as a kind of religion with millenarian themes also changed how I think about progress and innovation (Noble, 1999). I’ve never connected ideas about transcendence or salvation to technological development. Now, I see how these beliefs influence how people imagine the future. Sometimes in hopeful, inspiring ways, but other times as a dangerous distraction from real social problems like inequality and exclusion. It made me more skeptical but also more thoughtful about how technology narratives shape policy and culture.
Personally, I found the most challenging part to be confronting how much of my own perspective was shaped by an optimistic but naive faith in technology as an automatic good. This course unsettled that belief. For example, learning about the Cambridge Analytica scandal not just as a breach of privacy but as an example of deep structural inequality forced me to rethink what “digital divide” really means. It’s not just about who has internet access. It’s about who can participate meaningfully in the digital world, who controls data, and how that control influences democracy itself.
At times, this brought feelings of discomfort and even frustration. It felt somewhat overwhelming to realize how complex and deeply entrenched these problems are, and how much ordinary people are often left powerless. But that discomfort was necessary. It pushed me to question assumptions and seek deeper understanding rather than easy answers. And that’s probably the most valuable shift for me, learning to sit in discomfort rather than rushing to find clean solutions. It taught me that nuance matters, and that genuine understanding starts when you stop looking for simple conclusions and start examining the messiness.
I’ll admit, there were moments of guilt and discomfort, especially when reflecting on my own digital behaviour and privilege. Realizing that I benefit from and participate in systems that exploit data and reinforce inequalities was a tough but necessary step. It made me think critically about my role. Not just as a consumer or user but as a future engineer and potential creator of technology.
However, alongside that discomfort came a sense of joy and hope. Learning about the digital divide, for example, also included discussions of digital education and infrastructure investments as paths forward. Seeing that there are solutions, if society chooses to prioritize equity and accountability, was encouraging. The course reminded me that technology itself is neutral, and it’s how we build and govern it that shapes the outcomes.
This course transformed how I view the relationship between society and technology. It’s not a simple story of progress or innovation. It’s a complex, often messy interplay of power, culture, economics, and belief systems. Technology both shapes and is shaped by human values and social forces.
Now, I’m more aware of the need to approach technology critically and ethically, not just focusing on what it can do but asking who it serves and who it excludes. The ideas of surveillance and data manipulation gave me a sharper lens to see the risks of unchecked technological power. At the same time, concepts like Noble’s “religion of technology” (Noble, 1999) encourage humility, reminding me that our hopes for a technological utopia must be tempered by realism and responsibility.
What makes this course stand out for me is how relevant and urgent it felt. These aren’t theoretical or abstract ideas. They’re happening right now through our phones, our elections, and our workplaces. And that immediacy gave everything more importance. It made it feel less like schoolwork and more like preparation for navigating the world with open eyes.
I’m optimistic about the future because understanding these issues is the first step towards meaningful change. This course has given me a toolkit of critical perspectives and real-world examples that I can carry forward into my career and life. It’s motivated me to think beyond just creating technology to actively shaping its role in society, working to ensure it helps people rather than exploiting them.
This course also reinforced the importance of education, access, and dialogue around technology. Bridging the digital divide means more than providing devices. It means investing in digital literacy, infrastructure, and policies that protect rights and promote justice.
This course was a transformative journey. Not just intellectually but personally and emotionally. It challenged my assumptions, expanded my worldview, and deepened my commitment to ethical technology development. I’m grateful for the opportunity to engage with such critical issues and hopeful that I can contribute to a future where technology supports human dignity and collective well-being.
Clarke, A. C. (1953). Childhood’s end. Ballantine Books.
Noble, D. F. (1999). The religion of technology: The divinity of man and the spirit of invention. New York: Penguin.
Webster, F. (2014), Theories of the information society (4th ed.). Routledge