The web has been long thought of as a space which enables the deconstruction of hierarchies due to its technical architecture, i.e., a flat space (Oxenford, n.d.). Arguments against this claim can be made by analysing the topic under the lens of access to the web. However, this article attempts to address power relations that exist even within the web. As the web is now being viewed as a socio-cognitive space where users themselves are a part of the web, we can draw analogues of the web to societal structures which in themselves are socio-cognitive spaces that very much subsume us. Thus, just as power relations form a significant part of societal structures (Connolly 2020), an analysis into power relations in this network society, as coined by Manuel Castells (Castells 2000), are important so that we can effectively understand the cognitive impacts of the web.

Web as a Socio-Cognitive Space

Before we get into power itself, we first need to understand and formulate a model of the web in which power relations proliferate. Viewing the web as a socio-cognitive space seems appropriate for this cause. Initially, the web was seen as a tool that was completely separate from users. Humans were seen more as consumers rather than contributors. However, since the advent of Web 2.0, human participation became the central aspect of the web. This participation happens explicitly (blogs, tweets) and implicitly (cookies, phone sensors). As a result of this, at present, the web draws more data from us than we draw from the web! Users have thus become very much a part of the web. Thus, the web is no more merely just a tool, rather a ubiquitous space where it’s not just interactions between users that take place. The web uses the data we generate to interact with us through algorithms. Such a construction of a space results in humans being cognitively affected by the web even if we want it or not. The two main proponents which are drivers of this space are data and algorithms. Thus, the web can no longer be seen merely as a technological construct, but also a social construct.

The Power of Data and Algorithms

Firstly, it is important to establish the notion of power itself before we get into the discourse of it. To create a proper understanding of power, it is only appropriate to seek recourse to Michel Foucault’s theories of power. It is important to note that power by means of coercion and a showcase of supremacy is not what we are concerned with. This is rather an archaic representation of power. The shift from this kind of a visual offering of symbolic power is marked by changing the focus on to the individual as an object of knowledge (Gutting and Oksala 2019). According to Foucault, the goals of power and the goals of knowledge cannot be separated: ‘in knowing we control and in controlling we know’ (Gutting and Oksala 2019).

The present day web draws many analogues with Foucault’s description of a disciplinary society (Krasmann 2017). The three characteristics of such a space are observation, normalising judgement, and examination (Gutting and Oksala 2019). A direct form of observation would be government surveillance. Various geopolitical models of the web (O’hara and Hall 1) feature surveillance irrespective of their position on the political spectrum (Edward Snowden’s revelations about the NSA, China’s authoritative internet, etc). This can be viewed as an analogue to Bentham’s panopticon. However, the central observation tower of the Panopticon model is in this case dispersed and inconspicuous as technologies have become omnipresent. The root of these surveillance mechanisms are data and algorithms and they serve both commercial and state interests. State interests lie in things such as anticipating criminal interests, but commercial interests are based on addressing and invoking consumer desires. Examples of this include Netflix’s recommendation engines, Google’s personalised advertisements, etc. The premise of truth onto which the power of algorithms are vested on is the fact that they act on the data that we produce on the web. Thus, the outcomes produced by algorithms become or reflect wider notions of truth. Power then is operationalized through the algorithm, i.e., the algorithmic output cements, maintains or produces certain truths and these discourses are circulated in the web. From this perspective, algorithms might be understood to create truths around things like riskiness, taste, choice, lifestyle, health and so on (Beer 2017).

Also, clearly, since the web draws data from us, we, the users, have become the objects of knowledge. This data is further presented back to us in order to subjugate us to the ‘truth’. This procedure is utilised by the power structures of the web to manipulate and objectify the user in order to subjugate his/her power to know the truth, all via surveillance (Rajagopal 2014). The web creates what Foucault calls ‘instrumental knowledge’, i.e., scientific knowledge and expertise related to new technologies and their operation — software or hardware, rather than real awareness and intellectual knowledge/wisdom. Use of instrumental knowledge may entrap unaware users. States/corporations, could legitimize the erosion of individual’s privacy under the false pretext of ‘public good’ (Rajagopal 2014). Examination on the other hand aims at eliciting truth and controlling behaviour (Gutting and Oksala 2019). One way that this process is undertaken in the web is with the help of socio-cognitive incentives. An example of this would be the manner in which Google offers virtual rewards as an incentive for people to review a place they visit, or Twitter adding fake news labels to false information. These are examples of direct intervention to keep a check on deviant behaviour. However, the very nature of the web and its control mechanisms facilitate the eliciting of the so called truths by networked actions. This effect is noticed in Twitter’s hashtags for example. There often seems to be a collective emotivist response to content on the web, which are further enhanced by algorithms who seem to have the agency to heighten or even diminish certain content.

So far, our discussion has been about disciplinary power which can feel rather hegemonic. However, Foucault also talks about power in the form of subjectivity and resistance (Gutting and Oksala 2019). This move, which he calls governmentality, allows us to explain the role of resistance in the modern day control society. Governmentality involves the creation of subjects, and in the case of the web (which is a socio-cognitive space), the very subjects that are prevalent in the real world are mapped directly on the web space as well. Thus, the web constitutes the production of subjects, social relationships and associations. Also, along with the creation of subjects, the framework of the web also allows for the immanence of discourse which further shapes action (which in turn facilitates the creation of knowledge) . Thus, power is not just merely hegemonic, but it’s ontological (Krasmann 2017). Actors on the web have a fairly low bar to be called an active object in this process of power shaping, because even the smallest of decisions we take in our lives are manifest on the web as a digital footprint which forms the input for algorithms; all thanks to the ubiquity of the web (and the rise of social machines) (Smart 2016). In order to better understand the creation of subjects and critique on the web, we can look at the web as a hierarchy of opinion drivers (Bhanushali, Subbanarasimha, and Srinivasa 2017). According to this model, the inner core consisting of social machines facilitate discourse with the help of the ‘trigger’ level (media houses, online news outlets etc) and these two layers in turn seek information sources from the ‘inert’ realm (Wikipedia, Project Gutenberg etc). Here, we can clearly notice the process of subject creation and facilitation of critique in the web in a rather ontological manner. All kinds of actors end up shaping this dynamic process irrespective of their activity on the web. For example, we can directly take part in the discourse around subjects on social machines, or by merely browsing the web, our footprint on the web can be utilised for statistical analysis (for example) which can in turn feed discourse. Thus, power relations are dynamic and permeating in nature and shape the way we perceive the ways of the world.

Agency of Algorithms

We have seen how data and algorithms shape (and possibly hold) power relations in the web. The main critique of the web currently is the lack of transparency of these algorithms and in the collection of data. This opaqueness is what gives rise to discussions about the role of algorithms in the deployment or expression of power, leading to the notion of a ‘black box society’ (Beer 2017).

While it is true that the decisions taken by algorithms are shaping the way we think and perceive the world, it’s important to also understand that we have to look deeper when we ask ourselves about the power vested in them. Rather than focusing on algorithms themselves, we need to hold the people behind these processes accountable. In the section on governmentality, we have seen how power structures enable the creation of subjects. People often rely on ‘experts’ to dictate these norms and define what is normal and what is deviant (Greene, n.d.). When we internalise these norms and identify ourselves with the subjects/labels that have been assigned to us is when power is most invisible and most powerful. This can be seen as a direct observation with the processes in the web (the growth of internet centrism (Chatfield, n.d.))! Thus, clearly, these ‘experts’ who in the web space include data scientists, fact checkers, content makers, etc should have a sense of responsibility and awareness so that they can ethically dispatch the power they hold.

Thus, the people behind the processes on the web should embrace the fact that their actions hold immense power, and it is vital for them to have an understanding of these technical processes in concurrence with their social implications. This can be achieved by understanding computer science as a social science field rather than merely viewing it just as a technical field (Connolly 2020).


Beer, David. 2017. “The Social Power of Algorithms.” Information, Communication & Society 20 (1): 1–13.

Bhanushali, Anish, Raksha Pavagada Subbanarasimha, and Srinath Srinivasa. 2017. “Identifying Opinion Drivers on Social Media.” In On the Move to Meaningful Internet Systems. OTM 2017 Conferences, edited by Hervé Panetto, Christophe Debruyne, Walid Gaaloul, Mike Papazoglou, Adrian Paschke, Claudio Agostino Ardagna, and Robert Meersman, 242–53. Cham: Springer International Publishing.

Castells, Manuel. 2000. The Rise of the Network Society (2nd Ed.). Cambridge, MA, USA.: Blackwell Publishers, Inc.

Chatfield, Tom. n.d. “The Net Delusion: How Not to Liberate the World by Evgeny Morozov – Review.”

Connolly, Randy. 2020. “Why Computing Belongs Within the Social Sciences.” Commun. ACM 63 (8): 54–59.

Greene, Travis. n.d. “Data Scientists and the Ethics of Power, Part I.”

Gutting, Gary, and Johanna Oksala. 2019. “Michel Foucault.” In The Stanford Encyclopedia of Philosophy, edited by Edward N. Zalta, Spring; Metaphysics Research Lab, Stanford University.

Krasmann, Susanne. 2017. “Imagining Foucault. On the Digital Subject and ‘Visual Citizenship’.” Foucault Studies, August, 10.

O’hara, Kieron, and Wendy Hall. 2019. “Four Internets.” Communications of the ACM 63 (June).

Oxenford, Alec. n.d. “The Internet Is Flat, Now What?”

Rajagopal, Indhu. 2014. “Does the Internet Shape a Disciplinary Society? The Information-Knowledge Paradox.” First Monday 19 (March).

Smart, Paul. 2016. “The Rise of the (Social) Machines.” In 17th Ifip Working Conference on Virtual Enterprises (03/10/16).