Most Recent Posts

Stay Tuned!

Receive regural updates from Absolute Influence.

By signing up, you agree to our Privacy Policy

The present and future of a centralized internet: Dystopia

Read Carefully

In the previous post I argued that the growing concerns regarding the internet have many causes: from underlying social problems, to bad science; from bad incentive structures put in place by big platforms, to the ongoing process of centralization that magnifies the impact of any problem that might arise.

I defined centralization broadly as the process through which intermediaries reshape our internet, increasing their gatekeeping power over the information that circulates through it.

I argued that centralization is creating the single point of failure that the original design sought to avoid, and that this should be the key concern of policy-makers.

The culture of fail fast and iterate that boosted innovation over the past decades has become highly problematic. In a centralized system problems are no longer localized and easy to neutralize. In a centralized system failure spreads too quickly, and can cause a lot of harm. (Part I here)

Constant evolution

How does centralization take place? The web is always and only becoming. It is in constant evolution. Each link that is made, each server that is set up to host content is part of this process.

But some actors have bigger wrenches than others. There are gatekeepers at a network, device, application and storage level. They have the capacity to influence the decisions of millions of people who produce and consume content, and thus how the entangled web evolves, and how people understand the world they live in.

These brokers are not merely replacing the traditional media in their role as information brokers. Their power is qualitatively superior.

Whereas traditional media managed a one-way stream of information

old media →consumer

new information brokers also harvest a LOT of real-time data about the information recipients, creating a two-way stream of information:

new media < →user

New media can leverage this process to direct users to one piece of content instead of others, or limit their access to links all-together.

This can be more subtle than the usual censorship cases. See for example what happens when you post a link on Instagram, one of the rising social networks owned by Facebook (hint: its not clickable, a design that ensures you don’t leave the app).

Intermediation continues to grow in breadth and depth, fueling the process of web centralization

Intermediation is not in itself a bad thing. Search engines, for example, have become a key ally in enabling the web to achieve scale by helping users find relevant information in the ever-growing web of content. But it can also have problematic effects.

There are several ways in which intermediation can take place.

It can be structurally embedded, such as through algorithms that automatically sort information on behalf of the user.

Intermediation can also operate within the previously mentioned structure in somewhat organic ways, such as when users unknowingly interact with networks of bots (automated accounts) controlled by a single user or group of users, or armies of trolls paid to disseminate specific information, or disrupt dialogue. In these cases, the bots and trolls act as intermediaries for whoever owns or created them.

But how did we get to this point where centralization is giving the internet a bad name?

Intermediation, centralization and inequality

Part of it is an “organic” cycle whereby the more central a player is the more personal data it can collect, enabling such player to further optimize intermediation services. This optimization and personalization can in turn make services more attractive to users, pushing competitors out of the market, and thus “organically” reducing the range of services to which users can migrate. This is an example of a rich-get-richer dynamic.

The other key dynamic occurs beyond the set of existing rules, and I would call outright illegitimate. That is, intermediaries often leverage their position as a tool to [illegitimately]prioritize their own services, allowing them to further increase their market share. Their success in the intermediation market should not allow them to force their way to success in another market. Amazon is a perfect dynamic of this example: It relies on its position as owner of the marketplace to study buyer behavior and define the products it could sell directly. It then relies on its algorithms and design to get a competitive edge over rivals.

The perils of centralization: a look into the future

New technological developments — such as smart assistants, augmented and virtual reality — will likely increase the breadth and depth of intermediation over the next decade. This, in turn, threatens to accelerate and further entrench the process of centralization.

Centralization and search

Whereas originally different users would go to different websites looking for links to other websites, we quickly shifted to search engines that presented users with a list of websites of interest. Currently the trend suggests smart assistants will take over this role, skipping that step and providing the user with specific contents or services, without offering the bigger picture. Winner takes all.

With AR and VR the user is placed in an even more passive role and might be “force-fed” information in more seamless ways than through today’s online advertising. Whoever operates the code manages the process of blending the curated digital world with the physical environment in which our species evolved over millions of years. No contours on your screen. No cover on your book to remind you of the distinction between worlds.

Smart assistants such as Siri, Google Assistant, and Alexa are making agreements with companies that produce smart devices (cars, refrigerators, thermostats, etc) . Through these agreements, smart assistants will allow users to control their whole swarm of smart objects more easily. And the companies behind the smart assistant will increase the quantity and quality of data they have about users.

Developments in technologies such as AR and VR are capable of further isolating people into algorithmically curated silo-worlds, where information flows are managed by the owners of these algorithms.

This would reduce the probability of people facing random or unanticipated encounters with information , such as a protest on the streets. These unmediated encounters are often key to the development of empathy between people, and the fuel upon which social movements develop.

Having further isolated groups would erode the set of common experiences upon which trust within society is built. This trust is key for the coordination of big projects, and to ensure a fair distribution of the benefits of such coordination.

Centralization and person-to-person communication

The internet has not merely reduced the cost of one-to-one person communication; it has offered a qualitative leap in communications. Whereas the newspaper, radio and TV enabled one-to-many communications, and telephone facilitated one-to-one communications, the internet has facilitated group communications, sometimes referred to as many-to-manycommunications.

This is what we observe in places like Twitter and chat rooms, where thousands if not millions of people interact in real time. The deployment of effective many-to-many communications often relies on curatorial algorithms to help people find relevant conversations or groups. This means that some of the challenges faced in the realm of Search (previous section) affect person to person communications.

Yet centralization also poses a distinct set of risks for these communications. Among them, risks to the integrity of signifiers (representations of meaning, such as symbols or gestures), and their signified (meaning).

Intermediation in person-to-person communications

A. The intermediary’s responsibility to respect the integrity of a message

When texting with a new lover it is often the case that a word or emoji is misinterpreted. This often leads to an unnecessary quarrel, and we need to meet up physically to clear things up. Oh, no! That’s not what I meant…What I wanted to say is…

Conveying meaning is not simple, and we often require a new medium or set of symbols to explain and correct what went wrong.

Now imagine that someone could tamper with your messages, and you might not have that physical space to fix things… And that it’s not your lover you are communicating with, but the electorate or a group of protesters.

The internet facilitates engagement by bringing people closer together. The apparent collapse of the physical space between users is achieved by slashing down the time between the moment in which a message is sent and received, until it’s close to real time. For millions of years the only type of real-time communications we’ve had as a species involved physical presence. Thus real-time digital communication makes us feel physically close. This illusion often makes us forget that there is physical infrastructure between us, and that someone manages it. A package with the message is being transported, and it goes through the hands of several actors before it reaches its destination. It is fundamental that all parties managing these channels respect the integrity of the message. But we are mostly unaware of existence of these managers. .

It is fundamental that any and all parties who control these channels respect the integrity of the message that is being delivered.

Centralization, which leaves communication channels under the control of a handful of actors, could effectively limit parties from exchanging certain signifiers (symbols, such as words).

If virtual and augmented reality are the future of communications, then we should bear in mind that not only spoken or written language will be sent over communication channels. These communications will include a wide array of signals for which we still have poorly defined signifiers. This includes body gestures and — potentially — other senses, such as smell and taste. To get an idea of the complexity of the task ahead of us, think about the gap between experiencing a movie through descriptive noise captioning and the standard hearing experience of the same content.

In the past, the debate was focused on the legitimacy of the frames traditional intermediaries — such as newspapers — applied to political events and discourse. For example, how the old media shifts the narratives depending on who the victims and perpetrators are, and shaping its audience’s appetite for certain policies.

With new intermediaries come new challenges. Our new mediums enable person to person mass communication. By reducing (or eliminating) the availability of alternative mediums through which parties can communicate, centralization could limit the sender’s ability to double-check with the receiver(s) whether or not a message’s signifiers were correctly delivered.

Distributed archive systems, where many players simultaneously store the same content independently and check for consistency across all copies (such as those currently being developed based on Bitcoin’s blockchain model) offer a glimpse of hope in this battle. A blockchain could protect the message’s integrity from ex-post tampering. Yet it must be noted that the phase between the message’s production and its transcription onto a distributed ledger is subject to some of the risks present in our current model.

B. The effect of centralization on the fluidity of the decoding process

A second issue affecting person-to-person communication is the process through which the relationship between signifier (symbol) and signified (meaning) comes to be (point B on the diagram). The decoding process.

The process of information consumption is not automatic or passive. The receiver has a role to play. The word cat triggers a different set of reactions in a cat owner and a person allergic to cats.

The receiver constructs meaning by relying on her own experiences as well as recalling instances in which members of the community managed to coordinate a conversation by relying on [what seemed like] an agreed-upon meaning of a concept. Through this process individuals and groups play an active part in the construction of reality.

This active interpretation enables language to be fluid: the relationship between signifier (symbol, such as a word) and signified (meaning) can shift over time. Language, as a system, is open and somewhat decentralized. It requires individuals to coordinate around meanings. No one actor can effectively impose a meaning. We see this through slang, for example, where marginalized groups, despite their exclusion from formal spaces of power, coin terms to more accurately share their thoughts and feelings.

This active decoding process suggests that a reflective capacity comes embedded within language. The noisiness of the process through which we interpret and discuss our world provides the flexibility necessary for critical social changes to become possible. New meaning can be constructed.

With cat the process is quite straightforward. Now shift from cat to more abstract concepts — like justice and war, or muslim and latino — and things get trickier. Since people don’t necessarily deal with muslims or latinos directly, third parties — such as the mass media and the board of education — exercise greater control over their meaning.

Much like the elites in charge of writing definitions in a dictionary, mass media often takes over the process of rooting the signifiers onto a broader set of signifiers in order to construct meaning.

The process of constructing meaning is deeply political.

Reiterated associations between muslim or latinx to negative frames can, over time, trigger negative mental responses to the mere reference of these terms, even when the negative frame itself is not present. If so, the term has been effectively rooted onto the negative frame. As from that moment, the negativity has become part of its meaning.

A centralized web of content, where the few define which frames should be applied and distributed, becomes a liability — the opposite of the open space the web was meant to create. Many of us still believe that by distributing the power to construct meaning — and therefore the way we understand our identity, our relationships, and the societies we live in — the web has huge potential to make the world a more equal and fair place.

A centralized web of content, where the few define which frames should be applied and distributed, becomes a liability.

Let’s think about how the process of centralization might play out in 20 years…

Many resources are currently devoted to the development of brain-computer interfaces. Brain-computer interfaces imply tending a bridge across the air gap that currently exists between people and their devices. It would be bridging our five senses. Controlling not only the information we receive but how we interpret it.

Eliminating such air gaps might create limits to the receiver’s capacity to diverge in the way she processes the signifier: the computer would arguably take over the decoding role, and with it our subject’s ability to decode and reconstruct–through purpose or mistake– signifiers into novel and potentially transformative meanings. The evolution of thought itself could become subject to the whims of whoever controls the tech. Whereas our natural language is an open and somewhat decentralized system, code is more rigid, like numbers. Huge power thus lies in the hands of those capable of defining meaning.

Every step towards the large-scale roll out of these technologies strengthens incentives for intermediaries to ensure that they can operate these systems unchecked.

Too much power…

Those in control of information flows are gaining too much control over what conversations will take place and what meanings can be constructed. As the concentration of power increases, the “mistakes” of these power players trigger harms of a breadth previously unknown. Public scrutiny is on the rise. Yet the public seems to react with cynicism, distrust and criticism to whatever fix big corporations propose. This suggests public criticism is not targeted at the solution being proposed, but at the actors forwarding these solutions. There seems to be a feeling that these corporations lack the legitimacy to exercise the power they have managed to amass, regardless of how they choose to exercise it.

How to move forward? The next part sketches a plan…

Juan Ortiz Freuler

Juan Ortiz Freuler is an affiliate at the Berkman Klein Center for Internet and Society.

You can reach Juan on Twitter, Solid and Medium.


follow me

Leave a reply

Your email address will not be published. Required fields are marked *

Stay Tuned!

Receive regural updates from Absolute Influence.

By signing up, you agree to our Privacy Policy