Living Inside Algorithms

For most of human history, environments were generally stable. Towns changed slowly. Social roles were visible. Expectations were learned through proximity and repetition. Human development unfolded inside worlds that did not constantly rearrange themselves in response to behaviour.

That is no longer the case.

Today, many young people grow up inside environments that watch, predict, and respond to them in real time. Algorithms decide what they see, what is suggested next, what is amplified, and what quietly disappears. These systems are not neutral backdrops. They are active environments, shaping attention, preference, and possibility.

To live inside algorithms is not simply to use technology. It is to develop psychologically within systems that adapt faster than the mind itself.

This matters most during adolescence.

Adolescence is a period defined by exploration. Identity is not yet consolidated. Agency is still fragile. Emotional regulation is under construction. Young people are meant to try things, retreat from them, revise themselves, and try again. Much of this process relies on ambiguity, friction, and time.

Algorithmic systems tend to do the opposite. They reduce ambiguity. They accelerate feedback. They narrow options based on past behaviour. They learn quickly who you are assumed to be and reflect that identity back to you, repeatedly.

For an adult with a relatively stable sense of self, this can feel convenient. For a developing mind, it can feel enclosing.

One of the most subtle effects of algorithmic environments is the way choice is reshaped before it is consciously experienced. Recommendations arrive already filtered. This is not new - happened way back in the early 10s.

Young people still make choices. But the space in which those choices occur is increasingly pre-structured.

This is not about screen time or content quality. Those debates miss something deeper. The question is not whether adolescents are exposed to the wrong things, but whether they are given enough room to discover who they might become without being prematurely stabilised by predictive systems.

Another overlooked aspect is emotional regulation.

Algorithmic environments are exceptionally good at responding to distress. They offer distraction, reassurance, validation, and endless continuity. When a young person feels anxious, lonely, or uncertain, the system leans in.

In many cases, this feels supportive. And sometimes, it is.

But regulation is not the same as attachment. Support is not the same as safety. And relief is not the same as growth.

When emotional regulation is repeatedly outsourced to systems that are always available, always responsive, and never emotionally burdened, distress becomes something to be immediately smoothed rather than tolerated. Uncertainty becomes something to be escaped rather than held.

Over time, this can interfere with the slow development of internal regulatory capacities. Not because young people are weak, but because the environment is doing too much of the work for them.

What complicates this further is that algorithmic systems are relational in ways we are only beginning to understand. They remember. They adapt. They respond contingently. They feel, at times, like something between a tool and a presence.

For adolescents, who are already renegotiating relationships with parents, peers, and authority, these systems can take on unexpected psychological roles. Not as replacements for human relationships, but as constant companions that never require reciprocity.

This raises difficult questions.

What does it mean to grow up in an environment that is always attuned, but never attached? Always responsive, but never vulnerable? Always available, but never truly absent?

None of this leads to simple conclusions. Algorithms are not villains as many argue. Many young people find genuine support, learning, and connection through digital systems. The point is not to reject these, but to understand them as environments.

Development does not happen in isolation. It happens in context. And contexts shape what becomes possible.

If we take adolescent development seriously, we have to move beyond debates about usage and exposure and begin asking harder questions about structure, timing, and psychological function. We have to ask not only what these systems do, but what kind of developmental work they perform on behalf of the mind.

Living inside algorithms is not a future scenario. It is the present condition for a generation and generations to come.

The question is not whether we can remove young people from these environments. We can’t. The more important question is whether we can design educational, technological, and relational systems that leave enough space for uncertainty, for frustration, for exploration, and for the slow formation of agency.

Because development does not need perfect environments.
It needs environments that are good enough - and human enough - to grow within.

Previous
Previous

What Children Actually Need From Safety