Algorithmic Paternalism

Imagine you’re dictating clear English to your iPhone, while suddenly, a digital oracle assumes you’re speaking something entirely different? Maybe an Eastern European language. Or, perhaps Hebrew. You just stare, bewildered, thinking: Who asked for this default?

It gets weirder, as you dive into the settings, scrub every trace of the rogue language, restart your device, and now… It translates into another tongue of the same area.

Your quiet frustration reveals a deeper truth: This isn’t a benign glitch. It’s an everyday manifestation of how devices sculpt our reality through predetermined choices. Choices that most people don’t even perceive while they quietly erode human agency.

The Unseen Mechanism

At play is a prime example of automatic language detection. A seemingly innocuous convenience was built into the very system. So here’s how it goes.

  • Your device spies on you 24/7, sharing data with Big Tech.
  • It tracks the language of your immediate physical environment.
  • So it makes its “informed” decisions based on that.

If you are in Germany. It seamlessly flips between English and German. If you are in Greece, it turns your English speech into Greek text. You’re welcome for the optimization.

The most unsettling part? You can’t simply opt out, as the choice is binary. You can manually remove all other languages from keyboards and disable dictation. But this doesn’t change much in ChatGPT. Or does it? It really wants to “help” by forcing you to surrender to its persistent guesswork.

A Partial Solution

What worked for me is rambling random stuff in English about a minute before my actual dictation, and definitely not having any locals around.

Turns out the device considers the locals’ background speech more valid than your own voice. Go figure. The rambling is like a “warm-up” for the AI to help it understand that English is your language of choice.

This “consistent convenience” is the logic behind YouTube’s “recommended” feed, Instagram’s opaque visibility algorithms, or any system that claims to “help” by making decisions for you. The underlying impulse is to decide.

Algorithmic Paternalism

This is what I call algorithmic paternalism. The pervasive philosophy ingrained in tech development: “Don’t worry, Big Tech knows what’s best for you.”

The invisible hand shaping news feed, search results, maps, and yes, even “assisting” your keyboard mic. The profound paradox is that these companies promise “globalization” and an interconnected world. Yet their very systems enforce hyper-local assumptions that box us in. But is this really a convenience? And do all people want it?

Culture and Convenience

Now, as far as I can remember, some Eastern Europeans are obsessed with English. So obsessed, they attached TOEFLcertificates to their identity in the early 2000s.

The web app of my favorite local online grocery store, which offers and supports local farms and produce, has an English version. Enabled once, that version remains pinned in my Safari. This is what I call convenience.

Conversely, Glovo and TakeAway, which are supposed to be much more “global,” keep reverting to the “local” language no matter how many times I chose English.

And by the way… Glovo, you have multiple couriers delivering a single order, which is plenty confusing to both them and the customer. Maybe optimize that first, rather than choosing my language.

Recently, the Planetary Hours webapp I’ve been recommending to practicing esotericists has been doing the same, which is rather disappointing.

And of course, Google Chrome, the undisputed champ. Literally! Yeah, yeah…

“You can customize everything in Chrome!”

We all heard that an infinite number of times. It just so happens that after a week or sooner, it reverts to the local language. Turns out, Chrome knows best!

Reclaiming the Reins

As a 15+ years Apple user, I expected better from the iPhone. Seriously. Yet, this isn’t merely about dictation settings. It’s the little moments where the machine starts shaping you. Because if something as small as your iPhone mic can override your choices, imagine the magnitude of the larger decisions you’re unknowingly relinquishing:

  • Which crucial news remains unseen?
  • Which valuable videos are perpetually buried?
  • Which opportunities you might never even conceive of, because they were pruned from your pre-selected reality.

Creators from smaller countries might find their work constrained, as the system presumes that’s where it “belongs.”It’s the same design across the board: optimize for frictionless convenience, but the hidden cost is always agency.

Back to Frankl

So, the next time your phone “helps” you by making a choice on your behalf, invoke that space Viktor Frankl spoke of:

“Between stimulus and response there is a space. In that space is our power to choose our response. In our response lies our growth and our freedom.”

Pause, and use your prefrontal cortex to plan as a human being. Do you genuinely desire this algorithmic default? Or are you merely allowing the system to run you? Because of true globalization and authentic decentralization? Such grand concepts perish when the very defaults of our digital infrastructure are engineered to contain us within predefined boxes.

CTA

Has your tech ever “decided” for you in an unexpected way? Let me know. And if you have trouble seeing the invisible hand, maybe get into Magick using my manuals. And if you struggle with discipline and getting in shape, my books might be just what you need.

— POTB

https://www.youtube.com/watch?v=_8cV2PhzWq8