From Learning by the Data, to Leading the Data

3 minute read

Published:

In the past decades, we have witnessed an incredible transformation in how data shapes our world. What started as simple numbers feeding into statistical models has become something much more powerful (and perhaps more concerning). In this blog, I want to explore how we moved from teaching machines with our data to being guided by them in almost every aspect of our lives.

When Data Was Just Raw Material

Remember when data was just raw material? It was something we generated (with our actions and behavior), collected, organized, and fed into machine learning algorithms to help them understand patterns. We were the teachers, and the machines were our students. Statistical models learned from the information we provided them, finding correlations and making predictions based on what we shown them. The human was in control, deciding what data matters and how it should be interpreted.

The Tables Have Turned

But somewhere along the way, the tables have turned. Now, it is these very models we trained that are dictating to us—the original sources of data—in ways we do not even realize. Think about your last online shopping experience. The products you see first aren’t random; they have been carefully selected by algorithms that learned from millions of previous purchases. When you open Netflix or YouTube, the recommendations you get isn’t just suggestions but powerful influences that shape what content you consume.

The Bias Problem

The most interesting part is how these systems contain biases that comes from the original data they learned from. Every search result you see on Google, every friend suggestion on social media, every news article that appears in your feed… All of these are filtered through algorithms that carry the biases of their training data. And here’s the catch: these biases don’t just reflect our preferences; they actively shape them.

Prisoners of Our Own Creation

We’ve become prisoners of our own creation, in a sense. The algorithms we taught are now putting blinders on us, limiting what we see and experience. When a recommendation system consistently shows us certain type of content, we naturally develop preferences for that content. Our beliefs, opinions, and even our worldview gets molded by what these systems choose to show us. It’s like wearing horse blinders that we didn’t even know we put on.

The funny thing is: we have put those blinders with our own hands in the first place. At first, we wanted those systems because we were lost under the big scale of uncontrolled data. So, those blinders were blocking the noise that we don’t want to see. Although, in the short term we experienced benefits, in the long term we have a hard time creating our own belief system without direction from those systems. Moreover, bias does not end by removing those systems either because the most important knowledge source is affected: humans.

Questions of Free Will

This shift raises important questions about free will and authentic choice. Are we really choosing what we want, or are we selecting from a pre-filtered set of options that an algorithm decided we should see? The systems that once served us have become our guides, leading us down paths they’ve determined based on patterns in data.

Moving Forward

As we move forward, we need to be more conscious of this reality. Understanding that we’re being influenced by these systems is the first step toward reclaiming some control. Maybe it’s time we stopped being passive data sources and became active participants again in deciding how technology shapes our experiences.