The Attention Farm: How Big Tech Profits From Your Unpaid Labor
You’re not using the algorithm. It’s using you.
A few weeks ago, I spent the day volunteering at a local community garden that was short on help. We worked from dawn to dusk hauling compost, pulling weeds, and swapping stories under the warm Spring sun. It was sweaty, satisfying work.
Not once did I reach for my phone. I didn’t even think about it. Something about being grounded in the dirt, working alongside and interacting with other people, kept me fully present.
After a communal dinner under the stars, I left that farm and clocked in at the other one. You work there, too, even if you don’t know it. You’ll find it in the field Big Tech owns — the internet — where we perform unpaid labor planting the seeds of consciousness and fertilizing the ground with our time.
Then they harvest and sell our behavioral data and plow some of the profit back into finding how to get more free labor out of us.
You’re already on the attention farm
Whether you realize it or not, you’re on the Attention Farm right now: it’s the internet. Specifically, the part of the internet shaped by platforms that claim to be “free.” Facebook, Instagram, YouTube, TikTok, Google, X, even Substack and your favorite shopping or streaming apps.
These platforms aren’t just tools — they’re the farm. They’re cultivated to watch what you do, learn how you think, and turn every move into money.
And like any farm, it has owners.
Big Tech sets the rules. You don’t decide how the feed works — it’s called a For You page for a reason. You just show up and do the work: scroll, swipe, post, like, comment.
And even if you don’t actively do any of those things, you’re still producing data. How long you linger, what you skip, what you hesitate on… all of it gets recorded. You can’t not participate. Just being there is enough.
The illusion of leisure (and consent)
To you, it feels like leisure but it functions as labor. Invisible, unpaid labor that fuels trillion-dollar companies.
Your daily activity, from how long you look at a photo or what headline makes you pause, what you rewatch, who you tag or forward things to? It’s all quietly captured and transformed into data.
Then the data becomes a crop. A resource. Something harvested and sold to advertisers, data brokers, political campaigns, or AI systems hungry for training material.
And you have no meaningful say in it, because your consent (or lack of it) is all an illusion. Oh, you can use a VPN, a private relay, even a burner email. None of those stop data algorithms from figuring out who you are.
Anonymity on the internet is a myth
After all, most behavioral data is technically “anonymized,” but that doesn’t mean it’s untraceable.
In 2013, a study showed that just four points of location and time (like being at your home at 7AM, then work at 9AM) were enough to uniquely identify 95% of people in a supposedly anonymized location dataset.
Imagine how that’s improved over the past twelve years as billions in funding have poured into improving algorithms.
Consent or not, you’re still working on the farm. You’re just wearing a mask the system sees right through.
What is “behavioral surplus”?
Most people assume that platforms collect data to make things work better. To recommend a movie or suggest a product you might like, or to improve the feed, even personalize your search results.
And yes, they do some of that. But what they collect goes far beyond what’s needed to run the service.
First described by Dr. Soshanna Zuboff in The Age of Surveillance Capitalism (2019), this excess data is called behavioral surplus:
"This surplus is not used to improve service for you, the user. It’s used to create prediction products that are sold to other businesses.”
Your labor feeds their AI
Behavioral surplus is valuable because it’s predictive. The more platforms know about how you behave, the better they can guess what you’ll do next: what you’ll buy, who you’ll believe, how you’ll vote, even how you might feel tomorrow.
And these predictions can be sold.
That’s the business model. Not ads. Access to your future. This is where the algorithm comes in.
You are the unpaid worker. Behavioral surplus is the crop. And everything you do online is helping to grow next season’s yield for the platform’s profit.
Here’s the part most people miss: the algorithm is AI. Not the kind making headlines for writing essays on behalf of students who don’t want to do their own work, or generating fake images of your favorite pop star without clothes.
This is machine learning that works quietly in the background, analyzing behavioral surplus to find patterns and make predictions.
It’s the engine behind your feed, your recommendations, your suggested search terms. It’s been shaping your online life for years, long before anyone was arguing about chatbots or copyright.
You’re not using the AI. You’re training it. You’re the unpaid help — and your behavior is the product.
The algorithm isn’t here to please you
The better the algorithm learns to predict you, the better it gets at shaping and influencing you. And the more you train it, the more it knows exactly how to keep you working on the farm.
This is where things start to shift. AI-powered algorithms don’t just observe your behavior; they use what they learn to gently push you in certain directions.
Not all at once and not obviously. Just enough to increase the odds that you’ll stay engaged, come back more often, and slowly shift in ways the system finds profitable. These aren’t hard pushes. They’re nudges.
Maybe it’s the order of videos in your feed.
Maybe it’s which comments are shown first.
Maybe it’s a notification you weren’t expecting, timed perfectly to grab your attention when you're vulnerable.
The system doesn’t need to force you. It just makes certain paths easier to follow, and certain thoughts harder to escape.
That’s not just an algorithm, it’s behavioral engineering.
And it’s powered by AI.
It knows what makes you pause. What gets you to rage-react. When you’re most likely to doomscroll. It knows what types of content are most likely to trigger your anxiety, play on your fears, or push you to defend your identity or tribal instincts.
This is why it’s called machine learning
And no, this isn’t tin foil hat territory — it’s literally why it’s called machine learning. The system studies how you behave, adapts in real time, and gets better at keeping you engaged. The more you respond, the stronger (and more invisible) the loop becomes.
This is why outrage travels faster than truth. Why conspiracy content keeps resurfacing. Why the things that make you angry keep showing up in your feed. It’s why platforms keep recommending more extreme or emotionally charged content.
It's not a glitch. It’s the business model doing its job.
The algorithm isn’t here to serve you. It’s here to steer you.
And if you don’t recognize the steering for what it is, you’ll think you’re driving when, really, you’re being driven.
The stakes have never been higher
You might wonder why any of this matters. So what if your feed is personalized? So what if the system nudges you now and then? You’re still making your own choices… right?
Maybe. But what if those choices are being shaped before you’re even aware of them?
This isn’t just about distraction or screen time anymore. It’s about autonomy. Identity. Labor. Power.
The more these systems learn, the more power they have not just over what you see, but over how you think, feel, and act. They don’t just know what you’ll likely do next. They quietly help decide it.
The systems you trained are coming for your job
Behavioral surplus doesn’t just feed the algorithm. It trains the AI that’s starting to reshape entire industries.
The same data you give away while scrolling through social media can help build the systems that will one day write your content, screen your job application, track your productivity, or make recommendations that quietly limit your choices.
This isn’t science fiction. It’s already happening.
We’re not just losing our privacy. We’re feeding systems that could soon undermine the value of human work, creativity, and judgment while reinforcing inequality, manipulation, and control.
And we’re doing it without a paycheck. Without real consent. Often without even knowing. We’re working the farm, training the tools, and building the machines that will replace us while being told it’s all fun and games.
The algorithm isn’t designed to show you helpful content. It’s designed to show you content that keeps you working for free.
Where we go from here
You don’t have to quit the internet. You don’t have to delete every app or go off-grid. Sure, it sounds like the perfect solution some days, but it’s not realistic for most people. It’s also not the point.
The point is to see it.
To recognize the systems you’re in.
To understand how they work.
To notice when your time, energy, or emotional state is being redirected for someone else’s gain.
Because once you see the Attention Farm, it’s a lot harder to get tricked into working it for free.
In the weeks ahead, we’ll go deeper into recognizing the dark patterns the platforms use to keep you hooked, how AI fits into the machinery, and how to start reclaiming your focus, your privacy, and your choices.
This isn’t about shame. It’s about awareness. And awareness is the first step toward agency.