The Human Offset

By Charli Cohen

I asked my community if they'd be more likely to buy from a brand that donates to artists to offset their AI use. The response was an overwhelming yes.

Call it a Human Offset. Every time AI is used to replace human creativity, an offset is paid to directly fund human creators. The goal is to preserve creative instinct: the unlearnable skill that makes meaningful human-AI collaboration possible.

I'm launching an experiment to put the Human Offset into practice…but first let's explore why it matters.

Transparency over blanket bans

Committing to never use AI is becoming an unrealistic promise. Transparency is far more important for a sustainable future.

We've seen this with carbon emissions. Most of us don't live perfectly eco-friendly lives, but we expect enough transparency to make informed choices. We can spot greenwashing from a mile away. We distrust a company that claims to save the planet without proof, while we trust the one that says "we're doing our best and here is exactly what that looks like."

Passing off AI work as human is what muddies the waters around meaningful AI collaboration. With transparency, we can start to have more nuanced conversations. We can differentiate AI slop from AI-assisted human creativity, and identify new forms of human craft that can only exist through AI collaboration.

Soundcloud's blanket AI ban misses this nuance. What if AI assistance is the only way a talented artist can self-publish without going into debt? What about completely new genres born through skilled human-AI collaboration? Musicians should get to experiment with this medium like any other new instrument.

Transparency is a two-way street. We need corporations to be honest about how they use these tools to scale, but we also need artists to be honest about their process. This builds a system where we can make informed choices.

The unlearnable skill

To collaborate with AI in a meaningful way, we need to preserve creative instinct: the ability to turn something felt into art, words or code. We need to preserve a willingness for happy accidents, the ability to join seemingly unconnectable dots. This is the skilled human part of human-AI collaboration. It's not something you can learn from a YouTube tutorial. It's an innate quality, honed through weird and specific lived experience.

We're great at bringing back obsolete crafts. Now digital photography is the go-to for efficient documentation, film photography has been allowed to become an even more expressive art form. And it's been easy for the digital native generation to learn those skills. Even if cameras and film were no longer produced, we'd have the blueprint to start again.

But creative instinct is intangible and personal. If that atrophies for a whole generation, it's not a step by step method we can just relearn.

What made Alexander McQueen a great artist wasn't just his skill, but the particular way he interpreted beauty and violence. Even with half a million videos and several seminal biographies, no one has learned how to see the world like Alexander McQueen. If he'd grown up with AI to rationalise everything for him, would he have developed the same creative instinct? Would Miyazaki, or David Lynch, or Björk?

Human innovation thrives in empty space — sitting with our thoughts, noticing what goes unnoticed, embracing discomfort and following it where it leads. The more we lean on AI to co-think, the more the empty space becomes intolerably uncomfortable — we automate away the breakthrough until we unlearn the ability to have one.

The experiment

So how do we actually fund the preservation of this creative instinct? Can we make it culturally expected for companies to transparently measure and offset their AI use, the way they do their carbon emissions?

We're launching an experiment to test this. Two parallel brand universes, identical starting conditions, divergent creative constraints — one entirely human-made, one AI-assisted. Both built collaboratively, both earning contributors a share of what they help create. The difference: 5% of revenue from the AI-assisted brand directly funds creators in the human-made one, as a Human Offset.

Brand 1: Care About Something Again

The theme: I AM THE GPU

The action: You cannot buy the sweatshirt. To earn it, you must submit a human-made take on the theme. Any medium is welcome as long as it's AI-free. Photography, crochet, writing, videos, mini games, anything you like.

The reward: The submissions that resonate most with the community receive a special version of the sweatshirt with a secret detail — this version cannot be bought.

The earning: You can create products to sell within the Care About Something Again network and curate submissions to keep the brand world high quality. Contributing work and helping to curate also earns you a share of sales and network fees.

Brand 2: Compute Something Again

The theme: I HAVE A GPU

The action: You can buy the GPU sweatshirt outright to trigger the Human Offset. A fixed percentage of every purchase goes directly to Care About Something Again creators, transparently tracked and paid out based on the quality of their work.

The earning: You can submit AI-generated responses to the theme, make products available for sale within the Compute Something Again network and curate submissions to keep the brand world high value. The community decides how they want to value AI collaboration vs AI slop. Your creation and curation earn you a share of sales and network fees.


Pick a side or participate in both.

The GPU sweatshirt contest operates on a 30-day rolling submission period — the community curates entries in real-time, with our team and media partners reviewing top-rated work weekly. Approximately 10 sweatshirts are earned each week through public voting, up to 50 total over the challenge period.

Both universes are built on WEB — a system for collaborative creation. WEB enables the best performing work and curation within each brand to earn ongoing revenue share, as well as transparently facilitating the Human Offset.

The Human Offset works through social pressure, not moral conviction. This experiment tests whether enough people care about preserving creative instinct to make transparency the baseline expectation…or maybe they just want a really good sweatshirt.

Let's find out.