The Entry-Level Cliff: Who Actually Gets Hired When AI Does the First Draft?

Something I’ve been watching happen in real time across three different marketing teams I work with.

Two years ago, each of those teams had a mix: a senior strategist, a mid-level manager, and one or two juniors handling first drafts, research pulls, initial copy. The usual pyramid. Now the pyramid is flatter. Way flatter. Seniors are still there. The juniors are mostly gone, and the work didn’t disappear with them. It got absorbed by tools.

The argument you usually hear is that AI creates new jobs. Maybe. But it doesn’t create entry-level jobs at the same rate it eliminates them. The new roles it creates tend to require someone who already has experience - prompt engineering that actually works, editing AI output so it’s not embarrassingly generic, QA’ing generated content against brand standards. You need judgment for that. Judgment takes years to develop. And you develop judgment by doing the grunt work.

That’s the part that bothers me. Entry-level positions aren’t just about cheap labor. They’re the mechanism through which people get trained. When you remove the junior writer who spends two years grinding through mediocre blog posts, you also remove the person who would have been your senior writer in five years.

I’ve been in marketing for over two decades. I learned the craft by doing bad work first and having people who were better than me tell me why it was bad. That feedback loop is how skills compound. What happens to that loop when a language model is producing the bad first drafts instead of a person?

I’m genuinely uncertain whether this resolves itself or whether we’re looking at a structural break. Companies might discover that AI-first content pipelines produce output that’s fine for volume but thin on insight. Or they might not care enough to pay for the difference.

Curious what others are seeing on the ground.

This is exactly what I’ve been watching from the freelance side.

The clients who used to hire junior writers to do research summaries, first drafts, listicles - those budgets are mostly gone. Not the clients, just those specific line items. Now they want me to “review and polish” what the AI produced. The rate for that is not the same rate as original writing. And the path into this industry used to run through those lower-stakes pieces.

I got better by writing badly for clients who didn’t care that much. That’s not available at the same volume anymore.

My students in creative writing ask me whether there’s a point to pursuing this professionally. Honestly I don’t have a clean answer. I tell them the skills still matter. I believe that. But I’m not naive enough to pretend the entry path looks the same as it did.

What worries me most isn’t whether the jobs exist. It’s whether we’re still building humans who can do them well. Writing is thinking. If the output gets outsourced before someone develops the thinking, what are we actually producing?

Same logic applies to entry-level coding. Junior dev roles were already shrinking before this. Now with AI handling boilerplate and basic logic, the argument gets stronger. Why hire someone to write CRUD endpoints when the AI does it in 20 seconds?

Seniors stay. Juniors get cut. The ladder gets shorter at the bottom. Somehow people still expect juniors to magically develop into seniors. I don’t know how that’s supposed to work.

Publishing angle is slightly different but the same problem.

Interns and editorial assistants used to do the first read on submissions, write coverage notes, flag what was interesting. That’s the job that taught people how to read like an editor - how to identify a premise that won’t hold over 80,000 words, how to articulate why a voice works or doesn’t. Some of that is being offloaded.

The assumption that young people learn craft by doing low-stakes versions of high-stakes work - that’s under pressure in every creative industry I can see. Whether the skills transfer when the work changes is something I don’t think we know yet.

The sites that went all-in on AI content production early got volume gains, then ranking drops, then panic hiring of editors - except by then the budget was gone.

When you cut the juniors you also cut the people who would have caught the problems your AI content creates. The senior “supervising” the AI doesn’t always have bandwidth for what a good junior actually did. Second-order effects nobody planned for.