Essay · April 2026

Starting out when the ladder has changed

What I tell my sons, and my teammates.

I have three sons that are approaching college and the workforce. I worry about what it will take for them to navigate a job market that does not look like the one I started in. I also run teams full of people that are early in their careers. I hear the same questions from both sides of that line: what do I do, where do I look, how can I build a strong career?

The landscape, honestly.

Entry-level postings are down roughly 35% since early 2023, according to Revelio Labs. A Harvard study across sixty-two million workers by Seyed Hosseini and Guy Lichtinger found that at firms adopting generative AI, junior employment dropped 7 to 12% while senior employment kept rising. Postings that used to ask for two-to-four years of experience now ask for five. The tasks that used to teach juniors how to do their jobs are being absorbed by AI, which means the old ramps are disappearing before the new ones are clearly visible.

The ladder has changed shape.

The first rungs look different from the ones I climbed, and different again from the ones my early in career teammates were told to expect. The rungs that used to exist are being replaced: the revenue or cost analyst, the first-year associate reviewing contracts, the junior banking analyst building Excel models, the content marketer producing blog drafts. In some cases the replacement is AI. In more cases, the replacement is a role that requires AI fluency plus judgment of a kind juniors used to be allowed to build slowly, over years.

The new ladder is visible. Here is where I tell them to look.

Three categories of real starter jobs.

AI governance, security, and compliance.

One of the biggest talent shortages in 2026: more than half of businesses cite AI-ready talent as the greatest barrier to AI implementation, and AI governance is where the gap is widest. Regulated industries (healthcare, finance, pharmaceuticals, critical infrastructure) have to build AI oversight infrastructure they do not yet have. These jobs pay well, they are entry-point-accessible, and they require judgment more than advanced technical skill. If you have a legal or compliance background, a head for how systems break, or an interest in ethics, this is the category to watch. The hiring is aggressive; the airtime is elsewhere; many candidates never look.

Technical roles, including engineering more broadly.

Junior data scientist, ML engineer, AI research assistant. Engineering itself has changed shape: the role is no longer as specialized as a factory process, and engineers now need the judgment and oversight to run the system, plus the ability to work with Claude and other AI coding tools to generate, test, and evaluate code. Python, statistics, a portfolio of real projects, and fluency with AI coding assistants matter more than the degree you hold. Workers with advanced AI skills earn 56% more than peers in the same roles without them. If you have the technical chops and the willingness to build publicly (Kaggle, GitHub, open-source contributions), the demand is still aggressive.

Traditional roles where AI fluency is the edge.

The largest category, and the one most juniors overlook: marketing, operations, finance, legal, product, HR. Employers here are hiring people who can do the actual work well and use AI fluently, with judgment about where to trust it and where not to. That is the 10x candidate in any industry. It is also the easiest category to demonstrate in an interview, if you have been quietly working this way already.

Healthcare, trades, and regulated physical-presence work stay more insulated than tech. The categories are real, even when the airtime goes elsewhere.

What to build.

Three layers, in order of leverage.

Baseline: AI fluency. Prompt engineering not as a buzzword but as the ability to structure a request so the AI delivers useful output, and then verify what comes back. Everyone needs this in 2026; not doing it is a negative signal.

Leverage: pick one thing and go deep. If you are on a technical path: Python, SQL, and fluency with AI coding tools like Cursor and Claude Code, used to generate, test, and evaluate code. If you are on a non-technical path: fluency with the AI tools embedded in the work you will actually do. For most knowledge workers that means Copilot in Excel, Word, and Teams; for Google-shop companies it means Gemini in Workspace. Add a general-purpose assistant (Claude or ChatGPT) for research, drafting, and analysis across everything. Pick your lane and go deeper than a certificate would suggest.

Compounding: the breadth of knowledge and awareness that makes good judgment possible. Curiosity across functions, beyond your own. Enough fluency in adjacent fields (finance if you are in marketing, operations if you are in strategy, customer context if you are in a technical role) to see problems whole rather than from one angle. Awareness of the industry, the regulatory environment, the customer, and the market. These build over time by staying curious across domains. AI is strong at depth in narrow fields; it is weaker at the cross-domain synthesis that produces good judgment in ambiguous situations. That synthesis is what separates someone who uses AI from someone who runs things that use AI. Someone who builds breadth at 24 is a different kind of leader at 40.

The shift to competency-based hiring.

Employers across industries are moving from credentials to demonstrated capability, and AI is the reason. AI-generated résumés are indistinguishable from human-written ones. Work samples can be produced end-to-end by someone who never did the work. Even the person on the Zoom call may not be who you think they are; the FBI first warned about AI-enabled interview impersonation in 2022, and a North Korean state actor used a face-swap to pass multiple rounds of live video interviews at security firm KnowBe4 in 2024. The traditional signals have collapsed, and employers know it. What they can still verify is whether you can do the thing on the spot, with the tool, while they watch. Every skill you put on a résumé now comes with an implied test. Python, Excel, working with an AI coding assistant. If you have not actually used it, they will know in two minutes. Treat that as freedom. The candidates who can actually do the thing move fast; many still cannot.

What not to do.

Chase low-wage outsourcing plays dressed up as AI jobs (data annotation, generic "AI content creator" contract work). Treat AI skills as a substitute for domain expertise. Chase certificates over projects. Build things and publish them, even if the first versions are rough; a portfolio of three real builds will out-signal a résumé full of buzzwords in 2026 hiring.

What I tell my boys, and what I tell my incredibly talented teammates: the ladder is different. The climb is still yours. The people who come out of this decade ahead will be those that didn't wait for someone to tell them what to do but instead overcame their discomfort, picked an area to explore, went deep, built a portfolio of work, and showed they could do something useful, both with AI and with their own judgment about it.

The fear is earned. The curiosity is a choice. The work of building your next decade starts with the next conversation you choose to have, the next project you choose to build, the next problem you choose to own.

Continue the conversation. I read and reply to reactions on LinkedIn. The sharpest responses usually come from readers.

Julia Denman is Chief Risk and Audit Officer at Microsoft and a director on The Clorox Company's board. Her book, The Clarity Quotient, publishes early 2027.

Take the five-minute self-assessment →

Get the essays in your inbox

One short essay per month. No promotion. Unsubscribe anytime.

All essays About Julia