Advice for Employers and Recruiters
Is AI killing the early career job market? Parsing the data from three high-profile studies.
Is artificial intelligence making early career jobs disappear. That is the headline you see again and again. Some days it feels true. Other days it feels like hype. If you talk with a wide range of employers, you hear both stories. Some say they are not cutting back on early career hires at all. Some say they want to hire more. Others say they will not open a req unless the job is something that a tool cannot do at scale. These views can both be real at the same time because the labor market is not one thing. It is many small markets that move at different speeds.
I want to separate fear from fact and then turn those facts into steps. I also want to speak to two audiences that need each other. Employers who want a strong pipeline of early career talent. Students and recent grads who want fair access to good work. The point of this piece is to give both groups a clear and useful map.
Let us begin with what the best current data says. A new study by Erik Brynjolfsson and his coauthors uses payroll records for millions of workers and tracks who is getting hired and who is not. The most important finding is simple to say and hard to ignore. In the jobs that are most exposed to language models, employment for workers ages twenty two to twenty five fell after late twenty twenty two. The authors report a six percent drop for that age group in the most exposed occupations between late twenty twenty two and July twenty twenty five. They also find a thirteen percent decline if you look at how those young workers did compared to other groups in the same period. Older workers in those exposed jobs were flat or grew. Younger workers in less exposed jobs grew. It is a clear early signal that the pressure lands first where junior tasks are easiest to automate. (Stanford Digital Economy Lab)
That is one piece of the story. Another comes from Employ America. They look at a different set of measures and they focus on timing. Their conclusion is that the struggles of recent graduates started before the recent wave of chatbots. The advantage that new grads once had in the job market began to fade years ago. By twenty eighteen, the unemployment rate for twenty two to twenty seven year olds with a degree had already moved above the national average, long before the newest tools showed up. They also show that unemployment patterns do not line up neatly with majors that experts label as highly exposed to AI. Some high exposure majors did worse, but others did better than before the pandemic. This suggests that a larger pool of graduates and a slower overall hiring rate explain a lot of what we see. In other words, AI may be part of the picture, but it is not the only driver and it may not be the first one. (Employ America)
A third view pulls the other two together. The Burning Glass Institute studied unemployment, underemployment, postings, and company behavior. Their report shows that the college to career pipeline is under stress for several reasons at once. Companies learned to run lean after the pandemic and many kept those habits. AI makes it easier to do that. At the same time, more degree holders are chasing a limited number of jobs that truly require a degree. The result is high underemployment. For the class of twenty twenty three, more than half were in jobs that did not require a degree a year after graduation. That is a striking number and it cuts across many programs. The report also shows that the drop in postings is sharpest in jobs with higher language model exposure. In short, the first rung of many white collar ladders got weaker, and it got weaker before the newest tools but faster after them.
These studies can seem to argue with each other at first. Look closer and they line up. Employ America is looking at broad sectors and at longer trends. That level can hide changes inside specific jobs. The Stanford team is tracking individual workers inside specific occupations at a month by month level. That lens can catch early moves that a broad sector average will wash out. Burning Glass looks at both data and behavior and frames how the pieces fit. Put together, they say this. The job market for new grads began to get harder years ago. The newest wave of AI made some parts of it harder still, and it did so where the work breaks neatly into tasks that a model can do passably well.
A practical question follows. What is actually changing inside the work. A useful answer comes from a recent paper by David Autor and Neil Thompson at MIT. They look at what happens when we automate routine tasks. When you remove the simple parts of a job, what remains is more complex. That raises the skill bar to do the work. When the bar rises, a smaller group of people can clear it, so you get fewer qualified workers and sometimes higher wages for that smaller group. If the simple tasks were how young workers used to learn, then you have a real problem. You took away the training ground and you left a hill that is steep on day one. That is the core reason early career roles can shrink even if senior roles hold steady.
Now let us connect the dots. The Stanford paper shows younger workers losing ground in jobs where language models can do a big share of the junior tasks. The Burning Glass report shows that postings for the most junior versions of those jobs have fallen the most and that many firms now run lean as a matter of habit. Employ America reminds us that the weakness for new grads started before the current tools, and that the pattern is not the same for every major or sector. You can read that as a clash. I read it as a timeline. A tougher market for new grads began years ago. Then a faster tool arrived. It pushed on the exact parts of the ladder that mattered most to beginners. The combined effect feels sudden because it landed on the same group at the same time. (Stanford Digital Economy Lab, Employ America)
If this is the map, what should job seekers do. The answer is not to give up or chase hype. The answer is to prove value sooner and closer to the result. In a world where first drafts and basic research are easy to generate, your edge is your judgment and your ability to ship something useful. That edge can be learned and shown. If you are a student, pick a problem that matters to someone other than you. Use the tools to get to a first pass faster. Then add what the tool does not have. Talk to a user. Check the math against real data. Record a short video that explains the problem, the steps, the decisions, and the outcome. Two minutes of a clear walkthrough is more convincing than two pages of claims. When you apply, use the language of the job posting so that your proofs reach human eyes. Many systems rank by text match. That is not new, but the volume of applications makes it more important.
There is also a mindset shift that helps. For the last twenty years, a common path was to take an early career role with routine tasks, learn the ropes, and then move up to judgment heavy work. That path still exists in many places, but it is narrower in others. In the thinner areas, you may need to build your own first rung. That can be a self directed project that serves a real need. That can be a part time role that gives you direct contact with a customer or a process that repeats. That can be a micro internship where you commit to a small project with a short deadline. The key is to create proof that you can do the work and make decisions that increase value.
Skills stacks matter more than labels. A computer science student who shows they can understand a business metric will stand out. A communications major who can show a lift in a marketing test will stand out. A psychology graduate who can run a clean user interview and turn it into changes that improve a product will stand out. For many employers, the ability to learn fast and show proof now matters more than a brand name credential. That is a change worth leaning into.
If you lead hiring, there are ways to protect your pipeline and improve results at the same time. The first step is to redesign the on ramp. If junior tasks are moving, then the entry roles must move too. Create roles where new hires do human tasks that still need hands and eyes and also learn to use the tools well. Give them small projects, short cycles, and clear outcomes. Ask them to tell you when and how they used a model. Ask them to test the output and to show you the changes they made. Pair them with mentors who have time to coach. If you do not make this shift you will find yourself trying to hire experience that does not exist, or paying a premium for it, and then watching it walk out the door.
The second step is to write job postings that match what you really need. If a role is truly an entry role, make it one. Ask for the core skills and proofs and drop the laundry list. If a tool can be learned in a week, do not make three years in that tool a must have. If the job needs writing, ask for a short sample on a real prompt. If the job needs customer care, ask for a short response to a real case. You will see more of the right people and you will see them faster. You will also widen your pool in a way that supports equity since inflated requirements block many strong candidates who did not have early breaks.
The third step is to score work, not resumes. Add a small work sample that reflects the job. Keep it short and fair. Allow or even encourage the use of modern tools, but ask the candidate to show what they did and why. Use two reviewers so you do not depend on one set of eyes. This reduces noise and bias. It also helps you find talent from schools or programs that you may not know well. Those hires often have more grit and more loyalty. That is a good trade for any team that wants to grow.
A fourth step is to invest in managers who teach. The best early career programs break when managers are stretched thin. Coaching takes time and focus. Reward the managers who build new people, not only the ones who inherit strong teams. Give them checklists and templates. Give them a shared set of tool guides for common tasks. Make teaching part of the job and part of the review. When you do this you create more ways for people to learn and more ways for your organization to adapt.
A fifth step is to set clear rules for tool use. Your team will use AI. The question is how. Make it clear where it helps and where a human must own the output. A model can help with a draft, but a person must check the claims and own the voice. A model can assemble a first pass of a dashboard, but the analyst must choose the cuts and connect the numbers to the business. A model can suggest a reply in a tense customer moment, but the human must decide what to send. When you set these norms, you get speed without losing trust.
Let us talk about where this pressure is sharpest and where it is lighter. The Stanford paper found larger declines for young workers in jobs like software development and customer service. Those roles include many tasks that a model can do well enough for a first pass. Write a short function. Draft a reply. Summarize a ticket. Draft test cases. In those roles senior people spend more time on design, integration, and judgment. When the junior tasks move to software, fewer junior seats are needed, at least at first. That is what the payroll data captured. (Stanford Digital Economy Lab)
The Burning Glass report adds that the drop in postings is highest in roles with more exposure to language models. It also notes that some sectors have kept revenue growth without adding headcount at the same pace, which is another way to say that lean staffing has stuck. At the same time the report shows very high underemployment for new grads and projects that the number of college educated adults will grow by millions over the next decade. If you add more degree holders to a market that is running lean and you remove the basic tasks that used to train them, you get exactly the stress we see today.
Employ America keeps us honest about sector stories and major stories. They show that majors labeled as high exposure do not all suffer more than others. They also show that sectors with more reported AI use do not always show worse outcomes for new grads than sectors with less use. This tells me that adoption is uneven and that company choices matter. It also tells me that the general hiring climate matters. When hiring slows, new entrants feel it more, and that has always been true. (Employ America)
It is useful to ask what parts of this moment are likely to fade and what parts will remain. I think some of the post pandemic leanness will ease when leaders feel safer about the path of the economy. I also think some of the risk aversion that came from the great resignation will calm as turnover normalizes. I do not think the growth in the number of degree holders will reverse any time soon. I also do not think firms that moved routine tasks to software will bring those tasks back to people. That is not how technology adoption works. The open question is how far the tools climb the ladder of expertise and how fast new kinds of early career roles appear. History suggests that new roles do appear. Whether they are open to new grads without long prior experience is the question that employers can decide.
So what should a student do this month. Start by getting close to real problems. If your school offers project courses with an outside partner, take one. If not, pick a business, a clinic, a club, a lab, or a local group and offer to build or fix something small with a clear outcome. Use the tools to move faster on the first draft. Then show the part that only you can do. If you want analytics, pick a metric and move it. If you want marketing, run a test and show lift. If you want product, talk to users and change the design. Keep short write ups that explain your choices. When you apply, place those proofs where a recruiter or a hiring manager can see them right away. Do not assume anyone will read a long document. Lead with a short video or a short summary and a link.
You can also widen your target. The roles that look like the roles your friends had five years ago may be thinner. The roles that place you close to a decision, a process, or a user may be thicker. If you want to write code, look at reliability, quality, or developer tools. If you want to do marketing, look at channel owners who must make weekly calls with real dollars on the line. If you want to do analysis, look at roles that track one business number and report daily. Those roles remain very human. They also teach fast because feedback is fast.
If you lead a team, here is a simple pilot you can start now. Pick one function where junior tasks changed the most. Choose a manager who is good at coaching and who wants to try something new. Bring in a small cohort and run three short cycles over twelve weeks. Teach how to use tools with care in week one. Give them a bounded problem in week two. Have them ship something to internal users or a small set of customers in week four. Repeat. Measure time to useful output, quality, and retention. If it beats your baseline, scale it. If it does not, fix it and try again. The cost is small and the signal is strong.
Let me restate the data that matters most so we do not lose it in the flow of advice. In the jobs most exposed to language models, employment for twenty two to twenty five year olds fell after late twenty twenty two. Older workers in those jobs did not fall. Young workers in less exposed jobs grew. That is an early, focused effect and it shows up in near real time payroll data. (Stanford Digital Economy Lab)
At the same time, the overall slip in outcomes for new grads began before the new tools. The advantage that new grads used to have faded in the late twenty tens. The link between exposure by major and outcomes is mixed. The link between sector level AI use and outcomes is also mixed. That means macro factors and supply and demand matter a lot. (Employ America)
Alongside those facts, we see that the first rung of many white collar paths is thinner and that underemployment is high for recent grads. The drop in junior postings is sharpest in roles with more exposure to language models. The count of college educated adults will rise by millions, which will keep competition high. That is the context in which leaders and candidates must make choices.
None of this means early career jobs are gone. It means they are moving. They are moving toward roles where human presence, direct care, field work, real time judgment, and ownership of decisions matter. They are moving toward teams that expect new hires to use tools with care and to prove value fast. They are moving toward employers who design entry roles as training grounds with real work, not as seats that fill an old checklist. The faster we accept that shift, the faster we can rebuild the pipeline.
There is also a fairness issue that deserves plain speech. When the first rung shrinks, the people who do not have strong personal networks lose more. If employers raise the bar to expert level on day one, you favor people who had access to training outside of school or who had family introductions. If you rebuild the on ramp to teach judgment in the flow of work, you open the door wider. That is not only the right thing to do. It is the smart thing to do. More ways in means more ways to grow. More ways to grow means a stronger team when the market shifts again.
I will close with the same question I started with. Is AI disappearing early career roles. The clean answer is that AI is changing the mix and the timing of those roles in ways that are already visible in the data. It is strongest where junior tasks are written, repeatable, and easy to score. It is weaker where work needs hands, presence, and context. It is part of a larger set of changes that began before the newest tools and that include lean staffing and a larger pool of degree holders. If you are a student or a recent grad, your move is to show proofs of value, to learn to use tools with care, and to go where the loop between work and results is tight. If you are an employer, your move is to rebuild the on ramp, to hire for skills and proofs rather than years, and to teach judgment at the pace of modern tools. The market will reward both sets of choices.
For readers who want to dig into the sources that shaped this piece, the Stanford working paper uses payroll records to show a six percent drop for twenty two to twenty five year olds in the most exposed jobs since late twenty twenty two and a thirteen percent relative decline for that group compared to others. The Burning Glass report documents the rise in underemployment for new grads, the concentration of posting declines in higher exposure roles, and the likely growth of the college educated labor force by millions in the coming decade. Employ America shows that the relative position of recent grads weakened before the current tools and that the link between exposure by major or sector and unemployment is mixed. The MIT framework explains why removing routine tasks can raise the skill bar and reduce the pool of qualified workers. Each source answers a different part of the same question. Together they give us a path forward. (Stanford Digital Economy Lab, Employ America)
New Job Postings
Advanced Search