How to use AI before you start your career
AI is already reshaping what employers expect. Here is how to use it well, use it honestly, and stand out because of it.
Tremoli
What this article covers:
Why AI literacy matters before you enter the workforce
How to actually use AI tools in a way employers value
The line between using AI and depending on it
What hiring managers are starting to look for
How to build a portfolio that shows real thinking, not just AI output
AI is not coming for your career. But someone who knows how to use it well might.
If you are a university student or recent graduate, you are entering a job market that has shifted more in the last three years than in the previous fifteen. AI tools are everywhere. Your future colleagues are using them. Your future employers are evaluating whether you can too.
This is not about becoming an AI expert. It is about understanding what these tools can do, where they fall short, and how to use them in a way that makes your work better without replacing your thinking.
The reality of AI in the workplace right now
Let us be specific about what is actually happening. According to Microsoft's 2024 Work Trend Index, 75% of knowledge workers were already using AI at work, with the majority bringing their own tools rather than waiting for their employer to provide them 1. LinkedIn's research showed that AI skills were the fastest-growing category in job postings globally, with AI-related hiring accelerating across every major industry 2.
This is not a trend that applies only to tech companies. Law firms are using AI for contract review. Consultancies are using it for research synthesis. Marketing teams are using it for first drafts and data analysis. NHS trusts are piloting it for administrative tasks. PwC's 2024 Global Workforce Hopes and Fears Survey, covering 56,000 workers across 50 countries, found that a significant majority expected AI to change their role within the next three years 3.
The World Economic Forum's Future of Jobs Report 2025 identified AI and big data skills as the fastest-growing skill requirement globally, with the vast majority of employers surveyed expecting AI to transform their business by 2030 4. McKinsey's global survey on AI found that by early 2024, 72% of organisations had adopted generative AI in at least one business function, nearly double the rate from the previous year 5.
The question is not whether you will use AI at work. It is whether you will use it well.
AI literacy is becoming like spreadsheet literacy was twenty years ago. Not everyone needs to be an expert, but everyone needs to be competent. The people who treat it as optional will find themselves at a disadvantage.
What "using AI well" actually looks like
There is a difference between using AI and using AI well. Here is what separates the two.
Use it as a thinking partner, not a thinking replacement
The best use of AI is not to skip the hard parts. It is to move through them faster while still doing the thinking yourself. Ethan Mollick, professor at Wharton and author of Co-Intelligence, argues that AI is most powerful when used as a "cognitive tool" that augments human reasoning rather than replacing it 6. Use it to:
- Brainstorm and stress-test ideas. Give it your argument and ask it to find the weaknesses. Ask it to suggest angles you have not considered.
- Get unstuck. When you are staring at a blank page, use it to generate a rough structure. Then rewrite it in your own voice with your own reasoning.
- Learn faster. Ask it to explain a concept at different levels of complexity. Use it to create practice problems. Have it walk you through a framework step by step.
- Check your work. Use it to review your logic, spot inconsistencies, or identify gaps in your analysis.
Research from Harvard Business School found that consultants using AI completed tasks 25% faster and produced 40% higher-quality output, but only when they stayed actively engaged with the work rather than delegating entirely. When they used AI on tasks outside its capabilities without applying their own judgement, quality actually dropped 7. The pattern is clear: you do the thinking, AI accelerates the process.
Understand what it cannot do
AI models are confident, articulate, and sometimes wrong. If you do not understand this, you will get burned. A survey of hallucination in natural language generation, published in ACM Computing Surveys, found that large language models fabricate information at significant rates and do so with the same confident tone as when they are correct 8.
- It fabricates sources. It will cite papers, statistics, and quotes that do not exist. Always verify references independently.
- It reflects patterns, not truth. It generates text based on what is statistically likely, not what is factually correct. This matters enormously in professional settings.
- It has no context about your specific situation. It does not know your company's strategy, your team's dynamics, or the politics of your organisation. It gives generic advice unless you give it specific context.
- It cannot replace domain expertise. It can summarise a legal precedent, but it cannot tell you whether it applies to your case. It can draft a financial model, but it cannot tell you whether the assumptions are reasonable for your market.
A useful rule: never submit AI output you could not defend in a conversation. If someone asked you "why did you write this?" or "where did this number come from?", you should have a real answer. "The AI said so" is not a real answer.
The honesty question
This is the part most people avoid talking about, so let us be direct.
Using AI is not cheating. Pretending you did not use it is.
The professional world is moving towards a norm where AI-assisted work is expected and accepted, but transparency matters. Canva's 2025 hiring survey, conducted by Sago across 4,200 hiring managers and 6,000 job seekers in ten countries, found that 90% of hiring managers said it was acceptable for candidates to use generative AI in application materials, though 73% said candidates should disclose when they do 9. Here is how to think about it:
- In job applications: If you used AI to help refine your cover letter or structure your CV, that is fine. If AI wrote it entirely and you submitted it unchanged, you are misrepresenting yourself. The interview will reveal the gap.
- In your work: If you used AI to research a topic, draft a first version, or check your analysis, say so when relevant. "I used Claude to help me structure the initial research, then I verified the sources and rewrote the analysis" is a professional thing to say. It shows judgement.
- In your portfolio: If a project involved AI tools, mention it. Explain what the AI did and what you did. This is not a weakness. It demonstrates that you understand the tool and can use it deliberately.
Universities are adjusting too. Jisc, the UK's education technology body, has published guidance on AI in tertiary education, and the majority of UK universities have updated their academic integrity policies to address AI use 10. The Russell Group issued principles in 2023 stating that AI literacy should be embedded into teaching rather than prohibited 11.
The employers who impress us most are the ones who ask candidates how they use AI, not whether they use it. Be ready for that question.
What employers are actually looking for
We speak with hiring managers across tech, consulting, finance, and the public sector. Here is what they tell us they value in graduates who use AI.
Critical evaluation
Can you look at AI output and tell what is good, what is wrong, and what is missing? This is the skill that separates someone who uses AI productively from someone who just copies and pastes. It requires you to actually understand the subject matter. The OECD's Employment Outlook on AI and the labour market specifically identified the ability to work effectively alongside AI systems as one of the most important emerging skills for knowledge workers 12.
Prompt craft and iteration
Not "prompt engineering" in the buzzword sense. Just the ability to communicate clearly with AI tools, give them useful context, and iterate on the output rather than accepting the first response. This is fundamentally a communication skill.
Workflow integration
Can you identify where AI saves time in a real workflow and where it adds risk? McKinsey's research estimated that generative AI could automate a significant share of tasks across most knowledge work roles, but that the majority of work still required human judgement, creativity, and interpersonal skills 5. The person who automates the right parts and does the rest with full attention is more valuable than the person who tries to automate everything.
Judgement about when not to use it
This is underrated. Knowing when AI is not the right tool, when the task requires human nuance, relationship building, ethical reasoning, or creative originality, is a sign of maturity. The UK Government's white paper on AI regulation emphasises that human oversight and the ability to exercise informed judgement over AI outputs are core principles for responsible AI adoption in every sector 13.
The graduates who stand out are not the ones who use AI the most. They are the ones who use it with the most intention. They can explain what they used it for, why, and what they did differently because of it.
Building a portfolio that shows real thinking
Your portfolio, whether it is a GitHub profile, a personal website, a blog, or a collection of projects, is your proof that you can think. Here is how to make AI part of that story without undermining it.
Show your process, not just your output
Write about how you approached a problem. If you used AI at any stage, explain what it contributed and what you changed. A project writeup that says "I used AI to generate the initial data cleaning script, then I modified it to handle edge cases X and Y that the model missed" tells a hiring manager far more than a polished final product with no context.
Build things that require judgement
The projects that impress are the ones where AI could not have done the whole thing. Choose a problem that requires you to make decisions, interpret ambiguous data, design for real users, or navigate trade-offs. Use AI to accelerate the boring parts. Do the interesting parts yourself.
Learn one AI tool deeply
Rather than skimming five tools, pick one and learn it properly. Understand its strengths, its limitations, and the types of tasks where it performs best. Being able to say "I have used Claude extensively for research synthesis and I have learned that it is excellent at structuring arguments but you need to verify every factual claim" is more impressive than saying "I have tried ChatGPT, Gemini, Claude, and Copilot."
Practical steps to start now
You do not need to wait until you graduate. Here is what you can do this week.
- 1Pick a tool and use it for real work. Not a toy example. Use it for an actual assignment, a real project, or a genuine problem you are trying to solve. Notice where it helps and where it does not.
- 2Fact-check everything. The first time you catch AI confidently stating something false, your entire relationship with it will change. That scepticism is valuable.
- 3Write about what you learn. A short blog post or LinkedIn article about how you used AI on a project is more valuable than any certification. It shows you can reflect, communicate, and think critically.
- 4Ask professionals how they use it. In informational interviews, networking events, or mentoring sessions, ask people what AI tools they use day to day. The answers will surprise you and give you a realistic picture of what awaits.
- 5Practice explaining your AI use. Get comfortable with phrases like "I used AI to help with the first draft and then I restructured it based on..." This is a professional skill you will need.
What to take away from this article:
AI literacy is no longer optional for knowledge workers. Start building competence now, not after you graduate.
Use AI as a thinking accelerator, not a thinking replacement. The value is in what you do with the output, not the output itself.
Be honest about your AI use. Transparency is a professional norm, not a vulnerability.
Build a portfolio that shows judgement, process, and critical thinking. Projects where AI helped but could not have done it alone are the most impressive.
The skill that matters most is knowing when AI is useful and when it is not. That requires understanding your domain, not just the tool.
References
- 1Microsoft (2024). 2024 Work Trend Index Annual Report. Microsoft WorkLab.
- 2LinkedIn Economic Graph (2023). Future of Work Report: AI at Work. LinkedIn.
- 3PwC (2024). Global Workforce Hopes and Fears Survey 2024. PwC.
- 4World Economic Forum (2025). The Future of Jobs Report 2025. World Economic Forum, Geneva.
- 5McKinsey & Company (2024). The State of AI in Early 2024: Gen AI Adoption Spikes and Starts to Generate Value. McKinsey Global Survey on AI.
- 6Mollick, E. (2024). Co-Intelligence: Living and Working with AI. Portfolio/Penguin.
- 7Dell'Acqua, F., McFowland, E., Mollick, E. et al. (2023). Navigating the Jagged Technological Frontier: Field Experimental Evidence of the Effects of AI on Knowledge Worker Productivity and Quality. Harvard Business School Working Paper 24-013.
- 8Ji, Z., Lee, N., Frieske, R. et al. (2023). Survey of Hallucination in Natural Language Generation. ACM Computing Surveys, 55(12), Article 248.
- 9Canva (2025). New Year, New Job. Second annual report. Survey conducted by Sago.
- 10Jisc (2024). Artificial intelligence (AI) in tertiary education. Jisc.
- 11Russell Group (2023). Russell Group principles on the use of generative AI tools in education. Russell Group.
- 12OECD (2023). OECD Employment Outlook 2023: Artificial Intelligence and the Labour Market. OECD Publishing, Paris.
- 13Department for Science, Innovation and Technology (2023). A pro-innovation approach to AI regulation. UK Government White Paper.
Share this article
We build AI products.
We take ideas from prototype to production. See what we are working on.
See Our Work →