AI literacy isn’t optional anymore: Here’s where to start
AI literacy isn’t optional anymore: Here’s where to start
There’s no way around it: AI is already embedded in the way we work. Whether it’s Microsoft Copilot in your inbox, AI-driven applicant tracking systems, or staff running side experiments in ChatGPT, this technology is everywhere.
As JP Browne, Practice Manager from our office in Auckland puts it:
“You can’t wait for your organisation to come up with the best practice for AI. Too many businesses are behind. You have to take responsibility for learning it yourself.”
In other words, burying your head in the sand isn’t an option. AI literacy is quickly becoming a must for every role, not just tech teams.
Why AI literacy matters
AI tools are moving so quickly that what felt advanced six months ago is already outdated and this pace of change has created both opportunity and risk. On one hand, AI can streamline tasks, boost productivity, and surface insights in seconds. On the other, uninformed use can put entire businesses at risk.
Jack Jorgensen, General Manager of Data, AI & Innovation at our project delivery arm, Avec, recalls a recent case where a senior executive uploaded an entire customer database into ChatGPT to “get insights”:
“It’s wild to think about now, but at the time it felt like just another platform. That kind of mistake has massive implications for security and compliance.”
This is the crux of AI literacy: understanding not just how to use AI, but also where the risks lie; from data privacy and compliance breaches to biased outputs and system vulnerabilities.
It starts with personal responsibility
AI literacy isn’t about becoming a machine learning engineer. It’s about being aware, informed, and responsible. This means:
- Knowing what happens to your data when you use AI tools.
- Understanding the difference between AI-generated and human-created content.
- Recognising the ethical and compliance risks before pressing “submit.”
JP stresses the importance of personal accountability:
“It’s not just about your organisation. Even in your own life, banking, personal data, family photos… You need to understand the implications of AI. That awareness is what protects you.”
Learning is not a one-off
One of the clearest takeaways from our latest AI survey was that training cannot be a box-ticking exercise. Jack recommends a rolling approach:
“AI training should be refreshed every three to six months. This technology is evolving too fast for a one-and-done course.”
This means creating a culture of continuous learning of online courses, hands-on experimentation, and knowledge sharing across teams to help normalise AI use and build the kind of muscle memory that keeps businesses secure and competitive.
Building AI literacy across the workforce
Where should you start? Here are three practical steps:
- Baseline awareness. Run introductory sessions to ensure everyone knows what AI is (and isn’t), how it works, and why it matters.
- Applied learning. Give employees the chance to try AI tools in controlled settings. Show them how these tools apply to their actual workflows.
- Guardrails and guidance. Pair literacy with policy. Make it clear what’s allowed, what’s off-limits, and where to go with questions.
Jack suggests making this a cultural shift, not just a compliance exercise:
“The more people understand both the benefits and limitations of AI, the better they’ll be at spotting real opportunities to use it. That’s when AI becomes an enabler, not just another shiny tool.”
Beyond work: AI literacy as a social responsibility
AI literacy isn’t just about career development. It’s also about helping those around us adapt. As Jack points out, older generations often struggle to spot AI-generated content, while younger people may not yet grasp its risks.
“It’s not just a learning journey for you. It’s about helping your parents, grandparents, even your kids understand what AI is and how to navigate it safely.”
This makes AI literacy a broader social responsibility. The more people who can spot manipulated images, biased outputs, or fake content, the healthier our workplaces and communities will be.
AI isn’t going away. It’s already woven into the tools you use every day, whether you notice it or not. So, learning how it works, what to trust, and where to apply it responsibly is no longer optional.
The question is: will you be the person who waits for your organisation to catch up, or the one who gets ahead and helps lead the change?
Looking to build in-house AI capability into your workforce? Get in touch today.