A new employer survey from the Digital Education Council in partnership with the Global Finance and Technology Network confirms what many teams already feel at their desks each morning: artificial intelligence has become a standard part of daily work in large enterprises. With responses drawn from more than 100 major employers across 18 industries and 29 countries, representing over 4 million employees, the study shows widespread use of AI for routine knowledge work and a growing presence in creative and technical tasks. The findings align with other 2025 snapshots and point to a clear message for universities and VET providers in Australia. Curricula, assessments, and educator capability must move faster to match how AI is actually used at work
Where AI is being used and how often
Across the sample, 56 per cent of organisations report that most employees use AI every day, and another 29 per cent report at least some daily use within teams. That level of penetration suggests a shift from experimental pilots to embedded practice. Adoption is strongest in knowledge work but now reaches routine operations and creative workflows. The most common jobs for AI are information search at 78 per cent, document summarisation at 74 per cent, brainstorming and email drafting both at 71 per cent, content creation at 62 per cent, data analysis at 45 per cent, workflow automations at 39 per cent, and coding at 20 per cent. DEC characterises the current phase as assistive AI, where generative tools augment human work. More autonomous agent systems are beginning to appear, yet they are not the norm for most firms today
What employers say about productivity and maturity
Sixty-three per cent of respondents describe AI as very helpful or game-changing for individual productivity. At the organisation level, 52 per cent see clear benefits, especially reduced time on repetitive tasks and faster iteration. The picture is not uniform, though. Thirty-six per cent report mixed results, usually a combination of local wins with oversight challenges, and 12 per cent report little or no impact so far. The pattern is consistent with fast diffusion and uneven depth, where usage spreads quickly while governance, measurement, and skills lag behind
New expectations for graduates
Hiring standards have shifted. A slim majority of employers, 51 per cent, now expect baseline AI proficiency in all graduates. A further 22 per cent require it for specific roles, 13 per cent are undecided, 7 per cent plan to train on the job, and only 7 per cent say AI is not yet required. Underneath that headline is a quality question. Fifty-three per cent of employers are concerned about graduates' ability to interrogate, verify, and appropriately extend AI-generated content. The bar has moved from can you use a tool to can you reason with it, detect errors, and apply domain knowledge responsibly to complete real tasks
The friction points that slow value.
Employers identify three binding constraints. The first is data security and privacy at 56 per cent. The second is a lack of knowledge and skills at 53 per cent. The third is the absence of clear policies or guidelines at 34 per cent. More than half of employers report no formal governance process for AI use, and 41 per cent say they offer no AI-related training at all. Where training happens, it skews practical and embedded, for example, peer sharing, short workshops, repositories of vetted use cases, and on-the-job upskilling. In short, adoption is racing ahead while enablement and oversight chase from behind
Roles are changing rather than simply disappearing
Seventy-two per cent of employers foresee reductions in some traditional roles as AI becomes part of core workflows. At the same time, 62 per cent expect net growth in AI-centred roles, particularly AI engineers, data architects, integration specialists, prompt engineers, strategy leads, and ethics officers. The dominant view is role redesign rather than pure substitution, which implies reskilling, new task bundles, and closer alignment between human judgement and tool capability
What the results mean for Australian higher education and VET
The message to providers is direct. Employers want AI-literate graduates who can apply domain knowledge through AI tools, manage privacy and intellectual property, and work within clear policy boundaries. They also want critical thinkers who can test probabilistic outputs, identify hallucinations, and design verification steps into their workflows. Ethics and responsible AI are not a niche elective anymore. Privacy, bias mitigation, inclusivity, transparency, and accountability need to live across all disciplines with explicit standards and assessment. Soft skills remain central. Communication and teamwork are still at the top of employer wish lists, and they must now be demonstrated in AI-supported contexts. Above all, employers want applied practice. That means work-integrated learning where students use AI with real constraints, real datasets, and real accountability. DEC’s commentary adds a blunt datapoint. Only 3 per cent of surveyed employers believe current degree programs prepare graduates adequately for practical and responsible AI use.
Closing the readiness gap in programs and assessment
Translating the findings into program design involves five practical moves. First, make AI usage explicit in unit outlines. Spell out disclosure norms, citation rules, permissible tools, and red lines. Second, assess the process as well as the product. Require students to submit their prompts, version history, tests, and rationale along with the final output. Third, embed governance and data safety by design. Give students tasks that include consent, de-identification, secure storage, and audit logs. Fourth, co-design with employers. Refresh use cases each term so they reflect live practice and tool chains in local industries. Fifth, invest in educator capability. Staff need time, tools, and professional learning to model safe and effective use in class, studios, workshops, and labs.
How does this sit with the broader 2025 evidence base
McKinsey and other 2025 studies report near-universal executive awareness of generative AI and rising investment intentions. They also report a maturity gap. Only a small subset of firms describe themselves as skilled adopters, and many leaders are still catching up to how employees actually use AI day to day. Faculty surveys cited by DEC show a similar split in education. Eighty-three per cent of educators worry about students' ability to critically engage with AI outputs, while only 4 to 6 per cent feel their institutions provide comprehensive guidance. These strands point in the same direction. Usage is high, policy clarity is low, and training is uneven. That is why the DEC findings read less like hype and more like a practical to-do list for institutions and employers in Australia
A short agenda for institutions that want to move now
Providers can act in the current teaching year. Start by publishing a simple AI in a learning and assessment guide that students and staff can actually use. Approve a small set of tools, set reasonable disclosure and citation rules, and create a fast pathway for new tools to be evaluated. Stand-up short, role-relevant workshops for staff. Build an internal bank of use cases aligned to local industries and map them to specific units. Pilot process-based assessment in a few gateway subjects. Swap a single high-stakes assignment for a sequence that includes a live demonstration, an oral defence, and reflective evidence. Report what works and iterate. This is how the readiness gap closes at speed.
Bottom line
In large companies, AI has become a daily instrument for search, summarisation, drafting, content creation, analysis, and growing slices of automation and code. Employers expect graduates who can use these tools with judgment, who understand the risks, and who can explain and defend their choices. The current gap between workplace practice and education practice is solvable, but it requires visible changes to curriculum, assessment, industry engagement, and educator capability. The sooner providers in Australia embed AI across disciplines and invest in applied, work-integrated learning, the sooner new entrants will be ready to think with AI responsibly from day one.
