One in three Aussie workers expose sensitive company data to AI platforms, Josys warns
SYDNEY, 3 September 2025 – More than 1 in 3 Australian professionals are regularly uploading sensitive company data including strategy documents, financials, and customer’s personally identifiable information (PII) into AI platforms, often without any formal oversight. This is one of the key findings from the Shadow AI Report 2025 by Josys. It warns that a surge in “shadow AI” — employee use of unauthorised AI platforms that bypass security protocols — is exposing Australian companies to serious compliance and data risks.
Despite the growth in AI adoption, 70% of organisations have moderate to no visibility into what AI tools are being used, creating massive blind spots. Smaller businesses are particularly vulnerable, with only 30% of companies with fewer than 250 employees feeling fully equipped to assess AI risks, compared to 42% of larger organisations. As economic uncertainties and job pressures mount, users eager to capitalise on AI’s productivity gains are unintentionally creating backdoors for data leaks and compliance violations.
“Shadow AI is no longer a fringe issue. It’s a looming full-scale governance failure unfolding in real time across Australian workplaces,” said Jun Yokote, COO and President of Josys International. “While the nation is racing to harness AI for increased productivity, without governance, that momentum quickly turns into risk. Productivity gains mean nothing if they come at the cost of trust, compliance, and control. What’s needed is a unified approach to AI governance which combines visibility, policy enforcement, and automation in a single, scalable framework.”
AI is moving faster than governance
The report surveyed 500 Australian technology decision makers and reveals a worrying gap between AI usage and organisational preparedness:
As AI adoption accelerates, critical sectors and smaller businesses in Australia are becoming overwhelmed. Without effective technology oversight for policy enforcement and training, many organisations risk falling into a cycle of reactive governance and compliance failures.
Nearly half (47%) of respondents cite upcoming AI model transparency requirements and Privacy Act amendments as top compliance hurdles. Despite the growing complexity, 50% still rely on manual policy reviews, while a third (33%) have no formal AI governance processes in place. Even among those with some level of oversight, only 25% believe their current enforcement tools are highly effective. This highlights a widespread gap between regulatory compliance and organisational readiness.
With recent reforms to the Australian Privacy Act and growing pressure for AI model transparency, Josys is calling for Australian organisations to take immediate and coordinated action. This includes auditing AI usage across all teams to close visibility gaps, automating risk assessments based on data sensitivity and business function, enforcing real-time policies aligned to role-based access and risk tiers, and ensuring the organisation is audit-ready with AI-specific compliance reporting. Without these foundations, Yokote says businesses risk falling behind not just in compliance, but in the trust and resilience needed for long-term productivity.
About Josys
Josys is the SaaS Management Platform that simplifies how IT works. Our holistic approach equips IT teams with 360-degree control over their SaaS applications by making it easier to visualise user access, analyse utilisation trends, and automate provisioning processes that will make IT operations run more efficiently.
About the research
This research was conducted by Josys, in collaboration with independent research firm Censuswide, based on a survey of 500 Australian technology decision makers across a range of sectors and company sizes. The study explores how the growing use of unsanctioned AI tools, known as Shadow AI is increasing risk for organisations, and underscores the urgent need for automated AI and policy governance frameworks.