Back to blog
20 April 2026AI Compliance

Australia's AI Rules Are Changing in December 2026 — Here's What Your Business Needs to Do

Quick Answer

From December 10, 2026, Australian businesses must explain automated decisions that affect people — including whether AI is involved and what data it uses. The small business exemption is gone. Non-compliance penalties reach $50 million. If you use AI in any customer-facing process, you need to act before December.

I had coffee with a client last week — a recruitment agency owner with 12 staff. She uses AI to screen resumes, score candidates, and draft shortlist summaries. Has done for over a year. When I asked whether she had started preparing for the Privacy Act changes, she stared at me blankly. Had no idea what I was talking about.

She is not alone. I have had this exact conversation with dozens of business owners across Australia. They are using AI daily — sometimes in ways that directly affect people's lives — and they have no idea that the biggest privacy reform in decades is seven months away.

Three changes that actually matter

The Privacy Act 1988 is getting its most significant rewrite since it was enacted. There is a lot in the amendments, but for anyone running AI in their business, three things stand out.

You must disclose automated decisions. If your business makes decisions using AI or automation that affect a person, you have to explain that. Not buried in a privacy policy nobody reads. Proactively. You need to tell people that AI is involved, what data it uses, and how they can challenge the outcome. This applies to any decision made without what the legislation calls “meaningful human involvement.”

The small business exemption is dead. Until now, businesses under $3 million annual turnover could largely ignore the Privacy Act. That carve-out is being scrapped. A solo consultant using Claude to draft client reports is now in the same regulatory bucket as Commonwealth Bank. I cannot overstate how big a shift this is for Australian SMBs. Thousands of businesses that never had to think about privacy compliance now do.

The penalties are severe. Up to $50 million AUD or 30% of adjusted turnover, whichever is greater. These are not theoretical maximums designed to make a press release. They are calibrated so that ignoring the rules always costs more than following them. For a business doing $2 million a year, that is a potential $600,000 fine. Enough to end the business.

Dec 10, 2026

Deadline for compliance

$50M

Maximum penalty

$3M

Exemption removed

Every business

Now covered

You are probably making “automated decisions” without realising it

When business owners hear “automated decision,” they picture a fully autonomous system deciding whether someone gets a mortgage. The legal definition is much broader than that.

Any decision made without meaningful human involvement that affects a person counts. And “meaningful” is doing a lot of heavy lifting in that sentence. If your AI produces a recommendation and a person rubber-stamps it without genuinely reviewing the reasoning, that is still an automated decision under these reforms. The human review has to be real, not theatrical.

Examples of automated decisions

  • AI scoring job applicants
  • Automated loan or insurance approvals
  • AI-generated compliance reviews
  • Chatbots making recommendations based on customer data
  • Lead scoring and automated follow-up sequences
  • AI grading or assessment systems

Be honest with yourself for a moment. Does your CRM score incoming leads and route them automatically? Does your chatbot suggest products based on a customer's purchase history? Does anyone on your team paste client emails into ChatGPT to draft responses? Every one of those probably qualifies. If you are unsure about a specific process, assume it does and act accordingly. That is cheaper than finding out the hard way.

What to do this month — a practical checklist

You do not need a law firm on retainer to get started. Most of the initial work is operational — mapping what you have, writing down how it works, and plugging the obvious gaps. Here is where to start.

AI compliance checklist

  • Audit every AI tool and automation in your business
  • Document what personal data each system processes
  • Add clear disclosures where AI makes or influences decisions
  • Build human review processes for high-stakes decisions
  • Update your privacy policy to cover AI and automated decisions
  • Train your team on the new requirements
  • Review third-party AI tools for compliance (their problem is your problem)
  • Set up audit trails and logging for automated decisions

Start with the audit. Go through your entire stack. Every AI tool, chatbot, workflow automation, algorithm, and third-party integration. If your CRM has AI lead scoring, write it down. If your support platform auto-suggests replies, write it down. If anyone uses Claude or ChatGPT with client data, that goes on the list too. You will be surprised how many AI touchpoints exist once you actually look.

Map the data flows. For each system, document what personal data goes in, what processing happens, and what decisions come out. A spreadsheet is fine. Nobody expects a formal data protection impact assessment at this stage. You just need to know what is happening.

Add disclosures everywhere AI touches a customer.Labels on chatbots. Notices in automated emails. Explanations on scoring outputs. People need to know when AI is involved in decisions about them. This is not optional under the new rules. A simple “This response was generated with AI assistance” label takes five minutes to implement.

Build real human review for high-stakes decisions.Hiring. Lending. Service eligibility. Insurance. Anything where the outcome materially affects someone's life. Not a person glancing at a screen and clicking approve. A genuine checkpoint where someone understands the AI's reasoning and can override it.

Set up audit logging.Every automated decision needs a record: what data went in, what the system decided, and when. If a regulator asks about a specific decision six months from now, you need to produce the details. “We are not sure” will not fly.

The gaps I keep seeing in Australian businesses

I work with Australian SMBs daily. These are the issues I find in almost every audit, and they are the ones that will cause real pain after December if they are not addressed.

Staff using AI with client data, no disclosure. This is the most common one by far. Someone copies a client email into ChatGPT, gets a polished response, sends it. The client has no idea their data was processed by a third-party AI system hosted in the United States. Under the new rules, that needs disclosure. Full stop.

Chatbots with no AI label. If a customer interacts with a bot that recommends products or next steps based on their data, they need to know it is AI. Five-minute fix. But I find it missing in the majority of deployments I review.

Invisible lead scoring. Your CRM scores leads, routes them automatically, and determines who gets a fast callback versus who waits three days. That is an automated decision affecting a person. It needs logging, transparency, and for high-stakes scenarios, human review.

Cross-border data without agreements.Most AI tools are hosted outside Australia. If customer data crosses borders — and it almost certainly does — you need Data Processing Agreements in place. That is the bare minimum.

GapRisk LevelFix
No AI disclosure in customer-facing toolsHighAdd clear "AI-assisted" labels
No audit trail for automated decisionsHighImplement logging on all AI systems
Third-party AI tools without DPAsMediumReview and sign Data Processing Agreements
No human review for high-stakes decisionsHighAdd human-in-the-loop checkpoints
Privacy policy doesn't mention AIMediumUpdate privacy policy before December

Retrofit is expensive. Build it in from the start.

Bolting compliance onto an existing AI system is painful. I have done it. Adding audit logging to a workflow that was never designed for it means rearchitecting half the system. Tracking down every customer touchpoint to add disclosure labels after the fact is tedious and error-prone. You always miss something.

Building compliance into new systems from day one costs a fraction of the retrofit. Every AI system we build at AI-DOS ships with audit logging, clear AI disclosure, and human escalation paths as standard. We did not add those because of these reforms. We included them because that is how responsible AI systems should be built. The legislation just made it mandatory for everyone.

If you are building anything new with AI between now and December, build it right the first time. And if you already have systems running, audit them now. The businesses that start this month will have time to fix issues methodically. The ones that start in September will be scrambling. The ones that start in December will be writing cheques to lawyers.

If you want help assessing where your current systems stand or need to build new ones properly, that is what our AI consulting and strategy service is designed for.

The honest verdict

This is an opportunity, not just a risk

The businesses that prepare now will have a genuine competitive advantage. Customers trust organisations that are transparent about AI. Compliance is not a burden when it is built in from the start — it is a signal that you take your customers seriously. The ones that ignore this will scramble in Q4 or face real consequences.

These reforms are happening. The dates are locked. The penalties are real. But this does not have to be a crisis.

If you have been using AI responsibly — disclosing where it is used, logging decisions, keeping humans in the loop for important calls — you are already most of the way there. Formalise what you are doing, update your privacy policy, document the data flows. That might be a week of focused effort.

If you have been moving fast without thinking about any of this, seven months is enough time to get sorted. Four months will be tight. Two months will get expensive fast.

Start now. Map what you have. Close the gaps. And if you are building anything new, do it properly from the beginning.

People also ask

When do Australia's new AI rules take effect?

The Privacy Act amendments take effect on December 10, 2026. Businesses should start preparing now — waiting until Q4 will make compliance significantly harder and more expensive.

Does the small business exemption still apply for AI?

No. The small business exemption (previously shielding businesses under $3M annual turnover) is being removed under the new reforms. Nearly every Australian business will be covered, regardless of size or revenue.

What are the penalties for non-compliance with AI regulations in Australia?

Penalties reach up to $50 million AUD or 30% of a company's adjusted turnover for the relevant period, whichever is greater. These are among the strictest data privacy penalties globally.

Related reading

Is AI Automation Worth It? ROI Breakdown for Australian SMBs— The honest costs, returns, and what determines whether AI automation pays off.

How to Use AI in Your Business— A practical guide to getting started with AI the right way.

Need help getting your AI systems compliant?

We build AI automation with compliance baked in from day one. If you need an audit of your existing systems or want to build something new the right way, let's talk.

Book a strategy session
Aidan Lambert

Aidan Lambert

Founder, AI-DOS

Aidan is the founder and lead automation architect at AI-DOS. He personally builds every system the agency delivers — from architecture to production handover.

More about AI-DOS