More than 1,100 data breaches were reported to the Australian Information Commissioner in 2024 — the highest annual total since mandatory breach reporting began in 2018, up 25 per cent on the year before. Almost every single day, Australians are having their personal information accessed, stolen, or sold by organisations that were trusted to protect it. And the uncomfortable truth is that most of that data was handed over willingly — to Google, Meta, Amazon, Apple, and the other big tech companies operating in Australia — long before a hacker ever got involved.
Big tech companies in Australia aren’t just technology providers anymore. aren’t just technology providers anymore. They are the infrastructure of modern life. And that infrastructure has a surveillance problem that no terms-of-service update is going to fix.
This post covers what big tech companies in Australia actually collect, how that data is used, who it gets shared with, and what Australians can practically do about it. If you want to understand the full picture — the data brokers, the government requests, the facial recognition in your local Bunnings — this is the place to start.
The data collection starts the moment you pick up your phone. Google processes over 8.5 billion searches every day. Meta’s pixel tracker is embedded on roughly 20 per cent of the world’s most visited websites, silently logging your behaviour long after you’ve closed Facebook or Instagram. Amazon knows your purchasing patterns, your delivery address, your browsing hesitations. Apple knows everywhere you’ve been.
When a journalist requested her personal data from Meta in 2024, she received over 20,000 pages of records — including 20,000 interactions with websites and apps that had no direct connection to her Meta accounts at all. That’s one platform. Google and Apple hold comparable volumes of data, often more.
The information collected includes your location history, IP address, device type, browsing behaviour, dwell time on pages, the contacts in your phone, the content of your messages on some platforms, your health data from apps, your biometric data from facial recognition, and increasingly, the content of your conversations with AI chatbots. Meta announced in late 2024 that it would begin harvesting conversations with its AI assistants across Facebook, Instagram, and WhatsApp for advertising targeting — a move privacy groups described as chatbot surveillance going mainstream.
This isn’t accidental. It is the business model. The digital advertising industry is projected to exceed one trillion US dollars globally. Your data is the raw material that fuels it.
Big tech companies Australia-wide collect only half the story — what happens next is worse. What happens to that data after it’s collected is where things get genuinely alarming.
Australia has a thriving data broker industry — companies you’ve almost certainly never heard of that collect, combine, package, and sell detailed profiles of individual Australians. According to ACCC research, 74 per cent of Australians are uncomfortable with their personal information being shared or sold. Most have no idea it’s already happening.
One Australian data broker has claimed to hold data on 85 per cent of the Australian population, combining location data from mobile phones, the electoral roll, online behaviour, and purchasing records into individual profiles sold to advertisers, marketers, and insurance companies. The ACCC’s investigation found that Australians renting a property, getting an insurance quote, or shopping online are routinely profiled without their knowledge or meaningful consent.
Under Australian Privacy Principle 3.6, collecting personal information from third parties for profiling purposes is technically illegal unless it would be unreasonable or impracticable to collect it directly from the individual. According to privacy law researchers at UNSW, this provision has almost never been enforced. Data enrichment — the practice of companies buying extra information about their own customers from brokers — continues largely unchecked.
The data can change hands dozens of times. A phone number traced by one ANU researcher during the 2025 federal election had passed through at least four different brokers since its original collection in 2014, eventually reaching political campaign operators who are exempt from privacy laws entirely.
There is a dimension to big tech surveillance that goes beyond advertising. Between 2014 and 2024, the number of Australian user accounts handed over to law enforcement by Google, Meta, and Apple grew dramatically — Google’s account disclosures to authorities rose 530 per cent over that decade, Meta’s surged 675 per cent, and Apple’s climbed 621 per cent. Collectively, these three companies shared data from over 3.16 million user accounts with US law enforcement alone in that period — and that figure excludes requests made under secret provisions of the Foreign Intelligence Surveillance Act.
Australia is part of the Five Eyes intelligence alliance. When Edward Snowden revealed the PRISM surveillance program in 2013, it emerged that Australia’s own signals directorate was cooperating with NSA programs that gave intelligence agencies access to user data held by Microsoft, Google, Facebook, Apple, and others. The Australian government knew about PRISM before Snowden made it public.
This is not ancient history. The question of where big tech company obligations to their users end and their obligations to government agencies begin is unresolved — and trending in the wrong direction. When your data sits on a platform’s servers, it is accessible to anyone who can compel that platform to hand it over. That includes advertisers, data brokers, and law enforcement agencies across multiple jurisdictions.
A deGoogled phone running GrapheneOS doesn’t hand your location, contacts, or usage patterns to Google’s servers in the first place. There’s nothing to hand over because the data was never collected.
The surveillance isn’t just happening on your phone. It’s happening in your local shopping centre.
In 2024, the Office of the Australian Information Commissioner found that Bunnings had scanned the faces of every customer entering 62 of its stores over a three-year period without consent, matching those images against a database of persons of interest. In 2025, the same finding was made against Kmart across 28 stores. Both companies argued security justifications. The Commissioner found those justifications did not outweigh the privacy impact of indiscriminately collecting the biometric data of everyone who walked through the door.
These cases matter beyond retail. They establish that biometric data — your face — is sensitive personal information requiring consent before collection. They also demonstrate that Australian regulators are prepared to act. The OAIC’s enforcement powers were significantly expanded in late 2024, and 2026 has been flagged as a year of active enforcement, not just education. Australia’s first-ever civil penalties under the Privacy Act were handed down in 2025 — $5.8 million against Australian Clinical Labs. Civil penalty proceedings are now underway against Optus for the breach that exposed the personal information of approximately 9.5 million Australians in 2022.
The regulatory environment is tightening. That’s good news. But regulation operates after the fact. It doesn’t stop your data from being collected. It just creates consequences for organisations that mishandle it — consequences that arrive years later, long after the damage is done.
For a deeper look at how biometric data is being used against Australians, read our post on biometric privacy risks and facial recognition.
The integration of artificial intelligence into big tech companies Australia relies on. First, AI makes previously unusable datasets useful. Data points that were too fragmented or numerous to act on individually can now be combined, cross-referenced, and acted upon at scale. The “safety in numbers” assumption — the idea that you could hide in a mountain of data — no longer holds.
Second, AI is accelerating the rate at which big tech companies Australia-wide expand their data collection. Meta has announced plans for fully AI-automated advertising by 2026, where the entire process of creative generation, audience targeting, and optimisation will run autonomously — fed by everything Meta knows about you, including the content of your AI chatbot conversations. Google’s advertising platforms are following the same trajectory.
Algorithmic recommendation systems — the engines that determine what you see on YouTube, Instagram, TikTok, and Facebook — are not neutral. They are designed to maximise engagement, which means maximising emotional response, which means surfacing content that triggers strong reactions. The FTC’s investigation into social media and streaming companies described this as “vast surveillance” of user data to monetise behaviour and shape what people see.
Your children are not exempt. From 10 December 2025, Australia became the first country to legislate a social media minimum age, requiring platforms including Facebook, Instagram, TikTok, YouTube, Snapchat, and Reddit to prevent Australians under 16 from holding accounts. Non-compliance carries fines of up to $49.5 million. The law exists because the evidence of harm — to mental health, to development, to privacy — became impossible to ignore.
To understand how AI surveillance is reshaping Australian society, read our post on AI, ethics, and privacy in a hyperconnected world.
The Privacy and Other Legislation Amendment Act 2024 significantly expanded the OAIC’s enforcement toolkit. The regulator can now issue infringement notices, impose civil penalties for a broader range of privacy breaches, and exercise enhanced investigative powers. A new statutory tort for serious invasions of privacy took effect on 10 June 2025, allowing individuals to sue directly for intentional or reckless privacy violations without needing to prove financial loss.
These are meaningful developments. But they address what big tech companies do with your data once they have it. They don’t address the fundamental problem: that the data is being collected in the first place, on a scale and with a depth that no regulatory framework has yet managed to contain.
The OAIC’s own 2025–26 regulatory priorities include scrutinising biometric technology, pixel tracking, and the power imbalances in the data broker industry. The regulator is catching up. In the meantime, Australians who want genuine privacy protection cannot rely on regulation alone.
Read our detailed post on how Big Tech companies are spying on Australians for a deeper look at the specific tracking methods in use.
Understanding the problem is the first step. The second step is building a digital environment that doesn’t participate in it.
Protecting yourself from big tech companies Australia-wide doesn’t mean disappearing from the internet.
It means making deliberate choices about the technology you use — choices that reduce what big tech companies can collect on you in the first place.
A deGoogled phone running GrapheneOS removes Google’s data collection from your mobile device entirely. No location data sent to Google. No app usage patterns. No advertising ID. The bootloader is relocked after installation — the same security architecture as a stock Android device, without the surveillance. It’s the approach recommended by Edward Snowden, and it’s the standard we build to at FreedomTech.
A privacy-focused Linux laptop removes Windows telemetry, Microsoft’s data collection, and the background processes that mainstream operating systems use to report your usage back to their developers. Linux Mint Cinnamon, which we pre-install on every FreedomTech computer, doesn’t phone home. It doesn’t harvest your documents. It doesn’t profile your search behaviour.
A Faraday bag addresses the hardware layer — the reality that a phone sitting in your pocket is still broadcasting signals even when you think it’s idle. Cellular, WiFi, Bluetooth, and GPS signals can all be used for passive location tracking. A Faraday bag blocks those signals entirely when hardware isolation is what you need.
For crypto investors, a dedicated crypto computer separates your digital asset activity from your everyday digital life — protecting against the browser-based tracking, malware exposure, and behavioural profiling that makes mainstream devices a genuine security risk for anyone managing serious digital assets.
For a deeper look at exactly how Big Tech tracks you — the mechanisms, the app permissions, and the algorithmic manipulation — read our post on the dark side of Big Tech and how your data is really being used.
For more on how surveillance shapes the political and social environment, read our post on digital privacy, voting, and data manipulation.
Yes. The Privacy Act 1988 applies to organisations with an annual turnover above $3 million, as well as smaller organisations that trade in personal information or provide health services. Major tech platforms operating in Australia are subject to the Australian Privacy Principles, though enforcement has historically been limited. The OAIC’s expanded powers from late 2024 mean that picture is changing in 2026.
In September 2022, Optus disclosed that the personal information of approximately 9.5 million Australians had been accessed without authorisation — including names, dates of birth, addresses, phone numbers, and government-issued ID numbers. Medibank’s breach, disclosed in October 2022, affected approximately 9.7 million current and former customers, with sensitive health data exposed. Civil penalty proceedings are now before the Federal Court against both companies. These breaches remain the most significant data security failures in Australian corporate history.
A data broker is a company that collects personal information about individuals from multiple sources — public records, loyalty programs, online behaviour, third-party sharing agreements — and sells or shares that information with other organisations. You typically have no direct relationship with them and no knowledge that your data is being traded. The ACCC has investigated the Australian data broker industry and found significant concerns about transparency, consent, and the potential for consumer harm.
A VPN can mask your IP address and encrypt your traffic between your device and the VPN server. It does not prevent data collection by apps running on your device, by platforms you’re logged into, or by the operating system itself. It is one layer of protection, not a complete solution. On FreedomTech computers, Mullvad VPN is pre-installed as part of a broader privacy architecture that includes a hardened browser, DNS-over-TLS, and a privacy-focused operating system.
Based on the OAIC’s 2024 and 2025 determinations against Bunnings and Kmart respectively, collecting biometric information through facial recognition technology without proper notification and consent breaches the Australian Privacy Principles. The Privacy Act is technology-neutral — it doesn’t ban facial recognition, but it sets a high bar for consent and proportionality that both retailers failed to meet.
Big tech companies in Australia operate with a simple assumption: that you won’t notice, won’t object, or won’t know what to do about it. The data collection happens in the background. The profiling is invisible. The consequences — identity theft, targeted manipulation, surveillance by entities you’ve never heard of — arrive later, if at all.
At Freedom Technology and Services, we help Australians build a digital environment that doesn’t feed the machine. Not through paranoia, but through practical, tested technology choices that keep your data where it belongs — with you.
If you’re ready to take the first step, start with our deGoogled phones and tablets store, explore our privacy computers and laptops, or reach us at [email protected].
Added to cart
Check out our shop to see what's available