
What would it take to better protect Australians from secret or opaque data harvesting of private information by big tech companies?
paul mallett calls for the Federal Government to tighten privacy laws to protect Australians from secret or opaque data harvesting of private information by big tech companies.
The issue of ‘surveillance capitalism’
paul believes every time we pick up our phone, use a computer, or even walk past certain cameras, tiny bits of our lives are being watched and recorded — often without us fully knowing it. This is not just about social media or young people on their phones — it affects all of us, our families, and our communities.
paul is concerned with the new economic model that has grown in the past 20 years. Big tech companies give us free tools — but in return, they collect information about what we do, where we go, who we talk to, and what we like. The major concern is that they sell this information to make huge profits — by predicting what we might buy, how we might vote, and even what might grab our attention or make us angry.
paul notes this is not just about ads. This information can be used to change our choices — without us even realising it. It shapes what news we see, what job offers we get, what our children are shown. It’s like someone peeking through our windows 24/7 — and selling what they learn about us. Most troubling is the fact that most of us never gave proper permission for this. The rules are so hidden and confusing, it’s almost impossible to say no.
paul objects to the unfairness of this new economic model. This new system concentrates power and money in the hands of a few giant companies. It creates unfair advantages — some people get special deals, others get left behind. Most troubling is that it can spread false information, divide communities, and weaken trust in our democracy.
Shoshana Zuboff’s book, The Age of Surveillance Capitalism (2019), outlines how big tech companies turn personal data into a new source of immense profit — at huge social cost. The key ideas are:
- Surveillance capitalism is a new economic logic. This is distinct from industrial capitalism — here, the raw material is not nature or labor, but human experience turned into data.
- Behavioral surplus. Companies don’t just collect data needed to improve services — they extract behavioral surplus: data that goes far beyond what’s needed, used to predict and manipulate future behavior.
- Prediction and modification. The goal isn’t just to predict what people will do — but to nudge or shape it to maximize profits. This erodes individual autonomy.
- Hidden operations. Much of this extraction happens secretly or opaquely. Consent is often meaningless — users don’t fully understand what they’re agreeing to.
- Loss of democratic control. Surveillance capitalism concentrates power and knowledge in private hands, undermining democratic norms, personal freedom, and even markets themselves.
- We are the products. The famous line: If it’s free, you’re the product. Zuboff says: actually, you’re the raw material. Your behavior is the product sold to advertisers.
- Call for collective action. Zuboff argues this isn’t inevitable. Citizens, governments, and civil society can — and must — push back through regulation, new rights, and norms.
The goal to create safe, fair and inclusive communities
paul calls on the Federal Government to explore:
- Strengthen data protection laws. Pass robust privacy legislation (like in Europe). Limit what data companies can collect, store, and sell. Make privacy the default — not something you ‘opt into’.
- Ban or limit manipulative practices. Outlaw behavioral targeting that exploits vulnerabilities (e.g., microtargeted political ads or addictive design).
- Strengthen transparency and accountability. Companies must disclose how data is collected, used, and shared. Make algorithmic decision-making explainable and auditable.
- Give users meaningful control. Easy tools to see, correct, delete, or export personal data. Clear, simple consent — not hidden in legal jargon.
- Tackle monopolies. Support competitive, ethical alternatives.
- Educate citizens. Digital literacy campaigns so people understand risks and rights. Encourage civic participation in shaping tech policy.
- Promote inclusive design. Tech systems must work for everyone, not just for profit. Address biases in AI and automated systems that can reinforce inequality.
- Build public-interest digital infrastructure. Support alternatives like public data trusts or cooperative platforms. Treat some digital services as essential public goods.
