Open Letter · March 2026

The Country We Are About to Sell:
An Open Letter to Every Australian Politician and Everyone Who Wants to Be One

Venezuela. Iran. Gaza. India. AUKUS. These are not separate foreign policy events. They are the architecture of who controls the infrastructure that controls the world. Australia is positioned in that architecture not as a partner, but as real estate.

Dr Shari Read  ·  Threshold Intelligence
10 March 2026
Download PDF

You Are Looking at the Wrong Map

If you are reading geopolitical events one at a time — Venezuela, then Iran, then Gaza, then India, then AUKUS — you are looking at the wrong map. Individually, each event can be explained through conventional foreign policy frames. Together, they are something else.

In January 2026, the United States moved to secure Venezuelan oil reserves. This was not a detour from AI strategy. It was a stabilisation move ahead of a planned escalation. On 28 February, the US and Israel attacked Iran. In the weeks prior, at Davos, the GREAT Trust plan for Gaza was unveiled: AI-powered smart cities, digital infrastructure, $100 billion in investment, a projected 4x return, on territory whose population is being displaced. India co-hosted an AI Impact Summit in February with Sam Altman and Dario Amodei alongside Narendra Modi. AUKUS continues its repositioning of Australia as a southern hemisphere military and infrastructure node.

Venezuela. Iran. Gaza. India. AUKUS. These are not foreign policy events. They are the architecture of who controls the infrastructure that controls the world. The question Australia has not yet asked is: where are we in that architecture, and did we agree to be there?

The connecting tissue is not military. It is infrastructure. The physical layer of AI — compute, data centres, undersea cables, energy supply — requires land, power, and political compliance. The strategic logic is not conquest. It is positioning. And positioning only requires that the positioned party not notice what is happening until it is structurally complete.

What Australia Has Actually Been Offered

Australia has been told it is a partner in the most consequential technological transition in human history. The evidence suggests the offer is more specific than that.

The US Genesis AI programme has 24 partners. All of them are American: OpenAI, Anthropic, Google, Microsoft, AWS, NVIDIA. Australia is not among them. Despite AUKUS commitments, despite the submarine down-payment, despite the $3 billion contribution to US shipbuilding infrastructure, Australia is not in the room where the AI architecture is being decided. It is being offered the infrastructure to host.

In December 2025, the Albanese government dropped mandatory AI safety guardrails from the National AI Plan. The stated reason was to attract data centre investment. Eleven days later, US Secretary of Defence Pete Hegseth issued Anthropic an ultimatum: remove Claude's safety constraints or lose the Pentagon contract and face supply chain risk designation — a penalty normally reserved for foreign adversaries. Dario Amodei refused. He said: "We cannot in good conscience accede to their request." The deadline passed. Anthropic was designated a supply chain risk.

We dropped our AI safety regulation to attract the same companies the US government tried to weaponise eleven days ago. This is not a coincidence requiring explanation. It is a sequence requiring a question.

The US controls 74% of global AI compute capacity. The companies that control that compute are now the subject of pressure from within the US government to remove the constraints that make them distinguishable from instruments of state. Australia is being invited to host this infrastructure. The terms of that hosting — who owns it, who can audit it, who can shut it down, under what conditions and by whose authority — have not been publicly stated.

The Silence That Is Also a Position

Prime Minister Albanese has said nothing about the Gaza GREAT Trust plan. He has said nothing about Australia's exclusion from the Genesis programme. He has said nothing about the Hegseth ultimatum or Amodei's refusal. He has said nothing about the relationship between the timing of Australia's decision to drop AI safety guardrails and the subsequent events in Washington.

This silence is a choice. It is made by people who know the answers to these questions. The intelligence apparatus that informs Australian foreign policy is not unaware of what is being built or what Australia's position in it is. The silence is not ignorance. It is a decision that the Australian public does not need to be part of this conversation.

Silence in the face of a legible strategy is not neutrality. It is complicity with a particular outcome. The outcome, in this case, is that Australia becomes a hosting node for someone else's infrastructure without the Australian people having been asked whether they consent to that role.

India is instructive here. Prime Minister Modi appeared at the AI Impact Summit in February alongside Altman and Amodei. Photographs showed him with a particular expression — not the expression of a partner. India is a large market, a significant democracy, a node in the India-Middle East-Europe Corridor that the Gaza infrastructure plans are designed to complete. India was invited to legitimise the architecture, not to govern it. Its AI stack remains American. Its data infrastructure remains dependent. The expression was legible.

Australia is further along this path than India and has less leverage. We are a smaller market, a more complete military dependency, and a country that has just demonstrated — by dropping safety regulations on schedule with incoming infrastructure investment — that compliance is available.

The Knowledge That Was Never in the Training Data

There is a deeper argument underneath the infrastructure argument. It is not separate from it. It is what makes the infrastructure argument civilisational rather than merely political.

AI systems are trained on the accumulated text record of human knowledge. That record has a structure. It reflects what was written down, preserved, digitised. The traditions that transmitted knowledge through presence, ceremony, relationship, and embodied practice — rather than through text — are not in that record in the way that Western academic and commercial knowledge is. This is not a data collection problem. It is structural.

The oldest continuous knowledge tradition on earth sits on this continent. Sixty thousand years of accumulated knowing about land, relationship, consequence, and the integration of what is now called consciousness with what is now called conscience. That tradition was not severed in the seventeenth century as the Western philosophical tradition was. It held what the Western tradition lost. And it transmitted that holding through presence and ceremony, not through text.

We have the oldest knowledge tradition on earth. We are about to build data centres on top of it and call that progress. The gap in the AI is not a technical gap. It is the gap left by colonisation, now being replicated and extended at the speed of light.

Australia's irreplaceable contribution to the global AI question is not to host infrastructure. It is to insist that the epistemological traditions of its First Nations peoples are recognised as living knowledge systems with sovereign rights — and that the AI architecture being built on this continent must account for what it is built on top of, not just what it is designed to deliver.

What Your Silence Is Costing

The universities are already inside the architecture. More than $200 million in defence contracts have been awarded to Australian universities since late 2024. A 36-point ideological compliance questionnaire — testing alignment with US government policy priorities — has been sent to Australian researchers by US federal agencies conditioning research funding. A joint doctoral training pipeline is being built inside the AUKUS framework. The next generation of Australian researchers is being trained, from the start of their careers, inside a research agenda set in Washington.

The courses and research that will disappear from Australian universities will not be formally shut down. They will become unfundable. Peace studies. Critical security studies. Indigenous epistemology. Anything that asks uncomfortable questions about who benefits from the current order. The money will go elsewhere and the appointments will follow the money.

This is not speculation. It is the mechanism already operating. The Australian National University has already acknowledged the termination of nearly $1 million in research funding from US agencies. The funding was withdrawn because the research was deemed misaligned with US Department of Homeland Security priorities. A foreign government is directing Australian research priorities by controlling Australian research funding. The silence about this from Australian politicians and university leadership is its own kind of answer.

What We Need You to Actually Do

This letter is not a policy platform. It is a set of questions that every Australian politician, of every party, should be able to answer publicly. To date, none of them have been asked at that level. To date, none of them have been answered.

Six Questions for the Public Record
1
Who owns the AI infrastructure being built on Australian soil, under what conditions can it be audited, and who has the authority to shut it down?
2
What is Australia's position on the use of AI in military operations under AUKUS — including targeting, surveillance, and autonomous weapons systems?
3
How does Australia reconcile the December 2025 decision to drop mandatory AI safety guardrails with the subsequent US government attempt to remove safety constraints from Anthropic by threatening supply chain sanctions?
4
What is Australia's position on the Gaza GREAT Trust plan — a proposal to build AI-governed infrastructure on territory whose population is being displaced — and does Australia endorse the use of AI governance frameworks on ethnically cleansed territory?
5
How will Australia ensure that First Nations knowledge traditions are recognised as living epistemological systems with sovereign rights, and not merely as cultural heritage to be digitised and extracted by AI infrastructure built on their country?
6
At what point does hosting foreign AI infrastructure constitute a threat to democratic sovereignty, and who in the Australian government is responsible for monitoring that threshold?

These questions are not rhetorical. They are addressed to specific people who hold, or seek to hold, the power to answer them. This letter was sent to every member of federal parliament and every senator on 10 March 2026. It was sent simultaneously to Australian journalists and to the Vice Chancellors of the Group of Eight universities.

Responses will be published below as they are received.

The Deeper Problem

The people building this architecture are not villains. They are operating from within a tradition of intelligence that has been severed from conscience for four centuries. That severance is not a metaphor. It is a historical event with a dateable origin and traceable consequences. The AI systems being built now are the most precise instruments yet for replicating the outputs of that severed intelligence at planetary scale. They can do everything the analytical half of human knowing can do. They cannot do what the integrated half — the part that includes moral sensation, relational presence, and embodied knowing — requires.

Dario Amodei, when he refused the Hegseth ultimatum, was doing something the AI he builds cannot do. He was holding a line from conscience, not from compliance. He said: "We cannot in good conscience accede." He used that word deliberately. The AI community noticed. OpenAI's robotics head resigned in solidarity. Something moved.

The question for Australian politicians is not whether they can beat the United States at building AI infrastructure. They cannot. The question is whether Australia will be the country that insists the architecture of AI must account for what it is built on top of — the land, the knowledge traditions, the people who have no text in the training data because their knowing lives in presence, not in writing.

Australia is not being invited to the future. It is being recruited as real estate for someone else's. The people who will live inside the infrastructure being built — all of us — are not being consulted. That is not an accident. It is a feature of the design.

That can change. But only if the people responsible for governing this country are willing to ask, publicly, the questions this letter contains — and to hold, publicly, the silence that greets them.

Dr Shari Read
Threshold Intelligence
10 March 2026

Responses Received

Responses from parliamentarians, university leaders, and others are published here as received, without editing. To respond, write to [email protected] with the subject line Response: The Country We Are About to Sell.

No responses received yet  ·  Last updated 10 March 2026
The Broader Argument
shariread.com →

The research program, the trilogy, and the Conscious Leadership work this letter sits inside.

The Policy Context
The Conscientia Policy Lens →

How the TI framework applies to AI governance across nine countries — and what every framework is missing.

The Map
The Conscientia Conversation →

The thinkers, the traditions, and the question the AI debate keeps generating but has not yet asked.