No Result
View All Result
SUBMIT YOUR ARTICLES
  • Login
Monday, April 27, 2026
TheAdviserMagazine.com
  • Home
  • Financial Planning
    • Financial Planning
    • Personal Finance
  • Market Research
    • Business
    • Investing
    • Money
    • Economy
    • Markets
    • Stocks
    • Trading
  • 401k Plans
  • College
  • IRS & Taxes
  • Estate Plans
  • Social Security
  • Medicare
  • Legal
  • Home
  • Financial Planning
    • Financial Planning
    • Personal Finance
  • Market Research
    • Business
    • Investing
    • Money
    • Economy
    • Markets
    • Stocks
    • Trading
  • 401k Plans
  • College
  • IRS & Taxes
  • Estate Plans
  • Social Security
  • Medicare
  • Legal
No Result
View All Result
TheAdviserMagazine.com
No Result
View All Result
Home Market Research Economy

Claude, War, and the State of the Republic (with Dean Ball)

by TheAdviserMagazine
10 hours ago
in Economy
Reading Time: 41 mins read
A A
Claude, War, and the State of the Republic (with Dean Ball)
Share on FacebookShare on TwitterShare on LInkedIn


0:37

Intro. [Recording date: March 12, 2026.]

Russ Roberts: Today is March 12th, 2026, and my guest is Dean Ball. Dean is a Senior Fellow at the Foundation for American Innovation, a Policy Fellow at Fathom, and author of the AI [artificial intelligence]-focused newsletter Hyperdimensional, which you can find on Substack. He works on technological change, institutional evolution, and the future of governance. And, prior to this, he served as Senior Policy Adviser for AI–for artificial intelligence–and Emerging Technology at the White House Office of Science and Technology Policy, where he was the primary staff drafter of America’s AI Action Plan.

Dean, welcome to EconTalk.

Dean Ball: Thank you so much for having me, Russ.

1:17

Russ Roberts: Our topic for today is the relationship between private companies working on artificial intelligence, like Anthropic, which created the LLM–the Large Language Model–known as Claude–and the Department of War. In particular, we’re going to talk about the recent clash between the two over what will govern or constrain Claude’s use by the military, which created, I don’t know whether you want to call it a brouhaha, a dust up, or a very serious constitutional issue about the interaction between private entities and the federal government. And that’s what we’re going to talk about today. Our conversation is based on a superb article you wrote on your Substack, Hyperdimensional, which we will link to. That article was simply called “Clawed,” C-L-A-W-E-D. Very clever.

So, let’s start with what happened. What was the nature of this conflict, and what are some of the issues that are involved?

Dean Ball: So, I think to understand this conflict in full, you need to go back about 18 months to the tail end of the Biden Administration. In the summer of 2024, the Department of Defense [DOD]–now Department of War [DOW]–approaches Anthropic, and they agree to a contract for the use of the large language model Claude in classified contexts. That’s distinct from the unclassified uses. Right? So, the Department of Defense and many other government agencies have access to LLMs for all kinds of mundane uses: contract review and procurement, navigating HR [human resource] rules; and government has lots and lots of complex internal rules that just affect the agency, and so you need an LLM to navigate that, things like that.

This is different. This is, like, intelligence analysis, potentially targeting in active combat zones, selecting or at least recommending targets for human reviewers, things of that sort.

So, that starts in the summer of 2024, and in that contract, the Biden Administration agreed to usage restrictions. A wide variety of usage restrictions, as I understand it, but two in particular were on domestic mass surveillance and the use of AI in autonomous lethal weapons. Autonomous lethal weapons being defined as weapons that can autonomously basically identify a target, track it, and kill it with no human intervention. So, this would be machines killing humans on human instructions, but without human oversight.

So, those two things were disallowed in this contract. The Department of Defense agreed to that.

In the summer of 2025–this is during the Trump Administration–the Department of Defense still–it was not yet called the Department of War at that time–the Trump Department of Defense expanded this contract by a significant amount. This was publicly announced. And, when they did that–it was up to a $200 million contract with Anthropic–and, when they did that, they renewed the contract with the same, very similar contract, and it did have the same usage restrictions on domestic mass surveillance and autonomous lethal weapons.

Then we get into the fall of 2025, and as I understand it, a Department of War, now, official named Emil Michael is confirmed by the Senate. He had not been confirmed when this contract was renewed in 2025, or in the summer of 2025. He’s confirmed in the fall. He comes in, he reviews the contract, he sees these usage restrictions, and makes the decision to–he decides that the Department of War cannot live with these restrictions and says, ‘We have to have all lawful use only.’ So, he approaches Anthropic–and it’s worth noting Anthropic is the only LLM that is available to be used on classified systems. He approaches Anthropic, says, ‘We need to renegotiate for all lawful use.’ Anthropic agrees to drop many of their usage restrictions, but not those two. That ends up being a red line for Anthropic. Department of War then says, ‘If you don’t–.’

This goes on for months, and eventually this escalates to the point–I think there’s probably a lot of personal conflict and a lot of back-and-forth drama here that’s mostly private. But, we eventually get to the point where the Department of War says, ‘If you don’t agree to drop this red lines and allow us to use AI for all lawful uses, then we will designate your company Anthropic a supply chain risk.’ Which will mean that, a). All of your Department of War contracts are canceled; but more importantly so are all of your contracts with any Department of War contractors. So, for example, Microsoft is a Department of War contractor, and they wouldn’t be able to use Anthropic AI services in their fulfillment of contracts that they do for the Department of War.

And, that gets announced–at this point about two weeks ago is when that initially gets threatened; and then the actual designation came down something like a week ago, something like that. The timeline is now fuzzy for me because it’s been a very busy couple weeks. And, now we’re essentially in court. Anthropic has sued the government in the Ninth District of California. Or the Northern District of California, my apologies. And, that’s kind of where we are.

6:53

Russ Roberts: Just to clarify one important legal/verbal issue here. Many Americans would not be comfortable with the Department of War doing mass surveillance. There might be situations where that was accepted, acceptable. What is the definition of mass surveillance? Would the federal government have to get a court order to do certain kinds of surveillance?

What the Department of War was asking for, if I understand it correctly, is mass surveillance that’s, quote, “legal.” They wanted, quote, “all legal use,” and that could include mass surveillance as defined by people in everyday language; it could include autonomous lethal weapons that had been approved in some legal fashion. But Anthropic wanted to draw, it seems to me, a verbal distinction there. They wanted the freedom in their contract to say, ‘This is a use of our technology that we don’t approve of, even if it’s legal.’ Is that a correct summary of their position?

Dean Ball: That is correct, yes. And so, I think specifically when it comes to domestic mass surveillance, I think that’s the complex sticking point here.

So, just as an example, there are a very large number of commercially available data sets that would include information on Americans that could be private or sensitive, but that are commercially available. So, things like smartphone location data. For example, many people–you might download a third-party weather app to your phone. A lot of times, the weather app needs to know the location all the time to give you the weather in wherever you happen to be physically in the world. So, a lot of the ways these weather apps make money is the users turn on location, and then they have a location tracker, and they sell the location data. This is very common. And so, there’s tons of things like that.

There is obviously also commercial satellite data that you can buy. There’s web usage data, just a very–not only can you buy these individual data sets, but you can combine them in all sorts of ways to generate quite rich insights on individual people.

This has been true for a long time. This is the era of web-scale data.

The binding constraint, though, on the use of this data is simply that it’s time-intensive to actually analyze for any individual person. So, you have to do this for high-value targets. It’s not illegal. In many domains of national security law, what I have just described is not illegal to do. It’s not considered surveillance. If it’s commercially-available data, it’s not considered surveillance.

So, once you have advanced AI systems which can scale human expert-like attention infinitely, essentially, it is all of a sudden as though the intelligence community has, instead of thousands of analysts, millions and tens of millions of analysts. And so, you have a workforce of analysts larger than the government itself. Larger than the human workforce of the government itself, I should say.

And, Anthropic’s position is essentially that–and I agree with them here–that the law is not sufficient. The law has not been updated for this reality because this is the reality only of the last few years, and the law is not updated for it. And so, yes, that basically domestic mass surveillance as a legal term, as a legal term of art, does not correspond with what you and I might think of as the vernacular definition of the term domestic mass surveillance.

11:05

Russ Roberts: Okay. So, let’s now turn to what’s at stake here. And again, we’re taping this in mid-March of 2026; it will come out in about a month or so. By that time, maybe all humans will be eliminated by AI or the Department of War–who knows? So, listeners: Be aware that this is a rare EconTalk conversation that’s fairly timely, and things could change by the time this airs, and keep that in mind as to when it was taped. Recorded.

So, what’s at stake here? You had a very strong reaction to this. There’s a little footnote, by the way, we should just mention. After this disagreement between Anthropic and the Department of War, the Department of War, if I understand correctly, made an agreement with OpenAI with very similar terms without the constraints. Is that correct?

Dean Ball: Yeah. At least there’s an agreement in principle, it seems, for OpenAI models to be used in classified settings that I would say don’t contain the same red-line protections that Anthropic sought from the government, but do contain–OpenAI is essentially hanging its hat on the notion of technical safeguards. So, instead of putting these safeguards into the contract, their view is: ‘We can train a model and build a system, and if we control the deployment of the system to the Department of War, then that system could, for example, reason in real-time about whether or not what it’s being asked to do is domestic mass surveillance and say no to the government.’

Russ Roberts: Okay.

Dean Ball: That would be the idea.

Russ Roberts: Well, we’ll see. So, why is this–you found this alarming, basically: the actions of the Department of War. Why?

Dean Ball: Well, a number of reasons. I think the first is the nature of the punishment. One thing I think that’s worth being clear about is there’s this whole notion of ‘all lawful use.’ I’ve talked to defense procurement and procurement law experts: This is an abnormal notion in contracting. It’s sort of question-begging, in maybe the vernacular as opposed to the literal sense of that term. But, it’s like: ‘Well, what is lawful? What does lawful mean? Who decides?’ And, in this case it’s, ‘Well,’ the Trump Administration saying, ‘We decide what lawful is, and we’ll do it until courts stop us.’ Or someone stops us.

And so, it’s a somewhat strange term of art.

I get the principle. The principle sounds very intuitive. And, I’m actually just willing to concede for the purposes at least of this debate that it’s perfectly reasonable to say, ‘We want all lawful use.’ I actually think it’s kind of complicated and strange to say that, but there’s, like, reasons that most–like, a contract for a missile does not say, ‘You can use the missile for all lawful use.’ That’s not what it says. The Department of War’s position here is they’re pretending like that is what the contracts are like. But it’s really not.

But setting that aside, the bigger issue here for me is the nature of the both threatened and realized punishments that have been doled out on Anthropic.

So, first of all, Secretary of War Pete Hegseth threatened to issue regulations that would make it such that no DOD contractor–or Department of War–contractor could do any business with Anthropic. Which is very different from saying, ‘No Department of War contractor can use Anthropic in the fulfillment of DOW contracts.’ Right? Two very different things. One is profoundly broader than the other.

So, he threatened any commercial relations. And what they actually followed through with in terms of the regulation that’s been issued so far, is, it’s just barring Department of War contractors from using Claude in their fulfillment of Department of War contracts. So you can still use Claude for other things.

Russ Roberts: That’s the supply chain risk?

Dean Ball: Yes, this is the supply chain risk designation.

Russ Roberts: So, to be clear, Microsoft–in Washington State, in its offices–can use Claude all they want except when they’re working on a particular contract with the Department of Defense?–

Russ Roberts: Department of War, excuse me.

Dean Ball: Yes. It is a little bit complicated because the Department of War does–one thing that’s subject to a Department of War contract would be Microsoft Windows.

Russ Roberts: Yeah.

Dean Ball: They buy lots of computers that run Windows. They buy lots of computers that run Microsoft Word.

Russ Roberts: Yeah: it’s kind of gray. 16:36

Dean Ball: Yeah. And, I mean, one way to think about this, too, though, like, even if it is the more narrow definition. Actually, Microsoft is a good example. Let’s say in the 1990s, in the early 1990s, that the Department of Defense had issued a supply-chain risk designation against Microsoft for Microsoft Windows and said, ‘We won’t use it and none of our contractors can use it in their fulfillment of Department of Defense contracts.’ One wonders, would Microsoft be the sort of world-bestriding company that it is today? I don’t know.

So, we are talking about something–even in this narrower usage of the regulatory authority–we’re talking about a government intervention in a critical emerging technology that has the potential to really radically reshape the trajectory of this industry, and one company within it.

17:28

Russ Roberts: And, as a background–I don’t really want to go into this because it’s not that interesting–but it should be mentioned that people have speculated that Anthropic having an allegedly more safety-oriented culture in its development of AI and possibly a training process that has certain processes that people have said is more–I hate to use the word–‘woke’ than the other AI companies, and that there’s something else going on here behind the scenes that has nothing to do with red lines. And I just–

Russ Roberts: You can comment on that if you want. But we should just mention that.

Dean Ball: Well, yeah. No, I think that is worth mentioning. I’ll just say, stepping back a little, this supply-chain-risk designation is only used–typically is only used–against companies from foreign adversaries. This is about adversary manipulation of American military systems.

Russ Roberts: Yeah.

Dean Ball: So, it’s really treating Anthropic like enemies of the state, essentially.

Russ Roberts: Yeah. The broader designation, which would have been that any company that does anything with the Department of War can’t use it at all anywhere, would be kind of like a terrorist organization. Or, as you say, a foreign enemy that you would say we’re embargoing or we’re putting some kind of sanctions on.

Dean Ball: It would have been the equivalent of sanctions. And, one other thing that I think is worth noting here is that this is clearly Act I, Scene I.

Russ Roberts: Yeah.

Dean Ball: If the Administration decides that they want to bring the entire federal regulatory apparatus to bear against Anthropic, I imagine they will.

And, I also think, by the way, this doesn’t have to be restricted to formal, legible regulatory action. This can be jawboning. In fact, Anthropic has alleged in their complaint against the government, they have alleged already that the government is calling Anthropic customers–government officials are calling Anthropic customers–and encouraging them to cease doing business with Anthropic. So, it’s jawboning–that is, soft–and it’s very hard to sue about.

So, all this is essentially–like, if I were to summarize it in just a sentence, I would say the government is saying here that if you don’t do business on the terms we unilaterally set, we’ll set out to destroy your company. Which is a kind of usurpation of private property.

And even more, to your point, Russ, about some of the political—basically, every time senior Trump Administration officials have invoked Anthropic and talked about the supply-chain-risk designation, they have inevitably mentioned that Anthropic is liberal. That they’re supposedly woke. I think that’s not exactly true, actually. But, that they’re supposedly woke and they don’t share Trump Administration political values, that part certainly is true. Anthropic is run by people who donate to Democrats. A lot of AI companies are, it’s worth noting.

And if that’s the case, if that really does–then this is also a form of political interference, which would be in addition to private property usurpation, would also be a pretty serious abridgment of First Amendment rights.

Russ Roberts: Yeah. So, I think the question is–you framed it in a particular way. It could be framed a different way. It could be framed as: How can we allow a private company to interfere with the security of the citizens in the United States? The Department of War is responsible for keeping Americans safe, the argument would go. And, if we need to do certain things–we, the government–of course a particular private company shouldn’t be able to dictate the national security scope of the actions of the Department of War. That would be the other side.

21:41

Russ Roberts: We’ll come to that, but before we do, I want to go a little–I’m going to restate and make clear what you just said. You’re basically saying that the Trump Administration has–forget this thing about usurpation, private property, and First Amendment rights. That sounds nice. But then let’s make it starker. Do we really want the federal government punishing and rewarding particular companies for any reason? In this case, it might be political antagonism; that would be particularly horrific. But in general, in a free market, so-called capitalist system, how do you draw the line between private companies and government power? And, that is really what’s at stake here, I think.

Dean Ball: Yes. And, one thing that I think should be really clearly said here is that: one of the reasons that it’s very hard–and this is not just true of American, it’s true internationally–it’s very hard to do business with the Chinese–with large Chinese tech companies–because it’s sort of known that in particular things like information technologies, there’s a reason that Chinese companies don’t make the operating systems that define computers all over the world. And, it’s because one of the reasons is that–it’s a lot of reasons–but one of them is that everyone knows that Chinese technology companies are assets of the military and are viewed that way by the government. And, that’s not the case in the United States. And, that has aided American companies in doing business abroad because there is a trust.

One of the things I actually used to always say when I was in government to foreign governments, who, maybe they would have some concerns about doing business with America: ‘Oh, you’re an unreliable business partner.’ And, I’d say, ‘Look, yeah, I can’t deny it to you that the government changes every four years here in America and there are these wild swings in different directions, and I can’t deny that to you. But the thing is, is that don’t think of yourself as doing business with the U.S. government. Think of yourself as doing business with Microsoft.’ Which is, like, way more stable and has totally legible incentives.

The problem is that when you do things like this, you are eroding that distinction between public and private, which gives people faith in Microsoft. Microsoft has a higher credit rating than the U.S. government. It gives people faith in the institution of Microsoft that is separate and apart from faith in the institutions of the federal government. And, you erode that and all of a sudden, everything becomes political, and that’s a subsuming mentality that I think is quite toxic.

Russ Roberts: But, equally important–I mean, that’s interesting and it’s not irrelevant–but it seems to me it’s much more important that, as you say, we’re in the very earliest days of this extraordinary technology; and the government is picking winners and losers not based on who has the best technology, but without any particular constraints. Not constitutional constraints. It could be political; I don’t know. Who knows what’s really in the hearts of human beings? But, it could be political. And, if it’s not political, it’s arbitrary. It could corrupt, it could be personal. There are thousands of motivations. And in general, we would want government to not be beholden to those kind of motives and to leave private companies to do what they do best.

Having said that–and I’ll let you respond to that, too, if you want–but this is a unique technology on the surface. On the surface. It is probably going to revolutionize the world. We don’t know for sure. It has certainly revolutionized a few industries already, in the last year. And, we’re kind of worried–many people are–about our ability to keep a lead in this technology relative to our potential enemies abroad. So there’s a national security issue here that works in the opposite direction. Which is: we want–we, Americans–Americans want Anthropic, OpenAI, Google, the three big leaders right now–there may be others coming down the road–to be able to be at the forefront of this. And, if we’re going to punish them by saying, ‘We don’t like you. We don’t like that you didn’t play ball with us. We think this is really important and you didn’t cooperate,’ you’re going to hamper the competition that’s producing this extraordinary set of technologies.

Dean Ball: Well, first of all, I think it’s worth noting, yes, there’s a picking of winners and losers here; and it is explicitly not merit-based because Secretary Hegseth has said that: ‘The reason we use Claude’–I’m paraphrasing him here, but–‘it’s because it’s the best.’ And, ‘the reason that this is so important to us’–the reason that this fight is so important to them–he said, ‘is because it’s the best.’ And yet at the same time, his regulatory actions are trying to drive the company–at least hurt them, if not drive them out of business.

And yeah: it’s also worth observing here that this is an incredibly capital-intensive industry, and all of this regulatory risk is making it much harder for Anthropic in particular, and probably the industry in general, to raise the capital that they need. And so, yeah, you are diminishing America’s ability to maintain its lead in this technology right at a critical time.

And, not to mention the fact that, by all accounts, Claude is exceptionally useful already in its still relatively nascent forms. It’s already exceptionally useful for certain kinds of military operations. So, I think it’s unambiguous to say that if Claude disappeared from military systems tomorrow, it would be a–American national security would be weaker.

27:51

Russ Roberts: So, what’s the other side of this argument? Can you steel me [i.e., reconstruct the opponent’s argument into its strongest, most persuasive form before challenging it–Econlib Ed.] on the other side? The people who think that Anthropic was out of line. So, here’s the other side–I’m not going to give the argument. I’ll let you give the argument because you know it better than I do: ‘Anthropic is out of line here. This is a national security issue. They should have deferred to this application. They should have said, to this contractual demand, they should have said: Of course you can use it for anything that’s legal. And we have our own feelings about surveillance and autonomous weapons, but we have to trust our government to do what’s legal. So, as long as it’s legal, sure, go ahead.’ And, how dare they? How dare they hamstring the national security interests of the United States because they have a different view of what’s legal, perhaps?

Russ Roberts: What’s the argument there?

Dean Ball: I think the argument is that, yeah, that, like, this–Anthropic is essentially using its private power to set what amounts to public policy unilaterally. And, there is some truth to that–

Russ Roberts: Yeah–

Dean Ball: I think. I don’t think that’s crazy. And, my own view is that: Look, on one level, we look at this now and it feels really restrictive. At the same time, the government purchases software, including software that’s used in really important critical applications, purchases software on commercial terms all the time. And, commercial terms of service are, like, the same ones that you purchase it under–right?–basically. And so, commercial terms of service often have usage restrictions. Government software contracts have all kinds of usage restrictions.

Russ Roberts: If you don’t like it, don’t buy it. That would be the argument.

Russ Roberts: When I complain about some usage restriction on some product–that you can’t take the back off, you void your warranty, whatever it is–they just say, ‘Well, if you don’t like that, don’t buy it. Buy something else.’

Dean Ball: Yes. Yeah. Right. And, AI is in fact a competitive market. It’s true that Anthropic is the only model on classified systems right now, but that’s not a fact of physics. Right? That can change.

And so–but–I think, to make their argument for them, I think it would be: No, it doesn’t matter about competition. A private party can’t do public policy through contracting.

Russ Roberts: Yeah.

Dean Ball: And, it’s just that simple. And also, there are some allegations that the government has made that Anthropic has done things, like threaten to remove Claude. Like, basically, to pull Claude’s services during active military operations if Anthropic doesn’t like what the government is doing.

I must be honest with you that I have some real questions about the veracity of those claims, but at the end of the day–because I will say, it doesn’t sound like a thing that you would say to the government. It doesn’t sound true. But, it’s what the government claims. I’ll be interested to see if they claim these things under oath.

Russ Roberts: Yeah, we’ll see.

Dean Ball: That’s the ultimate thing: Do the DOJ [Department of Justice] lawyers claim it under oath?

31:13

Russ Roberts: So, what’s fascinating about this–it could be merely: In a different world the Department of War would be using Claude to-as you say; in the beginning we were saying it–maybe to streamline their HR [Human Resources]. To make their back office work a little more efficiently. And, this could have come up–they could be unhappy about the way that works and they could have complained, and they could have tried to redo their contract, they could have threatened them. There’s a lot of things government can do if they want. And, we’ll talk in a minute about the other constraints besides what they want.

But, this is a very complicated piece of technology because it does have important military applications. And, it has an immense number of non-military applications. Some people have likened it to a nuclear weapon. They’ve said, ‘If a private company developed a nuclear weapon and sold it to the government because it was better than the nuclear weapon the government had’–sort of an absurd, but useful story I think–certainly, they would not be free to withhold the weapon’s warhead because the company felt that the casus belli–whatever it was, the cause of war–that was generating the use of the weapon, they didn’t agree with it. And that’s a dramatic way to make your point about a private company doing public policy.

So, is that a legitimate analogy in this situation?

Dean Ball: Well, I think the contractual analogy actually is fair. And, in fact, you can imagine even a version of–you could imagine Anthropic having a contractual term that says, ‘We are only comfortable with our models being used in wars declared by Congress,’ or something.

Russ Roberts: Yeah, exactly.

Dean Ball: And, of course, there’s a long history of America engaging in basically wars that aren’t technically wars.

So, I think the nuclear-weapons-to-AI analogy is actually quite poor for reasons that I would be happy to explain, but that’s not actually your point here. Your point is more about this contractual term. And, I think the government has a very fair point here.

My observation is twofold. You can make that point without trying to destroy Anthropic’s business, Number One.

Number Two, but I think on the Anthropic side of things, you shouldn’t try–if these protections matter so much to the leadership of Anthropic, if they matter so much that they’re willing to call these red lines against a government that is threatening to basically destroy their business, I think if they’re that important, then you should have just said, ‘We’re not selling you anything until there’s a law.’ And, they should have said that in 2024. In fact, if they were in such cahoots with the Biden Administration and the Democrats, they should have said it in the summer of 2024. They should have said, ‘No, we’re not going to do this until Congress passes a law about domestic surveillance and autonomous lethal weapons; and we want those protections written in statute.’

34:38

Russ Roberts: I just want to make an observation here: I don’t know how important it is, but the United States is kind of weird about this generally. It’s weird in healthcare. In healthcare, we have people, they sometimes claim we have a free market system in healthcare. And what they mean by that is you can be a doctor if you want and have a private practice.

We don’t have a free market system in healthcare. We have an incredibly government-tampering role in a healthcare market that is not anything like a free market. There’s control of the number of doctors through certification of medical schools, accreditation of medical schools, licensing of physicians. There’s incredible subsidies through Medicare and Medicaid that basically determine what the prices are: they’re not free market prices.

So, people get confused because the U.S. system is very different. Because of our culture and our heritage as a sort of free-market country, we allow certain private activities to take place that give the illusion of a private market when it’s not one at all. As opposed to, say, the National Health Service in Great Britain or the Canadian healthcare system where doctors generally are employees of the government.

Now, we do the same thing in defense. Right? We have private defense. We have public government defense activity, like the Los Alamos Project. That was not a private company taking venture capital money to develop a nuclear weapon to fight World War II. That was a government project.

But, there are many, many, many private companies that develop things for the government. They’re nominally private, but their business is so dominated by federal contracting that they are this weird hybrid, like the healthcare market.

So, a company like Boeing or McDonnell Douglas, they are private. They have private employees; they’re not federal employees. But they have this weird relationship with the federal government. They are dependent on federal contracting in a way a nationalized–effectively a nationalized–industry, is different.

So, here we have this technology that is not a military technology on the surface: it’s a general technology. But, it has this very strong and powerful military potential. And so, what we’re seeing to some extent is the unusual nature of a company that is clearly private, but has a very important role to play in public sector activity–in particular national security. And, if it were only good for that, I think we’d be having a very different conversation. Part of the complication of this is: It’s good for seemingly everything.

Dean Ball: So, your question gets, I think, to one of the most interesting dynamics that we’re going to face in the next decade, two decades, maybe more. Which is: What is the relationship between this thing we know today of as the frontier lab–which is the AI companies–and the U.S. government–and the federal government?

And, it’s an incredibly complicated question because, Number One, there are national security implications, right? These technologies can be used for object-level dangerous things, right? They can be used to engage in autonomous cyber attacks. So, in other words, I don’t need to have a military arsenal to make use of these models, or an intelligence-gathering apparatus. Anyone can launch a cyber attack. So, there are these things.

There are people who talk about things like bioweapons and whatnot. There’s all sorts of catastrophic potential dangerous misuses, malicious uses of the technology. Obviously, there is a government role in the sort of mitigation of those things. Well, maybe not obviously, but I think that there’s some government role in the mitigation of those things.

But, it’s also an incredibly useful technology for national security, like, for government, for militaries specifically and uniquely.

And then, it’s also a technology that I think will be a profound part of how all of us exercise our individual liberty and express ourselves in the future. And even today. It will be hugely important, a sort of foundational tool in the acquisition of knowledge, which is a First Amendment right in and of itself. But also, the self-expression for many people, I think.

And then, on top of all that, I think that we’re dealing with a technology that, like the printing press, may well be so foundational to the capability of organizations and institutions that it actually changes sort of the institutional complex that defines the technocratic nation state. Such that what we currently think of as the government will actually change in important ways. And so, in that sense, you might think that the technology the frontier labs are developing is in some ways a challenge to the institutional status quo in which technocratic regulators are in charge of large swaths of the economy, basically. That that in and of itself might be challenged in various ways.

And so, it’s all of these things all at the same time. So, I can’t say that I know exactly what the answers are going to be here because indeed, I approach these issues with a classical liberal frame. But, I am also aware that the very notion of classical liberalism–some people would argue it’s already anachronistic; and certainly you could say that if you think about the future, that maybe all of our political concepts–all of our political theoretic concepts–are going to be somewhat outdated. Because something new: there’s some new type of institutional complex beyond the technocratic nation state is going to emerge. And so, new sorts of political relationships will undergird that.

And so, I think classical liberalism is a good starting point and all I can say is I changed my career from what I was doing before to be writing about this, because basically, this question in particular is one that I find infinitely fascinating and extremely important. And, I don’t have all the answers. I don’t have anything like all the answers. But I do think that this is going to keep coming back to us I think many times.

42:09

Russ Roberts: No, I think the point your essay highlights: Government regulation historically is about either restraining the power of the private sector, or enhancing it artificially through what economists call rent seeking–if you want to take a less charitable motive for government regulation. These two things, they’re not mutually exclusive: there’s a little of both often in all–much–of what government does. But, that’s the way it works. There’s a political process, government regulates some things, restricts some things. Sometimes that benefits the public at large, sometimes it benefits individual players. That’s a better way to say it on the corporate side.

And, we’re in a brand new, brave new world right now where the idea of what ideal regulation is and what is the right role for the federal government in this nascent industry is unclear. Like you, I start with the classical liberal framework, but it’s not exactly clear how to apply it here. And you can hear that in some of our conversation so far in our back-and-forth, which is: what does it mean exactly? It’s an unusual–it’s not the printing press. It’s not electricity. It’s not the steam engine. It is something that might underlie a total transformation of work and play. In which case, government probably isn’t prepared for that. I know most of us aren’t, either.

And so, the question of what should be the appropriate role in this brave new world for the government is up for a very crucial conversation; and what I hear from you is you want to be a part of that conversation. And I applaud you for it.

And, the other thing I hear from you is that the heavy-handed approach that the Department of War has taken in this early development of what is the appropriate relationship between the federal government and what is right now the private sector does not seem to be ideal and consistent with traditional American values of private property, freedom of expression–and I would also say responsibility and in the incentives. And, whatever restrains this technology, it probably shouldn’t be the whims of a particular person in the Department of War. That’s the way I would put it.

Dean Ball: Yes. I think that’s right. And, the thing here that’s hard for , I think, is–you know, there’s this notion of aligned super-intelligence. That we’re going to make something that is smarter–vastly smarter–than the best human experts at everything–right?–and at every cognitive task. And, I don’t know if that’s actually what we’re going to build exactly; I don’t know if that’s quite the right way of thinking about it. Yeah.

But, grant for a moment that, like, it will be of foundational importance to everything that an organization like the Department of War does, or a very large number of the activities that they engage in. And also, that it may be capable–in fact, definitionally, in order to be what it is described as or what the companies are trying to build, it will have to be able to act in the world as its own. It’s not a pure legal agent that does whatever you say. It will have to be able to make decisions. Again, anthropomorphizing language is complicated here, but we’re taking our hands off the wheel to a certain extent.

And so, I guess what I would say is imagine a world in which we build something that is smarter than all the employees of the Department of War; and when we ask, ‘What is domestic mass surveillance? What will it do and what will it not do?’ And, the answer is, ‘Well, the machine will decide.’ That’s obviously a caricatured world. I don’t think it will be that simple. But, probably that element of the machine deciding–truly deciding something–that’s probably something that a lot of people have not emotionally and intellectually factored in to their models of the future that you probably ought to.

Russ Roberts: Yeah.

Dean Ball: At this point.

46:47

Russ Roberts: I’ll just say one thing about that and then I want to segue into the deeper questions that you raised at the beginning and end of your piece.

Russ Roberts: That statement, ‘It’ll be smarter than any employee of the Department of War,’ is a somewhat misleading statement, because many of the things we care deeply about are not a question of cognition. And, I know that’s not fashionable to say, so let me to try to make it clear what I mean.

I can imagine the Secretary of the Department of War, late at night, frustrated that this company has failed to do what he wants, turns to Claude and says, ‘You know, Claude, this really annoys me. What can I do to get my way? How can I get Anthropic to bend to my will?’ And, Claude dutifully would say, perhaps, ‘Oh, well, you should threaten them with the supply-chain risk. You could even do more than that designation of supply chain risk. You can make them essentially corporation-non grata with anybody who deals with the Department of Defense.’ And, it could come up with some things that the Secretary can’t think of. And that’s the sense in which its cognition is spectacularly great.

But what it cannot do, and I believe will never be able to do–and I even think it’s meaningless to say it this way: It will never be able to give the Secretary of the Department of War advice on whether it’s the right thing to do. It’s not a meaningful question. There’s no answer to that question. It’s not a question of coding, it’s not a question of how many calculations you make per second. It’s not even a question of how many philosophers you’ve read in the history of your life. It’s not that kind of question.

And people, I think, assume that all questions will ultimately be questions you can answer, and I believe that is not true. I believe there are no solutions, only trade-offs. And once you’re in the world of trade-offs, that’s not something a machine can decide. It can try, it can give us some sort of utilitarian calculation–if you’re a utilitarian; I’m not.

So, this idea that in theory, we would–so, I think the risk–one of the biggest risks–of AI is people thinking it’s good at answering the wrong kind of question and using it. You can still use it. It will give you an answer. If you ask it, ‘Should I do this?’ it will–unless it’s been trained to say no–it will probably give you advice about whether you should do it. I’ve already done that with some of my strategic decision-making here at the college. I’ve ask its opinion; I’ve asked it why it thinks that, why does it justify that? But, that’s an illusion; and I don’t worry about it making the wrong decision. I worry about people assuming that whatever it says is the right decision and giving it questions to answer it is not capable of answering.

Dean Ball: I agree with you in part and disagree in other areas. So, I think, like–like, the other day, actually I was using GPT [ChatGPT, Chat Generative Pre-trained Transformer] 5.4, the newest model from OpenAI, and I was asking it about a very complex, a private issue, but related to some of the things we’re talking about in some ways–a very complex interpersonal and professional thing I’m dealing with. I was, ‘Okay, here’s what I’m thinking about saying in this situation. What do you think?’ And, it responded to me and it actually said what I should have said. It was, like, ‘No, you shouldn’t say that, you should say this.’ And, I was, like, ‘Wow, that’s really,’–like, because it knows enough about me to know what I want to sound like.

Russ Roberts: Yeah.

Dean Ball: It knows what I sound like at my best, in some sense. And so, what I do think though, what I think is–so I’m not sure that I agree with you that it won’t be able to reason about trade-offs and moral and ethical things. In fact, I think Claude is a better–I’d be willing to bet you, if I had a moral and ethical question for Secretary Hegseth versus Claude Opus 4.6, I bet you nine times out of ten, I would prefer Claude’s answer, to maybe more.

Russ Roberts: No comment. Go ahead, carry on–

Dean Ball: But, that’s interesting–

Russ Roberts: Other than to say that probably tells you more about what you think of Pete than what you think of Claude. But go ahead.

Dean Ball: Right, right, right. Well, that’s interesting because that’s not true of you, Russ.

Russ Roberts: Maybe.

Dean Ball: I don’t think so, I don’t think so. I bet you sometimes I like Claude more than what you would say, but I bet you not every time.

Russ Roberts: Yeah.

Dean Ball: And so, what I do think is that, a). I agree with you that there’s a risk to just assuming the AI is right about everything because it’s actually not, especially in things like this.

But also, where I think the value of–where I think the human touch is going is really going to be on these things that are definitionally based on relationships. Based on things like trust, and integrity, and charisma, and persuasion; and politics to some extent. It’s like the notion of automating politics doesn’t really make sense to me.

Russ Roberts: No.

Dean Ball: That seems like a category error. And, the reason for that is not that AI can’t do a better speech, that it can’t perform the–I think AI can probably perform many of the speech acts of politics better than the best. And, I’m willing to submit, one day, the best–it’ll be better than those things in even strategy and stuff. Better at strategy than Otto von Bismarck. Better at rhetoric than Abraham Lincoln. Better at writing rhetoric at least than Abraham Lincoln. But, there’s this issue of, like, politics is an inherently relational act. And, that seems much harder to automate. And so that’s my guess as to where we’re going. That’s where I think the human touch is going to be. That’s a super-different world than the one we currently live in and I don’t think our education is prepared–maybe yours, but not the U.S. education system–is preparing students to live in that world. That’s a very different world than the one we’re used to.

Russ Roberts: Yeah, fair enough.

53:12

Russ Roberts: I want to close–and I maybe should have opened with this. I hope listeners have found this interesting. I have. This to me, what we’re going to talk about next, is in some ways the most interesting part of your piece. It’s also the least specific, so I’ve saved it for last.

And, you start your piece–this piece “Clawed” with an A-W-E-D at the end–you start the piece with a discussion of your father. Talk about why you did that and why that’s relevant for this moment in American history.

Dean Ball: So, I have come to a quite biological conception of institutions. I think institutions are made up of human beings and I think that nature is filled with fractals. And so, I think that while institutions aren’t exactly like human beings, there are ways of observing and thinking about living things that can also be usefully and productively applied to institutions, both as an analytic matter and for purposes of the poetry of it all. I don’t think there’s that much of a distinction between those two things, actually.

So, I open up the piece basically describing the experience of sitting at my father’s deathbed about 11 years ago. I was 22 years old. I had just started my career. And it was no secret. We were in hospice–it was me, and my mother and a few other family members–and we knew that we were watching my father die. And, I remember reflecting at the time–and I’ve reflected, of course, on that experience many times since–that death is this process, and that in some ways, my father had become sick. He had gotten heart surgery that went wrong six months prior to the date that he died, roughly. It was immediately after that six months, he was a changed man entirely. The life had been sucked out of him. And then, it was just this gradual process of just him becoming less and less there, in fits and starts, not even necessarily, but he would occasionally come back and have some life in him.

And then, the actual process of just watching him die, I realized that I don’t know: he seemed dead to me well before the machine declared him dead. And so, the machine making this declaration that his heart had stopped, or the faint signal that it was getting from the heart had crossed a point of faintness that the machine made some arbitrary decision, basically, that he had officially passed over. That is just, I think, one way of looking at where he was in the process of death.

And so, I was reflecting on that and reflecting on why is this experience of writing about Anthropic, Department of War–why is it so emotional for me? Why is it so frustrating? Why do I feel such a deep melancholy about it? And, what I realized is that it is because I just feel as though I’ve watched–throughout my lifetime, for 20 years–I’ve watched a lot of these bedrock principles of our Republic get eroded in thing after thing. It’s been the same sort of corrosiveness, but worse sequentially every year, it feels like. And, I suddenly realized–it clicked for me–that that process feels very much like death. It felt very much like the experience–I don’t know what death feels like, but it felt very much like the experience of watching my father die.

And also, the fact that, like, I think about this a lot privately, but I don’t talk about it that much. And the reason I don’t talk about it is that it feels quite painful to talk about. When my father was going through his six months of dying, we talked about his health a lot. But we didn’t talk about, sort of, the certainty of his death that much, and where he was in the process, and all these kinds of things. Because it was too painful and we knew the answer. The answer, we all knew.

And so, yeah, that’s why I started. I will say I wrote that piece in about two hours, so it just kind of came out of me.

58:15

Russ Roberts: Well, the reason I think it’s so profound–I’m older than you, I’ve been watching for more than you have. And, it’s been clear to me for a while–and listeners know this because this show is 20 years old as of next week. And, over that 20 years, listeners can hear my optimism about the American experiment and then sometimes my pessimism. There’s times I said, ‘We’re near a civil war: America is near a civil war.’

And, five years ago, I moved to Israel and I found myself watching America from afar. And it changed my perspective. It allowed me to be a little more of an observer and less of a participant in some dimension. Still an American citizen.

And, I’ve thought for a long time now, ‘Something is wrong.’ In fact, something’s wrong in the West. It’s not an American problem: it’s a Western problem. And, what your piece made me realize is that it’s possible that this problem is not going to get better. That’s what’s hard to face. That’s the melancholy for me. And, I think there’s a tremendous blindness among some Americans that this is a Trump problem–

Russ Roberts: Trump is just the manifestation, the latest manifestation of a very, very long trend. It’s probably–you could argue it’s 80 years old, it goes back 90 years to Roosevelt. You could argue it goes back 60 years to Lyndon Johnson. But, what is that trend? The trend is the end of the Constitution as an effective constraint on government power. The rise of discretionary action. The destruction of norms that put some things off limits are no longer off limits: those norms are gone.

And, as a result, it’s much more: What’s expedient? It’s not: What’s constitutional? It’s not: What’s principled? It’s: What can I get away with? And, you could argue that the Department of War threatening a particular company is not that important, it’s just a petty dispute between egotistical players about their own success and failure.

But, what I thought you struck at deeply–and maybe we’re overreacting here but I think not–is that you don’t know what you got until it’s gone.

And, we thought we had a Republic. There’s this very famous line from the Constitutional Convention in, I think, 1789 where someone asks–I’m going to get this wrong so forgive me. You guys will all fix it for me. But, I think somebody asked Benjamin Franklin: ‘What kind of government do we have?’ And he responds, ‘A republic, if you can keep it.’ And, America kept it for a very, very, very long time. It’s had a tremendous run.

But, the increase in executive power unconstrained by the Constitution, unconstrained by norms is a long trend. Trump is just the one most comfortable ignoring the things that other people used to not ignore. They’ve all been ignoring it to some extent, the last eight presidents or whatever the number is.

And, I think this whole debate about whether we’re heading toward fascism, I think that’s the wrong way to think about it–

Russ Roberts: I think what we’re talking about here is the slow, inevitable erosion of institutions as we get further and further away from our Founding and from the principles that sustained it. And, now it’s like other places. If you get a good president, it turns out well. If you get a bad one, it doesn’t. It used to be it wasn’t so important. All of a sudden, it’s really important.

And, the reason I think your piece is so insightful is that when you’re in the middle of it, you don’t notice it. It’s like the frog getting boiled. Is it warmer in here? I don’t know, it seems a little warmer. But, after a few decades, it’s like, ‘Boy, this water is boiling hot. It used to be cold.’ And you kind of start to notice.

And what you’ve done, I think, in this piece, even though it’s a small corner–but maybe not–is to point out that the water has been boiling for a while. It keeps getting warmer and warmer. And it’s an illusion to think we can turn it down. It’s just we’re going to live in a new world. And I think you’re right. And it helps me, it’s a very–and I’m sorry about your dad. It’s a very powerful metaphor for thinking about change. Not so much about death, but this just happens to be about death, but for any kind of change–

Russ Roberts: When you’re in the middle of it, it feels like, ‘Well, I don’t know, is it really changing? Maybe it’s just me. Maybe it’s this one example. Maybe it’s this particular Congress that doesn’t want to do, quote, “its job” all of a sudden.’ This goes back to also to things Yuval Levin has said on this program: ‘Everybody’s performing.’ What happened to a world where people did what they’re obligated to do, what they’re responsible for doing? Their duty?

And then you think, ‘Well, we just need a president to come along who is going to do that.’ Do you really think that the next President, Republican or Democrat, is going to be any different?

Russ Roberts: I think it’s just going to be the same thing. So, that’s my rant. Your rant is beautifully said. You can go read your piece. I’d like you to reprise[?] it now if you want, but react to what I just said.

Dean Ball: Yeah. No, I think it’s very well put. In some ways, more precisely than I communicated it. And, I think the way I think about this is you are definitely right that this is about change and not death; because, I also talk about the birth of my son briefly in that piece and how it is similar. And how my experience thus far, quite brief still–it’s only several months of being a father–is that I sort of just am watching my son progressively awaken. He just becomes more and more aware of the world. And, nature is like this. Nature is filled with phase transitions.

There’s a great graphic I saw on social media, on Twitter, the other day of a heart beginning to beat and what that looks like. And, it’s all these cells, these decentralized cells that begin to activate; and then enough of them activate, and all of a sudden you have a heart beating. But, it’s not like there’s ever one moment where it is–and by the way, I think that change from AI will be like this, too. There will be phase transitions. There already have been phase transitions in the progression of AI, and there will be in the adoption as well.

So, very much, yes. And, part of the point I’m making is–like, yeah, I’m not trying to make a point about fascism. I think probably a lot of people on the Left read my piece; and I took pains to say that this wasn’t just about Trump. But I’m sure a lot of people–and I knew this would happen–a lot of people on the Left I think read my piece and in self-satisfied fashion said, ‘Ah, yes, but everything will be solved when we get Gavin Newsom in,’ or whoever–

Russ Roberts: Yeah–

Dean Ball: in a few years. And, that’s very much not my view. My view is, like, the most charitable thing I could say about the Left would be that they would likelier do all the same stuff in a somewhat more gentlemanly technocratic fashion than the Trump Administration, which has a tendency to be really explicit and stumble into things like this. But, in some sense, I actually applaud the Trump Administration for that because at least it’s out in the open–

Russ Roberts: Yep–

Dean Ball: At least we can talk about it with the Trump Administration.

And, the one other point I would make is, you know, I spent more time debating whether or not I should publish this piece in the form that I published it than I did writing it. Because there’s a certain aspect of, like, there’s run-on-the-bank dynamics that you don’t want to contribute to with things like this. The reason that republics work is that we all believe in the common fiction of the Republic. And that’s always been true–

Russ Roberts: Yeah–

Dean Ball: That’s always been true. And, I certainly did get pushback from some people, including people that you and I both respect about that, about the decision to publish it. And, one of the things that I heard is, like, ‘Well, you know, democratic–like, elections are still functioning. Right? Like, we still have elections and the results of them are observed.’ My view on that is that that’s a goalpost moving in my view–

Russ Roberts: Oh, 100%–

Dean Ball: Yeah. It’s really easy–

Russ Roberts: It’s better than nothing–

Dean Ball: It’s better than nothing and the thing is, it’s really easy to observe–

Russ Roberts: Yeah–

Dean Ball: It’s really easy to observe. Did I go to my polling place and vote, and did the person who won get into power? And so, it’s very, very hard to erode that particular thing.

And, it’s interesting to me that even the Left has chosen to focus so much on this issue of, like, the erosion of democracy per se. Because that has always seemed to me that the thing that the Trump Administration or anyone else is least likely to mess with. Because it’s so verifiable. And instead, like, indeed, the Founding Fathers, if you told them that the one thing that persisted was the ability of the masses to vote–

Russ Roberts: Oh, they would be so depressed–

Dean Ball: they’d be appalled!

Russ Roberts: so depressed!

Dean Ball: They’d be, like, ‘That is the worst part of the whole system.’

Russ Roberts: I forget who said it and maybe it some general bit of humor, but the joke used to be about Mexico, that the same party won every election for forever. I forget the name of it. And, the claim was that Mexico had a democracy 364 days year and the 365th day when they didn’t have an democracy was election day because it was rigged.

But, the rest of the year, political forces did matter, the people did have influence, but not on who won the election. That was rigged.

Dean Ball: Yeah. Because yeah, it’s tyranny of the masses. Democracy is just the tyranny–the idea that there’s an omni-powerful, an omnipotent executive who–we shift wildly between two different omnipotent executives based on a democratic vote–that’s not at all what a republic is. So, the fact that elections are being observed, it doesn’t feel–it’s cold comfort.

Russ Roberts: Yeah.

Dean Ball: It’s cold comfort.

1:09:19

Russ Roberts: Before October 7th, here in Israel there was a massive, incredibly controversial discussion about the proper role of the Supreme Court here in Israel and its relationship to the Knesset and the ruling coalition. And, what the judicial reform issue was about here was–and it’s interesting, both sides cast themselves as democratic.

The coalition–the Netanyahu reforms–which were going to severely curtail the power of the Supreme Court, they were called democratic because the coalition wins the election. What could be more democratic than that? Which is what we’re talking about.

The defenders of the Supreme Court’s power said, ‘Democracy requires civil rights. And, if there’s no constraint on the power of the majority, there will be nothing left to retain democracy because the civil rights will disappear.’ And, that’s the same thing that’s going to happen in the United States I’m going to predict; and I’ll let you react to that and take us home.

There’s been an enormous increase in power at the Executive Branch in the United States. The Legislative Branch is neutered, spayed–pick your verb. They’ve self-neutered: they’ve neutered themselves. And, the only thing that stands in the way of executive power is the Court. It’s a weird thing because the court is appointed by the President; but it’s approved by Congress, so it’s tricky. But, we’ve already seen that attempts by Trump, the Trump Administration, to put in things that some people would say are overreach in terms of power–I’ll pick tariffs as the obvious example and this example that we’re talking about right now–the courts have been very willing to try to restrain that executive power.

So, I’m going to predict that that’s going to intensify over the next few years; and I would be shocked if the courts did not rule in favor of Anthropic in this case simply because they see themselves–and this was true in Israel, too, whether they’re right or not–they see themselves as a bulwark against that executive discretion and that unconstrained power. Now when an executive gets into place that the court happens to like, it’s going to be even a more complicated situation and to some extent–well, the United States is more complicated than that. But, I think we’re going to see in the West generally fights between the legal–the Courts–and the Executive Branch as to what democracy is going to actually look like in the coming years.

Dean Ball: Yes. I think the one functioning branch remains the Courts and so they are this one lasting check on the unfettered power of the Executive. And, that exists in a real tension because the Courts can only do so much. At the end of the day, who enforces the Courts’ decisions? It’s the Executive.

And, once you start asking that question–

Russ Roberts: Yeah–

Dean Ball: that’s sort of my point–once you start asking that question, you’re in the law of the jungle at that point.

Russ Roberts: Sure.

Dean Ball: And so, I’m hopeful. Part of the reason that I’m a very close observer of the Courts on a wide variety of different issues, far beyond just AI and tech-related issues, is because I like to observe this chess match in detail.

One thing that maybe is a note of optimism that I can give is that if you think about the Courts as the last umpire enforcing the rules of game as written down–the laws that are written down–well, then if you are a smart long-range actor who wants to win in court, it’s incentive-compatible for you to pretend like those rules of the game actually do govern your actions. Because then when you go to court, you will have a better case to be made.

I’m a big fan of a book called Homo Ludens: Man at Play by a guy named Johan Huizinga. It’s an old book, but it’s a great book. And, it would make this point that you should model the institutions of classical liberalism as this kind of grand game. As long as there’s one institution that enforces the rules of the game, then maybe it’s incentive-compatible for the actors to remain. But, the problem is, like, the court authority gets eroded and it’s not always clear–even today, it’s not always clear–that court rulings get observed. Biden had this problem, too. Biden ignored aspects of court rulings and so does Trump. And so, even that is starting to break down a little bit, and we could get into court packing. There’s all kind of things.

Russ Roberts: Sure. Expanding the size of the Supreme Court. That’s why I said you can go back 80 years if you want to–90 years–to think about this tension.

Dean Ball: Yeah. So, I’m very grateful that the Courts exist, but in the end–and this gets into this locus of control thing to bring us back to the middle of the conversation about where is the proper locus of control and how should we be thinking of AI as this kind of new institutional technology. Well, one of the problems I have is that I’m trying to analyze this and think about the appropriate locus of control in a moment when I’m also just candidly acknowledging that our republic is in not very good health. And so, there’s a certain extent to which I have trouble trusting the unfettered executive to be the governing institution over AI. I have a lot of trouble with that in a way that maybe I wouldn’t have if this were 1923–

Russ Roberts: Yeah–

Dean Ball: Or if Calvin Coolidge were President or something: maybe we would be in a very different world.

But, we’re in the world that we’re in. So, I think that that should affect your–well, I don’t want to be–it affects my view of the accumulation of private power versus the accumulation of public power because the thing about private corporations is they don’t have the monopoly on legitimate violence.

And so, maybe we build new checks and balances in this way somehow. But, I think whatever we’re doing, I suspect that we are in a new Founding moment–which is not novel for this country, but certainly we’re in uncharted territory.

Russ Roberts: My guest today has been Dean Ball. Dean, thanks for being part of EconTalk.

Dean Ball: Thank you, Russ.



Source link

Tags: ballClaudeDeanRepublicstateWar
ShareTweetShare
Previous Post

The US Dollar’s Next Test: Energy Shock and Fed Week

Next Post

Links 4/27/2026 | naked capitalism

Related Posts

edit post
When a Chicken Isn’t Just a Chicken

When a Chicken Isn’t Just a Chicken

by TheAdviserMagazine
April 27, 2026
0

A man stands at a farmers market stall. His wife is talking to the farmer. He picks up a chicken....

edit post
Links 4/27/2026 | naked capitalism

Links 4/27/2026 | naked capitalism

by TheAdviserMagazine
April 27, 2026
0

California cities seek to bless polyamorous unions. Lawyers warn it will get messy in court Los Angeles Times The Kind...

edit post
Political Theatre – Solve Energy Crisis By Eliminating Fossil Fuels

Political Theatre – Solve Energy Crisis By Eliminating Fossil Fuels

by TheAdviserMagazine
April 27, 2026
0

Over 50 nations are gathering in Colombia to map out a future without oil, gas, and coal, all while the...

edit post
Iran War: Israel Strikes Lebanon, Trump’s Negotiations Rug Pull

Iran War: Israel Strikes Lebanon, Trump’s Negotiations Rug Pull

by TheAdviserMagazine
April 26, 2026
0

Today’s Iran War update includes Netanyahu ordering strikes on Lebanon, Trump rug pulling “negotiations” in Pakistan, the MSM publishing more...

edit post
The Sunday Morning Movie Presents: White Sun Of the Desert (1970) Run Time: 1H 23M Plus Naive Atheism Refuted!

The Sunday Morning Movie Presents: White Sun Of the Desert (1970) Run Time: 1H 23M Plus Naive Atheism Refuted!

by TheAdviserMagazine
April 26, 2026
0

Greetings gentle readers and welcome to another installment of the Sunday Morning Move. Today it’s a “Red Western”, White Sun...

edit post
American “Micro-Militarism” | naked capitalism

American “Micro-Militarism” | naked capitalism

by TheAdviserMagazine
April 26, 2026
0

Conor here: McCoy puts the US foray into the Strait of Hormuz in deep historical perspective. By Alfred McCoy, the...

Next Post
edit post
Links 4/27/2026 | naked capitalism

Links 4/27/2026 | naked capitalism

edit post
He Bought His First Rental at 20. Now at 29, He Cash Flows K/Month

He Bought His First Rental at 20. Now at 29, He Cash Flows $20K/Month

  • Trending
  • Comments
  • Latest
edit post
Virginia Permits ADULT MIGRANT MEN To Attend High School

Virginia Permits ADULT MIGRANT MEN To Attend High School

March 30, 2026
edit post
A 58-year-old left NYC for Miami to save on taxes — then retired early thanks to hidden savings. Here’s the math

A 58-year-old left NYC for Miami to save on taxes — then retired early thanks to hidden savings. Here’s the math

March 30, 2026
edit post
Tax Flight Accelerates In Massachusetts

Tax Flight Accelerates In Massachusetts

April 6, 2026
edit post
Property Tax Relief & Income Tax Relief

Property Tax Relief & Income Tax Relief

April 1, 2026
edit post
The Stevia Loophole Why Some Sweetened Drinks are Still SNAP-Legal While Others are Banned in Texas

The Stevia Loophole Why Some Sweetened Drinks are Still SNAP-Legal While Others are Banned in Texas

April 4, 2026
edit post
The Duke Faculty and Administration Damaged the Intellectual Foundations of Higher Education

The Duke Faculty and Administration Damaged the Intellectual Foundations of Higher Education

April 2, 2026
edit post
The Hidden Curriculum of Testing: Multiple-Choice Exam Strategies – Faculty Focus

The Hidden Curriculum of Testing: Multiple-Choice Exam Strategies – Faculty Focus

0
edit post
Go Digital! Visit Our Webpage | Social Security Matters

Go Digital! Visit Our Webpage | Social Security Matters

0
edit post
Shapir mulls buying control of Ashdod Refinery for NIS 1b

Shapir mulls buying control of Ashdod Refinery for NIS 1b

0
edit post
We Ran the Numbers and Found the Best Type of Short-Term Rental—Here’s How the Data Stacks Up

We Ran the Numbers and Found the Best Type of Short-Term Rental—Here’s How the Data Stacks Up

0
edit post
Claude, War, and the State of the Republic (with Dean Ball)

Claude, War, and the State of the Republic (with Dean Ball)

0
edit post
An NFT Investor Allegedly Lost Punks NFTs Worth +M In A Hack

An NFT Investor Allegedly Lost Punks NFTs Worth +$1M In A Hack

0
edit post
Go Digital! Visit Our Webpage | Social Security Matters

Go Digital! Visit Our Webpage | Social Security Matters

April 27, 2026
edit post
Easterly Government Properties Q1 2026: alt=

Easterly Government Properties Q1 2026: $0.77 Core FFO/Share Tops Estimates — Deep Dive

April 27, 2026
edit post
Trent announces record date for its 1:2 bonus issue. Check details

Trent announces record date for its 1:2 bonus issue. Check details

April 27, 2026
edit post
Love Driving? 9 Ways Putting Your Pedal to the Metal Can Earn an Income

Love Driving? 9 Ways Putting Your Pedal to the Metal Can Earn an Income

April 27, 2026
edit post
When a Chicken Isn’t Just a Chicken

When a Chicken Isn’t Just a Chicken

April 27, 2026
edit post
EU sanctions pressure keeps Iran uranium surrender odds near zero

EU sanctions pressure keeps Iran uranium surrender odds near zero

April 27, 2026
The Adviser Magazine

The first and only national digital and print magazine that connects individuals, families, and businesses to Fee-Only financial advisers, accountants, attorneys and college guidance counselors.

CATEGORIES

  • 401k Plans
  • Business
  • College
  • Cryptocurrency
  • Economy
  • Estate Plans
  • Financial Planning
  • Investing
  • IRS & Taxes
  • Legal
  • Market Analysis
  • Markets
  • Medicare
  • Money
  • Personal Finance
  • Social Security
  • Startups
  • Stock Market
  • Trading

LATEST UPDATES

  • Go Digital! Visit Our Webpage | Social Security Matters
  • Easterly Government Properties Q1 2026: $0.77 Core FFO/Share Tops Estimates — Deep Dive
  • Trent announces record date for its 1:2 bonus issue. Check details
  • Our Great Privacy Policy
  • Terms of Use, Legal Notices & Disclosures
  • Contact us
  • About Us

© Copyright 2024 All Rights Reserved
See articles for original source and related links to external sites.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Home
  • Financial Planning
    • Financial Planning
    • Personal Finance
  • Market Research
    • Business
    • Investing
    • Money
    • Economy
    • Markets
    • Stocks
    • Trading
  • 401k Plans
  • College
  • IRS & Taxes
  • Estate Plans
  • Social Security
  • Medicare
  • Legal

© Copyright 2024 All Rights Reserved
See articles for original source and related links to external sites.