A Net Benefit to Society?
Let’s start with the big picture questions. What are the social benefits of artificial intelligence?
What if, I think there are 2 million commercial truckers in the United States, and there are lots of other examples you can give. There’s a thought exercise, and you could push a button, eliminate all of them, and they make $120,000 on average. Save fuel, save lives, save time, a more efficient system, less disrupted highways, all that beautiful stuff. Would you do it if you put 2 million people on the street where even if there are jobs available, that next job is $25,000 a year, stocking shelves. I was saying, “That’s kind of really bad, kind of civilly, should we as society agree to that?” I don’t think so. I was talking about the business and government, and they should start thinking today, not when it happens, what would we do to deal with the [AI] issue? It’s got to be business and government.
Equally controversial is the recent standoff between the U.S. Department of War and Anthropic. Anthropic provides Claude to DoW for its use based on two conditions. First, it cannot be used in the surveillance of U.S. citizens within the U.S. Second, autonomous targeting systems must have a human safeguard before an “attack” decision is made. Secretary of War Pete Hegseth threatened Anthropic and wants the safeguards removed. Anthropic published a statement late last week rejecting the Pentagon’s demands.
Show Me the Money!
The other concern among AI skeptics such as Michael Burry of “The Big Short” fame is where the profits will come from.
Michael Burry recently asked, “When does all this data centre buildout actually end?” Hyperscalers are using accounting tricks to boost their earnings. Specifically, GPU chips are depreciated on the income statement over five years, but NVIDIA produces a new generation of chips about every three years that make the previous generation obsolete. Already, there are reports that H100 chips which cost $40,000 new are selling for $6,000 on eBay. The gap between the accounting depreciation rate and the economic life of GPU chips will have to be reconciled by hyperscalers restating their earnings at some point in the future. This will present an enormous shock to market expectations.
The enormous scale of hyperscaler capex is changing their business model from an asset-light and intellectual property-heavy model to a more conventional asset-heavy bricks-and-mortar model of old economy companies. Burry argues that such a shift in capex will strain their finances and load their balance sheets with debt. Already, the market is starting to price the increased default risk of technology companies.
Market anxiety is already rising in private credit. JPMorgan CEO Jamie Dimon recently said that some of his rivals are doing “dumb things” in their lending practices. Credit market jitters are rising.
AI and Productivity
Artificial intelligence boosters have trumpeted the potential productivity gains from AI adoption. A recent NBER working paper studied the effects of AI. Researchers surveyed 6,000 CEOs, CFOs and other executives from firms who responded to various business outlook surveys in the U.S., U.K., Germany and Australia, the vast majority see little impact from AI on their operations. Here is the abstract:
We present the first representative international data on firm-level AI use. We survey almost 6,000 CFOs, CEOs and executives from stratified firm samples across the US, UK, Germany and Australia. We find four key facts. First, around 70% of firms actively use AI, particularly younger, more productive firms. Second, while over two thirds of top executives regularly use AI, their average use is only 1.5 hours a week, with one quarter reporting no AI use. Third, firms report little impact of AI over the last 3 years, with over 80% of firms reporting no impact on either employment or productivity. Fourth, firms predict sizable impacts over the next 3 years, forecasting AI will boost productivity by 1.4%, increase output by 0.8% and cut employment by 0.7%. We also survey individual employees who predict a 0.5% increase in employment in the next 3 years as a result of AI. This contrast implies a sizable gap in expectations, with senior executives predicting reductions in employment from AI and employees predicting net job creation.
Fed Governor Lisa Cook addressed this issue in a recent speech:
In anticipation of future productivity gains, we already see soaring AI-related business investment in data centers and chips, despite interest rates broadly being elevated relative to levels over the past 20 years. With investment contributing to strong aggregate demand, it is possible that the current neutral rate is higher than before the pandemic. This could reverse when the AI productivity gains are more fully realized or if the labor market transition leads to a rise in income inequality, such that well-off consumers receive a larger share of income, which could lower the neutral rate, all else equal.
A Middle Ground
So where does that leave us? The Citrini report postulates a scenario where AI wipes out millions of jobs, while AI boosters assume widespread productivity gains without disruption.






Some forces are very hard to resist, especially if they involve lower costs (aka increased productivity). Take for example the agricultural changes of the last 200 years where the number of people on farms went way down. Aside from the Amish and others like them, we don’t harvest wheat by hand. How can a farmer compete without modern technology? There are niche markets for sure, but in the aggregate what sells at the lowest price sells the most. That’s why most hamburger meat is not prepared at your local corner store any more. So businesses that don’t employ AI may not compete effectively and go out of business. Is the military going to say, “yeah we won’t use AI, we’ll leave that to the Russians or Chinese”? Not going to happen.
There was a story I read in French class, 60 years ago. It could have been by DeMaupassant (maybe got the name wrong), but the gist of it was a miller who was decrying the use of motorized mills to grind wheat and not the windmills that use the Mistral for power. Well, windmills are in paintings but not grinding wheat.
Whether we like it or not AI is happening, use whatever analogy you wish, cats and bags, genies and bottles.
At the individual level, AI will be like TV did to us. We used to be outside a lot, do things differently before TV, now we have couch potatoes. So with AI, some will use it and be more productive and learn, others will have it do everything and not learn basic math.
Is the world going to a better place? I dunno, but wherever it’s going, it’s going.
Up till now tech innovations means we have new sophisticated tools for us to control. But now AI is both the tool AND its own controller. Assuming AI and robots can fully replicate and/or surpass our intelligence and motor skills humans won’t be needed. Even politicians won’t be needed and could be replaced by blockchain and AI, but they’ll probably be scaremongering using Skynet as an example to stay relevant.
I’ll make an analogy with flowers. You can make artificial flowers that look realistic, only they are not real flowers. When we talk about intelligence we should include emotional intelligence. I suspect that AI could successfully pass the Turing test, but really it is just one big data/algorithmic compiler. We may not have to work like we used to, lose resilience, not be able to tie our shoes, who knows? But the day that robots have emotions is far off in my opinion. The reason C3PO and R2D2 were cute was because they reflected human emotions. I suppose AI may mimic emotions like a mockingbird. Fear is something we “meat based” intelligences learned over say a billion years to survive. It’s going to take a while to teach AI about emotions. Will some try, of course they will…Pandora’s Box .
Timeless story.
AI can intuit real world physics from videos alone even if imperfectly. It’s not a stretch for it to intuit our emotional frameworks given enough data about our preferences of the arts and films, video games, etc. My personal opinion is AI isn’t there yet and need a few more innovations before it reaches AGI/ASI namely in working/episodic/etc. memories.
We in the investment and stock community love to talk about trends. So just like how chip developments could track moore’s law for decades even though it took significant innovations and breakthroughs to sustain the trend, many today may risk underestimating the amount of innovations that the industry could output to sustain the rate of growth in AI capabilities.