One of the core assumptions that underpins the artificial intelligence (AI) boom is that each new generation of AI model will require ever-increasing computational horsepower to train and run. DeepSeek, the Chinese AI company that managed to put out an AI model that performed well using a fraction of the computational resources of top-tier AI models, raised some serious questions about the future of the AI industry.
There are some other signs, as well, that more computing power may not be the answer. OpenAI's GPT 4.5, an enormous AI model that's not much of an upgrade over the company's previous models, is so expensive to use that viable real-world use cases are likely few and far between.
AI models are indeed becoming more powerful, but they're also becoming more wrong. The New York Times reported this week that the newest "reasoning" AI models are producing incorrect information more often than older models and no one seems to know why.
AI models that are increasingly wrong and expensive to use may not mesh well with a recession, which is a very real possibility as U.S. tariff policy disrupts global trade. International Business Machines (NYSE: IBM) is betting big on AI, making it one of the core pillars of its overall strategy. However, unlike other tech giants, the company isn't participating in the AI arms race to train the most capable model.
Instead, IBM is focusing on small, compact, and efficient AI models that are inexpensive to run and can be fine-tuned to perform specific tasks well. With its upcoming Granite 4.0 family of AI models, IBM is taking that focus a step further and positioning its AI business to thrive in a potentially difficult economic environment.
Image source: Getty Images.
Knocking down memory requirements
IBM's AI business spans its consulting, software, and hardware segments. The watsonx AI platform supports a variety of third-party AI models but also supports IBM's own Granite family of AI models.
These Granite models are aimed squarely at enterprise customers that need cost-effective AI solutions that also perform well in safety benchmarks. Every AI model can be prodded to produce harmful content, but IBM's Granite models tend to outperform the competition on that front.
IBM's third-generation Granite models are much smaller than top-tier models from OpenAI and other AI companies but still require large amounts of computing resources. In typical usage, the smallest Granite 3.3 model requires between 28 GB and 84 GB of graphics processing unit (GPU) memory for AI inference, depending on the number of concurrent requests. Those memory requirements put Granite 3.3 in the realm of requiring pricey data center GPUs.
With Granite 4.0, which is currently in preview, IBM is positioning its smallest model to run on inexpensive consumer-grade hardware. The Granite 4.0 Tiny model isn't fully trained yet but already outperforming its predecessor, while requiring 72% less memory.
Granite 4.0 Tiny can run on as little as 12 GB of GPU memory, which is about what you'll find in newer PC GPUs that start at a few hundred dollars. IBM accomplishes this feat by switching to a new hybrid architecture for the Granite 4.0.
Cheap AI is a winner
What kind of tech projects will enterprises focus on when economic conditions worsen and belt-tightening becomes the order of the day? Projects that save money or boost productivity. IBM's Granite 4.0 family of AI models, the smallest of which can be run on commodity hardware, can form the foundation of AI projects aimed at reducing costs.
What the U.S. and global economies look like six months from now is anyone's guess, given the unpredictability of the Trump administration's tariff policies. While IBM would suffer in a recession as clients pull back on discretionary projects, demand for projects with clear returns on investment should remain strong.
When the focus is on efficiency, AI can be a major part of those projects. IBM's push to make its Granite AI models even more efficient and capable of running on cheap hardware should pay off if the economy takes a turn for the worse.
Should you invest $1,000 in International Business Machines right now?
Before you buy stock in International Business Machines, consider this:
The Motley Fool Stock Advisor analyst team just identified what they believe are the 10 best stocks for investors to buy now… and International Business Machines wasn’t one of them. The 10 stocks that made the cut could produce monster returns in the coming years.
Consider when Netflix made this list on December 17, 2004... if you invested $1,000 at the time of our recommendation, you’d have $623,685!* Or when Nvidia made this list on April 15, 2005... if you invested $1,000 at the time of our recommendation, you’d have $701,781!*
Now, it’s worth noting Stock Advisor’s total average return is 906% — a market-crushing outperformance compared to 164% for the S&P 500. Don’t miss out on the latest top 10 list, available when you join Stock Advisor.
See the 10 stocks »
*Stock Advisor returns as of May 5, 2025
Timothy Green has positions in International Business Machines. The Motley Fool has positions in and recommends International Business Machines. The Motley Fool has a disclosure policy.