Was this helpful?
Thumbs UP Thumbs Down

OpenAI’s chairman says training your own AI could burn your capital

Machine learning, AI, algorithm on a digital conceptual image with a hand pointing on it

Training AI from scratch is a money pit

OpenAI chairman Bret Taylor didn’t mince words: training your large language model (LLM) is a great way to “burn through millions of dollars.” In a podcast appearance, Taylor likened the effort to setting piles of money on fire.

His advice to startups? Don’t do it. Unless you’re a tech giant with unlimited capital, using existing tools is smarter than building foundational AI from the ground up.

Google sign on the wall of the Google office building.

Only a few companies can afford the cost

Taylor said only OpenAI, Google, Meta, and Anthropic are realistically able to train frontier models due to the astronomical costs involved.

Between GPUs, infrastructure, R&D, safety validation and elite talent, training a competitive frontier LLM can cost tens to hundreds of millions today, and some projects already exceed $1 billion per training run, particularly for next-generation models.

For most companies, even trying is financial suicide. The message is clear: unless you’re one of the elite, don’t try to compete at the base model level.

ChatGPT language model with different versions of OpenAI

You can lease powerful models instead

Rather than training an LLM, companies can license access to powerful models like GPT-4 via APIs. OpenAI, for instance, sells tokens for developers to integrate AI into their products. There are also open-source models available.

Taylor emphasized that this model lowers entry barriers, speeds up development, and allows startups to focus on building differentiated products without spending a fortune on infrastructure.

Machine learning, AI, algorithm on a digital conceptual image with a hand pointing on it

Indie AI is getting squeezed out

Because of the enormous costs and complex logistics, there’s no real “indie data center” movement for training foundational models. AI development is consolidating rapidly, with most new LLMs coming from a few major players.

This consolidation creates risk but also opportunity. It opens the door for others to build creative applications, rather than reinventing the AI wheel at a massive cost.

DeepSeek logo displayed on a phone

Open-source AI is closing the performance gap

Some global players are proving that leaner alternatives can still make waves. China’s DeepSeek used a less resource-intensive approach to build its R1 model and corresponding chatbot.

It topped the App Store charts, briefly outpacing ChatGPT. The model’s success sparked debate over whether tech giants are overengineering and overspending for marginal gains. So while Taylor’s warning holds weight, it’s not an absolute rule.

Portrait of African American developer using laptop to write code

Custom models are fast-depreciating assets

Taylor warned that custom-built models depreciate quickly. The AI landscape moves so fast that what’s cutting-edge today may be obsolete next quarter.

If you sink millions into training your LLM, you may find it overtaken by open-source or leased models before you even hit the market. That’s a significant risk for startups with finite capital and pressure to ship fast.

Women interact with artificial intelligence

The real gold rush is in AI tools

Instead of building base models, Taylor recommended targeting the tools market, apps, agents, and services powered by AI. He likened these to “pickaxes in a gold rush.”

The idea is simple: provide the shovels and rakes, not the mines. These tools address real user needs today and can scale rapidly by riding the infrastructure created by AI giants.

AI agent

Agent-based startups are the next SaaS

Taylor sees the future of software in “agent companies,” startups that use AI to create autonomous agents that perform tasks. He compared this to the SaaS boom of the early 2010s.

These companies won’t need to build their models but can create highly valuable businesses by packaging AI into user-friendly solutions. The opportunity lies not in the core model, but in its application.

IT technicians in server farm brainstorming ways to repair equipment

Leasing models reduce regulatory risk

Training your own LLM brings more than financial risk; it opens you to scrutiny around bias, misinformation, data handling, and safety. Major AI labs have the teams and experience to navigate these challenges.

For smaller companies, leasing a model means offloading those headaches to someone else, reducing the chance of regulatory blowback while shipping an AI-powered product.

OpenAI logo displayed on phone screen

AI giants benefit from this ecosystem

Let’s be honest: Taylor’s advice is also self-serving. OpenAI, where he serves as chairman, profits from developers using its API rather than competing directly. Encouraging startups to build on its tech also helps OpenAI dominate the application layer.

Still, the advice isn’t wrong; it’s just layered. Giants want to own the rails and build them faster than anyone else.

Blackwell Nvidia chip

Infrastructure costs keep rising

The cost of training AI models continues to climb. State-of-the-art chips like Nvidia’s H100s are complicated to get and expensive to operate. Custom data centers cost hundreds of millions.

And once trained, models need to be continually fine-tuned and served at scale. Even mid-sized companies struggle to manage these costs, making DIY AI increasingly unrealistic.

Meta logo displayed on mobile phone

Retaining AI talent is brutally competitive

It’s not just about machines, it’s about minds. Training top-tier AI models requires elite research and engineering talent. Companies like Meta have reportedly offered $100 million signing bonuses to poach OpenAI employees.

Competing for that kind of talent is impossible for startups without mega-capital. You can’t train the model if you can’t attract or keep the team.

indonesiaapril 6th 2024 openai logo is displayed on smartphone screen

Failure stories are stacking up

OpenAI has scrapped or delayed some of its most ambitious model rollouts, citing safety and performance concerns. A recently planned open-weight model was postponed indefinitely.

And its attempted acquisition of coding startup Windsurf collapsed amid strategic disagreements. Even for the best-funded teams in the world, launching next-gen models isn’t smooth sailing. That says a lot about the uphill battle others face.

ai is transforming society raising important ethics questions ethics in

Open-weight models offer a middle ground

OpenAI had publicly slated an open‑weight release for summer 2025, but announced in mid‑July that it has delayed the launch indefinitely for additional safety testing and internal review.

This hybrid approach balances transparency and control, and may offer a way for smaller players to stand out without reinventing the wheel.

Still, it requires careful management and usually, a partner with serious AI chops to pull off.

Man working on multiple screen computer

AI isn’t just code, it’s contracts and IP

Launching your model also means dealing with thorny issues around intellectual property, licensing data, and cross-border compliance. A lawsuit or IP complaint (like OpenAI faced with its “io” branding) can derail even well-funded efforts.

By contrast, licensing a model off the shelf avoids these minefields and allows startups to focus on product-market fit.

Thinking long-term with AI? Sam Altman has a vision that ChatGPT will remember everything about you.

chatgpt chat with ai or artificial intelligence technology man using

Focus on value, not vanity, in AI

The AI hype train is real and seductive. Everyone wants to claim they’re training the next GPT-level model. But Bret Taylor’s advice rings true: the winners will be those who build durable, helpful tools, not just powerful engines.

In AI’s next chapter, delivering real utility will beat bragging rights. So if you’re building, make it count.

Wondering what it really takes to power all that AI magic? Sam Altman just shared the eye-opening numbers behind a single ChatGPT query.

What do you think about Sam Altman advising you not to invest all your capital in training your AI bot? Please share your thoughts and drop a comment.

Read More From This Brand:

Don’t forget to follow us for more exclusive content on MSN.

If you liked this story, you’ll LOVE our FREE emails. Join today and be the first to get stories like this one.

This slideshow was made with AI assistance and human editing.

This content is exclusive for our subscribers.

Get instant FREE access to ALL of our articles.

Was this helpful?
Thumbs UP Thumbs Down
Prev Next
Share this post

Lucky you! This thread is empty,
which means you've got dibs on the first comment.
Go for it!

Send feedback to ComputerUser



    We appreciate you taking the time to share your feedback about this page with us.

    Whether it's praise for something good, or ideas to improve something that isn't quite right, we're excited to hear from you.