What They Said: Sarah Friar of OpenAI on AI

How OpenAI’s CFO is financing the race to artificial general intelligence—from hyperscale compute and agentic AI to hardware, partnerships, and sustainable growth.

Most readers read for free. A small group from the TelecomTalk community keeps this going. Support only if our work adds value for you.

Highlights

  • AI agents—not chatbots—define the next phase of enterprise and consumer productivity.
  • Compute scarcity, not demand, is the biggest bottleneck for AI progress.
  • AI is evolving toward abundant intelligence, accessible like electricity.

Follow Us

What They Said: Sarah Friar of OpenAI on AI
Let’s take a look at what Sarah Friar, Chief Financial Officer at OpenAI, has said about artificial intelligence (AI), infrastructure, and the investments and returns on investment (ROIs) the technology creates within the internet ecosystem. OpenAI announced the appointment of Sarah Friar as Chief Financial Officer in June 2024, stating, “She will lead a finance team that supports our mission by providing continued investment in our core research capabilities, and ensuring that we can scale to meet the needs of our growing customer base and the complex and global environment in which we are operating.”

Also Read: What They Said: Pallavi Mahajan of Nokia on AI




Sarah most recently served as CEO of Nextdoor. She previously held the role of CFO at Square and has also worked at Goldman Sachs, McKinsey, and Salesforce. She is a board member of Walmart and Consensys, a fellow of the Aspen Institute, and co-chair of the Stanford Digital Economy Lab, an integral component of the Stanford Institute for Human-Centered AI (HAI).

Edited excerpts from various interviews and interactions

What They Said: Timeline

February 2025

February 20, 2025 — Revenue Growth & Growth Potential

In an interview on CNBC’s Squawk on the Street, Friar said OpenAI could reach $11 billion in revenue and noted strong investor interest — but raised the topic of growth and valuation rather than immediate public offering plans.

“We have managed to punch well above our weight to become effectively a hyperscaler, both in terms of the compute that we’re buying and the way we’re investing in it,” she reportedly said.

March 2025

Importance of Capital & AGI Race

At the Goldman Sachs Disruptive Tech Summit in London on March 5, 2025, Friar discussed the race toward artificial general intelligence (AGI) and the need for massive capital investment in data centers, power, and trained people to support AI’s next phases. She emphasized urgency in infrastructure and innovation to stay ahead in AI development.

Explaining Deep Research Friar said:

"If you all haven’t used Deep Research, it is mind-blowing. Now, shame on us because it was really only available in our pro SKU until about a week ago and we just rolled it out into the enterprise SKU, Deep Research is the ability to ask the model for something that you would probably go ask an analyst to do for you."

She continued with an anecdote:

"I’m just coming from a meeting with the CEO of another bank. I will not mention it. And we had been asking him about GPU financing. Very top of mind for me. And so, the team had effectively used Deep Research to go pull a report. His colleague sitting next to him who’s doing some of that work and looking into it said, “I read that report. It was much better than what we did as a team of two MDs, three VPs, six associates, ten analysts. I’m a little embarrassed that we didn’t do the Deep Research report first and then use that to help us ideate and iterate.”

She added:

"But today, OpenAI is so much more. We’re going down into data center technology because we do think that we’re now in the AI infrastructure V2. Or what Jensen calls AI factories. And we feel like we’re creating a lot of IP there. And it’s really important for us to own that. Think about Amazon at that moment where they’re rocking it on e-commerce."

On ChatGPT’s expanding functionality, Friar said:

"...behind that front door of ChatGPT you now can do video generation with Sora. You can do a Deep Research report. You can do search. You can create projects. You can code. You can create a canvas for writing. And our goal is just to keep loading that up because it does multiple things. First of all, it keeps us as the dominant player, 400 million weekly active users. But it also drives personalization. The model starts to know more about you," she explained about driving functionality in ChatGPT that makes it easy for a consumer in both personal and work life.

Discussing the roadmap toward AGI, she said:

"Let’s talk about just broadly how we talk about the five steps towards AGI, artificial general intelligence. So, I already talked about the world of chatbots. So, kind of year 2023 real time predictive response. Last year, we bring reasoning to the table. So, now a model that thinks for longer and can do long horizon tasks that you would send an analyst out to do. 2025 is the year of agents. We started talking about it probably Q3 or Q4 last year. It’s now become the term the industry is using. But this is AI that can go out and do work independently for you."

She emphasized this is not vaporware:

And this is not vapor ware. We’re not selling ahead. We actually have three things working today. Deep Research, which is the agentic tool to go do a real deep research report for you. Operator, which is what we have launched to allow a task worker to go out on the web and do something for you that might take time in the background. Book a flight. Book a holiday. Book dinner tonight, whatever you want to do. And then the third that is coming is what we call A Suite. We’re not the best marketers, by the way. You might have noticed. But agentic software engineer.

She explained:

And this is not just augmenting the current software engineers in your workforce, which is kind of what we can do today through Copilot. But instead, it’s literally an agentic software engineer that can build an app for you. It can take a PR that you would give to any other engineer and go build it. But not only does it built it, it does all the things that software engineers hate to do. It does its own QA, its own quality assurance, its own bug testing and bug bashing, and it does documentation, things you can never get software engineers to do. So, suddenly you can force multiply your software engineering workforce."

On model performance, Friar said:

"We are still by far the state-of-the-art model. O3, I just want to put it into perspective, these are the benchmarks that are just widely agreed upon, the benchmarks that we’re using to say is AI becoming AGI, right? Is it truly getting to that level of human intelligence and beyond?"

"You can see from a software engineering perspective how it’s scoring. Competitive coder, it’s the 175th best competitive coder in the world. On competitive math, it got one question wrong. And on PhD-level science, it is PhD level across physics, chemistry, biology, and so on. That’s O3. What my product team assures me is O3 mini is already the number one competitive coder in the world. It’s literally the best coder in the world already."

Defining AGI, she said:

"This is the question that everyone’s asking. I mean, AGI in definition is that point where we believe AI systems can take on, you know, a majority of the real kind of value-added human work in the world and do it. And we’re getting pretty close to that being the case. If you ask Sam, he would kind of say, you know, it’s imminent. We may be there."

She added:

"There is a whole next world of AI where it becomes much more 3D, truly robotics in action where you think about tasks like factory worker tasks, farming, right, areas that today we’ve seen technology begin to move into but hasn’t just fully moved into," she added.

Replying to a question about the idea that AI-designed vaccines could cure cancer, Friar said, "I think it’s very real. That’s why I made the point about what we’re hearing from academics in their field of expertise is that we might already be pushing that boundary. We might already be finding new discoveries, novel in the world."

On growth and compute:

"Literally in two years, we have grown to 400 million weekly active users. And our revenue has tripled every single year. This will now be the third year in a row that’s tripled. So, you can kind of imagine the sort of scale we might be at."

Also Read: What They Said: Bill Gates of Gates Foundation on AI

May 2025 — AI Hardware and New Computing Era

Friar commented on OpenAI’s acquisition of Jony Ive’s startup io (hardware bet), framing AI hardware as part of a new era of computing that could reshape how AI interacts with users.

OpenAI is betting a new “era” of computing will justify the company’s decision to spend billions of dollars on bespoke hardware to go with it, Chief Financial Officer Sarah Friar said, according to a CNBC report dated May, 22, 2025.

“You’re really betting on great people and beyond,” Friar reportedly said. “It’s not just about imagining what a new platform could look like — you’ve got to be able to craft it. You’ve got to be able to build it. You’ve got to be able to understand supply chains.”

“When you start thinking about it beyond just a phone, it starts to grab the imagination,” she reportedly said. “If we can get people around the world excited to use AI, we have many ways to begin to think of a business model around that. So it could be an ongoing, bigger subscription for ChatGPT.”

Friar hinted at new devices without touchscreens. “As you birth this new era of AI, there’s going to be new platforms and new substrate,” she said. “We think of tech today as a little bit more around touch. We as humans, we see things, we hear things, we talk. And our models are great at that.”

June 2025 — Partnerships and Ecosystem Support

She reiterated strategic partnerships and continued collaboration with external vendors (e.g., Scale AI) to support growth and diversity of technological inputs.

Despite Meta’s $14.3 billion investment in Scale AI, OpenAI plans to continue working with the company, Friar said, according to Fortune on June 16, 2025.

August 20, 2025 — Compute Demand & Industry Strategy

Friar told CNBC that the biggest challenge for AI right now is compute capacity, with demand for GPUs outpacing supply — highlighting why OpenAI expanded partnerships beyond Microsoft (e.g., Oracle, CoreWeave).

Sarah Friar said that even as OpenAI hits its revenue milestones, it faces ongoing pressures due to the demand for computing power required for artificial intelligence.

“It is voracious right now for GPUs and for compute,” she told CNBC’s Squawk Box on August 20, 2025, adding that insufficient compute, or computing power, to meet the demand of AI is the company’s biggest challenge. “That’s why we launched Stargate. That’s why we’re doing the bigger builds.”

“Microsoft will be an important partner for years to come, and I think we are very intertwined because of our IP,” Friar said. “Remember, Microsoft AI products are built on OpenAI technology.”

Friar said that OpenAI hit its first $1 billion revenue month in July. “When you have 700 million weekly active users, you start to find people are very opinionated,” Friar reportedly said. “As we’ve come out of the gate, we’re actually seeing acceleration in Plus and Pro subscriptions.”

September 2025 — Embrace AI or Get Left Behind

At a Goldman Sachs Communacopia + Technology Conference, she warned that companies not adopting AI quickly enough risk being left behind and noted ongoing compute constraints paired with exponential AI demand.

“The people who will get left behind are not embracing AI fast enough,” she said during an appearance at the conference, according to the CNBC report dated September 9, 2025.

Individuals harnessing AI to their full potential represent one of the biggest threats to businesses, she said, adding that it is “someone who’s using AI deeply that’s going to disrupt you.”

Also around this time, she stated that OpenAI’s revenue was set to more than triple in 2025, emphasizing rapid growth in a “new era of AI.”

"Revenue this year will grow over 3X. So about $13 billion in revenue from about $4 billion last year. So it's tripling on a very big base as well," Sarah Friar told Yahoo Finance in an interview at the Goldman Sachs Communacopia Conference.

Friar told CNBC that OpenAI expected to generated roughly $13 billion in revenue this year. “No one in the history of man built data centers this fast,” Friar reportedly said, adding that the entire ecosystem has to work together to meet demand.

“What we see today is a massive compute crunch,” she reportedly said. “There’s not enough compute to do all the things that AI can do.”

November 2025 — IPO Plans Clarified

At the Wall Street Journal’s Tech Live Conference, Friar said OpenAI is not preparing an IPO right now, stressing the focus should remain on scaling operations rather than going public in the immediate future.

More “AI Exuberance” Needed

Also at the Tech Live event, she argued that the market is too focused on bubble fears and needs more enthusiasm about practical AI benefits, urging a positive view of AI potential.

“I don’t think there’s enough exuberance about AI, when I think about the actual practical implications and what it can do for individuals,” Friar said in an onstage interview at the Wall Street Journal’s Tech Live conference in California. “We should keep running at it.”

Government Support Comments & Clarification

Speaking at the Wall Street Journal’s Tech Live event, she said:

“We are now OpenAI PBC—that’s our for-profit entity. We have also created the OpenAI Foundation, the nonprofit, which is one of the largest, if not the largest, nonprofits ever birthed into the world. All of this was done last week, so this is really day one. We’re now ready to scale.

I think the second thing is that we’re growing an incredibly healthy business with consumers and enterprises. We’ll go deeper on this, but we now have 800 million weekly active users—the fastest-growing consumer app ever. On the enterprise side, we’re seeing companies now go from pilot deeply into production.

And then I think the third thing that really stands out for me is the ecosystem is rising to meet us. We cannot do this by ourselves. We are in the age of intelligence. As AI comes to the forefront, it's going to take the whole supply chain from chips and data-center builders all of the kit that goes into a data center. And then on the other side, it's going to take governments, it's going to take nonprofits, it's going to take educational establishments, all of it to come together to really show what this can do for humanity," she continued.

"We've raised equity as a private company. Very kind of typical path, but we've raised a lot. We're building a really healthy business. So, free cash flow, CFO's favorite way to fund anything. That is absolutely climbing quickly. But I think the third area we've gotten into is really working with our ecosystem to do some really interesting financing deals. I'm particularly proud of the AMD warrant structure that we put in place just a few weeks back because it's very strong alignment of incentives."

“Let’s just talk about a data center build. So, a one gigawatt data center build today is about a $50 billion investment. That’s for one gig. How that really breaks down is about $15 billion is for the land, power, and shell, and about $35 billion is for the chips. That’s real frontier Nvidia chips that we’re talking about.

The former is well known from a financing perspective. People know how to finance data centers. They typically have 20-, 25-, even 30-year lives. Those are easy things, I would say, today to finance.

Chips have not been as easy to finance because, number one, I think we’re all still getting our arms around what is the life of a frontier chip, right? We openAI, at our core, are the model company that needs always to be the state-of-the-art. That’s what we’ve done time and time again. GPT5 is no exception.

But even in areas like open source, we’re attempting to put the state-of-the-art model always out into the world. And in order to do that, we always want to be on the frontier chip. So the question is, how long does a chip remain on the frontier? Is it three years, four years, five years, or even longer?”

Explaining how the AI agent works for users overnight and why it is priced at a premium, she said:

“The other thing we’re doing with that product is moving from a reactive—right? You go to ChatGPT to get something done and it starts responding—to proactive. So now the agent is working for you overnight in order to create the next kind of level of intelligence for you. Why is it only in our $200-per-month skew? That’s a lot of searches. Not because the CFO is like, ‘Let’s figure out how to monetize it to its max.’ It’s because, we don’t have enough compute to do the searches down into the Plus skew and then ultimately into the free skew.”

Speaking on sponsored results and monetization, she said:

“I think that is the north star that we have to keep internally, because it’s very important to our researchers and to our company that, first of all, it’s AGI for the benefit of humanity. It’s not AGI for the benefit of humanity that are sponsored. And I think this is where other platforms have maybe lost their way somewhat.

We want to keep that pristineness, and we can, because it goes back to how we started—with a subscription model. If you were starting an internet company today and you said, ‘I’m going to charge a subscription,’ people would say, ‘You’re crazy. That is just not how the internet works, Sarah. Do not even try that.’

Instead, what we’ve proven is that we have a subscription model that will get us—you know—we’ve gone from a billion in revenue two years ago, to four billion last year, to 13 billion this year, and we’re not quite done. But with that sort of growth, just running subscription allows you to be a lot more pointed in where you inject the rest of your business model, without compromising that true north, which is: we must always make sure that the intelligence gets you to the best answer, not the paid-for answer.

Q. So on revenue, you mentioned about $13–14 billion this year. What do you envision the mix of consumer versus enterprise being in the next year and sort of longer term?

So on the enterprise side—I mean, our consumer business has been incredible—but our enterprise business is now really starting to take off.”

“And so this year—we entered this year—consumer business was about 70% of revenue, enterprise about 30%. We’re actually starting to tip right now to 60–40, and that will give you a sense of how fast the enterprise business has grown. So just year over year, we’ve seen enterprise business grow about 9x year over year.

We just made an announcement this morning that we just hit one million enterprise customers on the platform, and so there has been incredible growth. Where is that happening? What is happening?

I think, number one, we’re seeing enterprises move very definitively now from pilots to full production. In terms of sectors, our focus has been financial services, health care, and consumer—so kind of CPG, consumer packaged goods. And then I would say kind of a tear-down folks like life sciences, professional services, and manufacturing. Probably that would give you—but really it’s all sectors of the economy.

Enterprise Adoption

Originally, people were just doing wall-to-wall ChatGPT. Now they are taking the API and then starting to get more sophisticated on top of that, to do things that are truly transforming their business.

I’ll give you a couple of examples. So, Amgen is a customer. I was just with their CFO a few weeks ago. Amgen is using our product for faster FDA approvals. So, the FDA process is full of process, full of a lot of documents, full of a lot of data. You have to package it up in exactly the right way so the FDA can digest it and decide whether a drug can be put forth as safe for consumers to use.

If you think about that process, if you can just speed that up by weeks—and what Amgen is seeing is even months of speeding it up—not only is it good for Amgen’s business, but it can be the difference between a cancer patient living long enough to get access to the drug that could save their life and perhaps not being able to get access. So it has a beautiful kind of mission alignment.

If you look at another customer like Walmart, Walmart is a good example of a customer now working with us on the commerce side, but also using a lot of our technology internally around things like how to merchandise, how to handle risk, and so on, on their site—going very, very deep, very, very fast. So we are seeing very real enterprise adoption.”

Speaking on Jobs and human workloads she said: “So my experience so far is I would—I view it almost as more augmenting the—but there are a lot of other places where I actually see it more accelerating kind of human workloads.

If you look at—there was a Wharton study that just came out a few weeks back. What was interesting in that study is I think there’s been a lot of fear about particularly the on-ramp, like the junior people coming out of college and hitting the workforce for the first time. That’s actually the first survey I’ve seen that didn’t say those jobs are disappearing. It just said those jobs are more fine.

If I look, like, inside of my—I'll talk my own shop—inside my own finance team, what I see is that we’re able to get rid of a lot of the more mundane, backward-looking analysis-type jobs, and we’re moving people much more quickly into an insight job.”

Clarification on Government Support

Friar initially suggested U.S. government “backstop” or "guarantee" could help the company finance its massive investments in cutting-edge chips — but walked back the wording, clarifying she meant broader public-private support for capacity building rather than direct bailouts.

"I want to clarify my comments earlier today. OpenAI is not seeking a government backstop for our infrastructure commitments. I used the word “backstop” and it muddied the point. As the full clip of my answer shows, I was making the point that American strength in technology will come from building real industrial capacity which requires the private sector and government playing their part. As I said, the US government has been incredibly forward-leaning and has really understood that AI is a national strategic asset," Sarah Friar said in a LinkedIn post clarifying the comments she made on stage at the Wall Street Journal's Tech Live event in California on November 6, 2025.

Disclosure: News Corp, the parent company of the Wall Street Journal, has a content deal with OpenAI.

User Engagement Concern (closed-door investor call)

In a private earnings call reported by industry press, she mentioned that ChatGPT engagement had declined, attributing it in part to stricter content safety restrictions — highlighting tension between safety and user experience.

ChatGPT users are spending less time on the AI chatbot, parent OpenAI’s finance chief Sarah Friar told top investors in a closed earnings call, according to the tech publication Sources, The Economic Times reported on November 18, 2025.

The decline in user engagement stems from “content restrictions” imposed in August to avoid risks associated with mental health and emotional dependency, Friar said in the meeting last week. She was responding to queries on the growth momentum of the popular conversational model.

Friar told investors that the reversal of curbs is expected to reverse ChatGPT’s dwindling growth momentum, per the report.

December 2025:

CFO of OpenAI Address at the Oxford Union

Friar reflected on ChatGPT’s origins as a research preview in November 2022 and outlined OpenAI’s vision of “abundant intelligence”—AI as a universal utility, like electricity.

She shared stories of farmers, doctors, and individuals like Anna, a Ukrainian immigrant who used ChatGPT to navigate a custody case.

"So, back in November of 2022, ChatGPT was actually rolled out as a research preview. I don’t know if you guys knew that—it wasn’t like a product. It was a research preview, and it just hit the zeitgeist of the world, because I think for the first time people felt what was intelligence.

What had Sam Altman been talking about all those times you’d watch Sam present, and I would leave and I’d be a little bit like, “I think I got about 10% of that”? Suddenly, you could feel it and experience it.

So let’s talk a bit about where we’re at, given that. November of ’22, ChatGPT hits the scene. ’23 is all about chatbots, about LLMs, call and response. 2024, we have a second really major breakthrough at the company to bring on an age of reasoning.

So this wasn’t about really large-format models that were pre-trained using massive amounts of compute, massive amounts of data, the smartest researchers. This is something that happens in post-training.

From now, the metaphor starts again. You might want your agent to come back two years later and say, “Haha, you had an idea. I’ve done some more research. Here’s an answer.” So start thinking about us all being augmented by agents—agentic behavior.

Era of Abundant intelligence

One of the things I talk about to my team is: imagine what it’s like as a leader when you don’t just manage, you know, seven people that are direct reports to you, but maybe it’s five people and ten agents. Like, how are you going to manage them and direct them? That is the world we’re entering. It’s an era of abundant intelligence.

Okay, so what is abundant intelligence? So we think of it at OpenAI as a universal utility. It’s reliable. It’s accessible. It’s everywhere. It’s like electricity.

What we see in real life is that 100 million weekly active users today are using that abundant intelligence—AI—to live, to work, to learn. There are farmers in Kenya who are using it to predict the weather—not just the weather tomorrow, but the weather for the growing season. What should they plant? How should they fertilize it? What are the prices going to be when they bring that to market? That is just real people doing real things.

Two weeks ago, I was in Austin, Texas, and a woman came up to me. She was an ophthalmologist, and she said, “People come to me with eye problems, but what I’ve found is often the eye problem is merely a symptom of a much bigger disease. They might have diabetes that’s starting to impact how they see, their vision. They might have a brain tumor.”

And in a lot of places where I’ve worked—she’d actually worked in Alaska and now works in West Texas—she said, “We just do not have the capacity to bring the specialists.” Think of a nurse in West Texas whose knowledge of a rare brain tumor might be 0.1%, and that’s no fault of hers. She just hasn’t had access to that particular knowledge.

Think of the best researcher in a place like Oxford or Stanford. She might know 30% of the knowledge of the world about that particular form of brain cancer. ChatGPT effectively has had access to all the knowledge. So it has maybe not 100%, because there’s still human innovation happening, but it knows a lot.

Think about Anna. So, a few weeks ago, I had a dinner for CFOs to talk finance—because we love to do that. It also usually becomes a little bit of a therapy session of what it’s like to work with our founders.

Anna came from Ukraine, and she came to the US, and she and her husband were getting divorced. He was able to afford some of the best lawyers. She could afford zero lawyers, so she didn’t know what to do. She turned to ChatGPT.

So as she was getting all of the stuff from his lawyer, she was putting it through ChatGPT and saying, “What should I do when I have to go to court? What do I need to bring?” She was afraid. She’s an immigrant. She was worried about losing her two children.

She went through that process with ChatGPT at her side—not a lawyer. I think the lawyers on the other side were a little bit freaked out, like, “Who the hell is she managed to get?” Because these answers are good.

When the final docs came across from those lawyers, she put it through ChatGPT and said, “Should I sign?” And ChatGPT said, ‘Definitely not. If you sign this document, you might lose custody of your children.’ And so she was able to redline that document, send it back. In fact, what we say—‘I can’t believe you did that.’

And in her words, “ChatGPT saved my life, because without my children, I could not imagine life.” Those are the moments that get me excited when I think about a world of abundant intelligence.

This is a developing story, and more quotes and insights from Sarah Friar will be added as they become available.

Most readers read for free. A small group from the TelecomTalk community keeps this going. Support only if our work adds value for you.

Reported By

Kirpa B is passionate about the latest advancements in Artificial Intelligence technologies and has a keen interest in telecom. In her free time, she enjoys gardening or diving into insightful articles on AI.

Recent Comments

Tanay Singh Thakur :

Thanks Sujata.

Jio is Offering 84 Days of Validity for Just Rs…

sudhakar :

The speed that you are getting in the Indian 5G network is the true 4G speed and the 4G speed…

Jio 5G is Free, But is it Really Unlimited

Faraz :

Meanwhile Vi 99 plan in C circles

BSNL Rs 99 Plan in 2026

Sujata :

Can't agree more, payGo option was too good in this matter. Everything you wrote is true. Telecom business has evolved…

BSNL CMD Confirms 4G Expansion

Sujata :

I think it is NSG (network signal guru)

BSNL CMD Confirms 4G Expansion

Load More
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments