Open AI urgent financing
With DeepSeek making its impact, Silicon Valley is just too exciting.
Yesterday, OpenAI and Anthropic were still leading the charge, trying every means possible to trip up the competition. Overnight, infrastructure vendors have suddenly become “really interested”.
Following Microsoft, NVIDIA and AWS have also expedited the launch of DeepSeek model hosting services.

As a user commented on the official Twitter account of NVIDIA, “If you can’t beat them, join them.”
Let’s take a look at each of these moves.
First, some new news: OpenAI’s latest response was announced today: fundraising, fundraising, fundraising.
The Wall Street Journal broke the news that OpenAI is seeking a new round of financing of US$40 billion (approximately 287.5 billion yuan) at a valuation of US$300 billion –
which would break the record for the highest single round of financing in Silicon Valley, held by OpenAI itself.
This round of financing is led by SoftBank. Previous news indicated that SoftBank plans to invest up to $25 billion (about 179.7 billion yuan) in OpenAI this time.
And this is just four months after OpenAI raised $6.6 billion at a valuation of $157 billion.
In just a few months, the valuation has doubled again, which also confirms the market rumors: OpenAI was not satisfied with the last round of financing.
And now, OpenAI may really be a bit desperate:
Another piece of news is that o3 will be released on Friday local time.

But the AI application side of the story is a bit different.
For example, Cursor, a favorite of programmers, has already openly adopted the new model, calling on everyone to test it together to see the real results.

Only the model vendors in the middle are left flustered in the wind. For cloud services and applications, it is only right to support them all first (doge).
For more details, let’s continue chatting.
See you on Friday with Open AI o3
According to the leak, one of the purposes of OpenAI’s fundraising is to fulfill its promise to the Stargate project.
This project is led by OpenAI and SoftBank, with Arm, Microsoft, NVIDIA, and Oracle as key technology partners. The goal is to invest $500 billion (about 3.64 trillion yuan) over the next four years to build multiple AI data centers in the United States.
OpenAI’s commitment is: $100 billion.

The official announcement revealed some details:
The Stargate Project is a new company that plans to invest $500 billion over the next four years to build new AI infrastructure for OpenAI in the United States. We will immediately invest $100 billion.
The initial equity funders of the project include SoftBank, OpenAI, Oracle, and the Middle Eastern AI fund MGX. SoftBank and OpenAI are the lead partners of the project, with SoftBank responsible for finance and OpenAI responsible for operations. Masayoshi Son will serve as chairman.
Arm, Microsoft, NVIDIA, Oracle and OpenAI are key initial technology partners. Construction is currently underway, starting in Texas, and we are evaluating potential locations across the country to build additional campuses while finalizing agreements.
As part of the Stargate project, Oracle, NVIDIA and OpenAI will work closely together to build and operate the computing system. This builds on the deep collaboration between OpenAI and NVIDIA since 2016, as well as the new collaboration between OpenAI and Oracle.
It also builds on OpenAI’s existing relationship with Microsoft. OpenAI will continue to increase its use of Azure as it continues to work with Microsoft to leverage additional compute power to train leading models and deliver exceptional products and services.
In addition, the money will also be used by OpenAI to cover losses. Although in August 2024, OpenAI’s monthly revenue reached 300 million US dollars, an increase of 1700% compared to early 2023, in October, OpenAI is expected to have a full-year loss of 5 billion US dollars.
Along with the news of the financing came the latest schedule for o3.
In an interview with NPR, OpenAI Chief Global Affairs Officer Chris Lehane revealed:
o3 will be released on Friday.
As soon as the news broke, the Internet user Xiaomaza was ready.
However, some Internet users pointed out that what he meant by “o3” was still o3-mini, and Altman himself had long since announced the launch of this model.
After all, despite the controversy, DeepSeek is sparking more discussion and praise across the pond
DeepSeek is really good
After Microsoft pointed its fingers, it immediately connected its AI platform to the DeepSeek model. Today, Cursor, a favorite of programmers, also announced the latest news:
the DeepSeek model is now available on Cursor.

It is worth noting that Cursor mentioned that in actual programming tasks, the Sonnet 3.5 is still better than DeepSeek’s new model.
However, Cursor officials did not give a specific example, which aroused the curiosity of the onlookers:
It seems that it’s time to speak with real-world measurements. If you’ve already had a chance to try it, you might as well share your experience with us in the comments section.
Similarly, the loudmouth Anthropic, whose backer, AWS, has also been the first to adopt the DeepSeek model, has not hesitated at all due to the controversy (doge).
Another company that has attracted more attention is NVIDIA, whose stock price has really been boosted by DeepSeek—
DeepSeek-R1 is already available on the NVIDIA NIM platform. NVIDIA has also made a big fuss about it:
DeepSeek-R1 is an open-source model with state-of-the-art inference capabilities. Inference models like DeepSeek-R1 do not give direct answers, but perform inference on queries to generate the best answer through thought chains, consensus and search methods.
DeepSeek-R1 is a perfect example of Scaling Law in testing, proving why accelerated computing is essential for the inference needs of proxy AI.
NVIDIA says that DeepSeek-R1 NIM microservices can deliver 3,872 tokens per second on a single NVIDIA HGX H200 system.
And there are also more and more dissenting voices emerging in response to the unresponsive reactions of Anthropic and others facing DeepSeek competition.
For example, HuggingFace co-founder Thomas Wolf directly criticized:
To be honest, Dario’s article was very painful to read.
In his short essay, he wrote:
Comparing open source research to vague closed research and unpublished evaluations has made me less confident in Anthropic’s leading position than before.

More importantly, with the development of Open-R1 and the DeepSeek paper, teams from all over the world will release open-source inference models in the coming months. For example, today, the Allen Institute and Mistral released Tülu and Small3, respectively, to catch up with DeepSeek-V3.
Open source will become increasingly important to our security.
What do you think?