The UAE’s Technology Innovation Institute (TII) yesterday launched the Falcon 180B, a large-scale upgraded version of the Falcon 40B. According to the official blog post, it is the largest open source language model, boasting a staggering 180 billion parameters.
According to TII, Falcon 180B was trained concurrently on 3.5 trillion tokens across 4096 GPUs, using Amazon SageMaker for a total of ~7,000,000 GPU hours. In simple terms, the Falcon 180B is 2.5 times larger than Llama 2 and is required four times more computing power for its training. It is certainly interesting that the UAE’s TII managed to acquire such significant computing power.
UAE has money from oil
The UAE, as an oil-rich country, has abundant financial resources at its disposal. According to a report, hydrocarbons continue to play an important role in the UAE economy, with 30% of the UAE’s GDP directly relying on the oil and gas industry and 13% on exports.
The UAE is allocating oil earnings to fund AI projects. Six years ago, they launched the National AI Strategy 2031, which aimed to make AI contribute significantly to their economy, targeting up to 13.6% of GDP by 2030.
In 2020, the UAE government established ARTC (Advanced Technology Research Council) to promote scientific research and innovation in AI. A few months later, ARTC founded TII, the company behind the creation of today’s Falcon 180B. There is no doubt that the UAE is optimistic about investing in AI initiatives. In June, when OpenAI CEO Sam Altman visited Abu Dhabi, he praised the country’s foresight in realizing the potential of AI, saying that the city “has been talking about AI since before it was so it’s great.”
While the world is struggling to buy NVIDIA GPUs, the UAE secured access to thousands of NVIDIA chips which it used to build the Falcon prototype in May. Furthermore, the report added that the UAE wants to control Control and possess your own computing power and talent without depending on the Chinese or Americans. There is no doubt that they have the capital, energy and talent to do it.
Similarly, Saudi Arabia also bought no less than 3,000 H100 chips. These processors are valued at $40,000 each. The acquisition was made through the public research institution, King Abdullah University of Science and Technology (KAUST). A bit of a quandary, and it’s clear that Saudi invested a staggering $120 million to secure this impressive array of GPUs.
This is why when the US banned the export of AI chips to Middle Eastern countries, AMD and NVIDIA both raised eyebrows. All the major economies of the world are currently engaged in an LLM race leading to a cold war with the US trying its best to ban domestic AI chip manufacturers from supporting their competitors.
Not only that, UAE’s G42 recently launched the Jais Arabic language AI model containing 13 billion parameters. Jais was created with the help of supercomputers manufactured by Silicon Valley-based Cerebras Systems, and they signed a $100 million contract with G42. Because NVIDIA’s chip supply is in short supply, the UAE is smart enough to look for alternative solutions.
Furthermore, G42 in 2021 raised $800 million from US technology investment firm Silver Lake, which is backed by Mubadala, the UAE’s sovereign wealth fund.
What about OpenAI?
Coming to OpenAI, the company’s progress largely depends on the billion-dollar investment it received from Microsoft at the beginning of the year. However, with recent developments, it seems that investment has dried up. Recently, Altman posted on X that the company will not launch GPT-5 or GPT-4.5 in the near future and asked everyone to stay calm.
Based on Information reported, OpenAI’s losses nearly doubled to about $540 million last year while developing ChatGPT and GPT-4. According to the report, training GPT-3 with 175 billion parameters cost them more than $4 million. Now, with rumors of GPT-4 having about 1.76 trillion parameters, the cost of building this model has reached nearly 46.3 billion USD, assume the cost for each parameter increases linearly. As a reminder, this is a simplified estimate and actual costs may vary based on various aspects, including research and development costs, human resources, hardware improvements, etc.
This explains why OpenAI is reluctant to release GPT-4’s multi-method capabilities to the public or reveal the parameter sizes, which the team seems to be intentionally hiding to avoid unwanted attention. Who knows, maybe OpenAI fooled us all and we never actually got GPT-4.
Altman previously suggested OpenAI could try to raise up to $100 billion in the coming years to achieve its goal of developing AGI. Perhaps OpenAI will also attract some oil money, or perhaps expand into the Middle East. What’s interesting is that Microsoft already has plans to do that.
As of now, OpenAI is trying to attract enterprises to stay in business. They announced their inaugural developer conference, scheduled to take place in San Francisco on November 6, 2023, where they hope developers around the world will pitch their ideas. and new tools for ChatGPT and API.
#UAE #dethrone #OpenAI
World Innovations: Top Trends Shaping the Future Worldwide
Global Migration Trends: Understanding the Modern Movement of People
World Sports: Discover the Most Exciting Global Sporting Events