OpenAI CEO has never been shy of expressing his views on the company’s GPT-4 model. He once indicated that it ‘kind of sucks’ and in another instance promised that GPT-5 will be smarter than the ‘mildly embarrassing’ GPT-4. According to Altman, “GPT-4 is the dumbest model any of you will ever have to use again, by a lot. It’s important to ship early and often, and we believe in iterative deployment.”
This model, not considered the best by the company CEO, apparently also consumes up to 3 bottles of water to generate 100 words. That’s right! A new research conducted by the University of California and published by The Washington Post has found this. It said that the exact usage of water depends on the state and proximity to the data center, with lower water use corresponding to cheaper electricity and higher electricity use.
As per the research, Washington demanded a whopping 1,408 milliliters per email — which is about three 16.9oz water bottles. Meanwhile, Texas had the lowest water usage at about 235 milliliters needed to generate one 100-word email. However, the amount of resources required to run these AI models is becoming a major concern.
The data centres are also seeking major electricity supplies to remain operational. This also drives up the power and water bills of residents in the towns where these data centers are being built.
This isn’t the first time that similar findings have been published. Last year, Windows Central published a report claiming that Microsoft Copilot and ChatGPT consume up to 1 bottle of water per query for cooling. In fact, Google and Microsoft — two of the biggest players in the AI space — collectively consume more power than 100 countries to manage their AI advances.
Separately, OpenAI is also spending millions of dollars to keep ChatGPT operational. This has reached a point where the company may run out of cash unless it raises a fresh round of funding. It appears that Microsoft, Apple, and Nvidia might invest in the company. However, as the AI models continue to advance, there seem to be a lot of questions around them.