On the air for about two months, ChatGPT has already won many hearts. So far, if we’ve had access to the chatbot for free, eventually OpenAI intends to monetize the artificial intelligence (AI) system with a paid version.
It would be a proposal aimed at those seeking, among other things, faster answers.
It can already be considered as one of the most popular AI models today and the truth is that OpenAI should want to improve it even more. For now, anyone who wants to test it out can do so by accessing and signing up for the platform. However, eventually, the news may arrive.
After all, OpenAI, through its official Discord channel, announced that in the future, a paid version of chat. The goal is clear: to "monetize" a chatbot that has captured users and is already causing a lot of talk.
Our goal is to continue to improve and maintain the service, and monetization is one of the ways we consider ensuring its long-term viability.
Explanation of OpenAI, over at Discord.
What is good ends quickly! Or not, OpenAI?
The Professional plan will have advantages over the version we're familiar with, such as faster and unlimited responses, for example, as well as instant service availability. Although the free version is available to any user, it does not always work due to high demand. Sometimes you have to wait to ask a question.
The paid Pro version is still in beta, and although it's on the table, when it arrives (if it does) it shouldn't replace the free version, but rather serve as a replacement as a more complete service.
Currently, OpenAI does not monetize ChatGPT, because it offers the chatbot service for free and does not display any ads. However, it appears to be very expensive to develop and maintain. Second estimate Tom GoldsteinOpenAI, a researcher and professor in the Department of Informatics at the University of Maryland, spends between $100,000 a day and $3 million a month to keep ChatGPT up and running.
I estimate that the cost of running ChatGPT is $100K per day, or $3M per month. This is a back of envelope account. I assume nodes are always used with a batch size of 1. In fact, they probably compile during high volume, but GPUs have a rest during low volume.
- Tom Goldstein (@tomgoldsteincs) December 6, 2022
Read also:
“Coffee trailblazer. Social media ninja. Unapologetic web guru. Friendly music fan. Alcohol fanatic.”