At the start of this month, both Google and Microsoft introduced their AI chatbots to the world.
The new AI-powered Bing search by Microsoft is already in the market with some users grabbing a hold of it as we speak while others wait patiently on a waiting list. Google on the other hand is being more careful with the launch as it feels it might not be ready just yet. So the Android maker is busy conducting internal tests so that everything is perfect before it expands its reach to the masses.
Each company is speaking in detail about how much it’s loving the chatbot and what great advantages are on offer. But what they fail to mention are the costs related to their operations. And let us tell you that it does not come cheap.
Think along the lines of expenses that are ten times the usual for operating the regular Google search. Reuters was quick to mention in the report how this type of technology with such a huge language model is probably going to strike huge expenses. This might be one reason they’re doing everything they can to make it so popular that people start attaining subscriptions in the future and also the addition of ads can make a huge difference too.
When it was December 2022, the CEO of OpenAI claimed on Twitter how the costs can be eye-watering. This is one reason why it ended up getting launched in February with a lot more features and better speed which would now make it available for $20 monthly.
A report from Reuters further elaborated on how Google was left with paying just a fifth of a cent every time a user made a search in 2022. But if chatbots like Bard come into play, the search expenses may be up by nearly $6 in the next year. This is if we assume that this chatbot takes on 50% of Google’s respective searches.
Now the costs linked to this have to do with the fact that a CPU is used to generate some natural answers through the likes of an AI chatbot. This further means more expenses in the world of electrical power would be needed to ensure the right functioning of servers.
In case you’re wondering how the costs may end up getting reduced, well, that can happen as the efficiency linked to chatbots increases, allowing fewer costs to be attached. Another idea might be showcasing links for advertisements for chatbot answers. This would produce more revenue that compensates for the costs linked to running such servers.
Read next: What Impact Will Generative AI Have? This Survey Reveals the Answers
by Dr. Hura Anwar via Digital Information World
No comments:
Post a Comment