IBM focuses on creating low cost chips, joins the race of Google & Amazon
IBM (International Business Machines), the well-known New York-based computer manufacturer firm, focuses on creating its own AI chips with lower cost than that of Nvidia.

Highlights
- No timeframe on the availability of chips has been disclosed by IBM
- IBM buckles up the AI innovation amid already existing fuss
An executive from International Business Machines said on Tuesday that the company is planning to use artificial intelligence chips that it developed internally to cut down on the cost of running a cloud computing service that was made available this week.
The corporation is considering utilising a chip known as the Artificial Intelligence Unit as part of its new ‘"watsonx’ cloud service, according to Mukesh Khare, General Manager of IBM Semiconductors.
IBM learns from the previous failure of generative AI model
The company is said to be taking advantage of the already existing generative AI fuss, which can write human-like text, after the last time's failure of the tech giant's IBM watson. Watson is a question-answering computer system capable of answering questions posed in natural language.
High expenses were one of the challenges faced by the last watson system, a problem IBM hopes to resolve this time. Due to their high power efficiency, Khare claimed to deploy its own chips which might reduce the cost of cloud services.
Samsung Electronics (005930.KS), which collaborates with IBM on semiconductor research, is the manufacturer of the chip, states to Khare, and his company is considering using it in watsonx.
IBM joins tech giants in creating its own chips
IBM has not specified a timeframe for when the chip might be made accessible to cloud users, but according to Khare, the company already has thousands of functional prototype chips
In developing its own AI chips, IBM has joined other IT behemoths including Alphabet's (GOOGL.O) Google and Amazon.com (AMZN.O).
However, according to Khare, IBM was not attempting to create a direct competitor to Nvidia's (NVDA.O) semiconductors, whose chips dominate the market for using massive volumes of data to train AI systems.
Instead, IBM's chip hopes to be cost-effective at what AI industry insiders call inference, which is the use of an already trained AI system to make decisions in the real world.
"That's where the volume is right now," Khare said. "We don't want to go toward training right now. Training is a different beast in terms of compute. We want to go where we can have the most impact."