The largest semiconductor manufacturer in the world by revenue is best renowned for its dominance in the CPU market.
However, now it seems that Intel is committed to embarking on an artificial intelligence (AI) mission. With AI chips becoming all the rage these days, the company has fallen behind its competitors.
These include Nvidia, which appears to have a huge head start when it comes to using its GPUs for AI processing.
The AI market
Catching up with its competitors in this area can offer a significant amount of benefits to Intel. A report recently disclosed that the AI chip market was valued at around $8 billion in 2020.
But, predictions show that the market will swell to a whopping $200 billion in the next decade. In May, Intel conducted its Vision event and Pat Gelsinger, the CEO of the company, talked about AI.
He highlighted artificial intelligence (AI) as key to the future products of the company and predicted that it would drive the company’s overall strategy because of its requirement for computers offering higher level performance.
The chief executive said that there were four superpowers that would help Intel in innovation. These were cloud-to-edge infrastructure, artificial intelligence (AI), ubiquitous compute and pervasive connectivity.
High performance software and hardware systems are required, including frameworks and tools that are used for implementing end-to-end AI.
Therefore, it was disclosed that Intel’s strategy is to develop a range of open source software and chips that would be able to address a wide range of computing needs, as AI use becomes widespread.
Speaking at the Vision event, Gelsinger stated that all of these four superpowers were impressive in their own respect, but they can be magic when combined.
He said that businesses not using artificial intelligence (AI) in every process were just falling behind and this was applicable in every industry.
The general manager and vice president of AI at Intel, Wei Li said that Intel was positioned to compete in the AI market because of the strong connection between its hardware and software.
He disclosed that the biggest problem they were trying to tackle was creating a bridge between insights and data.
He went on to say that the bridge has to be wide enough for handling a great deal of traffic and should also have the right speed and not get stuck.
This means that artificial intelligence (AI) requires software to perform quickly and efficiently. There needs to be a full ecosystem that allows data scientists to use large quantities of data and develop solutions.
Moreover, they also need to accelerate their hardware, so it is able to process that data efficiently. Li said that the software and hardware team work closely with each other.
He said that they function like one team because they have to understand the AI models, identify the performance bottlenecks and then adjust the hardware capacity accordingly.
He stated that the original GPU design was not meant for AI, but they have now evolved.