Deal comes a month after Intel demoed its Nervana AI silicon
Intel is buying Israeli company Habana Labs for $2 billion in an effort to bolster its efforts to develop processors for artificial intelligence (AI)-based data center workloads that can compete with Nvidia’s accelerators and the growing number of vendors building chips designed for AI and machine learning.
The Lowdown: Intel is seeking better traction in an AI silicon market that the giant chipmaker expects will grow past $25 billion by 2024. Apparently, Intel was unsure whether the Nervana program it had developed over the past three years since buying that start-up was going to get the job done.
The Details: Habana will continue operating as an independent company based in Israel led by its current management team and will report to Intel’s Data Center Group, which is where the bulk of Intel’s AI efforts are housed, including AI-based software, algorithms, and research. The start-up’s chairman, Avigdor Willenz, will be a senior advisor to both the Habana business unit and Intel, which had been an investor in the company.
Habana a year ago launched its Goya processor for machine learning inference workloads in large-scale data center environments, and officials said the chip offers significant performance and latency improvements over competing Nvidia silicon. In June, the start-up rolled out Gaudi, a chip for machine learning training workloads. Goya is generally available now, and Gaudi is sampling with several hyperscale data center companies, with Intel promising four times the throughput available with equal numbers of GPUs.
The Impact: The acquisition of Habana comes less than four months after Intel introduced its two Nervana AI accelerators as part of its larger Nervana Neural Network Processor (NNP) portfolio – the NNP-T for machine learning training workloads and NNP-I for inference – and a month after demonstrating the accelerators during its AI Summit. It’s unclear what Habana’s arrival means for the Nervana chips, though what is clear is Intel’s drive to be the key silicon provider for AI workloads. The chipmaker expects to generate more than $3.5 billion in revenue from AI work this year, which will be more than a 20% increase over 2018.
Background: What’s also clear is that Intel will do this in an increasingly crowded market. The company competes not only with Nvidia, which is continuing to grow its data center business thanks in part to its efforts in AI – including its Nvidia Deep Learning Accelerator (NVDLA) – but also with others like Google, with its Tensor Processing Units (TPUs); Amazon Web Services, with its Inferentia inference chip, and AMD. There are also numerous start-ups, such as Cerebras Systems, which has unveiled a monster AI chip with 1.2 trillion transistors, and GraphCore.
The Buzz: “This acquisition advances our AI strategy, which is to provide customers with solutions to fit every performance need – from the intelligent edge to the data center,” said Navin Shenoy, executive vice president and general manager of Intel’s Data Platforms Group. “More specifically, Habana turbo-charges our AI offerings for the data center with a high-performance training processor family and a standards-based programming environment to address evolving AI workloads. We know that customers are looking for ease of programmability with purpose-built AI solutions, as well as superior, scalable performance on a wide variety of workloads and neural network topologies. That’s why we’re thrilled to have an AI team of Habana’s caliber with a proven track record of execution joining Intel. Our combined IP and expertise will deliver unmatched computing performance and efficiency for AI workloads in the data center.”
“We have been fortunate to get to know and collaborate with Intel given its investment in Habana, and we’re thrilled to be officially joining the team,” Habana CEO David Dahan said. “Intel has created a world-class AI team and capability. We are excited to partner with Intel to accelerate and scale our business. Together, we will deliver our customers more AI innovation, faster.”