Intel and Baidu jointly develop the Nervana neural network training processor
Время публикации: 2019-07-04
At the Baidu AI Developers Conference, Naveen Rao, Intel's vice president and general manager of the Artificial Intelligence Products Division, announced that Intel is working with Baidu to develop the Intel® NervanaTM Neural Network Training Processor (NNP-T). This collaboration includes a new custom accelerator to achieve the goal of speed training deep learning models.
Naveen Rao said: "In the next few years, the complexity of the AI model and the need for large-scale deep learning computing will explode. Intel and Baidu will continue their cooperation for more than a decade and focus on joint design and development of new hardware and Supporting software to continuously move towards the new frontier of 'AI 2.0'."
AI is not a single workload, but a powerful capability that enhances the performance of all applications, whether they run on mobile phones or in large data centers. However, mobile phones, data centers, and all facilities in between have different requirements for performance and power consumption, so a single AI hardware can't meet all the requirements. Intel provides superior hardware choices in artificial intelligence and maximizes the release of hardware through software, helping customers run AI applications wherever data is complex or wherever they are. The NNP-T is a new class of highly developed deep learning system hardware that accelerates large-scale distributed training. Close cooperation with Baidu ensures that Intel's development department is always keeping up with the latest customer demand for training hardware.
Since 2016, Intel has been optimizing the Baidu Paddle* (Deep Learning Framework) for Intel® Xeon® scalable processors. Today, by optimizing NNP-T for Baidu's flying paddles, both parties can provide data scientists with more hardware options.
At the same time, Intel is further enhancing the performance of these AI solutions with more technology. For example, with the higher memory performance offered by Intel's Proud Data Center-class persistent memory, Baidu is able to offer personalized mobile content to millions of users through its Feed Stream* service, and gains access through Baidu's AI recommendation engine. Efficient customer experience.
In addition, given the importance of data security to users, Intel is working with Baidu to create MesaTEE*, the Memory Security Feature as a Service (FaaS) computing framework based on Intel Software Protection Extensions (SGX) technology.
- 1.Intel and Baidu jointly develop the Nervana neural network training processor
- 2.Intel is delaying plans to build a fab in Israel
- 3.Why are Intel and Dell Ian’s favored FPGAs?
- 4.Intel Unite? cloud services make it easier for organizations of all sizes to deploy and manage
- 5.Intel unveils details of a second generation of powerful scalable processors that will increase AI performance 30-fold
- 6. Intel chip circuit map released, 2021 release of the spoilers
- 7.The ninth-generation core i3 has been opened up to support the speed up technology that was once the i5
- 8.In the future, the graphics chip on the Intel graphics card may be made by samsung
- 9.Intel 14nm process continues to improve in performance and power consumption
- 10.Intel's XE graphics card may not be its own 10nm, but samsung's 5nm EUV
8-MBIT (512K X 16, 1024K X 8) SmartVolta >
QG82915GME S LA9KIntel
Chipsets 915GME Mobile & Atom GMCH FCBGA >
Intel StrataFlash Embedded Memory >
- NZ48F4000L0YTQ0S L7BT
- NQ82GDGES L338A239
- NQ82001MCH QE11ES
- NQ80332M500 Q874ES2000
- NH82801FR SL89J
- NG88AGM 5413A287
- GC80960RP3V33 SL396
- FW82804AA SL3PT
- FC80486DX4WB100 SL2M9