|Price:||Log In to View|
|Publication Date:||3Q 2020|
VMware’s Bitfusion acquisition is another example of how important top-line graphics processing units have become in artificial intelligence and machine learning. It also illustrates how pricing makes it ever more necessary to ensure graphics processing units (GPUs) are fully utilized. However, new computer science research reveals that the computational demands of deep learning (DL) are outpacing GPUs. Future artificial intelligence and machine learning (AI and ML) developments will increasingly turn to field-programmable gate array (FPGA) computing, with major consequences for the industry.
Key Questions Addressed:
- Why did VMware acquire Bitfusion?
- What does the increasing importance of GPU virtualization reveal about the AI sector?
- How demanding are current AI techniques in terms of computing power?
- What impact does this have on the scalability of AI?
- How can the semiconductor industry resolve this issue?
Who Needs This Report?
- Enterprises deploying AI and ML
- Semiconductor and hardware companies
- AI/ML developers
- Investor community
Table of Contents
Sharing virtualized GPUs and accelerators is emerging as a crucial field for AI hardware
The demands of deep learning workloads are escalating rapidly
Future server hardware and semiconductor industry may look very different