Akamai Technologies and Neural Magic announced a strategic partnership intended to supercharge deep learning capabilities on Akamai's distributed computing infrastructure. The combined solution gives enterprises a high-performing platform to run deep learning AI software efficiently on CPU-based servers. As an Akamai Qualified Computing Partner, Neural Magic's software will be made available alongside the products and services that power the world's most distributed platform for cloud computing, security, and content delivery.

Neural Magic's solution enables deep learning models to run on cost-efficient CPU-based servers rather than on expensive GPU resources. The software accelerates AI workloads using automated model sparsification technologies, available as a CPU inference engine, complementing Akamai's ability to scale, protect, and deliver applications at the edge. This allows the companies to deploy the capabilities across Akamai's globally distributed computing infrastructure, offering organizations lower latency and improved performance for data-intensive AI applications.

Moreover, the partnership can help foster innovation around edge-AI inference across a host of industries. The combined capabilities of Akamai and Neural Magic are particularly well suited for applications in which massive amounts of input data are generated close to the edge, placing affordable processing power and security closer to the data sources. Akamai recently announced a new Generalized Edge Compute (Gecko) initiative to embed cloud computing capabilities into its massive edge network, which will ultimately help support such applications and workloads among many others.

Akamai and Neural Magic share a common origin, both having been born out of Massachusetts Institute of Technology (MIT). They continue to maintain their respective corporate headquarters nearby.