NetApp and Intel

NetApp and Intel bring AI inferencing solution

Spread the love

Bengaluru: NetApp and Intel have announced the release of a joint solution NetApp AIPod Mini, which is designed to streamline enterprise adoption of AI inferencing. NetApp and Intel collaboration addresses the unique challenges businesses face when deploying AI, such as cost and complexity, at the department and team level.

To thrive in the era of intelligence, enterprises have adopted AI to enhance efficiency and data-driven decision-making across their operations.

A study by Harvard Business School found that consultants given access to AI tools were able to increase their productivity, completing 12.2% more tasks and completing them 25.1% more quickly. However, individual business units may find that the broadly available general-purpose AI applications are not able to meet their specific needs, but do not have the technical expertise or budget to customise an AI application from scratch.

NetApp and Intel have partnered to provide businesses with an integrated AI inferencing solution built on an intelligent data infrastructure framework that allows specific business functions to leverage their distinct data to create outcomes that support their needs.

NetApp AIPod Mini streamlines the deployment and use of AI for specific applications such as automating aspects of document drafting and research for legal teams, implementing personalised shopping experiences and dynamic pricing for retail teams, and optimising predictive maintenance and supply chains for manufacturing units.

“Our mission is to unlock AI for every team at every level without the traditional barriers of complexity or cost,” said Dallas Olson, Chief Commercial Officer – NetApp.

“NetApp AIPod Mini with Intel gives our customers a solution that not only transforms how teams can use AI but also makes it easy to customise, deploy, and maintain. We are turning proprietary enterprise data into powerful business outcomes,” added Olson.

NetApp AIPod Mini enables businesses to interact directly with their business data through pre-packaged Retrieval-Augmented Generation (RAG) workflows, combining generative AI with proprietary information to deliver precise, context-aware insights that streamline operations and drive impactful outcomes.

By integrating Intel Xeon 6 processors and Intel Advanced Matrix Extensions (Intel AMX) with NetApp’s all-flash storage, advanced data management, and deep Kubernetes integration, NetApp AIPod Mini delivers high-performance, cost-efficient AI inferencing at scale.

Built on an open framework powered by Open Platform for Enterprise AI (OPEA), the solution ensures modular,
flexible deployments tailored to business needs. Intel Xeon processors are designed to boost computing performance and efficiency, making AI tasks more attainable and cost-effective, empowering customers to achieve more.

“A good AI solution needs to be both powerful and efficient to ensure it delivers a strong return on
investment,” said Greg Ernst, Americas Corporate VP and GM – Intel.

“By combining Intel Xeon processors with NetApp’s robust data management and storage capabilities, the NetApp AIPod Mini solution offers business units the chance to deploy AI in tackling their unique challenges. This solution empowers users to harness AI without the burden of oversized infrastructure or unnecessary technical complexity,” added Ernst.

NetApp AIPod Mini with Intel will be available in summer 2025 from strategic distributors and partners around the world.