Why Malaysia’s AI infrastructure strategy must look beyond data centers

Artificial Intelligence (AI) has been reshaping how we live and work, from enhancing precision agriculture and improving healthcare diagnostics to expanding fintech services in underserved communities.

In Malaysia, the government has signalled a strong intent to accelerate AI adoption. The recently announced 13th Malaysia Plan (13MP) places AI at the heart of digital transformation strategies, including a forthcoming AI Nation Framework that leverages the country’s 5G infrastructure for smart-city solutions. Meanwhile, Budget 2025 earmarked RM 10 million to establish the National Artificial Intelligence Office (NAIO) under MyDIGITAL, serving as a central authority to coordinate the nation’s AI efforts. 

Region-wide, Southeast Asia’s AI driven economy is expected to deliver a significant boost. Malaysia alone could capture USD 115 billion in AI related GDP contribution by 2030.

Amidst all this momentum, much of the attention has been on the expansion of data centres. Malaysia, for instance, has attracted significant investment commitments in digital infrastructure, with new data centre projects announced almost every few months. 

But while data centres are critical, they should only be one part of a broader AI infrastructure strategy. If the country wants AI to become a catalyst for inclusive and sustainable growth, we need to think holistically — ensuring that compute power is distributed not only through data centres, but also through AI-ready personal computers (PCs) and edge devices that bring intelligence closer to where data is generated.

The whole is greater than the sum of its parts

Consider a smallholder farmer in rural Malaysia using an AI-powered agriculture app that provides real-time insights into soil conditions and weather patterns. In theory, the app helps improve yields and reduce costs with precise, data-driven recommendations.

However, effectiveness can be limited in areas with unreliable internet connectivity, which remains a challenge in parts of rural Malaysia and across ASEAN. Data traveling long distances – from remote villages to centralized data centres often located in urban hubs – can result in slower response times, higher energy use, and greater cybersecurity risks.

This illustrates why a purely data centre–driven approach is insufficient. Supporting Malaysia’s AI ambitions requires a distributed infrastructure strategy that considers:

  • The connectivity gap in rural and underserved communities.
  • The latency and cost of relying solely on centralized processing.
  • The environmental impact of high energy consumption linked to running AI workloads predominantly through large-scale data centres.

To make AI truly inclusive and sustainable to use and run, Malaysia needs a distributed compute approach to AI. This means creating a balance between data centres and edge devices that can run AI applications locally on devices.

Why AI PCs edge devices matter

Running AI applications locally – whether on AI PCs or edge devices such as Internet of Things (IoT) sensors in factories, farms, and vehicles – can be faster, more energy efficient, and more secure.

Imagine the same AI farming app running directly on a device with a built-in AI chip. It works offline, requires less and energy, and keeps sensitive data on the device. For the farmer, this means greater reliability and lower cost. For the business providing the app, it means reduced dependence on cloud infrastructure and improved user experience.

While there are many similarities in terms of the benefits of AI PCs and edge devices, they are used differently. AI PCs are powerful, general-purpose devices best used for local, high-performance AI tasks and development. Whereas edge devices such as those in smart factories or self-driving vehicles, they are more specialized, distributed, and optimized for real-time, low power AI inference at the data source that often needs to operate in space constrained and challenging environments (i.e., high or low temperatures, dusty environments, etc.)

In a distributed compute approach for a nation, this means that data centres are used as centralized hubs for training large AI models, storing vast datasets, and running high-complexity analytics. They continue to serve as the backbone of national and enterprise-level AI workloads. 

At the same time, AI PCs equipped with Neural Processing Units (NPUs) are used by professionals, researchers, and advanced users for local inference for on-device AI tasks, development, and some training. Then you have the edge devices in the periphery (e.g., smartphones, devices in factories, farms, etc.) handling real-time inference and automation. 

Becoming a leader in AI

Many countries around the world are now racing to become one of the leaders of AI, and Malaysia has a real chance to become one as we have the talents, strong government support, and business environment.  

As we continue this critical path of expanding the nation’s AI infrastructure, it is pivotal that we ensure the foundation that we build is accessible and sustainable for all. A distributed compute approach to powering AI for the nation is what we need, not a one-sided reliance on data centres. 

Now that more and more people have a better understanding of AI, and how our lives will be increasingly powered by it, it is now time to shift the conversation beyond data centres.   

This article is contributed by Alexey Navolokin, General Manager for Asia Pacific at AMD

Author: VIP Guest

Share This Post On