Key Takeaways:
- The white paper proposes extending India’s Digital Public Infrastructure (DPI) model to enable AI compute democratisation.
- India plans to scale data centre capacity from about 960 MW to an estimated 9.2 GW by 2030 to meet rising compute demand.
- Platforms such as IndiaAI Compute and AIKosha will offer subsidised access to GPUs and TPUs, lowering barriers for researchers and startups.
- Treating AI infrastructure as shared digital public goods aims to broaden access beyond physical proximity and private ownership.
India has published a government white paper that sets out a policy framework to democratise access to artificial intelligence (AI) infrastructure by extending the country’s Digital Public Infrastructure (DPI) model to compute and data resources. Issued under the Office of the Principal Scientific Adviser after consultations with domain experts and stakeholders including NITI Aayog, the report charts steps to platformise AI access rather than concentrate ownership of models and hardware.
AI compute democratisation drives access across sectors
The paper argues that opening shared layers of datasets, compute and model ecosystems will allow researchers, startups and public institutions to train and deploy models without owning expensive infrastructure. India already generates nearly 20% of the world’s data but accounts for only about 3% of global data centre capacity. The report estimates that scaling AI data centres will require an additional 45-50 million square feet of real estate by 2030 and projects installed capacity to rise from roughly 960 MW to about 9.2 GW.
To bridge that gap, the document recommends treating foundational compute and data services as Digital Public Goods. Under this approach, users would access compute remotely via shared technical architectures, moving beyond a narrow focus on hardware to include registries, standards and federated consent systems that evolve in stages.
The proposal builds on existing platforms and programmes. The IndiaAI Mission and its IndiaAI Compute Portal have already provided access to more than 38,000 GPUs and 1,050 TPUs, offering compute at subsidised rates of under Rs 100 per hour. The National Supercomputing Mission has deployed more than 40 petaflops of capacity across research institutions, while a secure cluster of around 3,000 next-generation GPUs has been earmarked for sovereign and strategic uses.
Industry investment is also scaling. Commercial cloud services such as AWS, Google Cloud and Microsoft Azure host open datasets used by Indian researchers. Domestic and international data centre operators, including Yotta Data Services, NTT, CtrlS and AdaniConneX, are expanding hyperscale and sovereign cloud facilities. Yotta operates Asia’s largest single-building data centre in Navi Mumbai with a 72 MW IT load, while CtrlS runs 19 facilities totalling about 250 MW.
Practical measures in the white paper seek to combine sustainable planning with compute expansion. The government notes the need to integrate energy-efficient design and land-use planning as capacity grows. The India Semiconductor Mission, backed by a Rs 76,000-crore allocation, has approved advanced chip fabrication and packaging projects to strengthen domestic hardware capabilities.
Officials describe the desired model as “compute-as-a-service”: a set of shared, governed platforms that lower the cost of experimentation and accelerate local innovation in regional languages and assistive technologies. By offering subsidised compute and standardised access layers, the policy aims to reduce entry barriers for smaller firms and public interest projects, while retaining safeguards to mitigate risks and improve accountability.
Analysts say the white paper signals a pragmatic shift in India’s AI policy, from siloed investments to shared infrastructure that leverages both public and private capacity. If adopted, the DPI-aligned approach could widen participation in AI development across the economy, support sovereign capabilities and help position India as a hub for responsible, inclusive AI innovation.

















