GreenPT and Neuralwatt are making AI’s Hidden Energy Costs Visible
New partnership brings energy transparency to AI inference, helping organizations measure, compare, and optimize the real cost of AI usage
As AI rapidly becomes a core infrastructure for modern businesses, its energy footprint is scaling just as quickly. Global data center electricity consumption is expected to more than double this decade, rising from roughly 415 TWh in 2024 to nearly 945 TWh by 2030[1]. At the same time, the majority of this demand is driven not by model training, but by inference — the everyday queries and workloads that power real-world applications.
Yet despite this rapid growth, most organizations still have little to no visibility into the resources behind their AI usage. Inference is typically priced per token, not per unit of compute or energy, leaving companies unable to measure, compare, or optimize the real cost of running AI at scale.
Neuralwatt and GreenPT today announced a strategic partnership to set a new global standard for sustainable AI inference. The collaboration is designed as a direct response to this gap; introducing infrastructure where energy usage is measurable, transparent, and tied to how AI is actually consumed.
Most AI inference today operates as a black box. Organizations pay per token with no visibility into the energy their workloads consume, no way to compare efficiency across models, and no connection between what they spend and the resources behind it. As inference workloads now account for an estimated 80–90% of AI compute usage, this lack of transparency is no longer just a technical limitation, it is becoming an operational and financial risk. Neuralwatt and GreenPT are building the modern alternative.
“For too long, the AI industry has treated energy as an afterthought, with little visibility into real costs,” said Chad Gibson, co-founder and CEO of Neuralwatt. “We built Neuralwatt Cloud to change that, and together with GreenPT, we’re showing that energy-efficient, transparent inference is where the industry is headed.”
The partnership is centered on complementary expertise and a joint commitment to bring more sustainable AI products to market. GreenPT’s deep experience in renewable infrastructure, carbon measurement, and privacy-first AI, combined with Neuralwatt’s GPU-level energy optimization and energy-based pricing model, creates a foundation for building AI inference that holds itself to a higher standard than what the industry offers today.
“We’ve built an infrastructure that proves sustainable AI isn’t a compromise, but a competitive advantage,” said Robert Keus, co-founder and CEO of GreenPT. “With Neuralwatt, we’re taking this a step further by gaining deeper insight into energy usage and how it can be optimized. Together, we’re moving toward AI systems where performance, cost, and energy are part of the same decision.”
Engineering AI for a sustainable future
Both companies were built on the same conviction. AI's energy challenge isn't solely a supply problem, it's an engineering one; and the solution starts with how inference is built and delivered, not just how much power is available. Too many AI companies start with great technology and then look for the market. Neuralwatt and GreenPT started from the people and places affected by it — customers who need to understand what their AI consumes, communities that are already feeling the strain of unchecked energy demand, and an environment that can't absorb the cost of scaling AI in its current state.
Building a measurable, energy‑transparent AI infrastructure
The pressure on AI infrastructure is growing worldwide. Grids are aging, electricity costs are rising, and data center demand is expected to more than double by 2030. New regulations like the EU AI Act are introducing energy disclosure requirements[1], while in the US, aging infrastructure and local resistance are slowing expansion. Yet many organizations still scale AI without clear insight into resource use.
At the same time, AI is shifting from experimental to always-on infrastructure, making inference efficiency a key constraint alongside performance. Neuralwatt and GreenPT are responding by deepening collaboration on energy transparency, model optimization, and tools to measure and reduce AI’s energy footprint. The partnership reflects a broader shift toward accountable AI systems, where each request has a measurable impact on energy, carbon, and infrastructure
About GreenPT
GreenPT, based in Utrecht, The Netherlands, provides privacy-first AI chat and inference hosted entirely on renewable-energy powered servers. The platform gives organizations access to a wide range of capabilities, including reasoning, document analysis, vision, code generation, speech-to-text, and multilingual processing, all through a single interface. What sets GreenPT apart is its ability to make AI measurable: the platform provides real-time insight into energy consumption and carbon impact at the prompt level, enabling organizations to monitor, compare and actively optimize their AI usage.
Neuralwatt, headquartered in Seattle, builds software that optimizes how AI workloads use power at the GPU level. Neuralwatt’s technology has demonstrated 33 percent more compute from the same power footprint and reduced idle GPU power draw by more than 40 percent. The company recently launched Neuralwatt Cloud, the first inference service with energy-based pricing, and the service GreenPT customers will gain access to through the partnership. Neuralwatt Cloud charges a flat rate per kilowatt-hour across all models with per-request energy reporting on every API call. The platform represents a fundamentally different approach to inference, where organizations can see exactly what their AI consumes and pay based on actual resources used, not opaque token multipliers.