IIJ × Kawamura Launch Modular Edge Data Center for AI-Era Distributed Computing
Japanese internet giant IIJ and Kawamura Electric co-developed 'DX edge Cool Cube'—a modular edge data center with liquid cooling for high-density GPU servers, enabling local AI inference at edge locations.
IIJ × Kawamura Modular Edge Data Center: In-Depth Technical Analysis
1. Product Overview
On March 17, 2026, Japanese internet infrastructure pioneer IIJ (Internet Initiative Japan, founded in 1992 as Japan's first commercial internet service provider) and power distribution equipment manufacturer Kawamura Electric Industry officially launched their jointly developed modular edge data center product, "DX edge Cool Cube." This product represents the commercial release of a prototype first announced in March 2025, refined through a year of evaluation, testing, and improvement.
DX edge Cool Cube is positioned as "distributed digital infrastructure for the AI era" — a self-contained edge AI foundation that integrates power supply, cooling systems, and IT racks into a single deployable unit, specifically designed for high-heat, high-power GPU servers.
2. Technical Specifications Deep Dive
#### 2.1 Core Technical Parameters
| Parameter | Specification |
|-----------|-------------|
| Server Load Capacity (Air-Cooled) | Maximum 45kW per module |
| Server Load Capacity (Liquid-Cooled) | 60kW (dependent on CDU cooling capacity) |
| PDU Input Power | Single-phase 200V or three-phase 4-wire 400V |
| Module Dimensions | W1000/1200 × D2000 × H2500mm |
| Rack Specification | EIA standard 19-inch rack (42RU per rack) |
| Cooling Methods | Air cooling (In-Row) and Direct Liquid Cooling (DLC) |
| Installation Environment | Indoor and outdoor |
| Operating Temperature | -20°C to 40°C (continuous) / -25°C to 45°C (maximum) |
| Dust/Water Protection | IP55 |
| Standard Delivery Time | 5 months |
#### 2.2 Modular Architecture Design
DX edge Cool Cube employs a three-module combination architecture:
- **Electrical Module:** Contains power distribution equipment, UPS, PDU, and other electrical infrastructure
- **IT Module:** Contains EIA-standard 19-inch racks and GPU servers
- **Cooling Module:** Contains air conditioning or liquid cooling systems (chillers)
These three module types can be freely combined according to requirements. The minimum configuration starts from a single rack, suitable for proof-of-concept (PoC) and small-scale validation. N+1 redundancy configurations are achieved through module interconnection, and GPU additions or site expansions can be flexibly accommodated.
#### 2.3 Cooling Technology Innovation
As GPU server power consumption has escalated dramatically, cooling has become the central challenge in data center design. DX edge Cool Cube offers two cooling approaches:
Air Cooling (In-Row): Utilizes in-row cooling units positioned directly between rack columns, supporting up to 45kW per module of thermal load. This approach is suitable for traditional servers and medium-density GPU configurations.
Direct Liquid Cooling (DLC): Supports in-rack CDU (Coolant Distribution Unit) deployment, where coolant flows directly through GPU chip heat sinks, achieving cooling capacity up to 60kW per module. This is essential for high-power GPU servers such as NVIDIA H100/H200 systems, which generate heat densities that air cooling alone cannot adequately manage.
#### 2.4 Cubicle Housing Technology
One of the product's most distinctive design elements is its use of Kawamura Electric Industry's core technology — cubicle housings. Cubicles are standardized equipment enclosures used in Japan's power distribution system for transformer and distribution equipment. They feature mature industrial manufacturing processes and quality control systems, standardized dimensions and interface specifications, excellent dust and water protection performance (IP55), and environmental adaptability for both indoor and outdoor deployment.
Applying cubicle technology to data center modules represents a creative fusion of knowledge from the information communications and power equipment industries. This eliminates the need to construct new data center buildings — units can be deployed directly alongside factories, office buildings, or in outdoor spaces, dramatically shortening deployment timelines and reducing construction costs.
3. Market Positioning and Application Scenarios
#### 3.1 Sovereign AI Infrastructure
For manufacturing companies, research institutions, medical facilities, government agencies, and other organizations with strict data security and compliance requirements, DX edge Cool Cube serves as "Sovereign AI Infrastructure":
- Confidential data never needs to leave the enterprise premises; AI inference and training are completed internally
- Meets compliance requirements under Japan's Act on the Protection of Personal Information (APPI) and various industry-specific data protection regulations
- Suitable for defense, financial, medical, and other high-sensitivity industries where data sovereignty is non-negotiable
#### 3.2 Edge AI and MEC Deployment
For domains requiring low-latency real-time processing, such as autonomous driving, video analytics, and smart cities:
- Distributed deployment near communication base stations enables MEC (Multi-access Edge Computing) functionality
- Minimum configurations support validation deployments that can scale to commercial volumes
- Millisecond-level latency satisfies real-time AI inference requirements critical for safety-critical applications
#### 3.3 Watt-Bit Integration
"Watt-Bit Integration" is a core application concept for DX edge Cool Cube, referring to the integrated optimization of power (Watt) and information communications (Bit):
- **Surplus Power Utilization:** In regions or time periods with excess power, low-cost electricity can be leveraged to run AI computation
- **Co-location with Power Generation:** Deployment alongside solar or wind power generation facilities to directly consume green energy
- **Factory and Warehouse Conversion:** Leveraging existing buildings and electrical facilities to create distributed AI data centers
This concept aligns closely with China's "Computing-Power Coordination" strategy, first included in the 2026 Government Work Report, which similarly seeks to fundamentally resolve the geographic and temporal mismatch between AI computing demand and power supply.
4. Competitive Analysis
#### 4.1 Global Modular/Edge Data Center Landscape
| Product | Vendor | GPU Support | Liquid Cooling | Max Power | Delivery Time |
|---------|--------|------------|---------------|-----------|--------------|
| DX edge Cool Cube | IIJ/Kawamura | Yes | Yes | 60kW/module | 5 months |
| NVIDIA DGX SuperPOD | NVIDIA | Native | Yes | Multi-MW | Custom |
| Dell APEX Modular DC | Dell | Yes | Yes | 100kW+ | 6-9 months |
| Schneider EcoStruxure | Schneider | Yes | Yes | Customizable | Custom |
| Vertiv SmartMod | Vertiv | Yes | Limited | 50kW/module | 4-6 months |
DX edge Cool Cube's differentiated advantages include:
1. **Japan Market Adaptation:** Power specifications, regulatory compliance, and seismic design optimized for the Japanese market, where earthquake resilience is a critical infrastructure requirement
2. **Small-Scale Starting Point:** Minimum deployment from a single rack dramatically lowers the investment threshold, enabling organizations to start with validation before committing to larger deployments
3. **IT + Power Fusion:** The unique combination of IIJ's IT operations expertise and Kawamura's power equipment manufacturing capabilities creates a product that neither company could have developed independently
4. **Indoor/Outdoor Versatility:** IP55 protection rating enables outdoor deployment, providing flexibility in site selection that container-based alternatives may not match
5. Industry Trend Analysis
#### 5.1 Edge AI Infrastructure Demand Explosion
In 2026, demand for edge AI infrastructure is experiencing explosive growth:
- Autonomous vehicles require real-time AI inference at the vehicle edge or roadside
- Manufacturing intelligence requires in-factory AI quality inspection and predictive maintenance systems
- Medical AI must operate within hospital environments to satisfy data privacy requirements
- Retail needs in-store video analytics and personalized recommendation systems
- Smart city applications require distributed processing for traffic management, public safety, and environmental monitoring
Centralized cloud data centers cannot meet these scenarios' requirements for low latency and data sovereignty. Distributed edge data centers have become essential infrastructure — not optional enhancements but fundamental requirements for deploying AI in physical-world applications.
#### 5.2 Japan's Unique Infrastructure Challenges
Japan faces distinctive challenges in AI computing infrastructure development:
- **Limited Land Availability:** Japan's constrained land area makes large-scale data center sites increasingly scarce, particularly near population centers where demand is greatest
- **Tight Power Supply:** Post-Fukushima energy policies have made power supply a bottleneck, with many regions facing capacity constraints
- **Seismic Risk:** Data centers must satisfy stringent seismic design requirements, adding complexity and cost to traditional construction approaches
- **Extended Construction Timelines:** Japan's building approval and construction processes are relatively complex, with traditional data center projects often requiring 2-3 years from planning to operation
Modular solutions address these challenges through a "factory-prefabricated + on-site assembly" model that reduces site preparation requirements, shortens deployment timelines, and provides flexibility in facility planning.
6. Business Model Analysis
DX edge Cool Cube's business model combines "product sales + design and construction support":
- **Product Sales:** Direct sales of modular hardware units, individually quoted based on configuration
- **Design Support:** IIJ provides end-to-end technical consulting from requirements analysis through system design to deployment
- **Operations Services:** Remote environmental monitoring and control (equipment status, environmental sensors, fire precursor detection)
- **Expansion Services:** GPU additions and module extensions as business needs grow
This bundled approach differentiates the offering from pure hardware vendors by providing the operational expertise that many organizations deploying AI infrastructure for the first time require.
7. Summary and Outlook
DX edge Cool Cube represents a concrete implementation of the transformation from "large-scale centralized" to "modular distributed" data center infrastructure for the AI era. By fusing information communications technology expertise with power equipment manufacturing knowledge, IIJ and Kawamura Electric Industry have created an edge AI infrastructure solution tailored to the Japanese market's unique characteristics.
The product's launch timing coincides with the global explosion in AI computing demand. At NVIDIA GTC, Jensen Huang projected $1 trillion in computing demand by 2027. China has elevated "Computing-Power Coordination" to government strategy. Distributed, modular, energy-efficient data center solutions are becoming a central trend in global AI infrastructure development.
The product will be showcased with a physical unit at Data Center Japan 2026 (March 24-25), providing potential customers with the opportunity to evaluate the hardware firsthand. Looking ahead, IIJ and Kawamura Electric Industry plan to promote this distributed infrastructure solution to additional regions within Japan and explore international market opportunities, particularly in Southeast Asian markets where similar demand patterns are emerging.
The broader significance of DX edge Cool Cube lies in its demonstration that AI infrastructure need not follow the hyperscaler model of massive centralized facilities. For many organizations and use cases, smaller, distributed, purpose-built infrastructure that can be deployed quickly and scaled incrementally represents a more practical and economically viable path to AI adoption. As AI applications increasingly move from cloud-based training to edge-based inference, products like DX edge Cool Cube will play an increasingly important role in the global AI infrastructure ecosystem.