Tenstorrent Launches TT-QuietBox 2: Fully Open-Source RISC-V AI Workstation
Tenstorrent's TT-QuietBox 2 (Blackhole) is a RISC-V AI workstation with a fully open-source software stack, offering unprecedented transparency for local AI inference and challenging Nvidia's CUDA ecosystem dominance.
RISC-V architecture AI chip company Tenstorrent released the TT-QuietBox 2 on March 14, 2026 — the world's first commercial AI workstation with a fully open-source software stack. The product features Tenstorrent's proprietary Wormhole AI accelerator chips paired with a completely open-source compiler, runtime, and driver suite, providing AI developers with a complete development environment independent of any closed-source software.
Open Source For U published the first in-depth report on the launch. The TT-QuietBox 2 is equipped with 8 Wormhole n150 chips delivering approximately 512 TOPS (INT8) of AI compute. While absolute performance still lags behind NVIDIA A100 systems, the fully open-source software stack means developers can inspect and modify every line of code from driver level to compiler to runtime. Tenstorrent founder and CEO Jim Keller (former AMD Zen architect and former Apple A-series chip architect) stated at the launch: "AI should not be held hostage by any company's closed-source ecosystem. Developers deserve to know how their code executes on hardware."
Tom's Hardware's review provided a detailed breakdown of TT-QuietBox 2's hardware specifications. The workstation features a tower design comparable to a standard desktop, with air cooling and noise controlled below 35 decibels (hence the "QuietBox" name). Beyond the 8 Wormhole chips, it includes 128GB DDR5 memory, 4TB NVMe SSD, and a SiFive Performance P870 RISC-V host processor. Total system power consumption stays under 500 watts, requiring no special power supply or cooling infrastructure — it can sit on an office desk. Priced at $15,000, shipping is expected to begin in mid-April.
The Register's technical analysis explored Tenstorrent's open-source software strategy in depth. The entire stack is released under the Apache 2.0 license on GitHub, comprising four major components: TT-Metal (low-level runtime and device drivers), TT-Forge (deep learning compiler, similar to NVIDIA TensorRT), TT-NN (neural network library with PyTorch-compatible interfaces), and TT-Studio (visual development IDE). TT-Forge currently supports automatic compilation of PyTorch and ONNX models into optimized code for Wormhole chips, making model migration relatively straightforward.
RISC-V International Foundation CEO Calista Redmond highly praised the release: "This proves that RISC-V architecture is capable not only of embedded and IoT applications but also high-performance AI computing. Tenstorrent is pushing the boundaries of RISC-V." Foundation data shows global RISC-V chip shipments exceeded 16 billion units in 2025, though the vast majority were for low-power embedded applications. The TT-QuietBox 2 is the first RISC-V high-performance workstation targeting AI developers.
Tenstorrent's blog provided preliminary performance data. On ResNet-50 inference, TT-QuietBox 2 throughput reached approximately 60% of an NVIDIA RTX 4090, with the gap narrowing to about 25% on Transformer model inference. The post acknowledged a significant maturity gap compared to NVIDIA's CUDA ecosystem but emphasized that the open-source software stack's advantage lies in community-contributed optimizations that could close the gap over time.
The release sparked broader discussion about the AI chip competitive landscape. NVIDIA currently maintains a near-monopolistic moat through its CUDA ecosystem. A McKinsey analysis report noted that NVIDIA holds approximately 85% share of the global AI training GPU market, with CUDA's developer lock-in effect being its strongest competitive barrier. Tenstorrent's open-source strategy represents an entirely different competitive approach — attracting customers seeking supply chain flexibility by eliminating lock-in effects. Multiple companies including Meta, Samsung, and Hyundai Motor have publicly invested in Tenstorrent, with cumulative funding exceeding $1 billion.
From a commercialization perspective, TT-QuietBox 2's pricing strategy is highly aggressive. The base configuration (8 Wormhole chips, 64GB unified memory) is priced at $14,999, while a roughly comparable NVIDIA DGX Station A100 starts at over $140,000. Although the two are not directly comparable in raw performance metrics (DGX offers higher absolute compute), for local inference needs of small and medium enterprises and research institutions, TT-QuietBox 2's price-performance advantage is overwhelming.
Japanese and European markets have shown particularly strong interest. According to Tenstorrent, 40% of pre-orders came from Japan (mostly from the automotive industry for local inference testing of autonomous driving models), and 30% from Europe (primarily enterprise customers driven by data sovereignty regulations who need to keep AI inference entirely on-premises rather than uploading to US cloud providers). This validates an emerging market trend: the "decentralization" and "localization" of AI inference.
On the open-source ecosystem front, Tenstorrent's strategy stands in stark contrast to NVIDIA. While NVIDIA's CUDA has extensive documentation and a large community, its core code is entirely closed-source — developers cannot see or modify underlying implementations. TT-QuietBox 2's fully open-source nature means academic researchers can study AI accelerator design principles down to the hardware abstraction layer, carrying significant educational value for training the next generation of AI hardware engineers. Over 20 companies and research institutions have joined Tenstorrent's Open Source AI Hardware Alliance, including Samsung, Bosch, and several European automakers. Jim Keller stated at the launch: "We're not building another GPU alternative. We're building an AI computing ecosystem not controlled by any single company."