Minisforum's new flagship NAS comes with OpenClaw pre-installed — Strix Halo-powered N5 Max can run a local AI LLM
What could possibly go wrong?
Get 3DTested's best news and in-depth reviews, straight to your inbox.
You are now subscribed
Your newsletter sign-up was successful
Minisforum has announced an upcoming NAS that is built from the ground up to run large language models locally. The yet-to-be-released N5 Max AI NAS comes with a Ryzen AI Max+ 395 Strix Halo APU and features OpenClaw pre-installed, an open-source AI framework that can be configured to run a variety of tasks. Pricing and a release date have yet to be announced.
The small-form-factor manufacturer neglected to share the NAS's full specifications, particularly the unit's storage capacity. All we know officially is the CPU inside, which is AMD's flagship Strix Halo APU sporting 16 Zen 5 CPU cores that can clock up to 5.1GHz, a Radeon 8060S iGPU with 40 CUs, XDNA 2 NPU, and 64MB of L3 cache. The 395+ can be configured with 32GB to 128GB of system memory; likely, Minisforum is using a higher memory capacity of 64GB to 128GB. LLMs are known to scale very well with larger amounts of memory capacity.
That said, we can make some logical guesses about the system's other specs. Minisforum's Max variant of the N5 series appears to be using the same chassis as the outgoing N5 AI NAS and N5 AI Pro NAS. If this is true, the Max version will likely share the same storage configuration as the N5 AI/N5 AI Pro, consisting of five 3.5/2.5' HDD drive bays and three M.2 slots, two of which support U.2 drives. The HDD bays alone support up to 30TB per drive.
Article continues belowAI acceleration in network-attached storage systems is a trend that is growing rapidly in the computing industry. Having this capability in a NAS gives it the ability to serve as both a NAS and a local AI server for users. Having the LLM localized also improves security as all of the data processing and interactions are done within the machine and aren't shared with the internet.
OpenClaw is not an LLM like Copilot or Gemini, but is an AI framework that can be programmed to run a variety of tasks. For instance, OpenClaw can be programmed to run a photo search engine that can be controlled with conversational prompts. It can also be configured to edit videos based on prompts, automate emails, publish social media posts, and more. Specifically, OpenClaw routes messages to an LLM, which will then decide which tools to use to fulfill the user's request.
OpenClaw has exploded in popularity; however, security is one of the framework's biggest flaws. Beyond the apps' already problematic security issues that can leak sensitive data to the internet if not configured properly, malicious content has been found on ClawHub, a hub for OpenClaw users to install third-party extensions.
Follow 3DTested on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.
Get 3DTested's best news and in-depth reviews, straight to your inbox.

-
PEnns "can run a local AI LLM"Reply
"All we know officially is the CPU inside, which is AMD's flagship Strix Halo APU sporting 16 Zen 5 CPU cores that can clock up to 5.1GHz, a Radeon 8060S iGPU with 40 CUs, XDNA 2 NPU, and 64MB of L3 cache. The 395+ can be configured with 32GB to 128GB of system memory;"
Why does it need AI??
Are we talking about a NAS or a workstation, or what exactly???