NemoClaw Guide
Deploy secure NemoClaw agents on Maritime.
Overview
NemoClaw is a security wrapper around OpenClaw that uses NVIDIA NIM inference. It sandboxes agents with policy-enforced network egress and routes all inference through NVIDIA cloud (Nemotron models).
Docker Image
Image: maritimeai/template-nemoclaw:latest
Built on top of the official OpenClaw image with NemoClaw plugin pre-installed.
How It Works
NemoClaw runs the same OpenClaw gateway on port 18789 but configures an NVIDIA inference provider automatically. When you provide an NVIDIA API key, Maritime writes the provider config so the agent routes LLM calls through NVIDIA NIM endpoints.
# The container runs with:
OPENCLAW_HEADLESS=true
NVIDIA_API_KEY=nvapi-...
# Provider config is auto-generated at:
# /data/.openclaw/providers/nvidia.jsonEnvironment Variables
Set NVIDIA_API_KEY during agent creation. This key is stored encrypted and injected at runtime. You can also set a gateway password to secure the OpenClaw dashboard.
Deploy
Click the green NemoClaw button in the Create Agent modal. Provide your NVIDIA API key and an optional gateway password, then deploy.