MaritimemaritimeDocumentation
Dashboard

CrewAI Guide

Deploy CrewAI agents on Maritime.

Overview

The CrewAI template deploys a Python container running a FastAPI server that wraps CrewAI. It exposes a /run endpoint that accepts a task, creates a CrewAI Agent + Task + Crew, and returns the result.

Docker Image

Image: maritimeai/template-crewai:latest

Python 3.12-slim with crewai, FastAPI, and uvicorn. Runs on port 8080.

API Endpoints

GET  /health  → {"status": "ok"}
POST /run     → {"task": "..."} → {"result": "..."}

Source Code

main.py
from crewai import Agent, Crew, Task
from fastapi import FastAPI
from pydantic import BaseModel

app = FastAPI()

class RunRequest(BaseModel):
    task: str

@app.get("/health")
def health():
    return {"status": "ok"}

@app.post("/run")
def run(req: RunRequest):
    agent = Agent(
        role="Assistant",
        goal="Complete the given task accurately",
        backstory="You are a helpful AI assistant.",
    )
    task = Task(description=req.task, agent=agent, expected_output="A detailed response")
    crew = Crew(agents=[agent], tasks=[task], verbose=False)
    result = crew.kickoff()
    return {"result": str(result)}

Environment Variables

Set OPENAI_API_KEY in your agent's environment variables for CrewAI to use an LLM.

Deploy

Select Template → CrewAI Agent in the Create Agent modal. Maritime pulls the image and starts the container automatically.