humanus.cpp/README.md

1.6 KiB

humanus.cpp

Humanus (meaning "human" in Latin) is a lightweight framework inspired by OpenManus and mem0, integrated with the Model Context Protocol (MCP). humanus.cpp enables more flexible tool choices, and provides a foundation for building powerful local LLM agents.

Let's embrace local LLM agents w/ humanus.cpp!

Project Demo

How to Build

git submodule update --init --recursive

cmake -B build
cmake --build build --config Release

How to Run

Switch to your own configration first:

  1. Replace base_url, api_key, .etc in config/config_llm.toml according to your need.
  2. Fill in args after "@modelcontextprotocol/server-filesystem" for filesystem to control the access to files. For example:
[filesystem]
type = "stdio"
command = "npx"
args = ["-y",
        "@modelcontextprotocol/server-filesystem",
        "/Users/{username}/Desktop",
        "other/path/to/your/files]

Start a MCP server with tool python_execute on port 8818:

./build/bin/mcp_server # Unix/MacOS
.\build\bin\Release\mcp_server.exe  # Windows

Run agent Humanus with tools python_execute, filesystem and puppeteer (for browser use):

./build/bin/humanus_cli # Unix/MacOS
.\build\bin\Release\humanus_cli.exe # Windows

Run experimental planning flow (only agent Humanus as executor):

./build/bin/humanus_cli_plan # Unix/MacOS
.\build\bin\Release\humanus_cli_plan.exe # Windows