|
||
---|---|---|
agent | ||
assets | ||
common | ||
config/example | ||
examples | ||
flow | ||
include | ||
mcp@3c0a2a730a | ||
memory | ||
server | ||
src | ||
tests | ||
tokenizer | ||
tool | ||
.gitignore | ||
.gitmodules | ||
CMakeLists.txt | ||
LICENSE | ||
README.md | ||
README.zh.md |
README.md
English | 中文
humanus.cpp
Humanus (Latin for "human") is a lightweight C++ framework inspired by OpenManus and mem0, integrated with the Model Context Protocol (MCP). This project aims to provide a fast, modular foundation for building local LLM agents.
Key Features:
- C++ Implementation: Core logic in efficient C++, optimized for speed and minimal overhead
- Lightweight Design: Minimal dependencies and simple architecture, ideal for embedded or resource-constrained environments
- Cross-platform Compatibility: Runs on Linux, macOS, and Windows
- MCP Protocol Integration: Native support for standardized tool interaction via MCP
- Vectorized Memory: Context retrieval using HNSW-based similarity search
- Modular Architecture: Easy to plug in new models, tools, or storage backends
Humanus is still in its early stages — it's a work in progress, evolving rapidly. We’re iterating openly, improving as we go, and always welcome feedback, ideas, and contributions.
Let's explore the potential of local LLM agents with humanus.cpp!
Project Demo
How to Build
git submodule update --init
cmake -B build
cmake --build build --config Release
How to Run
Configuration
To set up your custom configuration, follow these steps:
- Copy all files from
config/example
toconfig
. - Replace
base_url
,api_key
, .etc inconfig/config_llm.toml
and other configurations inconfig/config*.toml
according to your need.Note:
llama-server
in llama.cpp also supports embedding models for vectorized memory. - Fill in
args
after"@modelcontextprotocol/server-filesystem"
forfilesystem
to control the access to files. For example:
[filesystem]
type = "stdio"
command = "npx"
args = ["-y",
"@modelcontextprotocol/server-filesystem",
"/Users/{Username}/Desktop",
"other/path/to/your/files]
mcp_server
(for tools, only python_execute
as an example now)
Start a MCP server with tool python_execute
on port 8895 (or pass the port as an argument):
./build/bin/mcp_server <port> # Unix/MacOS
.\build\bin\Release\mcp_server.exe <port> # Windows
humanus_cli
Run with tools python_execute
, filesystem
and playwright
(for browser use):
./build/bin/humanus_cli # Unix/MacOS
.\build\bin\Release\humanus_cli.exe # Windows
humanus_cli_plan
(WIP)
Run planning flow (only agent humanus
as executor):
./build/bin/humanus_cli_plan # Unix/MacOS
.\build\bin\Release\humanus_cli_plan.exe # Windows
humanus_server
(WIP)
Run agents in MCP the server (default running on port 8896):
humanus_initialze
: Pass JSON configuration (like inconfig/config.toml
) to initialize an agent for a session. (Only one agent will be maintained for each session/client)humanus_run
: Passprompt
to tell the agent what to do. (Only one task at a time)humanus_terminate
: Stop the current task.humanus_status
: Get the current states and other information about the agent and the task. Returns:state
: Agent state.current_step
: Current step index of the agent.max_steps
: Maximum steps executing without interaction with the user.prompt_tokens
: Prompt (input) tokens consumption.completion_tokens
: Completion (output) tokens consumption.log_buffer
: Logs in the buffer, likehumanus_cli
. Will be cleared after fetched.result
: Explaining what the agent did. Not empty if the task is finished.
./build/bin/humanus_server <port> # Unix/MacOS
.\build\bin\Release\humanus_cli_plan.exe <port> # Windows
Configure it in Cursor:
{
"mcpServers": {
"humanus": {
"url": "http://localhost:8896/sse"
}
}
}
Experimental feature: MCP in MCP! You can run
humanus_server
and connect to it from another MCP server orhumanus_cli
.
Acknowledgement
This work was supported by the National Natural Science Foundation of China (No. 62306216) and the Natural Science Foundation of Hubei Province of China (No. 2023AFB816).
Cite
@misc{humanus_cpp,
author = {Zihong Zhang and Zuchao Li},
title = {humanus.cpp: A Lightweight C++ Framework for Local LLM Agents},
year = {2025}
}