|
|
||
|---|---|---|
| agent | ||
| assets | ||
| common | ||
| config | ||
| examples | ||
| flow | ||
| include | ||
| mcp@0dbcd1e6d5 | ||
| memory | ||
| server | ||
| src | ||
| tests | ||
| tokenizer | ||
| tool | ||
| .gitignore | ||
| .gitmodules | ||
| CMakeLists.txt | ||
| README.md | ||
README.md
humanus.cpp
Humanus (meaning "human" in Latin) is a lightweight framework inspired by OpenManus and mem0, integrated with the Model Context Protocol (MCP). humanus.cpp enables more flexible tool choices, and provides a foundation for building powerful local LLM agents.
Let's embrace local LLM agents w/ humanus.cpp!
Project Demo
How to Build
git submodule update --init --recursive
cmake -B build
cmake --build build --config Release
How to Run
Switch to your own configration first:
- Replace
base_url,api_key, .etc inconfig/config_llm.tomlaccording to your need. - Fill in
argsafter"@modelcontextprotocol/server-filesystem"forfilesystemto control the access to files. For example:
[filesystem]
type = "stdio"
command = "npx"
args = ["-y",
"@modelcontextprotocol/server-filesystem",
"/Users/{username}/Desktop",
"other/path/to/your/files]
Start a MCP server with tool python_execute on port 8818:
./build/bin/mcp_server # Unix/MacOS
.\build\bin\Release\mcp_server.exe # Windows
Run agent Humanus with tools python_execute, filesystem and puppeteer (for browser use):
./build/bin/humanus_cli # Unix/MacOS
.\build\bin\Release\humanus_cli.exe # Windows
Run experimental planning flow (only agent Humanus as executor):
./build/bin/humanus_cli_plan # Unix/MacOS
.\build\bin\Release\humanus_cli_plan.exe # Windows
Acknowledgement
<html> <head> </head>