humanus.cpp/README.md

66 lines
1.6 KiB
Markdown

<p align="center">
<img src="assets/logo.png" width="200"/>
</p>
# humanus.cpp
Humanus (meaning "human" in Latin) is a lightweight framework inspired by [OpenManus](https://github.com/mannaandpoem/OpenManus) and [mem0](https://github.com/mem0ai/mem0), integrated with the Model Context Protocol (MCP). `humanus.cpp` enables more flexible tool choices, and provides a foundation for building powerful local LLM agents.
Let's embrace local LLM agents **w/** humanus.cpp!
## Project Demo
## How to Build
```bash
git submodule update --init --recursive
cmake -B build
cmake --build build --config Release
```
## How to Run
Switch to your own configration first:
1. Replace `base_url`, `api_key`, .etc in `config/config_llm.toml` according to your need.
2. Fill in `args` after `"@modelcontextprotocol/server-filesystem"` for `filesystem` to control the access to files. For example:
```
[filesystem]
type = "stdio"
command = "npx"
args = ["-y",
"@modelcontextprotocol/server-filesystem",
"/Users/{username}/Desktop",
"other/path/to/your/files]
```
Start a MCP server with tool `python_execute` on port 8818:
```bash
./build/bin/mcp_server # Unix/MacOS
```
```shell
.\build\bin\Release\mcp_server.exe # Windows
```
Run agent `Humanus` with tools `python_execute`, `filesystem` and `puppeteer` (for browser use):
```bash
./build/bin/humanus_cli # Unix/MacOS
```
```shell
.\build\bin\Release\humanus_cli.exe # Windows
```
Run experimental planning flow (only agent `Humanus` as executor):
```bash
./build/bin/humanus_cli_plan # Unix/MacOS
```
```shell
.\build\bin\Release\humanus_cli_plan.exe # Windows
```