Customization
Details
Full Explanation of config.yaml
We use a single config.yaml
file as the source to configure Memobase Backend. An example is like this:
Storage Config
max_chat_blob_buffer_token_size
: int, default to1024
. This is the parameter to control the buffer size of Memobase. Large the number, lower your LLM cost will be, but more lagging of profile update.max_pre_profile_token_size
: int, default to512
. The maximum token size of one profile slot can be. When a profile slot is larger than this, it will be trigger a re-summary.max_profile_subtopics
: int, default to15
. The maximum subtopics of one topic can be. When a topic has more than this, it will be trigger a re-organization.persistent_chat_blobs
: bool, default tofalse
. If set totrue
, the chat blobs will be persisted in the database.
Profile Config
Check what is profile in Memobase in here
additional_user_profiles
: list, default to[]
. This is the parameter to add additional user profiles. Each profile should have atopic
and a list ofsub_topics
.- For
topic
, it must have atopic
field and maybe adescription
field:
- For each
sub_topic
, it must have aname
field(or just astring
) and maybe adescription
field:
- For
overwrite_user_profiles
: list, default tonull
. The format is the same asadditional_user_profiles
. Memobase will have some built-in profile slots likework_title
,name
etc. If you want full control of the slots, you can use this parameter. The final profiles slots will be those you defined in here.
LLM Config
language
: string, default toen
, available options{'en', 'zh'}
. The prompt language of Memobase you like to use.llm_base_url
: string, default tohttps://api.openai.com/v1/
. The base URL of any OpenAI-Compatible API.llm_api_key
: string, default tonull
. Your LLM API key.best_llm_model
: string, default togpt-4o-mini
. The AI model to use.