# Conversation Summary Buffer Memory

This memory keeps a buffer of recent interactions and compiles old ones into a summary, using both in its storage. Instead of flushing old interactions based solely on their number, it now considers the total length of tokens to decide when to clear them out.

<figure><img src="https://662370747-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F3VoWwSsyrEg0DEvIIjv9%2Fuploads%2Fgit-blob-54e7a64e1291fcd50fe56bc319ae54125b4ffa4b%2Fimage%20(4)%20(1)%20(2).png?alt=media" alt="" width="297"><figcaption></figcaption></figure>

## Input

| Parameter       | Description                                                                   | Default       |
| --------------- | ----------------------------------------------------------------------------- | ------------- |
| Chat Model      | LLM used to perform summarization                                             |               |
| Max Token Limit | Summarize conversations once token limit is reached                           | 2000          |
| Session Id      | An ID to retrieve/store messages. If not specified, a random ID will be used. |               |
| Memory Key      | A key used to format messages in prompt template                              | chat\_history |
