You can prepend history messages to give some context to LLM. For example, if you want the LLM to remember user's name:
import requests
API_URL = "http://localhost:3000/api/v1/prediction/<chatlfowid>"
def query(payload):
response = requests.post(API_URL, json=payload)
return response.json()
output = query({
"question": "Hey, how are you?",
"history": [
{
"role": "apiMessage",
"content": "Hello how can I help?"
},
{
"role": "userMessage",
"content": "Hi my name is Brian"
},
{
"role": "apiMessage",
"content": "Hi Brian, how can I help?"
},
]
})
async function query(data) {
const response = await fetch(
"http://localhost:3000/api/v1/prediction/<chatlfowid>",
{
method: "POST",
headers: {
"Content-Type": "application/json"
},
body: JSON.stringify(data)
}
);
const result = await response.json();
return result;
}
query({
"question": "Hey, how are you?",
"history": [
{
"role": "apiMessage",
"content": "Hello how can I help?"
},
{
"role": "userMessage",
"content": "Hi my name is Brian"
},
{
"role": "apiMessage",
"content": "Hi Brian, how can I help?"
},
]
}).then((response) => {
console.log(response);
});
Persists Memory
If the chatflow contains Memory nodes, you can pass a sessionId to persists the state of the conversation, so the every subsequent API calls will have context about previous conversation. Otherwise, a new session will be generated each time.
You can assign an API key to the prediction API from the UI. Refer Chatflows and APIs for more details.
The Authorization header must be provided with the correct API key specified during a HTTP call.
"Authorization": "Bearer <your-api-key>"
2. Vector Upsert API
POST /api/v1/vector/upsert/{your-chatflowid}
Request Body
Key
Description
Type
Required
overrideConfig
Override existing flow configuration
object
No
stopNodeId
Node ID of the vector store. When you have multiple vector stores in a flow, you might not want to upsert all of them. Specifying stopNodeId will ensure only that specific vector store node is upserted.
array
No
Document Loaders with Upload
Some document loaders in Tailwinds allow user to upload files:
If the flow contains Document Loaders with Upload File functionality, the API looks slightly different. Instead of passing body as JSON, form-data is being used. This allows you to upload any files to the API.
It is user's responsibility to make sure the file type is compatible with the expected file type from document loader. For example, if a Text File Loader is being used, you should only upload file with .txt extension.
import requests
API_URL = "http://<yourtenant>.tailwinds.innovativesol.com/api/v1/vector/upsert/<chatlfowid>"
# use form data to upload files
form_data = {
"files": ('state_of_the_union.txt', open('state_of_the_union.txt', 'rb'))
}
body_data = {
"returnSourceDocuments": True
}
def query(form_data):
response = requests.post(API_URL, files=form_data, data=body_data)
print(response)
return response.json()
output = query(form_data)
print(output)
// use FormData to upload files
let formData = new FormData();
formData.append("files", input.files[0]);
formData.append("returnSourceDocuments", true);
async function query(formData) {
const response = await fetch(
"http://localhost:3000/api/v1/vector/upsert/<chatlfowid>",
{
method: "POST",
body: formData
}
);
const result = await response.json();
return result;
}
query(formData).then((response) => {
console.log(response);
});
Document Loaders without Upload
For other Document Loaders nodes without Upload File functionality, the API body is in JSON format similar to Prediction API.