Prepend history messages at the start of conversation
array
No
You can use the chatflow as API and connect to frontend applications.
Override Config
You also have the flexibility to override input configuration with overrideConfig property.
import requestsAPI_URL ="http://<yourtenant>.tailwinds.innovativesol.com/api/v1/prediction/<chatlfowid>"defquery(payload): response = requests.post(API_URL, json=payload)return response.json()output =query({"question": "Hey, how are you?","overrideConfig": {"sessionId": "123","returnSourceDocuments": true }})
asyncfunctionquery(data) {constresponse=awaitfetch("http://<yourtenant>.tailwinds.innovativesol.com/api/v1/prediction/<chatlfowid>", { method:"POST", headers: {"Content-Type":"application/json" }, body:JSON.stringify(data) } );constresult=awaitresponse.json();return result;}query({"question":"Hey, how are you?","overrideConfig": {"sessionId":"123","returnSourceDocuments":true }}).then((response) => {console.log(response);});
History
You can prepend history messages to give some context to LLM. For example, if you want the LLM to remember user's name:
import requestsAPI_URL ="http://localhost:3000/api/v1/prediction/<chatlfowid>"defquery(payload): response = requests.post(API_URL, json=payload)return response.json()output =query({"question": "Hey, how are you?","history": [ {"role": "apiMessage","content": "Hello how can I help?" }, {"role": "userMessage","content": "Hi my name is Brian" }, {"role": "apiMessage","content": "Hi Brian, how can I help?" }, ]})
asyncfunctionquery(data) {constresponse=awaitfetch("http://localhost:3000/api/v1/prediction/<chatlfowid>", { method:"POST", headers: {"Content-Type":"application/json" }, body:JSON.stringify(data) } );constresult=awaitresponse.json();return result;}query({"question":"Hey, how are you?","history": [ {"role":"apiMessage","content":"Hello how can I help?" }, {"role":"userMessage","content":"Hi my name is Brian" }, {"role":"apiMessage","content":"Hi Brian, how can I help?" }, ]}).then((response) => {console.log(response);});
Persists Memory
If the chatflow contains Memory nodes, you can pass a sessionId to persists the state of the conversation, so the every subsequent API calls will have context about previous conversation. Otherwise, a new session will be generated each time.
import requestsAPI_URL ="http://<yourtenant>.tailwinds.innovativesol.com/api/v1/prediction/<chatlfowid>"defquery(payload): response = requests.post(API_URL, json=payload)return response.json()output =query({"question": "Hey, how are you?","overrideConfig": {"sessionId": "123" } })
asyncfunctionquery(data) {constresponse=awaitfetch("http://<yourtenant>.tailwinds.innovativesol.com/api/v1/prediction/<chatlfowid>", { method:"POST", headers: {"Content-Type":"application/json" }, body:JSON.stringify(data) } );constresult=awaitresponse.json();return result;}query({"question":"Hey, how are you?","overrideConfig": {"sessionId":"123" }}).then((response) => {console.log(response);});
Image Uploads
When Allow Image Upload is enabled, images can be uploaded from chat interface.
You can assign an API key to the prediction API from the UI. Refer Chatflows and APIs for more details.
The Authorization header must be provided with the correct API key specified during a HTTP call.
"Authorization": "Bearer <your-api-key>"
2. Vector Upsert API
POST /api/v1/vector/upsert/{your-chatflowid}
Request Body
Key
Description
Type
Required
overrideConfig
Override existing flow configuration
object
No
stopNodeId
Node ID of the vector store. When you have multiple vector stores in a flow, you might not want to upsert all of them. Specifying stopNodeId will ensure only that specific vector store node is upserted.
array
No
Document Loaders with Upload
Some document loaders in Tailwinds allow user to upload files:
If the flow contains Document Loaders with Upload File functionality, the API looks slightly different. Instead of passing body as JSON, form-data is being used. This allows you to upload any files to the API.
It is user's responsibility to make sure the file type is compatible with the expected file type from document loader. For example, if a Text File Loader is being used, you should only upload file with .txt extension.
import requestsAPI_URL ="http://<yourtenant>.tailwinds.innovativesol.com/api/v1/vector/upsert/<chatlfowid>"# use form data to upload filesform_data ={"files": ('state_of_the_union.txt',open('state_of_the_union.txt', 'rb'))}body_data ={"returnSourceDocuments":True}defquery(form_data): response = requests.post(API_URL, files=form_data, data=body_data)print(response)return response.json()output =query(form_data)print(output)
// use FormData to upload fileslet formData =newFormData();formData.append("files",input.files[0]);formData.append("returnSourceDocuments",true);asyncfunctionquery(formData) {constresponse=awaitfetch("http://localhost:3000/api/v1/vector/upsert/<chatlfowid>", { method:"POST", body: formData } );constresult=awaitresponse.json();return result;}query(formData).then((response) => {console.log(response);});
Document Loaders without Upload
For other Document Loaders nodes without Upload File functionality, the API body is in JSON format similar to Prediction API.