You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to process the response from the LLM from a Flowise endpoint in a structured way, e.g. to have it output in Streamlit as you know it from OpenAI etc..
Unfortunately I am not able to do this. The response as such works, but unfortunately not as a stream.
import requests
API_URL = "http://192.168.0.133:7000/api/v1/prediction/e8c074c0-0956-4cdf-9786-86b0aa47a989"
def query(payload):
response = requests.post(API_URL, json=payload, stream=True)
if response.status_code == 200:
for line in response.iter_lines():
if line:
# Verarbeitung der Streaming-Daten hier
data = line.decode('utf-8')
print("Stream:", data)
else:
print("Error:", response.status_code)
# Beispielabfrage
query({
"question": "How fast it's the light?",
"overrideConfig": {
"sessionId": "user1"
}
})
Does anyone have any ideas on how to do this?
Here the Flowise part:
The text was updated successfully, but these errors were encountered:
Hello everyone,
I am trying to process the response from the LLM from a Flowise endpoint in a structured way, e.g. to have it output in Streamlit as you know it from OpenAI etc..
Unfortunately I am not able to do this. The response as such works, but unfortunately not as a stream.
Does anyone have any ideas on how to do this?
Here the Flowise part:
The text was updated successfully, but these errors were encountered: