- Helpers
- load_chat
New to Gradio? Start here: Getting Started
See the Release History
To install Gradio from main, run the following command:
pip install https://gradio-builds.s3.amazonaws.com/3fcd23a09df280468351eac4fb081e79c8e58216/gradio-6.0.1-py3-none-any.whl*Note: Setting share=True in launch() will not work.
load_chat
gradio.load_chat(···)Description
Load a chat interface from an OpenAI API chat compatible endpoint.
Example Usage
import gradio as gr
demo = gr.load_chat("http://localhost:11434/v1", model="deepseek-r1")
demo.launch()Initialization
Parameters
token: str | None
token: str | Nonedefault
= NoneThe API token or a placeholder string if you are using a local model, e.g. "ollama"
file_types: Literal['text_encoded', 'image'] | list[Literal['text_encoded', 'image']] | None
file_types: Literal['text_encoded', 'image'] | list[Literal['text_encoded', 'image']] | Nonedefault
= "text_encoded"The file types allowed to be uploaded by the user. "text_encoded" allows uploading any text-encoded file (which is simply appended to the prompt), and "image" adds image upload support. Set to None to disable file uploads.