- Helpers
- load
New to Gradio? Start here: Getting Started
See the Release History
To install Gradio from main, run the following command:
pip install https://gradio-builds.s3.amazonaws.com/3fcd23a09df280468351eac4fb081e79c8e58216/gradio-6.0.1-py3-none-any.whl*Note: Setting share=True in launch() will not work.
load
gradio.load(···)Description
Constructs a Gradio app automatically from a Hugging Face model/Space repo name or a 3rd-party API provider. Note that if a Space repo is loaded, certain high-level attributes of the Blocks (e.g. custom css, js, and head attributes) will not be loaded.
Example Usage
import gradio as gr
demo = gr.load("gradio/question-answering", src="spaces")
demo.launch()Initialization
name: str
name: strthe name of the model (e.g. "google/vit-base-patch16-224") or Space (e.g. "flax-community/spanish-gpt2"). This is the first parameter passed into the `src` function. Can also be formatted as {src}/{repo name} (e.g. "models/google/vit-base-patch16-224") if `src` is not provided.
src: Callable[[str, str | None], Blocks] | Literal['models', 'spaces', 'huggingface'] | None
src: Callable[[str, str | None], Blocks] | Literal['models', 'spaces', 'huggingface'] | None= Nonefunction that accepts a string model `name` and a string or None `token` and returns a Gradio app. Alternatively, this parameter takes one of two strings for convenience: "models" (for loading a Hugging Face model through the Inference API) or "spaces" (for loading a Hugging Face Space). If None, uses the prefix of the `name` parameter to determine `src`.
token: str | None
token: str | None= Noneoptional token that is passed as the second parameter to the `src` function. If not explicitly provided, will use the HF_TOKEN environment variable or fallback to the locally-saved HF token when loading models but not Spaces (when loading Spaces, only provide a token if you are loading a trusted private Space as the token can be read by the Space you are loading). Find your HF tokens here: https://huggingface.co/settings/tokens.
accept_token: bool | LoginButton
accept_token: bool | LoginButton= Falseif True, a Textbox component is first rendered to allow the user to provide a token, which will be used instead of the `token` parameter when calling the loaded model or Space. Can also provide an instance of a gr.LoginButton in the same Blocks scope, which allows the user to login with a Hugging Face account whose token will be used instead of the `token` parameter when calling the loaded model or Space.
provider: PROVIDER_T | None
provider: PROVIDER_T | None= Nonethe name of the third-party (non-Hugging Face) providers to use for model inference (e.g. "replicate", "sambanova", "fal-ai", etc). Should be one of the providers supported by `huggingface_hub.InferenceClient`. This parameter is only used when `src` is "models"