Skip to content

09 User Interface

Gradio

  • Quick method to create interfaces for LLMs.

Setup Gradio

  1. Install "Desktop Development with C++" workload using Visual Studio Installer.
  2. Download and install Rust from this link
  3. Install gradio package in the python virtual environment.

Gradio Examples

Simple Example

1
2
3
4
5
6
7
8
9
def greet(name: str):  
    return f'Hello, {name}!'

view = gr.Interface(
    fn=greet,  
    inputs=[gr.Textbox(label='Name', lines=2)],  
    outputs=[gr.Textbox(label='Greeting', lines=2)]  
)  
view.launch()

Disable Flagging Button

1
2
3
4
5
6
7
view = gr.Interface(  
    fn=greet,  
    inputs=[gr.Textbox(label='Topic', lines=10)],  
    outputs=[gr.Textbox(label='Output')],  
    allow_flagging='never'  
)
view.launch()

Render Markdown

1
2
3
4
5
6
7
view = gr.Interface(  
    fn=greet,  
    inputs=[gr.Textbox(label='Topic', lines=10)],  
    outputs=[gr.Markdown(label='Output')],  
    allow_flagging='never'  
)
view.launch()
1
2
3
4
5
6
7
view = gr.Interface(  
    fn=ask_ollama,  
    inputs=[gr.Dropdown(['llama3.2', 'gemma'], label='Model'), gr.Textbox(label='Input')],  
    outputs=[gr.Markdown(label='Output')],  
    allow_flagging='never'  
)  
view.launch()

Chat Interface

def chat(message: str, history: list):  
    model = 'llama3.2'  
    # Update this system prompt to change the behavior of the model.
    system_prompt = "You are a helpful assistant."  
    messages = [  
        {"role": "system", "content": system_prompt},  
    ]  

    for user_message, assistant_message in history:  
        messages.append({"role": "user", "content": user_message})  
        messages.append({"role": "assistant", "content": assistant_message})  
    messages.append({"role": "user", "content": message})  
    print("History: ", history)  
    print("Messages: ", messages)  
    response = ollama.chat(model=model, messages=messages, stream=True)  
    all_content = ''  
    for chunk in response:  
        content = chunk.get("message", {}).get("content", "")  
        all_content += content  
        yield all_content


view = gr.ChatInterface(fn=chat)  
view.launch()