Member-only story
Doing a Discord bot for your fine-tuned LLM
After doing fine-tunning and playing with a gradio GUI interface for our application, its time to make it “upstream”, either via a bot like the one we did on slack or via discord, another option is making a web app. For this blog we are going to explore the discord bot approach using HugggingFace spaces.
Take for example this execute-cdaiapp.py file
# conda info --envs
# conda activate nameofvenv
# This uses lamini api key
# export API__KEY="yourapikey"
import logging
import os
from utilities import *
from lamini import Lamini
import gradio as gr
logger = logging.getLogger(__name__)
global_config = None
# Train with lamini
import lamini
print (os.getenv("API__KEY"))
lamini.api_key = os.getenv("API__KEY")
## Using a trainned model with hyour devops data
llm = Lamini(model_name='f6f3eda8e5275a33781f47c622d0a6f1aa5524a35ec8203b89f468afa83bfade0')
def generate_response(prompt):
response = llm.generate(prompt, output_type={"Response":"str"})
return response
iface = gr.Interface(fn=generate_response, allow_flagging=False, inputs=[gr.Textbox(label="Question", lines=3)], outputs=[gr.Textbox(label="Answer", lines=3)], title="Devops AI")
iface.launch(share=False)An then to deploy it to HuggingFace spaces you could do something similar to:
gradio deploy --title devops-ai --app-file…