Connecting your applications
You can deploy your model anywhere using Databricks Model Serving, which is how you deployed your RAG chatbot model in Chapter 7. In this section, we will introduce how to host ML demo apps in Hugging Face (HF). Having an easy way to host ML apps allows you to build your ML portfolio, showcase your projects at conferences or with stakeholders, and work collaboratively with others in the ML ecosystem. With HF Spaces, you have multiple options for which Python library you use to create a web app. Two common ones are Streamlit and Gradio.
We prefer Gradio. It is an open source Python package that allows you to quickly build a demo or web application for your machine learning model, API, or any arbitrary Python function. You can then share a link to your demo or web application in just a few seconds using Gradio’s built-in sharing features. No JavaScript, CSS, or web hosting experience is needed – we love it!
We will walk you through deploying...