From Zero to GenAI Hero: Building Your GenAI App with HuggingFace and Databricks | Databricks Blog
A comprehensive guide to building a GenAI app using a HuggingFace model, MLflow, Unity Catalog and Databricks Apps, covering setup, development, and deployment.
What is Databricks Feature Serving? - Azure Databricks
Feature Serving provides structured data for RAG applications and makes data in the Databricks platform available to applications deployed outside of Databricks.
With Databricks Feature Serving, you can serve structured data for retrieval augmented generation (RAG) applications, as well as features that are required for other applications, such as models served outside of Databricks or any other application that requires features based on data in Unity Catalog.
Databricks Foundation Model APIs - Azure Databricks
This article provides an overview of the Foundation Model APIs in Databricks. It includes requirements for use, supported models, and limitations.
Using the Foundation Model APIs you can:
Query a generalized LLM to verify a project’s validity before investing more resources.
Query a generalized LLM in order to create a quick proof-of-concept for an LLM-based application before investing in training and deploying a custom model.
Use a foundation model, along with a vector database, to build a chatbot using retrieval augmented generation (RAG).
Replace proprietary models with open alternatives to optimize for cost and performance.
Efficiently compare LLMs to see which is the best candidate for your use case, or swap a production model with a better performing one.
Build an LLM application for development or production on top of a scalable, SLA-backed LLM serving solution that can support your production traffic spikes.
Discover the power of Lakehouse. Install demos in your workspace to quickly access best practices for data ingestion, governance, security, data science and data warehousing.