- Published on
Developing and deploying a basic LLM app on Huggingface
- Authors
- Name
Table of Contents
- What is...🤓?
- Streamlit
- Features of Streamlit:
- ChatPromptTemplate
- ChatOpenAI Model
- LLMs vs Chat Models
- Requirements & Setup:
- Code
- Output
- References:
What is...🤓?
Streamlit
A python library.
Streamlit allows you to create interactive web applications with minimal code.
Features of Streamlit:
Rapid Prototyping: Streamlit makes it easy to quickly build and deploy data-driven web applications with just a few lines of Python code.
Interactive Components: Streamlit provides a wide range of interactive components like sliders, dropdowns, file uploaders, and more, which can be easily integrated into your application.
Data Visualization: Streamlit seamlessly integrates with popular data visualization libraries like Matplotlib, Plotly, and Altair, allowing you to create interactive visualizations.
Deployment: Streamlit applications can be easily deployed to cloud platforms like Heroku, AWS, and Google Cloud.
ChatPromptTemplate
Based on the search results, here are the key points about ChatPromptTemplate and its components in LangChain:
ChatPromptTemplate:
- ChatPromptTemplate is a specialized prompt template in LangChain designed for chat-based models.
- It allows you to create a structured prompt that consists of a sequence of chat messages, each with a specific role (e.g., system, human, AI).
- This is in contrast to the more general PromptTemplate, which is designed for a single prompt string.
Components of ChatPromptTemplate:
- Input Variables: These are the placeholders within the template that will be replaced with actual values when the template is used.
- Messages: The list of chat messages that make up the prompt. Each message has a role (e.g., "system", "human", "ai") and a template string.
- Message Classes: LangChain provides specialized message classes like
SystemMessage
,HumanMessage
, andAIMessage
that can be used to define the chat messages.The message classes help structure the input and output of chat-based language models, which expect a sequence of messages rather than a single prompt. - Formatting: The
format_messages()
method is used to replace the input variables in the chat messages with the provided values, creating the final prompt.
Advantages of ChatPromptTemplate:
- Allows for more structured and contextual prompts, especially for chat-based models.
- Enables the use of different message types (system, human, AI) to guide the language model's responses.
- Integrates with LangChain's memory management features to maintain conversation state.
- Provides a consistent and reusable way to define chat-based prompts.
ChatOpenAI Model
The ChatOpenAI
model in LangChain is a specialized language model that is optimized for chat-based interactions. Here are the key points about this model:
Integration with OpenAI's Chat Models:
- The
ChatOpenAI
model allows you to use OpenAI's chat-based language models, such as GPT-3.5 Turbo and GPT-4, within your LangChain-powered applications. - This integration provides access to the advanced conversational capabilities of OpenAI's chat models.
- The
Structured Prompts and Messages:
- When using the
ChatOpenAI
model, you can leverage LangChain'sChatPromptTemplate
to define structured prompts that include different message types (system, human, AI). - This allows you to provide more context and guidance to the language model, leading to more coherent and relevant responses.
- When using the
Conversational Workflows:
- The
ChatOpenAI
model is designed to work with LangChain's conversational features, such as theConversationalRetrievalChain
. - This enables you to build chat-based applications that maintain context and state across multiple user interactions.
- The
Customization and Fine-tuning:
- The
ChatOpenAI
model supports customization, such as setting the temperature and using fine-tuned OpenAI models. - This allows you to tailor the language model's behavior to your specific use case and requirements.
- The
Comparison to Regular OpenAI API Calls:
- The
ChatOpenAI
model in LangChain provides a more structured and streamlined way to interact with OpenAI's chat models, compared to making direct API calls. - LangChain's abstractions and features, such as prompt templates and message classes, can simplify the development of chat-based applications.
- The
LLMs vs Chat Models
Based on the search results, the key differences between LangChain LLMs (Large Language Models) and LangChain ChatModels are:
Input and Output Format:
- LLMs: LLMs in LangChain take a single string as input and return a single string as output.
- ChatModels: ChatModels in LangChain take a list of messages as input, where each message has a role (e.g., "human", "ai", "system") and content. The ChatModel then returns a new chat message.
Prompt Structure:
- LLMs: LLMs use PromptTemplates, which allow you to define a template with placeholders that can be filled in with dynamic values.
- ChatModels: ChatModels use ChatPromptTemplates, which allow you to define a sequence of messages with different roles, creating a more structured and contextual prompt.
Conversational Capabilities:
- LLMs: LLMs are more general-purpose language models that can be used for a variety of tasks, such as text generation, summarization, and question answering.
- ChatModels: ChatModels are specifically designed for chat-based interactions, allowing for the maintenance of conversation state and the ability to respond to a sequence of messages.
Integrations:
- LLMs: LangChain supports a wide range of LLMs, including models from OpenAI, Anthropic, Hugging Face, and others.
- ChatModels: LangChain's ChatModels integration is currently focused on OpenAI's chat-based models, such as GPT-3.5 Turbo and GPT-4.
In summary, the key difference is that LangChain LLMs are designed for general language tasks, while LangChain ChatModels are specifically tailored for chat-based interactions, with a more structured prompt format and the ability to maintain conversation state. The choice between using an LLM or a ChatModel in LangChain depends on the specific requirements of your application.
Requirements & Setup:
Python > 3.8.1
Create and activate a virtual environment
# Use the virtualenv package
pip install virtualenv
virtualenv venv
source venv/bin/activate
# Or use the `venv` module that's part of Python's standard library
virtualenv venv
(OR)
python -m venv venv
venv\Scripts\activate
# the following packages are installed explicitly since we need them only in the development stage and not when the app is deployed
pip install -r requirements.txt
pip install ipykernel
pip install jupyter
# `deactivate` to destroy the virtual environment
Code
from langchain.chat_models import ChatOpenAI
from langchain.schema import HumanMessage
from dotenv import load_dotenv
import os
import streamlit as st
load_dotenv(dotenv_path='./.env')
openai_api_key = os.getenv('OPENAI_API_KEY')
def retrieve_model_response(query):
llm = ChatOpenAI(openai_api_key=openai_api_key, model_name="gpt-3.5-turbo", temperature=0.9)
human_message = HumanMessage(content=query)
response = llm([human_message])
return response.content
##initialize our streamlit app
st.set_page_config(page_title="Q&A Demo")
st.header("Langchain Application")
input=st.text_input("Input: ",key="input")
submit=st.button("Ask the question")
## If ask button is clicked
if submit:
response=retrieve_model_response(input)
st.subheader("The Response is")
st.write(response)
🐱 Github
Output
Figure 1: Output of a simple Langchain app leveraging ChatModels
References:
[^1] "LangChain: A Complete Guide & Tutorial," Nanonets, 29-Mar-2024. [Online]. Available: https://nanonets.com/blog/langchain/. [Accessed: 30-Mar-2024].
[^2] "What is the difference between OpenAI and ChatOpenAI in LangChain?," Stack Overflow, 30-Mar-2024. [Online]. Available: https://stackoverflow.com/questions/76950609/what-is-the-difference-between-openai-and-chatopenai-in-langchain. [Accessed: 30-Mar-2024].
[^3] "What is LangChain? Easier development of LLM applcations," InfoWorld, 08-Sep-2023. [Online]. Available: https://www.infoworld.com/article/3706289/what-is-langchain-easier-development-of-llm-applcations.html. [Accessed: 30-Mar-2024].
[^4] "ChatGPT - OpenAI," OpenAI, 30-Mar-2024. [Online]. Available: https://openai.com/chatgpt. [Accessed: 30-Mar-2024].