π¦ ClawHub
Pywayne Llm Chat Bot
by @wangyendt
LLM chat interface using OpenAI-compatible APIs with streaming support and session management. Use when working with pywayne.llm.chat_bot module for creating...
π‘ Examples
from pywayne.llm.chat_bot import LLMChatCreate chat instance
chat = LLMChat(
base_url="https://api.example.com/v1",
api_key="your_api_key",
model="deepseek-chat"
)Single-turn conversation (non-streaming)
response = chat.ask("Hello, LLM!", stream=False)
print(response)Streaming response
for token in chat.ask("Explain recursion", stream=True):
print(token, end='', flush=True)
βοΈ Configuration
LLMConfig Class
from pywayne.llm.chat_bot import LLMConfigconfig = LLMConfig(
base_url="https://api.example.com/v1",
api_key="your_api_key",
model="deepseek-chat",
temperature=0.7,
max_tokens=8192,
top_p=1.0,
frequency_penalty=0.0,
presence_penalty=0.0,
system_prompt="You are a helpful assistant"
)
chat = LLMChat(**config.to_dict())
Dynamic System Prompt Update
chat.update_system_prompt("You are now a Python expert, provide code examples")
TERMINAL
clawhub install chat-bot