RubyLLM: A delightful Ruby way to work with AI
RubyLLM:一种愉悦的 Ruby AI 开发方式
crmne/ruby_llm
一个使用 AI 的愉悦的 Ruby 方式。没有疯狂的配置,没有复杂的回调,没有处理程序地狱 – 只有美观、富有表现力的 Ruby 代码。
许可协议
crmne/ruby_llm
main
文件夹和文件
| 名称 | 名称 | 最近提交信息 | 最近提交日期 | | ------------------ | ------------------ | -------------------- | ------------- | | .github/workflows | .github/workflows | | | | bin | bin | | | | docs | docs | | | | lib | lib | | | | spec | spec | | | | .gitignore | .gitignore | | | | .overcommit.yml | .overcommit.yml | | | | .rspec | .rspec | | | | .rspec_status | .rspec_status | | | | .rubocop.yml | .rubocop.yml | | | | .yardopts | .yardopts | | | | Gemfile | Gemfile | | | | LICENSE | LICENSE | | | | README.md | README.md | | | | Rakefile | Rakefile | | | | ruby_llm.gemspec | ruby_llm.gemspec | | |
最新提交
历史记录
Repository files navigation
A delightful Ruby way to work with AI. No configuration madness, no complex callbacks, no handler hell – just beautiful, expressive Ruby code.
🤺 Battle tested at 💬 Chat with Work
AI 库存在的问题
每个 AI 提供商都有自己的客户端库、自己的响应格式、自己的流式传输约定以及自己的错误处理方式。想要使用多个提供商?准备好处理不兼容的 API 和臃肿的依赖项吧。
RubyLLM 解决了所有这些问题。一个适用于所有事物的精美 API。一种一致的格式。最少的依赖项 – 只有 Faraday 和 Zeitwerk。因为使用 AI 应该是一种乐趣,而不是一件苦差事。
功能
- 💬 与 OpenAI, Anthropic, Gemini 和 DeepSeek 模型聊天
- 👁️ 视觉和音频理解
- 📄 PDF 分析,用于分析文档
- 🖼️ 使用 DALL-E 和其他提供商生成图像
- 📊 用于向量搜索和语义分析的嵌入
- 🔧 允许 AI 使用你的 Ruby 代码的工具
- 🚂 使用 ActiveRecord 持久化聊天和消息的 Rails 集成
- 🌊 使用适当的 Ruby 模式流式传输响应
它的优点
# Just ask questions
chat = RubyLLM.chat
chat.ask "What's the best way to learn Ruby?"
# Analyze images
chat.ask "What's in this image?", with: { image: "ruby_conf.jpg" }
# Analyze audio recordings
chat.ask "Describe this meeting", with: { audio: "meeting.wav" }
# Analyze documents
chat.ask "Summarize this document", with: { pdf: "contract.pdf" }
# Generate images
RubyLLM.paint "a sunset over mountains in watercolor style"
# Create vector embeddings
RubyLLM.embed "Ruby is elegant and expressive"
# Let AI use your code
class Calculator < RubyLLM::Tool
description "Performs calculations"
param :expression, type: :string, desc: "Math expression to evaluate"
def execute(expression:)
eval(expression).to_s
end
end
chat.with_tool(Calculator).ask "What's 123 * 456?"
安装
# In your Gemfile
gem 'ruby_llm'
# Then run
bundle install
# Or install it yourself
gem install ruby_llm
使用你的 API 密钥进行配置:
RubyLLM.configure do |config|
config.openai_api_key = ENV['OPENAI_API_KEY']
config.anthropic_api_key = ENV['ANTHROPIC_API_KEY']
config.gemini_api_key = ENV['GEMINI_API_KEY']
config.deepseek_api_key = ENV['DEEPSEEK_API_KEY'] # Optional
end
进行精彩的对话
# Start a chat with the default model (GPT-4o-mini)
chat = RubyLLM.chat
# Or specify what you want
chat = RubyLLM.chat(model: 'claude-3-7-sonnet-20250219')
# Simple questions just work
chat.ask "What's the difference between attr_reader and attr_accessor?"
# Multi-turn conversations are seamless
chat.ask "Could you give me an example?"
# Stream responses in real-time
chat.ask "Tell me a story about a Ruby programmer" do |chunk|
print chunk.content
end
# Understand content in multiple forms
chat.ask "Compare these diagrams", with: { image: ["diagram1.png", "diagram2.png"] }
chat.ask "Summarize this document", with: { pdf: "contract.pdf" }
chat.ask "What's being said?", with: { audio: "meeting.wav" }
# Need a different model mid-conversation? No problem
chat.with_model('gemini-2.0-flash').ask "What's your favorite algorithm?"
合理的 Rails 集成
# app/models/chat.rb
class Chat < ApplicationRecord
acts_as_chat
# Works great with Turbo
broadcasts_to ->(chat) { "chat_#{chat.id}" }
end
# app/models/message.rb
class Message < ApplicationRecord
acts_as_message
end
# app/models/tool_call.rb
class ToolCall < ApplicationRecord
acts_as_tool_call
end
# In your controller
chat = Chat.create!(model_id: "gpt-4o-mini")
chat.ask("What's your favorite Ruby gem?") do |chunk|
Turbo::StreamsChannel.broadcast_append_to(
chat,
target: "response",
partial: "messages/chunk",
locals: { chunk: chunk }
)
end
# That's it - chat history is automatically saved
创建工具轻而易举
class Search < RubyLLM::Tool
description "Searches a knowledge base"
param :query, desc: "The search query"
param :limit, type: :integer, desc: "Max results", required: false
def execute(query:, limit: 5)
# Your search logic here
Document.search(query).limit(limit).map(&:title)
end
end
# Let the AI use it
chat.with_tool(Search).ask "Find documents about Ruby 3.3 features"
了解更多
查看 https://rubyllm.com 上的指南,深入了解如何使用工具进行对话、流式传输响应、嵌入生成等。
许可协议
在 MIT License 下发布。
关于
一个使用 AI 的愉悦的 Ruby 方式。没有疯狂的配置,没有复杂的回调,没有处理程序地狱 – 只有美观、富有表现力的 Ruby 代码。
主题
ruby rails ai embeddings gemini openai image-generation claude dall-e llm chatgpt anthropic deepseek