RubyLLM:一种愉悦的 Ruby AI 开发方式

crmne/ruby_llm

一个使用 AI 的愉悦的 Ruby 方式。没有疯狂的配置,没有复杂的回调,没有处理程序地狱 – 只有美观、富有表现力的 Ruby 代码。

rubyllm.com/

许可协议

MIT license

472 stars 8 forks

crmne/ruby_llm

main

Branches Tags

文件夹和文件

| 名称 | 名称 | 最近提交信息 | 最近提交日期 | | ------------------ | ------------------ | -------------------- | ------------- | | .github/workflows | .github/workflows | | | | bin | bin | | | | docs | docs | | | | lib | lib | | | | spec | spec | | | | .gitignore | .gitignore | | | | .overcommit.yml | .overcommit.yml | | | | .rspec | .rspec | | | | .rspec_status | .rspec_status | | | | .rubocop.yml | .rubocop.yml | | | | .yardopts | .yardopts | | | | Gemfile | Gemfile | | | | LICENSE | LICENSE | | | | README.md | README.md | | | | Rakefile | Rakefile | | | | ruby_llm.gemspec | ruby_llm.gemspec | | |

最新提交

历史记录

126 Commits

Repository files navigation

RubyLLM A delightful Ruby way to work with AI. No configuration madness, no complex callbacks, no handler hell – just beautiful, expressive Ruby code. OpenAI Anthropic Google DeepSeek Gem Version Ruby Style Guide Gem Downloads codecov 🤺 Battle tested at 💬 Chat with Work

AI 库存在的问题

每个 AI 提供商都有自己的客户端库、自己的响应格式、自己的流式传输约定以及自己的错误处理方式。想要使用多个提供商?准备好处理不兼容的 API 和臃肿的依赖项吧。

RubyLLM 解决了所有这些问题。一个适用于所有事物的精美 API。一种一致的格式。最少的依赖项 – 只有 Faraday 和 Zeitwerk。因为使用 AI 应该是一种乐趣,而不是一件苦差事。

功能

它的优点

# Just ask questions
chat = RubyLLM.chat
chat.ask "What's the best way to learn Ruby?"
# Analyze images
chat.ask "What's in this image?", with: { image: "ruby_conf.jpg" }
# Analyze audio recordings
chat.ask "Describe this meeting", with: { audio: "meeting.wav" }
# Analyze documents
chat.ask "Summarize this document", with: { pdf: "contract.pdf" }
# Generate images
RubyLLM.paint "a sunset over mountains in watercolor style"
# Create vector embeddings
RubyLLM.embed "Ruby is elegant and expressive"
# Let AI use your code
class Calculator < RubyLLM::Tool
 description "Performs calculations"
 param :expression, type: :string, desc: "Math expression to evaluate"
 def execute(expression:)
  eval(expression).to_s
 end
end
chat.with_tool(Calculator).ask "What's 123 * 456?"

安装

# In your Gemfile
gem 'ruby_llm'
# Then run
bundle install
# Or install it yourself
gem install ruby_llm

使用你的 API 密钥进行配置:

RubyLLM.configure do |config|
 config.openai_api_key = ENV['OPENAI_API_KEY']
 config.anthropic_api_key = ENV['ANTHROPIC_API_KEY']
 config.gemini_api_key = ENV['GEMINI_API_KEY']
 config.deepseek_api_key = ENV['DEEPSEEK_API_KEY'] # Optional
end

进行精彩的对话

# Start a chat with the default model (GPT-4o-mini)
chat = RubyLLM.chat
# Or specify what you want
chat = RubyLLM.chat(model: 'claude-3-7-sonnet-20250219')
# Simple questions just work
chat.ask "What's the difference between attr_reader and attr_accessor?"
# Multi-turn conversations are seamless
chat.ask "Could you give me an example?"
# Stream responses in real-time
chat.ask "Tell me a story about a Ruby programmer" do |chunk|
 print chunk.content
end
# Understand content in multiple forms
chat.ask "Compare these diagrams", with: { image: ["diagram1.png", "diagram2.png"] }
chat.ask "Summarize this document", with: { pdf: "contract.pdf" }
chat.ask "What's being said?", with: { audio: "meeting.wav" }
# Need a different model mid-conversation? No problem
chat.with_model('gemini-2.0-flash').ask "What's your favorite algorithm?"

合理的 Rails 集成

# app/models/chat.rb
class Chat < ApplicationRecord
 acts_as_chat
 # Works great with Turbo
 broadcasts_to ->(chat) { "chat_#{chat.id}" }
end
# app/models/message.rb
class Message < ApplicationRecord
 acts_as_message
end
# app/models/tool_call.rb
class ToolCall < ApplicationRecord
 acts_as_tool_call
end
# In your controller
chat = Chat.create!(model_id: "gpt-4o-mini")
chat.ask("What's your favorite Ruby gem?") do |chunk|
 Turbo::StreamsChannel.broadcast_append_to(
  chat,
  target: "response",
  partial: "messages/chunk",
  locals: { chunk: chunk }
 )
end
# That's it - chat history is automatically saved

创建工具轻而易举

class Search < RubyLLM::Tool
 description "Searches a knowledge base"
 param :query, desc: "The search query"
 param :limit, type: :integer, desc: "Max results", required: false
 def execute(query:, limit: 5)
  # Your search logic here
  Document.search(query).limit(limit).map(&:title)
 end
end
# Let the AI use it
chat.with_tool(Search).ask "Find documents about Ruby 3.3 features"

了解更多

查看 https://rubyllm.com 上的指南,深入了解如何使用工具进行对话、流式传输响应、嵌入生成等。

许可协议

在 MIT License 下发布。

关于

一个使用 AI 的愉悦的 Ruby 方式。没有疯狂的配置,没有复杂的回调,没有处理程序地狱 – 只有美观、富有表现力的 Ruby 代码。

rubyllm.com/

主题

ruby rails ai embeddings gemini openai image-generation claude dall-e llm chatgpt anthropic deepseek

资源

Readme

许可协议

MIT license

Activity

Stars

472 stars

Watchers

7 watching

Forks

8 forks

Releases 2

Packages 0

语言