Show HN: Index – New Open Source browser agent
Show HN: Index - 全新的开源浏览器代理
Navigation Menu
- Product
- GitHub Copilot Write better code with AI
- GitHub Advanced Security Find and fix vulnerabilities
- Actions Automate any workflow
- Codespaces Instant dev environments
- Issues Plan and track work
- Code Review Manage code changes
- Discussions Collaborate outside of code
- Code Search Find more, search less Explore
- Why GitHub
- All features
- Documentation
- GitHub Skills
- Blog
- Solutions By company size
- Resources Topics
- Open Source
- Enterprise
- Pricing
隆重推出:Index - 一款全新的开源浏览器代理
SOTA 开源浏览器代理,用于在 Web 上自主执行复杂任务
License
lmnr-ai/index
Folders and files
Name| Name| Last commit message| Last commit date ---|---|---|--- .github/workflows| .github/workflows .vscode| .vscode index| index static| static .gitignore| .gitignore LICENSE| LICENSE README.md| README.md pyproject.toml| pyproject.toml uv.lock| uv.lock
Repository files navigation
Index
Index 是 SOTA 开源浏览器代理,用于在 Web 上自主执行复杂任务。
- 由具有视觉能力的推理 LLM 提供支持。
- Gemini 2.5 Pro (非常快速和准确)
- Claude 3.7 Sonnet 具有扩展思维(可靠且准确)
- OpenAI o4-mini(取决于推理工作量,在速度、成本和准确性之间提供良好的平衡)
- Gemini 2.5 Flash(非常快速、便宜,适合不太复杂的任务)
pip install lmnr-index
并在您的项目中使用它index run
在交互式 CLI 中运行代理- Index 也可用作 serverless API。
- 您还可以通过 Chat UI 试用 Index。
- 支持高级 browser agent observability,由开源平台 Laminar 提供支持。
prompt: go to ycombinator.com. summarize first 3 companies in the W25 batch and make new spreadsheet in google sheets.
Documentation
查看完整文档 here
Index API
在生产中使用 Index 的最简单方法是通过 serverless API。 Index API 管理远程浏览器会话、代理基础设施和 browser observability。 要开始使用,请 sign up 并创建项目 API 密钥。 阅读 docs 了解更多信息。
Install Laminar
pip install lmnr
Use Index via API
from lmnr import Laminar, AsyncLaminarClient
import asyncio
# you can also set LMNR_PROJECT_API_KEY environment variable
# Initialize tracing
Laminar.initialize(project_api_key="your_api_key")
# Initialize the client
client = AsyncLaminarClient(project_api_key="your_api_key")
async def main():
response = await client.agent.run(
prompt="Navigate to news.ycombinator.com, find a post about AI, and summarize it"
)
print(response.result)
if __name__ == "__main__":
asyncio.run(main())
Local Quick Start
Install dependencies
pip install lmnr-index
# Install playwright
playwright install chromium
Setup model API keys
在项目根目录下的 .env
文件中设置您的模型 API 密钥:
ANTHROPIC_API_KEY=
GEMINI_API_KEY=
OPENAI_API_KEY=
Run the agent with CLI
您可以通过交互式 CLI 运行 Index。 它具有以下功能:
- 会话之间的浏览器状态持久性
- 后续消息,支持“give human control”操作
- 实时流更新
- 使用 Textual 的美观终端 UI
您可以使用以下命令运行代理。 请记住在 .env
文件中为所选模型设置 API 密钥。
index run
输出将如下所示:
Loaded existing browser state
╭───────────────────── Interactive Mode ─────────────────────╮
│ Index Browser Agent Interactive Mode │
│ Type your message and press Enter. The agent will respond. │
│ Press Ctrl+C to exit. │
╰────────────────────────────────────────────────────────────╯
Choose an LLM model:
1. Gemini 2.5 Flash
2. Claude 3.7 Sonnet
3. OpenAI o4-mini
Select model [1/2] (1): 3
Using OpenAI model: o4-mini
Loaded existing browser state
Your message: go to lmnr.ai, summarize pricing page
Agent is working...
Step 1: Opening lmnr.ai
Step 2: Opening Pricing page
Step 3: Scrolling for more pricing details
Step 4: Scrolling back up to view pricing tiers
Step 5: Provided concise summary of the three pricing tiers
Run the agent with code
import asyncio
from index import Agent, AnthropicProvider
async def main():
llm = AnthropicProvider(
model="claude-3-7-sonnet-20250219",
enable_thinking=True,
thinking_token_budget=2048)
# llm = OpenAIProvider(model="o4-mini") you can also use OpenAI models
agent = Agent(llm=llm)
output = await agent.run(
prompt="Navigate to news.ycombinator.com, find a post about AI, and summarize it"
)
print(output.result)
if __name__ == "__main__":
asyncio.run(main())
Stream the agent's output
async for chunk in agent.run_stream(
prompt="Navigate to news.ycombinator.com, find a post about AI, and summarize it"
):
print(chunk)
Enable browser agent observability
要跟踪 Index 代理的操作并记录浏览器会话,您只需在运行代理之前初始化 Laminar 跟踪。
from lmnr import Laminar
Laminar.initialize(project_api_key="your_api_key")
然后,您将获得 Laminar 平台上与浏览器会话同步的代理操作的完整可观察性。
Run with remote CDP url
import asyncio
from index import Agent, AnthropicProvider, BrowserConfig
async def main():
# Configure browser to connect to an existing Chrome DevTools Protocol endpoint
browser_config = BrowserConfig(
cdp_url="<cdp_url>"
)
llm = AnthropicProvider(model="claude-3-7-sonnet-20250219", enable_thinking=True, thinking_token_budget=2048)
agent = Agent(llm=llm, browser_config=browser_config)
output = await agent.run(
prompt="Navigate to news.ycombinator.com and find the top story"
)
print(output.result)
if __name__ == "__main__":
asyncio.run(main())
Customize browser window size
import asyncio
from index import Agent, AnthropicProvider, BrowserConfig
async def main():
# Configure browser with custom viewport size
browser_config = BrowserConfig(
viewport_size={"width": 1200, "height": 900}
)
llm = AnthropicProvider(model="claude-3-7-sonnet-20250219")
agent = Agent(llm=llm, browser_config=browser_config)
output = await agent.run(
"Navigate to a responsive website and capture how it looks in full HD resolution"
)
print(output.result)
if __name__ == "__main__":
asyncio.run(main())
由 Laminar team 用 ❤️ 制作
About
SOTA 开源浏览器代理,用于在 Web 上自主执行复杂任务
Topics
ai sota ai-agent browser-agent llm claude-3-7-sonnet
Resources
License
Stars
Watchers
Forks
Releases 7
Packages 0
Contributors 4
Languages
Footer
© 2025 GitHub, Inc.