Skip to content

Commit aae88dd

Browse files
committed
docs: updated README
1 parent 7b1f469 commit aae88dd

1 file changed

Lines changed: 20 additions & 32 deletions

File tree

README.md

Lines changed: 20 additions & 32 deletions
Original file line numberDiff line numberDiff line change
@@ -55,18 +55,13 @@ gem install traceloop-sdk
5555

5656
Then, to start instrumenting your code, just add this line to your code:
5757

58-
```python
59-
from traceloop.sdk import Traceloop
58+
```ruby
59+
require "traceloop/sdk"
6060

61-
Traceloop.init()
61+
traceloop = Traceloop::SDK::Traceloop.new
6262
```
6363

6464
That's it. You're now tracing your code with OpenLLMetry!
65-
If you're running this locally, you may want to disable batch sending, so you can see the traces immediately:
66-
67-
```python
68-
Traceloop.init(disable_batch=True)
69-
```
7065

7166
Now, you need to decide where to export the traces to.
7267

@@ -86,31 +81,24 @@ See [our docs](https://traceloop.com/docs/openllmetry/integrations/exporting) fo
8681

8782
## 🪗 What do we instrument?
8883

89-
OpenLLMetry can instrument everything that [OpenTelemetry already instruments](https://github.com/open-telemetry/opentelemetry-python-contrib/tree/main/instrumentation) - so things like your DB, API calls, and more. On top of that, we built a set of custom extensions that instrument things like your calls to OpenAI or Anthropic, or your Vector DB like Pinecone, Chroma, or Weaviate.
90-
91-
### LLM Providers
84+
OpenLLMetry is in early-alpha exploratory stage, and we're still figuring out what to instrument.
85+
As opposed to other languages, there aren't many official LLM libraries (yet?), so for now you'll have to manually log prompts:
9286

93-
- [x] OpenAI / Azure OpenAI
94-
- [x] Anthropic
95-
- [x] Cohere
96-
- [ ] Replicate
97-
- [x] HuggingFace
98-
- [ ] Vertex AI (GCP)
99-
- [ ] Bedrock (AWS)
87+
```ruby
88+
require "openai"
10089

101-
### Vector DBs
90+
client = OpenAI::Client.new
10291

103-
- [x] Pinecone
104-
- [x] Chroma
105-
- [ ] Weaviate
106-
- [ ] Milvus
107-
108-
### Frameworks
109-
110-
- [x] LangChain
111-
- [x] [Haystack](https://haystack.deepset.ai/integrations/traceloop)
112-
- [x] [LiteLLM](https://docs.litellm.ai/docs/observability/traceloop_integration)
113-
- [ ] LlamaIndex
92+
traceloop.llm_call() do |tracer|
93+
tracer.log_prompt(model="gpt-3.5-turbo", user_prompt="Tell me a joke about OpenTelemetry")
94+
response = client.chat(
95+
parameters: {
96+
model: "gpt-3.5-turbo",
97+
messages: [{ role: "user", content: "Tell me a joke about OpenTelemetry" }]
98+
})
99+
tracer.log_response(response)
100+
end
101+
```
114102

115103
## 🌱 Contributing
116104

@@ -124,8 +112,8 @@ Not sure where to get started? You can:
124112
## 💚 Community & Support
125113

126114
- [Slack](https://join.slack.com/t/traceloopcommunity/shared_invite/zt-1plpfpm6r-zOHKI028VkpcWdobX65C~g) (For live discussion with the community and the Traceloop team)
127-
- [GitHub Discussions](https://github.com/traceloop/openllmetry/discussions) (For help with building and deeper conversations about features)
128-
- [GitHub Issues](https://github.com/traceloop/openllmetry/issues) (For any bugs and errors you encounter using OpenLLMetry)
115+
- [GitHub Discussions](https://github.com/traceloop/openllmetry-ruby/discussions) (For help with building and deeper conversations about features)
116+
- [GitHub Issues](https://github.com/traceloop/openllmetry-ruby/issues) (For any bugs and errors you encounter using OpenLLMetry)
129117
- [Twitter](https://twitter.com/traceloopdev) (Get news fast)
130118

131119
## 🙏 Special Thanks

0 commit comments

Comments
 (0)