You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+20-32Lines changed: 20 additions & 32 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -55,18 +55,13 @@ gem install traceloop-sdk
55
55
56
56
Then, to start instrumenting your code, just add this line to your code:
57
57
58
-
```python
59
-
fromtraceloop.sdkimport Traceloop
58
+
```ruby
59
+
require"traceloop/sdk"
60
60
61
-
Traceloop.init()
61
+
traceloop =Traceloop::SDK::Traceloop.new
62
62
```
63
63
64
64
That's it. You're now tracing your code with OpenLLMetry!
65
-
If you're running this locally, you may want to disable batch sending, so you can see the traces immediately:
66
-
67
-
```python
68
-
Traceloop.init(disable_batch=True)
69
-
```
70
65
71
66
Now, you need to decide where to export the traces to.
72
67
@@ -86,31 +81,24 @@ See [our docs](https://traceloop.com/docs/openllmetry/integrations/exporting) fo
86
81
87
82
## 🪗 What do we instrument?
88
83
89
-
OpenLLMetry can instrument everything that [OpenTelemetry already instruments](https://github.com/open-telemetry/opentelemetry-python-contrib/tree/main/instrumentation) - so things like your DB, API calls, and more. On top of that, we built a set of custom extensions that instrument things like your calls to OpenAI or Anthropic, or your Vector DB like Pinecone, Chroma, or Weaviate.
90
-
91
-
### LLM Providers
84
+
OpenLLMetry is in early-alpha exploratory stage, and we're still figuring out what to instrument.
85
+
As opposed to other languages, there aren't many official LLM libraries (yet?), so for now you'll have to manually log prompts:
tracer.log_prompt(model="gpt-3.5-turbo", user_prompt="Tell me a joke about OpenTelemetry")
94
+
response = client.chat(
95
+
parameters: {
96
+
model:"gpt-3.5-turbo",
97
+
messages: [{ role:"user", content:"Tell me a joke about OpenTelemetry" }]
98
+
})
99
+
tracer.log_response(response)
100
+
end
101
+
```
114
102
115
103
## 🌱 Contributing
116
104
@@ -124,8 +112,8 @@ Not sure where to get started? You can:
124
112
## 💚 Community & Support
125
113
126
114
-[Slack](https://join.slack.com/t/traceloopcommunity/shared_invite/zt-1plpfpm6r-zOHKI028VkpcWdobX65C~g) (For live discussion with the community and the Traceloop team)
127
-
-[GitHub Discussions](https://github.com/traceloop/openllmetry/discussions) (For help with building and deeper conversations about features)
128
-
-[GitHub Issues](https://github.com/traceloop/openllmetry/issues) (For any bugs and errors you encounter using OpenLLMetry)
115
+
-[GitHub Discussions](https://github.com/traceloop/openllmetry-ruby/discussions) (For help with building and deeper conversations about features)
116
+
-[GitHub Issues](https://github.com/traceloop/openllmetry-ruby/issues) (For any bugs and errors you encounter using OpenLLMetry)
0 commit comments