Skip to content

Commit 93e69c7

Browse files
committed
Regenerate llms.txt to include bedrock
1 parent 6ed81e6 commit 93e69c7

File tree

1 file changed

+35
-0
lines changed

1 file changed

+35
-0
lines changed

llms.txt

Lines changed: 35 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1432,6 +1432,41 @@ You can use OpenInference's `using_attributes` context manager to capture additi
14321432
```python
14331433
from openinference.instrumentation import using_attributes
14341434

1435+
def main():
1436+
with using_attributes(
1437+
session_id="my-test-session",
1438+
user_id="my-test-user",
1439+
tags=["tag-1", "tag-2"],
1440+
metadata={"foo": "bar"},
1441+
):
1442+
# Your LLM call
1443+
```
1444+
1445+
### AWS Bedrock
1446+
1447+
Installation:
1448+
1449+
```bash
1450+
pip install openinference-instrumentation-bedrock
1451+
```
1452+
1453+
Then, instrument your AWS calls:
1454+
1455+
```python
1456+
from openinference.instrumentation.bedrock import BedrockInstrumentor
1457+
1458+
BedrockInstrumentor().instrument()
1459+
```
1460+
1461+
That's it! You can now see the traces for your AWS Bedrock calls in the LangWatch dashboard.
1462+
1463+
## Capturing Metadata
1464+
1465+
You can use OpenInference's `using_attributes` context manager to capture additional information for your LLM calls, such as the user_id, session_id (equivalent to thread id), tags and metadata:
1466+
1467+
```python
1468+
from openinference.instrumentation import using_attributes
1469+
14351470
def main():
14361471
with using_attributes(
14371472
session_id="my-test-session",

0 commit comments

Comments
 (0)