Releases: run-llama/llama_deploy
Releases · run-llama/llama_deploy
v0.0.9
v0.0.9
v0.0.8
What's Changed
New Features 🎉
New Contributors
- @zhaoweiguo made their first contribution in #127
- @satvik314 made their first contribution in #126
Full Changelog: v0.0.7...v0.0.8
v0.0.7
What's Changed
New Features 🎉
- [Feature] Add RabbitMQMessageQueue (RabbitMQ Integration) by @nerdai in #85
- feat: add redis as message broker by @0xthierry in #113
Bug Fixes 🐛
- [Fix] - RabbitMQMessageQueue vhost param by @nerdai in #104
- [FIX] Remove overriding of
publish
inBaseService
by @nerdai in #114 - [fix] support nested service components in pipelines by @logan-markewich in #118
- [fix] lazily init object index to avoid embeddings init by @logan-markewich in #119
Documentation 📚
- [docs] Add docstrings by @logan-markewich in #98
- [example] Add CRAG impelmentation by @jerryjliu in #82
- nits: minor crag edits by @jerryjliu in #107
- Example: multi-agent app with RabbitMQ (with/without Docker + K8s) by @nerdai in #105
- [docs] Two fixes to provide better out-of-box experience to get started with llama agents by @peteryxu in #106
New Contributors
- @peteryxu made their first contribution in #106
- @0xthierry made their first contribution in #113
Full Changelog: v0.0.4...v0.0.7
v0.0.6
v0.0.6
v0.0.5
- Added RabbitMQ Message Queue Support
- Refactored the base message queue to account for returning a callable to "start consuming" (a common pattern for most queues)