Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pass client timeout to underlying client #452

Merged
merged 3 commits into from
Feb 9, 2025
Merged

Conversation

apost71
Copy link
Contributor

@apost71 apost71 commented Feb 7, 2025

This is a PR to address #374

@masci I think you're right about the fine grained timeout configurations. Adding that capability introduces difficulties with serialization and ease of use via environment variables and may not be worth the effort.

@coveralls
Copy link

coveralls commented Feb 7, 2025

Coverage Status

coverage: 79.146%. remained the same
when pulling 0617de3 on apost71:main
into 9c33c0d on run-llama:main.

Copy link
Member

@masci masci left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR! Would you mind also adding the timeout to the async client here https://github.com/run-llama/llama_deploy/blob/main/llama_deploy/client/async_client.py#L137? We should really deprecate those two classes in favor of Client but until then it's maybe better to stay consistent

@apost71
Copy link
Contributor Author

apost71 commented Feb 8, 2025

Thanks for the PR! Would you mind also adding the timeout to the async client here https://github.com/run-llama/llama_deploy/blob/main/llama_deploy/client/async_client.py#L137? We should really deprecate those two classes in favor of Client but until then it's maybe better to stay consistent

Good catch, updated.

@apost71 apost71 requested a review from masci February 8, 2025 15:53
Copy link
Member

@masci masci left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you, merging!

@masci masci merged commit 904d7d3 into run-llama:main Feb 9, 2025
31 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants