You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
codiumai/pr-agent in Azure DevOps with the default .yml
Issues details
The recent commit broke the integration for me since the OpenAI account I am using is not yet Tier 3. This is an easy fix, but due to OpenAI's tier handling, I have to wait 7 days.
Error message: pr_agent.algo.ai_handlers.litellm_ai_handler:chat_completion:276 - Error during LLM inference: litellm.NotFoundError: OpenAIException - Error code: 404 - {'error': {'message': 'The model o3-mini does not exist or you do not have access to it.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}
Additionally, the fallback model doesn't seem to be used and another change seems to have made the previous default incompatible. Locking manually to gpt-4o-2024-11-20 results in the following error:
pr_agent.algo.ai_handlers.litellm_ai_handler:chat_completion:276 - Error during LLM inference: litellm.APIError: AzureException APIError - argument of type 'NoneType' is not iterable
This issue will resolve itself once o3-mini becomes available for this account next week. However, it might confuse new users that OpenAI Tier 3 API access is mandatory in the default configuration.
The text was updated successfully, but these errors were encountered:
You can modify the configuration to use any model you have access to while waiting for your Tier 3 access to activate. For immediate compatibility, you can set the model to Claude 3.7 Sonnet or GPT-4o (if you have access) in your configuration file. This would bypass the requirement for the o3-mini model that's currently causing the error.
Regarding the fallback model issue, the NoneType is not iterable error suggests there might be a parameter initialization problem when using certain model strings with Azure. You might want to try a different fallback model format, for example. We would appreciate your contribution for investigating this further.
Git provider (optional)
Azure
System Info (optional)
codiumai/pr-agent in Azure DevOps with the default .yml
Issues details
The recent commit broke the integration for me since the OpenAI account I am using is not yet Tier 3. This is an easy fix, but due to OpenAI's tier handling, I have to wait 7 days.
Error message:
pr_agent.algo.ai_handlers.litellm_ai_handler:chat_completion:276 - Error during LLM inference: litellm.NotFoundError: OpenAIException - Error code: 404 - {'error': {'message': 'The model o3-mini does not exist or you do not have access to it.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}
Additionally, the fallback model doesn't seem to be used and another change seems to have made the previous default incompatible. Locking manually to
gpt-4o-2024-11-20
results in the following error:pr_agent.algo.ai_handlers.litellm_ai_handler:chat_completion:276 - Error during LLM inference: litellm.APIError: AzureException APIError - argument of type 'NoneType' is not iterable
This issue will resolve itself once
o3-mini
becomes available for this account next week. However, it might confuse new users that OpenAI Tier 3 API access is mandatory in the default configuration.The text was updated successfully, but these errors were encountered: