Currently, you can’t connect your own OpenAI or other LLM provider keys to Localazy AI. The system is built as an integrated service, not an API wrapper.
If you connected your own tokens, you would encounter unexpected errors and unpredictable behavior.
Rate limits vary by tier. Your OpenAI account might be on a tier with strict limits. Rate limits change between tiers, between models, and based on your usage patterns. This doesn’t apply to Localazy AI.
Models disappear or change. If OpenAI deprecates or changes a model’s behavior, your integration could break. You will need to update the model manually, regenerate the token, etc. With Localazy AI, we handle model updates and ensure translations remain consistent when underlying models change.
Please help us with improving our documentation.
We appreciate all your feedback.