#go #ai_gateway #ai_native #api_gateway #cloud_native #envoy
Higress is a powerful API gateway that uses AI and is built on top of Envoy and Istio. It helps manage traffic for AI services, microservices, and other applications. Here are the key benefits You can start Higress with just a Docker command, making it simple for personal developers to set up and use.
- **AI Integration** It is designed for large-scale scenarios, handling tens of thousands of requests per second without disrupting the service.
- **Flexible Expansion** It includes WAF protection, multiple authentication strategies, and automatic SSL certificate management, ensuring your applications are secure.
Overall, Higress makes managing and scaling your applications easier, more efficient, and secure.
https://github.com/alibaba/higress
Higress is a powerful API gateway that uses AI and is built on top of Envoy and Istio. It helps manage traffic for AI services, microservices, and other applications. Here are the key benefits You can start Higress with just a Docker command, making it simple for personal developers to set up and use.
- **AI Integration** It is designed for large-scale scenarios, handling tens of thousands of requests per second without disrupting the service.
- **Flexible Expansion** It includes WAF protection, multiple authentication strategies, and automatic SSL certificate management, ensuring your applications are secure.
Overall, Higress makes managing and scaling your applications easier, more efficient, and secure.
https://github.com/alibaba/higress
GitHub
GitHub - alibaba/higress: 🤖 AI Gateway | AI Native API Gateway
🤖 AI Gateway | AI Native API Gateway. Contribute to alibaba/higress development by creating an account on GitHub.
#python #ai_gateway #anthropic #azure_openai #bedrock #gateway #langchain #llm #llm_gateway #llmops #openai #openai_proxy #vertex_ai
LiteLLM is a tool that helps you use different AI models from various providers like OpenAI, Azure, and Huggingface in a simple way. Here’s how it benefits you You can call any AI model using a consistent format, making it easier to switch between different providers.
- **Consistent Output** You can set budgets and rate limits for your projects, helping you manage costs and usage efficiently.
- **Retry and Fallback Logic** It supports streaming responses and asynchronous calls, which can improve performance.
- **Logging and Observability**: You can easily log data to various tools like Lunary, Langfuse, and Slack, helping you monitor and analyze your AI usage.
Overall, LiteLLM simplifies working with multiple AI providers, makes your code cleaner, and helps you manage resources better.
https://github.com/BerriAI/litellm
LiteLLM is a tool that helps you use different AI models from various providers like OpenAI, Azure, and Huggingface in a simple way. Here’s how it benefits you You can call any AI model using a consistent format, making it easier to switch between different providers.
- **Consistent Output** You can set budgets and rate limits for your projects, helping you manage costs and usage efficiently.
- **Retry and Fallback Logic** It supports streaming responses and asynchronous calls, which can improve performance.
- **Logging and Observability**: You can easily log data to various tools like Lunary, Langfuse, and Slack, helping you monitor and analyze your AI usage.
Overall, LiteLLM simplifies working with multiple AI providers, makes your code cleaner, and helps you manage resources better.
https://github.com/BerriAI/litellm
GitHub
GitHub - BerriAI/litellm: Python SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with cost tracking…
Python SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with cost tracking, guardrails, loadbalancing and logging. [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthr...
#lua #ai #ai_gateway #api_gateway #api_management #apis #artificial_intelligence #cloud_native #consul #devops #docker #kong #kubernetes #kubernetes_ingress #kubernetes_ingress_controller #luajit #microservice #microservices #nginx #reverse_proxy #serverless
Kong API Gateway is a powerful tool that helps manage and secure your APIs. It offers features like advanced routing, load balancing, health checking, and authentication, making it easier to handle API traffic. Kong is highly scalable and can run on various platforms, including Kubernetes. It also supports plugins for additional functionalities such as AI traffic management and custom extensions. By using Kong, you can centralize your API management, focus on other important tasks, and ensure your APIs are secure and perform well. You can get started quickly with a free trial or by following the easy installation steps.
https://github.com/Kong/kong
Kong API Gateway is a powerful tool that helps manage and secure your APIs. It offers features like advanced routing, load balancing, health checking, and authentication, making it easier to handle API traffic. Kong is highly scalable and can run on various platforms, including Kubernetes. It also supports plugins for additional functionalities such as AI traffic management and custom extensions. By using Kong, you can centralize your API management, focus on other important tasks, and ensure your APIs are secure and perform well. You can get started quickly with a free trial or by following the easy installation steps.
https://github.com/Kong/kong
GitHub
GitHub - Kong/kong: 🦍 The Cloud-Native Gateway for APIs & AI
🦍 The Cloud-Native Gateway for APIs & AI. Contribute to Kong/kong development by creating an account on GitHub.
#typescript #ai_gateway #gateway #generative_ai #hacktoberfest #langchain #llama_index #llmops #llms #openai #prompt_engineering #router
The AI Gateway by Portkey lets you connect to over 1600 AI models quickly and securely through one simple API, making it easy to integrate any language, vision, or audio AI model in under two minutes. It ensures fast responses with less than 1ms latency, automatic retries, load balancing, and fallback options to keep your AI apps reliable and scalable. It also offers strong security with role-based access, guardrails, and compliance with standards like SOC2 and GDPR. You can save costs with smart caching and optimize usage without changing your code. This helps you build powerful, cost-effective, and secure AI applications faster and with less hassle.
https://github.com/Portkey-AI/gateway
The AI Gateway by Portkey lets you connect to over 1600 AI models quickly and securely through one simple API, making it easy to integrate any language, vision, or audio AI model in under two minutes. It ensures fast responses with less than 1ms latency, automatic retries, load balancing, and fallback options to keep your AI apps reliable and scalable. It also offers strong security with role-based access, guardrails, and compliance with standards like SOC2 and GDPR. You can save costs with smart caching and optimize usage without changing your code. This helps you build powerful, cost-effective, and secure AI applications faster and with less hassle.
https://github.com/Portkey-AI/gateway
GitHub
GitHub - Portkey-AI/gateway: A blazing fast AI Gateway with integrated guardrails. Route to 200+ LLMs, 50+ AI Guardrails with 1…
A blazing fast AI Gateway with integrated guardrails. Route to 200+ LLMs, 50+ AI Guardrails with 1 fast & friendly API. - Portkey-AI/gateway