#go #golang #http #api_gateway #raft #gateway #proxy_server #traffic #load_balancer #etcd #reverse_proxy #mesh #cloud_native #sidecar
https://github.com/megaease/easegress
https://github.com/megaease/easegress
GitHub
GitHub - easegress-io/easegress: A Cloud Native traffic orchestration system
A Cloud Native traffic orchestration system. Contribute to easegress-io/easegress development by creating an account on GitHub.
#typescript #api #api_keys #authentication #authorization #gateway #hacktoberfest #open_source #rate_limiter
Unkey is an open-source tool for managing API authentication and authorization. It allows developers to securely manage access to their APIs. The benefit to users is that Unkey provides a free, community-driven solution that can be customized and improved by anyone, ensuring robust security and flexibility in API management. Users can also contribute to the project or get in touch with the team for further discussions.
https://github.com/unkeyed/unkey
Unkey is an open-source tool for managing API authentication and authorization. It allows developers to securely manage access to their APIs. The benefit to users is that Unkey provides a free, community-driven solution that can be customized and improved by anyone, ensuring robust security and flexibility in API management. Users can also contribute to the project or get in touch with the team for further discussions.
https://github.com/unkeyed/unkey
GitHub
GitHub - unkeyed/unkey: The Developer Platform for Modern APIs
The Developer Platform for Modern APIs. Contribute to unkeyed/unkey development by creating an account on GitHub.
#python #ai_gateway #anthropic #azure_openai #bedrock #gateway #langchain #llm #llm_gateway #llmops #openai #openai_proxy #vertex_ai
LiteLLM is a tool that helps you use different AI models from various providers like OpenAI, Azure, and Huggingface in a simple way. Here’s how it benefits you You can call any AI model using a consistent format, making it easier to switch between different providers.
- **Consistent Output** You can set budgets and rate limits for your projects, helping you manage costs and usage efficiently.
- **Retry and Fallback Logic** It supports streaming responses and asynchronous calls, which can improve performance.
- **Logging and Observability**: You can easily log data to various tools like Lunary, Langfuse, and Slack, helping you monitor and analyze your AI usage.
Overall, LiteLLM simplifies working with multiple AI providers, makes your code cleaner, and helps you manage resources better.
https://github.com/BerriAI/litellm
LiteLLM is a tool that helps you use different AI models from various providers like OpenAI, Azure, and Huggingface in a simple way. Here’s how it benefits you You can call any AI model using a consistent format, making it easier to switch between different providers.
- **Consistent Output** You can set budgets and rate limits for your projects, helping you manage costs and usage efficiently.
- **Retry and Fallback Logic** It supports streaming responses and asynchronous calls, which can improve performance.
- **Logging and Observability**: You can easily log data to various tools like Lunary, Langfuse, and Slack, helping you monitor and analyze your AI usage.
Overall, LiteLLM simplifies working with multiple AI providers, makes your code cleaner, and helps you manage resources better.
https://github.com/BerriAI/litellm
GitHub
GitHub - BerriAI/litellm: Python SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with cost tracking…
Python SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with cost tracking, guardrails, loadbalancing and logging. [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthr...
#typescript #ai_gateway #gateway #generative_ai #hacktoberfest #langchain #llama_index #llmops #llms #openai #prompt_engineering #router
The AI Gateway by Portkey lets you connect to over 1600 AI models quickly and securely through one simple API, making it easy to integrate any language, vision, or audio AI model in under two minutes. It ensures fast responses with less than 1ms latency, automatic retries, load balancing, and fallback options to keep your AI apps reliable and scalable. It also offers strong security with role-based access, guardrails, and compliance with standards like SOC2 and GDPR. You can save costs with smart caching and optimize usage without changing your code. This helps you build powerful, cost-effective, and secure AI applications faster and with less hassle.
https://github.com/Portkey-AI/gateway
The AI Gateway by Portkey lets you connect to over 1600 AI models quickly and securely through one simple API, making it easy to integrate any language, vision, or audio AI model in under two minutes. It ensures fast responses with less than 1ms latency, automatic retries, load balancing, and fallback options to keep your AI apps reliable and scalable. It also offers strong security with role-based access, guardrails, and compliance with standards like SOC2 and GDPR. You can save costs with smart caching and optimize usage without changing your code. This helps you build powerful, cost-effective, and secure AI applications faster and with less hassle.
https://github.com/Portkey-AI/gateway
GitHub
GitHub - Portkey-AI/gateway: A blazing fast AI Gateway with integrated guardrails. Route to 200+ LLMs, 50+ AI Guardrails with 1…
A blazing fast AI Gateway with integrated guardrails. Route to 200+ LLMs, 50+ AI Guardrails with 1 fast & friendly API. - Portkey-AI/gateway
#python #agents #ai #api_gateway #asyncio #authentication_middleware #devops #docker #fastapi #federation #gateway #generative_ai #jwt #kubernetes #llm_agents #mcp #model_context_protocol #observability #prompt_engineering #python #tools
The MCP Gateway is a powerful tool that unifies different AI service protocols like REST and MCP into one easy-to-use endpoint. It helps you manage multiple AI tools and services securely with features like authentication, retries, rate-limiting, and real-time monitoring through an admin UI. You can run it locally or in scalable cloud environments using Docker or Kubernetes. It supports various communication methods (HTTP, WebSocket, SSE, stdio) and offers observability with OpenTelemetry for tracking AI tool usage and performance. This gateway simplifies connecting AI clients to diverse services, making development and management more efficient and secure.
https://github.com/IBM/mcp-context-forge
The MCP Gateway is a powerful tool that unifies different AI service protocols like REST and MCP into one easy-to-use endpoint. It helps you manage multiple AI tools and services securely with features like authentication, retries, rate-limiting, and real-time monitoring through an admin UI. You can run it locally or in scalable cloud environments using Docker or Kubernetes. It supports various communication methods (HTTP, WebSocket, SSE, stdio) and offers observability with OpenTelemetry for tracking AI tool usage and performance. This gateway simplifies connecting AI clients to diverse services, making development and management more efficient and secure.
https://github.com/IBM/mcp-context-forge
GitHub
GitHub - IBM/mcp-context-forge: A Model Context Protocol (MCP) Gateway & Registry. Serves as a central management point for tools…
A Model Context Protocol (MCP) Gateway & Registry. Serves as a central management point for tools, resources, and prompts that can be accessed by MCP-compatible LLM applications. Converts R...