GitHub Trends
10.1K subscribers
15.3K links
See what the GitHub community is most excited about today.

A bot automatically fetches new repositories from https://github.com/trending and sends them to the channel.

Author and maintainer: https://github.com/katursis
Download Telegram
#go #cqrs #event_driven #event_sourcing #events #go #golang #kafka #nats #rabbitmq #reactive #sagas #stream_processing #watermill

Watermill is a tool for working with message streams in Go. It helps you build event-driven applications easily and efficiently. You can use it with various messaging systems like Kafka, RabbitMQ, or even HTTP and MySQL. Watermill is designed to be easy to understand, fast, flexible, and resilient. It provides many examples and a getting started guide to help you get going quickly. Using Watermill, you can handle messages in a simple way, similar to how you work with HTTP requests, making it easier to build distributed and scalable services without needing deep knowledge of complex systems. This makes it beneficial for developers who want to focus on their application logic rather than the underlying messaging infrastructure.

https://github.com/ThreeDotsLabs/watermill
#go #gnmi #golang #influxdb #json #kafka #logs #metrics #modbus #monitoring #mqtt #opcua #telegraf #time_series #windows_eventlog #windows_management_instrumentation #xpath

Telegraf is a tool that helps collect, process, and send various types of data like metrics, logs, and more. It has over 300 plugins for different tasks such as system monitoring, cloud services, and messaging. You can easily configure it using TOML, and it runs as a standalone binary without extra dependencies. This makes it easy to set up and use. With Telegraf, you can choose plugins to monitor your devices, logs, networks, and more, making it very flexible and powerful for managing your data efficiently.

https://github.com/influxdata/telegraf
#cplusplus #consul #dag #http #kafka #mysql #redis #sogou #tasking

Sogou C++ Workflow is a powerful tool for building back-end services in C++. It supports creating HTTP servers, asynchronous clients for protocols like HTTP, Redis, MySQL, and Kafka, and even custom protocols. You can use it to build complex workflows, parallel computing tasks, and high-performance services with ease. It works on various platforms including Linux, macOS, Windows, and more. The benefit to you is that it simplifies the development of robust and efficient back-end services, allowing you to focus on your business logic without worrying about the underlying complexities.

https://github.com/sogou/workflow
#java #kafka #scala

To use Apache Kafka, you need to have Java installed. Here’s what you can do Use commands like `./gradlew jar` to build a jar file and follow the quickstart guide for running Kafka.
- **Testing** Use tools like Checkstyle and Spotbugs to ensure your code meets standards.
- **Dependency Management** You can contribute to Kafka by following the guidelines on the Apache Kafka website.

This helps you build, test, and maintain high-quality code for Apache Kafka, making it easier to work with the project.

https://github.com/apache/kafka
#java #batch #cdc #change_data_capture #data_integration #data_pipeline #distributed #elt #etl #flink #kafka #mysql #paimon #postgresql #real_time #schema_evolution

Flink CDC is a tool that helps you move and transform data in real-time or in batches. It makes data integration simple by using YAML files to describe how data should be moved and transformed. This tool offers features like full database synchronization, table sharding, schema evolution, and data transformation. To use it, you need to set up an Apache Flink cluster, download Flink CDC, create a YAML file to define your data sources and sinks, and then run the job. This benefits you by making it easier to manage and integrate your data efficiently across different databases.

https://github.com/apache/flink-cdc
#java #data_stream #data_streaming #data_streams #hacktoberfest #kafka #kafka_connect #kafka_streams #kubernetes #kubernetes_controller #kubernetes_operator #messaging #openshift

Strimzi helps you run Apache Kafka on Kubernetes or OpenShift easily. It provides quick start guides, detailed documentation, and a community support system. You can get help through Slack, mailing lists, or GitHub discussions. Strimzi also allows you to contribute by fixing issues, improving documentation, or participating in community meetings. This makes it easier to manage and use Kafka clusters in a cloud-native environment, which is beneficial for users who need reliable and scalable messaging systems.

https://github.com/strimzi/strimzi-kafka-operator
#java #apache_kafka #big_data #cluster_management #event_streaming #hacktoberfest #kafka #kafka_brokers #kafka_client #kafka_cluster #kafka_connect #kafka_manager #kafka_producer #kafka_streams #kafka_ui #opensource #streaming_data #streams #web_ui

UI for Apache Kafka is a free, open-source web tool that helps you manage and monitor Apache Kafka clusters easily. It's lightweight and fast, making it simple to track key metrics like brokers, topics, partitions, production, and consumption. You can set it up quickly with a few commands and run it locally or in the cloud. The tool offers features like multi-cluster management, performance monitoring, browsing messages, and dynamic topic configuration. It also supports secure authentication and role-based access control. This makes it easier to observe data flows, troubleshoot issues, and ensure optimal performance of your Kafka clusters.

https://github.com/provectus/kafka-ui
1🤮1
#c_lang #apache_kafka #c #c_plus_plus #consumer #high_performance #kafka #kafka_consumer #kafka_producer #librdkafka

librdkafka is a powerful library that helps you work with Apache Kafka using C or C++. It allows you to produce and consume messages very quickly, handling over 1 million messages per second for producers and 3 million messages per second for consumers. It supports advanced features like exactly-once semantics, compression, SSL, and SASL security. This library is reliable, high-performance, and easy to use, making it a great tool for developers who need to integrate Kafka into their applications. It also has detailed documentation and community support, making it easier to get started and resolve any issues.

https://github.com/confluentinc/librdkafka
#java #ai #apache_kafka #aws #azure #cloud #cloud_first #cloud_native #ebs #gcp #kafka #llm #messaging #minio #s3 #serverless #spot #streaming

AutoMQ provides a cloud-native alternative to Apache Kafka that runs on S3 storage, cutting costs by up to 90% while enabling instant scaling and eliminating cross-zone traffic fees. It offers high reliability, serverless operation, and full Kafka compatibility, making it easier and cheaper to manage large-scale data streaming without sacrificing performance or features.

https://github.com/AutoMQ/automq
#jupyter_notebook #a2a #agentic_ai #dapr #dapr_pub_sub #dapr_service_invocation #dapr_sidecar #dapr_workflow #docker #kafka #kubernetes #langmem #mcp #openai #openai_agents_sdk #openai_api #postgresql_database #rabbitmq #rancher_desktop #redis #serverless_containers

The Dapr Agentic Cloud Ascent (DACA) design pattern helps you build powerful, scalable AI systems that can handle millions of AI agents working together without crashing. It uses Dapr technology with Kubernetes to efficiently manage many AI agents as lightweight virtual actors, ensuring fast response, reliability, and easy scaling. You can start small using free or low-cost cloud tools and grow to planet-scale systems. The OpenAI Agents SDK is recommended for beginners because it is simple, flexible, and gives you good control to develop AI agents quickly. This approach saves costs, avoids vendor lock-in, and supports resilient, event-driven AI workflows, making it ideal for developers aiming to create advanced, cloud-native AI applications[1][2][3][4].

https://github.com/panaversity/learn-agentic-ai
#java #cloud #coap #dashboard #iot #iot_analytics #iot_platform #iot_solutions #java #kafka #lwm2m #microservices #middleware #mqtt #netty #platform #snmp #thingsboard #visualization #websockets #widgets

ThingsBoard is an open-source IoT platform that helps manage and analyze data from connected devices. It allows users to collect data, create real-time dashboards, and automate tasks using a powerful rule engine. This platform supports various protocols like MQTT and HTTP, making it easy to connect devices. Users can also define relationships between devices and assets, and trigger alarms based on specific conditions. The benefit is that it simplifies IoT project development, making it scalable and efficient for applications like smart farming, smart offices, and more.

https://github.com/thingsboard/thingsboard