#python #text_classification #text #transformer #vision #image_classification #feedforward_neural_network #language_model #fourier_transform #fnet
https://github.com/rishikksh20/FNet-pytorch
https://github.com/rishikksh20/FNet-pytorch
GitHub
GitHub - rishikksh20/FNet-pytorch: Unofficial implementation of Google's FNet: Mixing Tokens with Fourier Transforms
Unofficial implementation of Google's FNet: Mixing Tokens with Fourier Transforms - GitHub - rishikksh20/FNet-pytorch: Unofficial implementation of Google's FNet: Mixing Tokens with...
#python #ai #bert #dpr #elasticsearch #information_retrieval #language_model #machine_learning #natural_language_processing #neural_search #nlp #pytorch #question_answering #search_engine #semantic_search #squad #summarization #transfer_learning #transformers
https://github.com/deepset-ai/haystack
https://github.com/deepset-ai/haystack
GitHub
GitHub - deepset-ai/haystack: AI orchestration framework to build customizable, production-ready LLM applications. Connect components…
AI orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data...
#python #callcenter #conformer #ctc_decode #deepspeech #fastspeech2 #language_model #mandarin_language #ngram #parallel_wavegan #punctuation_restoration #speech_alignment #speech_recognition #speech_to_text #speech_translation #streaming_asr #text_frontend #text_to_speech #transformer
https://github.com/PaddlePaddle/PaddleSpeech
https://github.com/PaddlePaddle/PaddleSpeech
GitHub
GitHub - PaddlePaddle/PaddleSpeech: Easy-to-use Speech Toolkit including Self-Supervised Learning model, SOTA/Streaming ASR with…
Easy-to-use Speech Toolkit including Self-Supervised Learning model, SOTA/Streaming ASR with punctuation, Streaming TTS with text frontend, Speaker Verification System, End-to-End Speech Translatio...
#python #computer_vision #contrastive_loss #deep_learning #language_model #multi_modal_learning #pretrained_models #pytorch #zero_shot_classification
https://github.com/mlfoundations/open_clip
https://github.com/mlfoundations/open_clip
GitHub
GitHub - mlfoundations/open_clip: An open source implementation of CLIP.
An open source implementation of CLIP. Contribute to mlfoundations/open_clip development by creating an account on GitHub.
#python #bert #deep_learning #language_model #language_models #machine_learning #natural_language_processing #nlp #pytorch #transformer
https://github.com/extreme-bert/extreme-bert
https://github.com/extreme-bert/extreme-bert
GitHub
GitHub - extreme-bert/extreme-bert: ExtremeBERT is a toolkit that accelerates the pretraining of customized language models on…
ExtremeBERT is a toolkit that accelerates the pretraining of customized language models on customized datasets, described in the paper “ExtremeBERT: A Toolkit for Accelerating Pretraining of Custom...
#python #attention_mechanism #deep_learning #gpt #gpt_2 #gpt_3 #language_model #linear_attention #lstm #pytorch #rnn #rwkv #transformer #transformers
https://github.com/BlinkDL/RWKV-LM
https://github.com/BlinkDL/RWKV-LM
GitHub
GitHub - BlinkDL/RWKV-LM: RWKV (pronounced RwaKuv) is an RNN with great LLM performance, which can also be directly trained like…
RWKV (pronounced RwaKuv) is an RNN with great LLM performance, which can also be directly trained like a GPT transformer (parallelizable). We are at RWKV-7 "Goose". So it'...
#python #deepspeed_library #gpt_3 #language_model #transformers
https://github.com/EleutherAI/gpt-neox
https://github.com/EleutherAI/gpt-neox
GitHub
GitHub - EleutherAI/gpt-neox: An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and…
An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed libraries - EleutherAI/gpt-neox
#python #agi #gpt #language_model #llm #lm #lmops #nlp #pretraining #prompt #promptist #x_prompt
https://github.com/microsoft/LMOps
https://github.com/microsoft/LMOps
GitHub
GitHub - microsoft/LMOps: General technology for enabling AI capabilities w/ LLMs and MLLMs
General technology for enabling AI capabilities w/ LLMs and MLLMs - microsoft/LMOps
#python #deep_learning #flax #jax #language_model #large_language_models #natural_language_processing #transformer
https://github.com/young-geng/EasyLM
https://github.com/young-geng/EasyLM
GitHub
GitHub - young-geng/EasyLM: Large language models (LLMs) made easy, EasyLM is a one stop solution for pre-training, finetuning…
Large language models (LLMs) made easy, EasyLM is a one stop solution for pre-training, finetuning, evaluating and serving LLMs in JAX/Flax. - young-geng/EasyLM
#python #embeddings #information_retrieval #language_model #large_language_models #llm #machine_learning #nearest_neighbor_search #neural_search #nlp #search #search_engine #semantic_search #sentence_embeddings #similarity_search #transformers #txtai #vector_database #vector_search #vector_search_engine
https://github.com/neuml/txtai
https://github.com/neuml/txtai
GitHub
GitHub - neuml/txtai: 💡 All-in-one AI framework for semantic search, LLM orchestration and language model workflows
💡 All-in-one AI framework for semantic search, LLM orchestration and language model workflows - neuml/txtai
#jupyter_notebook #ai #azure #chatgpt #dall_e #generative_ai #generativeai #gpt #language_model #llms #openai #prompt_engineering #semantic_search #transformers
This course teaches you how to build Generative AI applications with 21 comprehensive lessons from Microsoft Cloud Advocates. You'll learn about Generative AI, Large Language Models (LLMs), prompt engineering, and how to build various applications like text generation, chat apps, and image generation using Python and TypeScript. The course includes videos, written lessons, code samples, and additional learning resources. You can start anywhere and even join a Discord server for support and networking with other learners. This helps you gain practical skills in building and deploying Generative AI applications responsibly and effectively.
https://github.com/microsoft/generative-ai-for-beginners
This course teaches you how to build Generative AI applications with 21 comprehensive lessons from Microsoft Cloud Advocates. You'll learn about Generative AI, Large Language Models (LLMs), prompt engineering, and how to build various applications like text generation, chat apps, and image generation using Python and TypeScript. The course includes videos, written lessons, code samples, and additional learning resources. You can start anywhere and even join a Discord server for support and networking with other learners. This helps you gain practical skills in building and deploying Generative AI applications responsibly and effectively.
https://github.com/microsoft/generative-ai-for-beginners
GitHub
GitHub - microsoft/generative-ai-for-beginners: 21 Lessons, Get Started Building with Generative AI
21 Lessons, Get Started Building with Generative AI - GitHub - microsoft/generative-ai-for-beginners: 21 Lessons, Get Started Building with Generative AI
#python #bert #deep_learning #flax #hacktoberfest #jax #language_model #language_models #machine_learning #model_hub #natural_language_processing #nlp #nlp_library #pretrained_models #python #pytorch #pytorch_transformers #seq2seq #speech_recognition #tensorflow #transformer
The Hugging Face Transformers library provides thousands of pretrained models for various tasks like text, image, and audio processing. These models can be used for tasks such as text classification, image detection, speech recognition, and more. The library supports popular deep learning frameworks like JAX, PyTorch, and TensorFlow, making it easy to switch between them.
The benefit to the user is that you can quickly download and use these pretrained models with just a few lines of code, saving time and computational resources. You can also fine-tune these models on your own datasets and share them with the community. Additionally, the library offers a simple `pipeline` API for immediate use on different inputs, making it user-friendly for both researchers and practitioners. This helps in reducing compute costs and carbon footprint while enabling high-performance results across various machine learning tasks.
https://github.com/huggingface/transformers
The Hugging Face Transformers library provides thousands of pretrained models for various tasks like text, image, and audio processing. These models can be used for tasks such as text classification, image detection, speech recognition, and more. The library supports popular deep learning frameworks like JAX, PyTorch, and TensorFlow, making it easy to switch between them.
The benefit to the user is that you can quickly download and use these pretrained models with just a few lines of code, saving time and computational resources. You can also fine-tune these models on your own datasets and share them with the community. Additionally, the library offers a simple `pipeline` API for immediate use on different inputs, making it user-friendly for both researchers and practitioners. This helps in reducing compute costs and carbon footprint while enabling high-performance results across various machine learning tasks.
https://github.com/huggingface/transformers
GitHub
GitHub - huggingface/transformers: 🤗 Transformers: the model-definition framework for state-of-the-art machine learning models…
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training. - GitHub - huggingface/t...
#python #chatbot #chatbots #chatgpt #chatgpt_4 #chatgpt_api #chatgpt_free #chatgpt4 #free_gpt #gpt #gpt_3 #gpt_4 #gpt3 #gpt4 #gpt4_api #language_model #openai #openai_api #openai_chatgpt #python #reverse_engineering
The `gpt4free` project is a tool that allows you to use various AI models like GPT-3.5 and DALL-E 3 without paying for expensive APIs. Here’s how it helps you It supports multiple AI providers, including OpenAI, Bing, and others, allowing you to choose the best option for your needs.
- **Easy Installation** It includes a web UI that you can access locally, making it easy to interact with the AI models.
- **Customization** The project is open-source and actively maintained by a community of contributors, ensuring continuous improvements and new features.
Overall, `gpt4free` provides a flexible and cost-effective way to leverage advanced AI capabilities.
https://github.com/xtekky/gpt4free
The `gpt4free` project is a tool that allows you to use various AI models like GPT-3.5 and DALL-E 3 without paying for expensive APIs. Here’s how it helps you It supports multiple AI providers, including OpenAI, Bing, and others, allowing you to choose the best option for your needs.
- **Easy Installation** It includes a web UI that you can access locally, making it easy to interact with the AI models.
- **Customization** The project is open-source and actively maintained by a community of contributors, ensuring continuous improvements and new features.
Overall, `gpt4free` provides a flexible and cost-effective way to leverage advanced AI capabilities.
https://github.com/xtekky/gpt4free
GitHub
GitHub - xtekky/gpt4free: The official gpt4free repository | various collection of powerful language models | o4, o3 and deepseek…
The official gpt4free repository | various collection of powerful language models | o4, o3 and deepseek r1, gpt-4.1, gemini 2.5 - xtekky/gpt4free
#python #asr #audio #audio_processing #deep_learning #huggingface #language_model #pytorch #speaker_diarization #speaker_recognition #speaker_verification #speech_enhancement #speech_processing #speech_recognition #speech_separation #speech_to_text #speech_toolkit #speechrecognition #spoken_language_understanding #transformers #voice_recognition
SpeechBrain is an open-source toolkit that helps you quickly develop Conversational AI technologies, such as speech assistants, chatbots, and language models. It uses PyTorch and offers many pre-trained models and tutorials to make it easy to get started. You can train models for various tasks like speech recognition, speaker recognition, and text processing with just a few lines of code. SpeechBrain also supports GPU training, dynamic batching, and integration with HuggingFace models, making it powerful and efficient. This toolkit is beneficial because it simplifies the development process, provides extensive documentation and tutorials, and is highly customizable, making it ideal for research, prototyping, and educational purposes.
https://github.com/speechbrain/speechbrain
SpeechBrain is an open-source toolkit that helps you quickly develop Conversational AI technologies, such as speech assistants, chatbots, and language models. It uses PyTorch and offers many pre-trained models and tutorials to make it easy to get started. You can train models for various tasks like speech recognition, speaker recognition, and text processing with just a few lines of code. SpeechBrain also supports GPU training, dynamic batching, and integration with HuggingFace models, making it powerful and efficient. This toolkit is beneficial because it simplifies the development process, provides extensive documentation and tutorials, and is highly customizable, making it ideal for research, prototyping, and educational purposes.
https://github.com/speechbrain/speechbrain
GitHub
GitHub - speechbrain/speechbrain: A PyTorch-based Speech Toolkit
A PyTorch-based Speech Toolkit. Contribute to speechbrain/speechbrain development by creating an account on GitHub.
#python #agent #ai #chatglm #fine_tuning #gpt #instruction_tuning #language_model #large_language_models #llama #llama3 #llm #lora #mistral #moe #peft #qlora #quantization #qwen #rlhf #transformers
LLaMA Factory is a tool that makes it easy to fine-tune large language models. It supports many different models like LLaMA, ChatGLM, and Qwen, among others. You can use various training methods such as full-tuning, freeze-tuning, LoRA, and QLoRA, which are efficient and save GPU memory. The tool also includes advanced algorithms and practical tricks to improve performance.
Using LLaMA Factory, you can train models up to 3.7 times faster with better results compared to other methods. It provides a user-friendly interface through Colab, PAI-DSW, or local machines, and even offers a web UI for easier management. The benefit to you is that it simplifies the process of fine-tuning large language models, making it faster and more efficient, which can be very useful for research and development projects.
https://github.com/hiyouga/LLaMA-Factory
LLaMA Factory is a tool that makes it easy to fine-tune large language models. It supports many different models like LLaMA, ChatGLM, and Qwen, among others. You can use various training methods such as full-tuning, freeze-tuning, LoRA, and QLoRA, which are efficient and save GPU memory. The tool also includes advanced algorithms and practical tricks to improve performance.
Using LLaMA Factory, you can train models up to 3.7 times faster with better results compared to other methods. It provides a user-friendly interface through Colab, PAI-DSW, or local machines, and even offers a web UI for easier management. The benefit to you is that it simplifies the process of fine-tuning large language models, making it faster and more efficient, which can be very useful for research and development projects.
https://github.com/hiyouga/LLaMA-Factory
GitHub
GitHub - hiyouga/LLaMA-Factory: Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024)
Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024) - hiyouga/LLaMA-Factory
#mdx #chatgpt #deep_learning #generative_ai #language_model #openai #prompt_engineering
Prompt engineering helps you use language models more effectively by designing better prompts. This skill is useful for various tasks like question answering, arithmetic reasoning, and coding. With prompt engineering, you can improve how language models perform and understand their capabilities and limitations. There are resources available, such as guides, courses, and tools, to help you learn and apply prompt engineering techniques. These resources include detailed guides, video lectures, and self-paced courses that can enhance your skills and make you more efficient in using language models.
https://github.com/dair-ai/Prompt-Engineering-Guide
Prompt engineering helps you use language models more effectively by designing better prompts. This skill is useful for various tasks like question answering, arithmetic reasoning, and coding. With prompt engineering, you can improve how language models perform and understand their capabilities and limitations. There are resources available, such as guides, courses, and tools, to help you learn and apply prompt engineering techniques. These resources include detailed guides, video lectures, and self-paced courses that can enhance your skills and make you more efficient in using language models.
https://github.com/dair-ai/Prompt-Engineering-Guide
GitHub
GitHub - dair-ai/Prompt-Engineering-Guide: 🐙 Guides, papers, lessons, notebooks and resources for prompt engineering, context engineering…
🐙 Guides, papers, lessons, notebooks and resources for prompt engineering, context engineering, RAG, and AI Agents. - dair-ai/Prompt-Engineering-Guide
#jupyter_notebook #ai #artificial_intelligence #chatgpt #deep_learning #from_scratch #gpt #language_model #large_language_models #llm #machine_learning #python #pytorch #transformer
You can learn how to build your own large language model (LLM) like GPT from scratch with clear, step-by-step guidance, including coding, training, and fine-tuning, all explained with examples and diagrams. This approach mirrors how big models like ChatGPT are made but is designed to run on a regular laptop without special hardware. You also get access to code for loading pretrained models and fine-tuning them for tasks like text classification or instruction following. This helps you deeply understand how LLMs work inside and lets you create your own functional AI assistant, gaining practical skills in AI development[1][2][3][4].
https://github.com/rasbt/LLMs-from-scratch
You can learn how to build your own large language model (LLM) like GPT from scratch with clear, step-by-step guidance, including coding, training, and fine-tuning, all explained with examples and diagrams. This approach mirrors how big models like ChatGPT are made but is designed to run on a regular laptop without special hardware. You also get access to code for loading pretrained models and fine-tuning them for tasks like text classification or instruction following. This helps you deeply understand how LLMs work inside and lets you create your own functional AI assistant, gaining practical skills in AI development[1][2][3][4].
https://github.com/rasbt/LLMs-from-scratch
GitHub
GitHub - rasbt/LLMs-from-scratch: Implement a ChatGPT-like LLM in PyTorch from scratch, step by step
Implement a ChatGPT-like LLM in PyTorch from scratch, step by step - rasbt/LLMs-from-scratch
#typescript #ai #chatgpt #docsgpt #hacktoberfest #information_retrieval #language_model #llm #machine_learning #natural_language_processing #python #pytorch #rag #react #semantic_search #transformers #web_app
DocsGPT is an open-source AI tool that helps you quickly find accurate answers from many types of documents and web sources without errors. It supports formats like PDF, DOCX, images, and integrates with websites, APIs, and chat platforms like Discord and Telegram. You can deploy it privately for security, customize it to fit your brand, and connect it to tools for advanced actions. This means you save time searching for information, get reliable answers with sources, and improve productivity whether you’re a developer, support team, or business user. It’s easy to set up and scales well for many users[2][3][4].
https://github.com/arc53/DocsGPT
DocsGPT is an open-source AI tool that helps you quickly find accurate answers from many types of documents and web sources without errors. It supports formats like PDF, DOCX, images, and integrates with websites, APIs, and chat platforms like Discord and Telegram. You can deploy it privately for security, customize it to fit your brand, and connect it to tools for advanced actions. This means you save time searching for information, get reliable answers with sources, and improve productivity whether you’re a developer, support team, or business user. It’s easy to set up and scales well for many users[2][3][4].
https://github.com/arc53/DocsGPT
GitHub
GitHub - arc53/DocsGPT: Private AI platform for agents, assistants and enterprise search. Built-in Agent Builder, Deep research…
Private AI platform for agents, assistants and enterprise search. Built-in Agent Builder, Deep research, Document analysis, Multi-model support, and API connectivity for agents. - arc53/DocsGPT
❤1