#python #document_understanding #language_generation #language_understanding #layoutlm #minilm #nlp #pre_trained_model #s2s_ft #small_pre_trained_model #unilm
https://github.com/microsoft/unilm
https://github.com/microsoft/unilm
GitHub
GitHub - microsoft/unilm: Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities - microsoft/unilm
#python #deep_learning #pre_trained #model #awesome #nlp #vision #paddlehub #ai_models
https://github.com/PaddlePaddle/PaddleHub
https://github.com/PaddlePaddle/PaddleHub
GitHub
GitHub - PaddlePaddle/PaddleHub: 400+ AI Models: Rich, high-quality AI models, including CV, NLP, Speech, Video and Cross-Modal.…
400+ AI Models: Rich, high-quality AI models, including CV, NLP, Speech, Video and Cross-Modal. Easy to Use: 3 lines of code to predict 400+ AI models. - PaddlePaddle/PaddleHub
#python #ai #deep_learning #natural_language_processing #natural_language_understanding #nlp #nlp_library #nlp_machine_learning #pre_trained_language_models #pre_trained_model #pytorch #transformer
https://github.com/thunlp/OpenPrompt
https://github.com/thunlp/OpenPrompt
GitHub
GitHub - thunlp/OpenPrompt: An Open-Source Framework for Prompt-Learning.
An Open-Source Framework for Prompt-Learning. Contribute to thunlp/OpenPrompt development by creating an account on GitHub.
#python #bert #document_embedding #pre_trained_language_models #semantic_search #sentence_encoder #sentence_transformers #text_search #text_semantic_similarity #top2vec #topic_modeling #topic_modelling #topic_search #topic_vector #word_embeddings
https://github.com/ddangelov/Top2Vec
https://github.com/ddangelov/Top2Vec
GitHub
GitHub - ddangelov/Top2Vec: Top2Vec learns jointly embedded topic, document and word vectors.
Top2Vec learns jointly embedded topic, document and word vectors. - ddangelov/Top2Vec
#python #beit #beit_3 #bitnet #deepnet #document_ai #foundation_models #kosmos #kosmos_1 #layoutlm #layoutxlm #llm #minilm #mllm #multimodal #nlp #pre_trained_model #textdiffuser #trocr #unilm #xlm_e
Microsoft is developing advanced AI models through large-scale self-supervised pre-training across various tasks, languages, and modalities. These models, such as Foundation Transformers (Magneto) and Kosmos-2.5, are designed to be highly generalizable and capable of handling multiple tasks like language understanding, vision, speech, and multimodal interactions. The benefit to users includes state-of-the-art performance in document AI, speech recognition, machine translation, and more, making these models highly versatile and efficient for a wide range of applications. Additionally, tools like TorchScale and Aggressive Decoding enhance stability, efficiency, and speed in model training and deployment.
https://github.com/microsoft/unilm
Microsoft is developing advanced AI models through large-scale self-supervised pre-training across various tasks, languages, and modalities. These models, such as Foundation Transformers (Magneto) and Kosmos-2.5, are designed to be highly generalizable and capable of handling multiple tasks like language understanding, vision, speech, and multimodal interactions. The benefit to users includes state-of-the-art performance in document AI, speech recognition, machine translation, and more, making these models highly versatile and efficient for a wide range of applications. Additionally, tools like TorchScale and Aggressive Decoding enhance stability, efficiency, and speed in model training and deployment.
https://github.com/microsoft/unilm
GitHub
GitHub - microsoft/unilm: Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities - microsoft/unilm