Automating Optimization of Quantized Deep Learning Models on CUDA
https://tvm.ai/2019/04/29/opt-cuda-quantized.html
https://tvm.ai/2019/04/29/opt-cuda-quantized.html
A Gentle Introduction to the ImageNet Large Scale Visual Recognition Challenge (ILSVRC)
https://machinelearningmastery.com/introduction-to-the-imagenet-large-scale-visual-recognition-challenge-ilsvrc/
https://machinelearningmastery.com/introduction-to-the-imagenet-large-scale-visual-recognition-challenge-ilsvrc/
MachineLearningMastery.com
A Gentle Introduction to the ImageNet Challenge (ILSVRC) - MachineLearningMastery.com
The rise in popularity and use of deep learning neural network techniques can be traced back to the innovations in the application of convolutional neural networks to image classification tasks.
Some of the most important innovations have sprung from submissions…
Some of the most important innovations have sprung from submissions…
Пошаговое руководство по созданию голосового помощника с Python
https://habr.com/ru/post/450224/
https://habr.com/ru/post/450224/
Real-Time Patch-Based Stylization of Portraits
Using Generative Adversarial Network
http://dcgi.fel.cvut.cz/home/sykorad/facestyleGAN.html
Using Generative Adversarial Network
http://dcgi.fel.cvut.cz/home/sykorad/facestyleGAN.html
dcgi.fel.cvut.cz
Real-Time Patch-Based Stylization of Portraits Using Generative Adversarial Network
Ensemble methods: bagging, boosting and stacking
https://towardsdatascience.com/ensemble-methods-bagging-boosting-and-stacking-c9214a10a205
https://towardsdatascience.com/ensemble-methods-bagging-boosting-and-stacking-c9214a10a205
Towards Data Science
Ensemble learning: Bagging and Boosting | Towards Data Science
It's time to explore the world of bagging and boosting. With these powerful techniques, you can improve the performance of your models...
BoTorch is a library for Bayesian Optimization built on PyTorch.
Facebook open-sources Ax and BoTorch to simplify AI model optimization
github: https://github.com/pytorch/botorch
description: https://techcrunch.com/2019/05/01/facebook-open-sources-ax-and-botorch-to-simplify-ai-model-optimization/
Facebook open-sources Ax and BoTorch to simplify AI model optimization
github: https://github.com/pytorch/botorch
description: https://techcrunch.com/2019/05/01/facebook-open-sources-ax-and-botorch-to-simplify-ai-model-optimization/
GitHub
GitHub - pytorch/botorch: Bayesian optimization in PyTorch
Bayesian optimization in PyTorch. Contribute to pytorch/botorch development by creating an account on GitHub.
Announcing Google-Landmarks-v2: An Improved Dataset for Landmark Recognition & Retrieval
http://ai.googleblog.com/2019/05/announcing-google-landmarks-v2-improved.html
http://ai.googleblog.com/2019/05/announcing-google-landmarks-v2-improved.html
research.google
Announcing Google-Landmarks-v2: An Improved Dataset for Landmark Recognition & R
Posted by Bingyi Cao and Tobias Weyand, Software Engineers, Google AI Last year we released Google-Landmarks, the largest world-wide landmark rec...
Best Practices for Preparing and Augmenting Image Data for Convolutional Neural Networks
https://machinelearningmastery.com/best-practices-for-preparing-and-augmenting-image-data-for-convolutional-neural-networks/
https://machinelearningmastery.com/best-practices-for-preparing-and-augmenting-image-data-for-convolutional-neural-networks/
MachineLearningMastery.com
Best Practices for Preparing and Augmenting Image Data for CNNs - MachineLearningMastery.com
It is challenging to know how to best prepare image data when training a convolutional neural network. This involves both scaling the pixel values and use of image data augmentation techniques during both the training and evaluation of the model. Instead…
Reinforcement Learning, Fast and Slow
https://www.cell.com/trends/cognitive-sciences/fulltext/S1364-6613(19)30061-0
https://www.cell.com/trends/cognitive-sciences/fulltext/S1364-6613(19)30061-0
Billion-scale semi-supervised learning for image classification"
Weakly-supervised pre-training + semi-supervised pre-training + distillation + transfer/fine-tuning =81.2% twith ResNet-50, 84.8% with ResNeXt-101-32x16, top-1 accuracy on ImageNet.
article: https://arxiv.org/abs/1905.00546
announce : https://www.facebook.com/i.zeki.yalniz/posts/10157311492509962
Weakly-supervised pre-training + semi-supervised pre-training + distillation + transfer/fine-tuning =81.2% twith ResNet-50, 84.8% with ResNeXt-101-32x16, top-1 accuracy on ImageNet.
article: https://arxiv.org/abs/1905.00546
announce : https://www.facebook.com/i.zeki.yalniz/posts/10157311492509962
arXiv.org
Billion-scale semi-supervised learning for image classification
This paper presents a study of semi-supervised learning with large convolutional networks. We propose a pipeline, based on a teacher/student paradigm, that leverages a large collection of...
How to Visualize Filters and Feature Maps in Convolutional Neural Networks
https://machinelearningmastery.com/how-to-visualize-filters-and-feature-maps-in-convolutional-neural-networks/
https://machinelearningmastery.com/how-to-visualize-filters-and-feature-maps-in-convolutional-neural-networks/
Огромный открытый датасет русской речи
https://habr.com/ru/post/450760/
https://habr.com/ru/post/450760/
Хабр
Огромный открытый датасет русской речи
Специалистам по распознаванию речи давно не хватало большого открытого корпуса устной русской речи, поэтому только крупные компании могли позволить себе заниматься этой задачей, но они не...
Forwarded from Artificial Intelligence
Google at ICLR 2019
http://ai.googleblog.com/2019/05/google-at-iclr-2019.html
http://ai.googleblog.com/2019/05/google-at-iclr-2019.html
Googleblog
Google at ICLR 2019
How to Develop a Convolutional Neural Network From Scratch for MNIST Handwritten Digit Classification
https://machinelearningmastery.com/blog/
https://machinelearningmastery.com/blog/
Искусственный интеллект на примере простой игры. Часть 2
https://habr.com/ru/post/451070/
https://habr.com/ru/post/451070/
Хабр
Обучаем нейросеть играть в «Змейку» и пишем сервер для соревнований
В этот раз выбрана игра «Змейка». Создана библиотека для нейросети на языке Go. Найден принцип обучения, зависимый от «глубины» памяти. Написан сервер для игр...
Forwarded from Artificial Intelligence
Announcing Open Images V5 and the ICCV 2019 Open Images Challenge
http://ai.googleblog.com/2019/05/announcing-open-images-v5-and-iccv-2019.html
http://ai.googleblog.com/2019/05/announcing-open-images-v5-and-iccv-2019.html
research.google
Announcing Open Images V5 and the ICCV 2019 Open Images Challenge
Posted by Vittorio Ferrari, Research Scientist, Machine Perception In 2016, we introduced Open Images, a collaborative release of ~9 million imag...
Digging Into Self-Supervised Monocular Depth Estimation
Article.: https://arxiv.org/abs/1806.01260
Github: https://github.com/nianticlabs/monodepth2
Article.: https://arxiv.org/abs/1806.01260
Github: https://github.com/nianticlabs/monodepth2
arXiv.org
Digging Into Self-Supervised Monocular Depth Estimation
Per-pixel ground-truth depth data is challenging to acquire at scale. To overcome this limitation, self-supervised learning has emerged as a promising alternative for training models to perform...
How to Develop a Deep Convolutional Neural Network From Scratch for Fashion MNIST Clothing Classification
https://machinelearningmastery.com/how-to-develop-a-cnn-from-scratch-for-fashion-mnist-clothing-classification/
https://machinelearningmastery.com/how-to-develop-a-cnn-from-scratch-for-fashion-mnist-clothing-classification/