NHL Analytics: Shots, Rebounds, and Weak Signals
https://towardsdatascience.com/nhl-analytics-shots-rebounds-and-weak-signals-c293ba8c635f
https://towardsdatascience.com/nhl-analytics-shots-rebounds-and-weak-signals-c293ba8c635f
Medium
NHL Analytics: Shots, Rebounds, and Weak Signals
Any pro sport is an exercise in percentages. Pro players are in the top 99.99 percentile of all human beings in that particular sport…
super-cheatsheet-deep-learning.pdf
4.5 MB
Super VIP Cheatsheet: Deep Learning
Forensic Deep Learning: Kaggle Camera Model Identification Challenge
https://towardsdatascience.com/forensic-deep-learning-kaggle-camera-model-identification-challenge-f6a3892561bd
https://towardsdatascience.com/forensic-deep-learning-kaggle-camera-model-identification-challenge-f6a3892561bd
Medium
Forensic Deep Learning: Kaggle Camera Model Identification Challenge
There was a computer vision challenge that was hosted at kaggle.com about a year ago named IEEE’s Signal Processing Society — Camera Model…
Great took for neural network, deep learning and machine learning models visualization.
https://github.com/lutzroeder/netron
https://github.com/lutzroeder/netron
GitHub
GitHub - lutzroeder/netron: Visualizer for neural network, deep learning and machine learning models
Visualizer for neural network, deep learning and machine learning models - lutzroeder/netron
A Gentle Introduction to Early Stopping to Avoid Overtraining Deep Learning Neural Network Models
https://machinelearningmastery.com/early-stopping-to-avoid-overtraining-neural-network-models/
https://machinelearningmastery.com/early-stopping-to-avoid-overtraining-neural-network-models/
How to Stop Training Deep Neural Networks At the Right Time Using Early Stopping
https://machinelearningmastery.com/how-to-stop-training-deep-neural-networks-at-the-right-time-using-early-stopping/
https://machinelearningmastery.com/how-to-stop-training-deep-neural-networks-at-the-right-time-using-early-stopping/
Train Neural Networks With Noise to Reduce Overfitting
https://machinelearningmastery.com/train-neural-networks-with-noise-to-reduce-overfitting/
https://machinelearningmastery.com/train-neural-networks-with-noise-to-reduce-overfitting/
MachineLearningMastery.com
Train Neural Networks With Noise to Reduce Overfitting - MachineLearningMastery.com
Training a neural network with a small dataset can cause the network to memorize all training examples, in turn leading to overfitting and poor performance on a holdout dataset. Small datasets may also represent a harder mapping problem for neural networks…
Machine Learning Top 10 Articles for the Past Month (v.Dec 2018)
https://medium.mybridge.co/machine-learning-top-10-articles-for-the-past-month-v-dec-2018-37b229f930a1
https://medium.mybridge.co/machine-learning-top-10-articles-for-the-past-month-v-dec-2018-37b229f930a1
Medium
Machine Learning Top 10 Articles for the Past Month (v.Dec 2018)
For the past month, we ranked nearly 1,400 Machine Learning articles to pick the Top 10 stories that can help advance your career (0.7%…
Facebook has released #PyText — new framework on top of #PyTorch.
This framework is build to make it easier for developers to build #NLP models.
https://code.fb.com/ai-research/pytext-open-source-nl..
Github: https://github.com/facebookresearch/pytext
This framework is build to make it easier for developers to build #NLP models.
https://code.fb.com/ai-research/pytext-open-source-nl..
Github: https://github.com/facebookresearch/pytext
Engineering at Meta
Open-sourcing PyText for faster NLP development
To make it easier to build and deploy natural language processing (NLP) systems, we are open-sourcing PyText, a modeling framework that blurs the boundaries between experimentation and large-scale …
The major advancements in Deep Learning in 2018
https://tryolabs.com/blog/2018/12/19/major-advancements-deep-learning-2018/
https://tryolabs.com/blog/2018/12/19/major-advancements-deep-learning-2018/
Tryolabs
The major advancements in Deep Learning in 2018
Deep Learning has changed the entire landscape over the past few years and its results are steadily improving. This article presents some of the main advances and accomplishments in Deep Learning for 2018.
30 Data Science Punchlines
A holiday reading list condensed into 30 quotes
https://towardsdatascience.com/data-science-conversation-starters-84affd2347f6
A holiday reading list condensed into 30 quotes
https://towardsdatascience.com/data-science-conversation-starters-84affd2347f6
Medium
30 Data Science Punchlines
A year of blog posts condensed into 30 quotes to help you avoid (cause?) awkward silences at family events and holiday parties.
Generating New Ideas for Machine Learning Projects Through Machine Learning
https://towardsdatascience.com/generating-new-ideas-for-machine-learning-projects-through-machine-learning-ce3fee50ec2
https://towardsdatascience.com/generating-new-ideas-for-machine-learning-projects-through-machine-learning-ce3fee50ec2
Medium
Generating New Ideas for Machine Learning Projects Through Machine Learning
Generating style-specific text from a small corpus of 2.5k sentences using a pre-trained language model. Code in PyTorch
How to Create a Random-Split, Cross-Validation, and Bagging Ensemble for Deep Learning in Keras
https://machinelearningmastery.com/how-to-create-a-random-split-cross-validation-and-bagging-ensemble-for-deep-learning-in-keras/
https://machinelearningmastery.com/how-to-create-a-random-split-cross-validation-and-bagging-ensemble-for-deep-learning-in-keras/
MachineLearningMastery.com
How to Create a Bagging Ensemble of Deep Learning Models in Keras - MachineLearningMastery.com
Ensemble learning are methods that combine the predictions from multiple models.
It is important in ensemble learning that the models that comprise the ensemble are good, making different prediction errors. Predictions that are good in different ways can…
It is important in ensemble learning that the models that comprise the ensemble are good, making different prediction errors. Predictions that are good in different ways can…
Deep learning cheatsheets, covering content of Stanford’s CS 230 class
CNN: https://stanford.edu/~shervine/teaching/cs-230/cheatsheet-convolutional-neural-networks
RNN: https://stanford.edu/~shervine/teaching/cs-230/cheatsheet-recurrent-neural-networks
TipsAndTricks: https://stanford.edu/~shervine/teaching/cs-230/cheatsheet-deep-learning-tips-and-tricks
#cheatsheet #Stanford #dl #cnn #rnn #tipsntricks
CNN: https://stanford.edu/~shervine/teaching/cs-230/cheatsheet-convolutional-neural-networks
RNN: https://stanford.edu/~shervine/teaching/cs-230/cheatsheet-recurrent-neural-networks
TipsAndTricks: https://stanford.edu/~shervine/teaching/cs-230/cheatsheet-deep-learning-tips-and-tricks
#cheatsheet #Stanford #dl #cnn #rnn #tipsntricks
stanford.edu
CS 230 - Convolutional Neural Networks Cheatsheet
Teaching page of Shervine Amidi, Graduate Student at Stanford University.
A radical new neural network design
https://www.technologyreview.com/s/612561/a-radical-new-neural-network-design-could-overcome-big-challenges-in-ai/
https://www.technologyreview.com/s/612561/a-radical-new-neural-network-design-could-overcome-big-challenges-in-ai/
MIT Technology Review
A radical new neural network design could overcome big challenges in AI
David Duvenaud was collaborating on a project involving medical data when he ran up against a major shortcoming in AI. An AI researcher at the University of Toronto, he wanted to build a deep-learning model that would predict a patient’s health over time.…