This post is in a series on machine learning with unbalanced datasets. This post focuses on the training aspect. For background, please see the setup post.
This post is the first in a series on working with unbalanced data. We’ll answer questions like how to train a model, how to validate it, and how to test it. Is it better than your datasets be balanced or representative of the real-world distribution?
As machine learning has continued to expand, so has the need for data. I’ve put together some of my favorite resources for finding datasets. I hope they are some service.
I find it difficult to keep up with the latest in machine learning, even though it’s part of my full-time job. Fortunately, there are a lot of resources out there to help sort through it all. I thought I would put together a list of resources that I use in case it helps anyone else. If you know of a great resource that I’m missing, please let me know!
This post is going to demonstrate how to use the Albumentations library with TensorFlow.
In this blog post, I will discuss how to use loss functions in PyTorch. I will cover how loss functions work in both regression and classification tasks, how to work with numpy arrays, the expected shape and type of loss functions in PyTorch, and demonstrate some types of losses.
In this blog post, I will discuss how to use loss functions in TensorFlow. I’ll focus on binary cross entropy loss.