As machine learning has continued to expand, so has the need for data. I’ve put together some of my favorite resources for finding datasets. I hope they are some service.
I find it difficult to keep up with the latest in machine learning, even though it’s part of my full-time job. Fortunately, there are a lot of resources out there to help sort through it all. I thought I would put together a list of resources that I use in case it helps anyone else. If you know of a great resource that I’m missing, please let me know!
This post is going to demonstrate how to use the Albumentations library with TensorFlow.
In this blog post, I will discuss how to use loss functions in PyTorch. I will cover how loss functions work in both regression and classification tasks, how to work with numpy arrays, the expected shape and type of loss functions in PyTorch, and demonstrate some types of losses.
In this blog post, I will discuss how to use loss functions in TensorFlow. I’ll focus on binary cross entropy loss.
In this post, I’ll demonstrate how to connect to Google Cloud Platform (GCP) instances using VSCode’s Remote SSH extention. We’ll assume you already have working GCP instances that you can ssh into. To connect to a remote host, VSCode needs to know the HostName, User, and IdentityKey. In this guide, we’ll go over how to find these. For simplicity, we’ll assume you’re trying to connect to an instance named my_instance and your zone is europe-west4-b. You’ll need to find these values and change them in the instructions below.
It’s important to know whether a model has been trained well or not. One way to do this is to look at the model weights. But it’s hard to know what exactly they’re telling you - you need something to compare the weights to. In this post, I’m going to look at the weight statistics for a couple of well-trained networks, which can be used as comparison points.