It’s often a good idea to construct deep learning code so that experiments can be conducted simply from the command line, allowing you to quickly iterate and record results. argparse makes this super easy by allowing configurations to be passed as command-line arguments. But sometimes you want to do the opposite. You find a repo that has a train.py file ready-to-go but you want to walk through it line-by-line in a Jupyter Notebook.
This post contains some of my notes on the argparse package.
In the research paper Group Normalization by Yuxin Wu and Kaiming He, they introduce the idea of group normalization. They show that it can be applied very easily by including the necessary code in the paper. This post walks through the code behind group normalization.
There are a lot of great resources out there so I thought I would try to compile some of my favorites. This list contains books on deep learning, machine learning, and computer vision. These textbooks are all made freely available by the publisher.
This post is going to demonstrate how to do data augmentation for computer vision using the albumentations library. The exact data augmentations you use are going to be specific to your use-case. For example, if you’re training on overhead imagery the augmentations you use will probably be somewhat different than on an ImageNet-like dataset (although there will also be considerable overlap).
This post is a quick walkthrough of the different data augmentation methods available in Detectron2 and their utility for augmenting overhead imagery. I’ll also go over a quick way to implement them.
This post contains details of how I set up my shell and environment. I use Windows, Mac, and Linux on a daily basis, so I have different setups for different purposes, but I try to make them similar when I can.