This post is a collection of some notes and thoughts I’ve had when working with FastAI.
Table of Contents
- Working on Windows
- Models
- Working on GPUs
- Transitioning From FastAI Version 1
- Encoding and Decoding Images
- Things I don’t like
- Conclusion
Working on Windows
There seems to be an issue when training some models on Windows machines that I haven’t run into when I’ve used Mac or Linux. Let’s create a simple example to start.
from fastai.vision.all import *
path = untar_data(URLs.PETS)
files = get_image_files(path/"images")
def label_func(f):
return f[0].isupper()
dls = ImageDataLoaders.from_name_func(path, files, label_func, item_tfms=Resize(224))
learn = cnn_learner(dls, resnet34, metrics=error_rate)
Due to IPython and Windows limitation, python multiprocessing isn't available now.
So `number_workers` is changed to 0 to avoid getting stuck
When I try to train this model with learn.fine_tune(1)
, I run into an OSError
.
The solution is to add the following before training your model:
from PIL import ImageFile
ImageFile.LOAD_TRUNCATED_IMAGES = True
learn.fine_tune(1)
epoch | train_loss | valid_loss | error_rate | time |
---|---|---|---|---|
0 | 1.202993 | 1.158934 | 0.472727 | 00:07 |
epoch | train_loss | valid_loss | error_rate | time |
---|---|---|---|---|
0 | 0.418523 | 0.513215 | 0.181818 | 00:05 |
Now it works.
Models
One thing I really like is that the learn.model
is a PyTorch object, which makes it immediately familiar to anyone working in PyTorch.
type(learn.model)
torch.nn.modules.container.Sequential
This means that everything you would normally do with a PyTorch model, you can do with a FastAI model.
learn.model.eval();
Working on GPUs
FastAI makes working with GPUs easy. You need to make sure your models and your dataset are on the GPU. This might not be the case, especially if you load them from disk.
To check your dataset:
dls.device
device(type='cuda', index=0)
Then you can put in on the GPU with:
learn.dls.cuda()
<fastai.data.core.DataLoaders at 0x201a6eb6730>
If you just have a single DataLoader, you can move it like this:
dl = dls.train
dl.to('cuda')
#dl.to('cpu') # if you wanted to put it back on the cpu
<fastai.data.core.TfmdDL at 0x201a6e972b0>
To test a model, just use the PyTorch method.
next(learn.model.parameters()).is_cuda
True
To move a model, you’ll need to:
learn.model = learn.model.cuda()
You can also make sure you load it right to the GPU.
load_learner('my_model.pkl', cpu=False);
new_dls = new_dblock.dataloaders(final_df, device='cuda')
Transitioning From FastAI Version 1
For those of you who used the first version of FastAI, there are a lot of differences. When loading data, you might be looking for DataBunch
es. Those no longer exist, but you will see similar functionality in the DataBlock
s. Also, lots of smaller data classes, such as ImageList
and ImageImageList
, no longer exist. Check out my data tutorial to see how to work with DataBlock
s.
Also, the import *
statement has changed.
from fastai.vision import *
is now
from fastai.vision.all import *
There’s Hidden Stuff All Over
There are a lot of cool features hidden around FastAI. These are great when you know they’re there, but surprising when you run into one unexpectedly.
Let’s say you’ve got a model in at analysis/catsvdogs/models/my_model
that you want to load. So you create a Path
with it and point learn.load
to it. But instead of loading, this causes an error.
path = Path('analysis/catsvdogs/models/my_model')
try:
learn.load(path)
except FileNotFoundError as err:
print(err)
[Errno 2] No such file or directory: 'C:\\Users\\Julius\\.fastai\\data\\oxford-iiit-pet\\models\\analysis\\catsvdogs\\models\\my_model.pth'
It adds the word “model” in the beginning. This is great if you know it’s going to do that, but not for people who aren’t expecting that. It’s not even obvious when you look at the default arguments how to turn this off. This is a minor annoyance, but there are a lot of these. I have found that FastAI seems to make more assumptions about what you want relative to other libraries.
Encoding and Decoding Images
If you get an image like this, it will be normalized
x, = first(dls.test_dl([img]))
So if you want to decode it, you’ll have to do something like this:
dls.train.decode((x,))[0][0].shape
Things I don’t like
There are a lot of things I like about FastAI, but there are also some things that I don’t.
Evaluation
I find the way evaluation works in FastAI to be counterintuitive. If you’re familiar with keras
, you’re used to calling .evaluate
on a model after it’s been trained. But in fastai
, there are only two splits - train and val. So if you want to evaluate on the test data, you have to create a new DataBunch
with training data, then swap in your test data by calling it validation data. At least, that’s the way I’ve been getting it to work. If I find a better way, I’ll update this post.
Variable Names
The first one that sticks out is the use of single-letter variables. For me, this adds confusion and does not save time. There are lots of these. For example, take a look at the params_size
function, which takes an input m
. It’s not obvious to me what m
refers to. I find that many times I have to poke around a bit to see what functions are expecting, mostly because I can’t figure it out from the argument names.
It’s almost like it’s designed for people who are going to use the library all the time and know it intimately, but difficult for anyone else.
Splitters
Datasets require splitters. I don’t mind this, but I think there should be a way to not include a splitter. It’s not that I don’t think splitting between training and validation sets is a good idea, it’s just that there are some cases when, for whatever reason, that’s not what I’m trying to do at the moment. But it seems like when you try to go your own way, it’s not easy.
Namespace Collisions
One thing that I know annoys people about FastAI is the liberal use of import *
(as I did at the top of the notebook). It’s kind of a Python anti-pattern so to see it everywhere can be disconcerting.
I see where the annoyance comes from. You have no idea what’s in your namespace, and that’s a problem. For example, image_size
is a function in FastAI that is automatically imported. I often do something like image_size=(244, 244)
in my code, but if I do this I’m overwriting a function that could be used. This leads to nasty bugs.
Inheriting from Outer Context
There’s a lot of inheriting from the outer context in FastAI. This works well for a Jupyter Notebook environment but makes it harder to use it in production.
Conclusion
I didn’t want to end on a negative so I want to say that there’s a lot I really like about FastAI. I hope the library continues to be developed. I’ve noticed that there’s not a big development environment around it - right now there’s only one core developer on it, and keeping a library going is a lot of work for a single person.