It could predict when there is no bear present, and not only restrict to the 3 classes
One hot encoding with MultiCategoryBlock
df.iloc[r,c]
df.column or df[‘column’]
a dataset is a list of x,y tuples. A dataloader takes a dataset and creates + shuffles mini-batches
a list with 2 lists. One list is of the x input, and the other list contains the y inputs
List of Minibatch lists of (x,y) tuples
Shorthand way to create functions on the fly
get_x, get_y. You put in your functions base on what the dataloader input data is (eg image path vs df)
Softmax tries to maximize one output. If we have multiple labels/targets, then softmax might push labels out
nll_loss is for single label datasets where that target is labeled as a single integer and not one-hot encoded
nn.BCELoss
and nn.BCEWithLogitsLoss
?
first one does not do a sigmoid. Second one performs a sigmoid on the activations.
accuracy used predictions from the highest class, and only had 1 output. Whereas, multi-label probles have multiple outputs, and we need to check the accuracy for each class separately we will separately sigmoid each class, threshold, and then compare to target labels
when the validation set accuracy sweep curve is flat -> generalizes
y_range
implemented in fastai? (See if you can implement it yourself and test it without peaking!)
y_range gives you the min and max range. Sigmoid squishes input between 0 and 1. Now, you just need to change the range -> sigmoid(x)(max-min) Recenter -> sigmoid(x)(max-min) + min
a regression problem is where the labels/targets are continuous variables. Use absolute difference or mean squared loss
Inside datatablock -> blocks=(ImageBlock, PointBlock) , Pointblock lets FA know the labels are coordinates, and to perform the same augmentation as the image