Provides faster inference scripts

Learner.get_preds[source]

Learner.get_preds(ds_idx=1, dl=None, with_input=False, with_decoded=False, with_loss=False, raw=False, act=None, inner=False, reorder=True, cbs=None, **kwargs)

This function is almost the exact same as fastai's. The big difference is we can return our raw outputs or our class names.

Learner.predict[source]

Learner.predict(x:Learner, item, with_input=False, rm_type_tfms=None)

TabularLearner.predict[source]

TabularLearner.predict(x:TabularLearner, row, with_input=False, rm_type_tfms=None)

Examples

Below is examples with Vision, Tabular, and Text:

Vision

We'll use the PETs dataset:

path = untar_data(URLs.PETS)
fnames = get_image_files(path/'images')
pat = r'(.+)_\d+.jpg$'
batch_tfms = [*aug_transforms(size=224, max_warp=0), Normalize.from_stats(*imagenet_stats)]
item_tfms = RandomResizedCrop(460, min_scale=0.75, ratio=(1.,1.))
bs=64
dls = ImageDataLoaders.from_name_re(path, fnames, pat, batch_tfms=batch_tfms, 
                                   item_tfms=item_tfms, bs=bs)
learn = cnn_learner(dls, resnet18, metrics=accuracy)
o = learn.predict(fnames[0], with_input=False)
o
('scottish_terrier',
 tensor(32),
 tensor([1.2242e-02, 9.9534e-06, 1.2894e-07, 1.4275e-04, 2.8886e-05, 3.6590e-06,
         2.2287e-03, 3.1244e-05, 5.2848e-05, 3.5833e-07, 1.1884e-04, 3.1537e-02,
         1.2701e-02, 2.5066e-03, 1.0196e-01, 1.4401e-03, 1.7709e-05, 9.4003e-03,
         1.2571e-03, 9.1903e-03, 2.6914e-03, 2.2574e-02, 3.5947e-03, 3.5112e-06,
         3.7433e-05, 4.8709e-02, 5.6559e-07, 3.8592e-09, 5.3575e-03, 1.2965e-02,
         1.0298e-01, 9.9666e-04, 6.0892e-01, 3.7930e-05, 5.9146e-05, 6.1699e-03,
         3.5687e-05]))
imgs, probs, classes, clas_idx = learn.get_preds(dl=learn.dls.test_dl(fnames[:3]), with_input=True, with_decoded=True)

The first index will contain our raw transformed images:

TensorImage(imgs[0]).show()
Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers).
<matplotlib.axes._subplots.AxesSubplot at 0x7fd7f95fe690>

Second the probabilities

preds[1][0]
tensor([1.2242e-02, 9.9534e-06, 1.2894e-07, 1.4275e-04, 2.8886e-05, 3.6590e-06,
        2.2287e-03, 3.1244e-05, 5.2849e-05, 3.5833e-07, 1.1884e-04, 3.1537e-02,
        1.2701e-02, 2.5067e-03, 1.0196e-01, 1.4401e-03, 1.7709e-05, 9.4004e-03,
        1.2571e-03, 9.1903e-03, 2.6914e-03, 2.2574e-02, 3.5947e-03, 3.5112e-06,
        3.7434e-05, 4.8710e-02, 5.6560e-07, 3.8592e-09, 5.3575e-03, 1.2965e-02,
        1.0298e-01, 9.9666e-04, 6.0892e-01, 3.7930e-05, 5.9146e-05, 6.1699e-03,
        3.5687e-05])

And third our class names:

preds[2]
['scottish_terrier', 'american_bulldog', 'Abyssinian']

Tabular

We'll use the ADULT_SAMPLE dataset:

from fastai.tabular.all import *
path = untar_data(URLs.ADULT_SAMPLE)
df = pd.read_csv(path/'adult.csv')
splits = RandomSplitter()(range_of(df))
cat_names = ['workclass', 'education', 'marital-status', 'occupation', 'relationship', 'race']
cont_names = ['age', 'fnlwgt', 'education-num']
procs = [Categorify, FillMissing, Normalize]
y_names = 'salary'
to = TabularPandas(df, procs=procs, cat_names=cat_names, cont_names=cont_names,
                   y_names=y_names, splits=splits)
dls = to.dataloaders()
learn = tabular_learner(dls, layers=[200,100])
inp, probs, name, cat_idx = learn.predict(df.iloc[0], with_input=True)
inp
(tensor([[5, 8, 3, 0, 6, 5, 1]]), tensor([[ 0.7608, -0.8399,  0.7488]]))

For tabular, our input is a TabularPandas row. We can decode that row utilizing the DataLoaders decode function to generate it, and show it:

learn.dls.decode(inp).show()
workclass education marital-status occupation relationship race education-num_na age fnlwgt education-num
0 Private Assoc-acdm Married-civ-spouse #na# Wife White False 49.0 101319.999368 12.0