--- title: TSLearners (TSClassifier, TSRegressor, TSForecaster) keywords: fastai sidebar: home_sidebar summary: "New set of time series learners with a new sklearn-like API that simplifies the learner creation." description: "New set of time series learners with a new sklearn-like API that simplifies the learner creation." nb_path: "nbs/052b_tslearner.ipynb" ---
Commonly used arguments:
[None, TSClassification()]
which is commonly used in most single label datasets. [64, 128]
. If a list of ints, the first one will be used for training, and the second for the valid (batch size can be larger as it doesn't require backpropagation which consumes more memory). Less frequently used arguments:
[0,3,5]
).slice(-50, None)
will select the last 50 steps from each time series).from tsai.models.InceptionTimePlus import *
X, y, splits = get_classification_data('OliveOil', split_data=False)
batch_tfms = [TSStandardize(by_sample=True)]
learn = TSClassifier(X, y, splits=splits, batch_tfms=batch_tfms, metrics=accuracy, arch=InceptionTimePlus, arch_config=dict(fc_dropout=.5))
learn.fit_one_cycle(1)
Commonly used arguments:
[None, TSRegression()]
which is commonly used in most single label datasets. [64, 128]
. If a list of ints, the first one will be used for training, and the second for the valid (batch size can be larger as it doesn't require backpropagation which consumes more memory). Less frequently used arguments:
[0,3,5]
).slice(-50, None)
will select the last 50 steps from each time series).from tsai.models.TST import *
X, y, splits = get_regression_data('AppliancesEnergy', split_data=False)
if X is not None: # This is to prevent a test fail when the data server is not available
batch_tfms = [TSStandardize()]
learn = TSRegressor(X, y, splits=splits, batch_tfms=batch_tfms, arch=TST, metrics=mae, bs=512)
learn.fit_one_cycle(1, 1e-4)
Commonly used arguments:
[None, TSForecasting()]
which is commonly used in most single label datasets. [64, 128]
. If a list of ints, the first one will be used for training, and the second for the valid (batch size can be larger as it doesn't require backpropagation which consumes more memory). Less frequently used arguments:
[0,3,5]
).slice(-50, None)
will select the last 50 steps from each time series).from tsai.models.TSTPlus import *
ts = get_forecasting_time_series('Sunspots')
if ts is not None: # This is to prevent a test fail when the data server is not available
X, y = SlidingWindowSplitter(60, horizon=1)(ts)
splits = TSSplitter(235)(y)
batch_tfms = [TSStandardize(by_var=True)]
learn = TSForecaster(X, y, splits=splits, batch_tfms=batch_tfms, arch=TST, arch_config=dict(fc_dropout=.5), metrics=mae, bs=512, partial_n=.1)
learn.fit_one_cycle(1)