Authorizations
HTTPBearer
Body
The frequency of the data represented as a string. 'D' for daily, 'M' for monthly, 'H' for hourly, and 'W' for weekly frequencies are available.
Model to use as a string. Common options are (but not restricted to) timegpt-1
and timegpt-1-long-horizon.
Full options vary by different users. Contact ops@nixtla.io for more information. We recommend using timegpt-1-long-horizon
for forecasting if you want to predict more than one seasonal period given the frequency of your data.
The number of tuning steps used to train the large time model on the data. Set this value to 0 for zero-shot inference, i.e., to make predictions without any further model tuning.
x > 0
The loss used to train the large time model on the data. Select from ['default', 'mae', 'mse', 'rmse', 'mape', 'smape']. It will only be used if finetune_steps larger than 0. Default is a robust loss function that is less sensitive to outliers.
default
, mae
, mse
, rmse
, mape
, smape
, poisson
The depth of the finetuning. Uses a scale from 1 to 5, where 1 means little finetuning, and 5 means that the entire model is finetuned. By default, the value is set to 1.
1
, 2
, 3
, 4
, 5
ID to assign to the finetuned model
ID of previously finetuned model