Simulate a lts
object based on a supplied time series model.
gen_lts(
n,
model,
start = 0,
end = NULL,
freq = 1,
unit_ts = NULL,
unit_time = NULL,
name_ts = NULL,
name_time = NULL,
process = NULL
)
An interger
indicating the amount of observations generated in this function.
A ts.model
or simts
object containing one of the allowed models.
A numeric
that provides the time of the first observation.
A numeric
that provides the time of the last observation.
A numeric
that provides the rate/frequency at which the time series is sampled. The default value is 1.
A string
that contains the unit of measure of the time series. The default value is NULL
.
A string
that contains the unit of measure of the time. The default value is NULL
.
A string
that provides an identifier for the time series data. Default value is NULL
.
A string
that provides an identifier for the time. Default value is NULL
.
A vector
that contains model names of each column in the data
object where the last name is the sum of the previous names.
A lts
object with the following attributes:
The time of the first observation.
The time of the last observation.
Numeric representation of the sampling frequency/rate.
A string reporting the unit of measurement.
Name of the generated dataset.
A vector
that contains model names of decomposed and combined processes
This function accepts either a ts.model
object (e.g. AR1(phi = .3, sigma2 =1) + WN(sigma2 = 1)) or a simts
object.