We introduce a concept of (AR)state-space realization that could be applied to all transfer functions with invertible. We show that a theorem of Kalman implies each Vector Autoregressive model (with exogenous variables) has a minimal -state-space realization of form where is a nilpotent Jordan matrix and satisfy certain rank conditions. The case corresponds to reduced-rank regression. Similar to that case, for a fixed Jordan form , could be estimated by least square as a function of . The likelihood function is a determinant ratio generalizing the Rayleigh quotient. It is unchanged if is replaced by for an invertible matrix commuting with . Using this invariant property, the search space for maximum likelihood estimate could be constrained to equivalent classes of matrices satisfying a number of orthogonal relations, extending the results in reduced-rank analysis. Our results could be considered a multi-lag canonical-correlation-analysis. The method considered here provides a solution in the general case to the polynomial product regression model of Velu et. al. We provide estimation examples. We also explore how the estimates vary with different Jordan matrix configurations and discuss methods to select a configuration. Our approach could provide an important dimensional reduction technique with potential applications in time series analysis and linear system identification. In the appendix, we link the reduced configuration space of with a geometric object called a vector bundle.
View on arXiv