# Introduction e start off by a quote from Engel. "There is nothing in our chosen career that is as exhilarating as having a good idea. But a very close second is seeing someone develop a wonderful new application from your idea." Most of the econometric researches in recent days are about modeling financial time series. They seem to be crowding out macroeconomic data as objects of econometric research. The reasons are obvious: ? In financial time series, you can get thousands of observations, whereas with macro data, a few hundred observations is a rare luxury, and ? The data are exact; no revisions are necessary The evolution of financial data shows a high degree of volatility of the series coupled with increasing difficulties of forecasting the shorter is the time horizon, when using standard (based on linear methods) forecasting methods. Alternative forecasting and other econometric methods for nonlinear time series based on the literature on complex dynamic systems, have recently been developed, which can be particularly useful in the analysis of financial time series. Since the nonlinear methods require the usage of very long time series, the availability of high frequency data for these variables make them the best candidates among economic and financial time series for the application of this methodology. The long financial series are particularly useful for testing recent non-linear models that do not require second-order stationarity i.e., methods that owe their roots to Granger and Engle's work long time ago. # II. # Using High Frequency Data Although the beta coefficient appeared in the 70.s, a debate is still going on about whether these betas are constant or varying over time. Anderson, et al (2006) has eloquently shown that although individual variances and co-variances are highly persistent, the betas of some major company shares are not because of an on linear fractional co-integration between individual equity and the market. Using the mean square error as the measure of accuracy in beta estimation, Anderson et al (2006) obtained the optimal pair of sampling frequency and the trailing window and this is found to be as short as 1 minute and 1 week, respectively. The sampling result may be due to the low market noise resulting from its high liquidity and econometric properties of the errors-in-variables model. Moreover, the realized beta obtained from optimal pair out-performed the constant beta from the CAPM when overnight returns are excluded. The comparison further strengthens the argument that the beta is time-varying. A non-parametric approach using high frequency data is one of the recent methods utilized to estimate financial measures, such as market volatility. The method utilizes price data with a very short term horizon, which is now widely available. By using the observed variables for calculation, the approach is very handy in that it trivializes calculation and avoids many distortive assumption necessary for parameterized modeling. The realized measures such as realized variances, for instance, are known to be efficient estimators of underlying values like variance, covariance etc Zhang et al (2005). The realized variance converges to integrated variance plus the jump component as the time between observations approaches zero. That is the sampling interval converges to zero. Where ds ) (s are known by the public, the gain would be eliminated through arbitrage. Still, people often continue to try to test their linear models against random walk forecasts only to find that the latter are hard to beat. Important advances in nonlinear time series analysis, multivariate includes ARCH and GARCH nonlinear stochastic process by Engle (1982), nonlinear deterministic chaotic dynamic model by Anderson et al (2001) (1995) non parametric analysis by Ebens (1999), multivariate adaptive regression splines by Lewis et al (1994), These works fuelled analyses on nonlinearities in financial data and opened new possibilities in forecasting and other areas. Alt recent empirical evidences seem to suggest that financial asset returns are predictable to some degree. Thirty years ago this would have been tantamount to an outright rejection of market efficiency. However modern financial economies teach us that the fine structure of securities market and friction in the in the area of nonlinear dynamics to deal with a complex process that has been renewed during the last decade is due to the surprising finding that even a very simple deterministic model of a dynamic system can reflect a very complex 'motion' that exhibits the characteristics of chaotic behavior. A chaotic system is one in which the long-term prediction of the system's trajectory is impossible because any uncertainty on its initial state grows exponentially fast along time. The characteristic property is called sensitive dependence on initial condition and the reason of the rapid loss of prediction power in chaotic system. However chaotic system is deterministic and show a crucial difference with random process. # III. # Chaos and the Nearest Neighbor Econometrics has been concerned with complex phenomenon providing successful stochastic models that are capable of describing financial behavior. The key concept in stochastic models is again randomness assuming that the process under study is governed by chaos and probability laws. Based on a philosophy opposite to randomness, nonlinear dynamic system and chaos offer the possibility of describing a complex phenomenon by a nonlinear dynamic process. development of econometric techniques for chaotic time series were followed by Farmer and Sidorowitch (1987), Trippi (1995) and others. Based on a time series in a state space these authors proposed a forecasting technique by using delayed coordinates and looking for past patterns of the nearest neighbor (NN). In this way the NN method is a prediction technique where segments with similar dynamic behavior are detected in the series and then used to define the next term at the end of the series which is computed by some average of the actually observed terms next to the segments complex economic dynamics suggest the possibility of chaos (Pesaron and Potter (1993), detecting chaos between financial time series is often an elusive task. A few researchers have proposed the use of nonparametric locally weighted regression to detect deter-Recently one of the most fascinating essays by Nicolas et al (2005) test various nonlinear models on Standard & Poor's 500 index (S & P 500) and come up with an astonishing result. Forecasting entire distribution one period ahead, they are able to beat the naïve forecast only for the right tail of the distribution. (see below the last section of this article) At the same time, it is hard to see how one can make money from this knowledge and the authors do not claim that one could. However, these are new procedures being made to test model specification and parameter estimation errors simultaneously often enriched by Kulback-Leibler Infor-1. In line with these considerations, special attention has been paid to testing predictable components in stock market prices, for example, Lo and Mackinlay (1988). 2. The origin of deterministic process dates back to planet dynamics and in particular the three bodies problem and its unpredictable dynamics. Even though the work on deterministic complex dynamics remained isolated of the main body of science during many years, the publication of Lorenz's work on weather prediction followed by an outburst of new research on the study of nondeterministic nonlinear systems with an irregular behavior which will be called chaotic behavior. 3. Notice that the philosophy behind the NN approach is quite different from that of the Box-Jenkins methodology. In contrast to the Box-Jenkins where extrapolation of past value into the immediate future is based on the correlation among lagged observations and error terms, The NN method select relevant prior observation based on their levels and geometric trajectories, not their location in time. 4. When we have a set of simultaneous time series, the NN predictor can be extended to a multivariate case using the simultaneous neighbor predictors (SNN). To simplify, let us consider a set of two time series: ) T , . . . . 1. t ( Y T) , . . . . . 1, t ( X t t = = = = We are interested in making predictions of one of these series ( e. g., 1 + T X ) by simultaneously considering nearest neighbor in both series. To this we embed each of these series in the vector space m 2 ? paying attention to the following vector : m m m t m t X ) y x ( ? ? ? which gives the last available m -history for each time series. In order to establish nearest neighbors to the last m-histories IV. # Seasonal Time Series Models = = + + + = ? ? ? Where t= i / n, ? is a function in [ 0, 1 ], { ) . ( j ? are smooth seasonal effect function, either fixed or random subject to a set of constraints and the error term { ij ? } is to be stationary and weak dependent for fixed seasonal effects , 0 ) t ( t d 1 j j ? = ? = ? The main advantage is that such a kind of dependence contains lots of pertinent examples and can be used in various situations just as the Central Limit Theorem for weak dependent variables is being studied in recent years. Cai and Chen (2006)'s recent seasonal time series model is a locally linear factorization of trend with seasonal components. The authors use a sliding window and a kernel smoother. The method is illustrated through using simulation and is also applied to two real time series. They derived consistency and asymptotic normality of the weighted least square estimates by a local linear method, and by assuming that error terms are k-weak dependent and ? -weak dependent random variables. The proposed methodology is illustrated with a simulated example and two economic and financial time series, which exhibit nonlinear and non stationary behavior. Since Granger and Joyeux (1980) introduced ARFIMA (Autoregressive Fractionally Integrated Moving Average), the maximum likelihood estimation of their parameters has intrigued many researchers. Methods such as exact ML based on the Cholesky decomposition of the covariance matrix tend to be complex procedure would be a Levinson-Durbin algorithm. But V. # The Fed Model and The Stock Market The "Fed model" is a theory of equity valuation that has found broad application in the investment community. The model compares the stock market's earnings yield (E/P) to the yield on long-term govern-ment bonds. In its strongest form the Fed model states that bond and stock markets are in equilibrium, and fairly valued, when the one-year forward looking earnings yield equals the 10-year Treasury note yield ( ) y 10 10 Y P Y = The model is often used as a simple tool to measure attractiveness of equity, and to help allocating funds between equity and bonds. When, for example, the equity earnings yield is above the government bond yield, investors should shift funds from bonds into equity. index] have often been inversely related to changes in the long-term Treasury yields, but this year's stock price gains were not matched by a significant net decline in interest rates. As a result, the yield on ten-year Treasury notes now exceeds the ratio of twelve-month-ahead © 2013 Global Journals Inc. (US) Financial Time Series-Recent Trends in Econometrics 5. In probability theoryand information theory, the Kullback-Leibler divergence (also information divergence, information gain, relative entropy, or KLIC) is a non-symmetric measure of the difference between two probability distributions Pand Q. Specifically, the Kullback-Leibler divergence of Q from P, denoted D KL (P||Q), is a measure of the information lost when Qis used to approximate P.For distributions P and Q of a continuous random variable, KL-divergence is defined to be the integral: P ( D KL || Q) = dx (x) p ) q(x) (x) p ( In ? ? ? ? wherep and q denote the densities of P and Q. 6. The concept of weak dependence makes explicit the asymptotic independence between the 'past' and the 'future'. This means that the past is progressively forgotten. Roughly speaking, for convenient function f and g they assumed Cov [ ( f ( 'past') , g ('future') ] is small when the distance between the past and the future is sufficiently large. 7. The Cholesky decomposition (the lower triangular squared root) of the covariance matrix for a conditional independent normal model under is obtained under our equivariant loss functions. By introducing a special group of lower-triangular block matrices, it obtains the best equivariant estimator of the Cholesky decomposition under each of the four losses. Because both the maximum likelihood estimator and the unbiased estimator belong to the class of equivariant estimators with respect to the special group, they are all inadmissible. 8. The algorithm provides parameterizations of a model by a finite set of positive numbers. It can be used for computing the covariance structure of the process, for testing the validity of such a structure, and for stability testing. 9. Casting ARFIMA models into state-space form leads to exact ML estimates. Estimating the moving average part of time series models has always been the trickier part. Many 'quasi' ML methods have been presented, where a long AR polynomial approximates the MA side. In simulation experiments Granger and Joyeux (1980) show that all of the ML methods, exact and quasi, are about equally accurate. All have a small downward bias in the d estimate of the degree of differencing. earnings to prices by the largest amount since 1991, when earnings were depressed by the economic slowdown." Global] T - 1 [ R P E x x x = Where E is the earnings-per-share of company x, P is the share price, R is the nominal interest rate on corporate bonds and T is the corporate tax rate. For a long time, the after-tax interest rate on corporate bonds was roughly equal to the 10-year Treasury rate. But during the 2008 financial crisis this relationship broke down, as Baa rated corporate bonds peaked at over 9%, and 10-year treasuries bottomed under 2.5%. # VI. # Stochastic Volatitity ans Bayesian Estimation Continuous time models are widely used in modern mathematical finance, providing the basis for option pricing, asset allocation and term structure theory. A classic example is the so-called Black-Scholes model (Black and Scholes (1973) which characterizes the log of an asset price t) ( x * as the solution of the stochastic differential equation t) ( dw dt ] [ (t) dx 2 * ? ? ? µ + + = Where w (t) is the standard Brownian motion. Implying that the aggregate returns are normally distributed with constant variance, well-known stylized features of financial time series such as heavy tails, skewness, volatility, clustering are not captured by this model. To improve the model, stochastic volatility has been introduced. (t) dw (t) dt } ) t ( { t) ( * dx 2 ? ? ? µ + + = Where the volatility t) ( Various assumptions have been made concerning the stochastic nature of the volatility process. Most of them based on diffusion type models e.g., square root process or Orustein -Uhlenbeck (OU) process for log volatility see Anderson and Lund (1997). G - RP R ) G 1 ( D P t + + = whereP is the current price and D the current dividend, G the expected long term growth rate, f R the risk free rate (10-year treasury notes) and RP the equity risk premium. If one now assumes that 100% of the earnings are paid as dividend (D=E), the growth rate is equal to zero, and the equity risk premium is also equal to zero, one gets the Fed model: E/P= f R . The three assumptions seem unrealistic at best. It is also pointed out that the Fed model compares a real magnitude (E/P) with a nominal interest rate. Inflation should affect the bond yield, but not the earnings yield. 12. The problem stems from the fact that the conditional distribution of the aggregated returns n y , although being normal, depends on the latent process Surprisingly, the Fed model was never officially endorsed by the Fed, but the former Fed chairman Alan Greenspan seemed to make a reference to it in his memoir: "The decline of real (inflation adjusted) longterm interest rates that has occurred in the last two decades has been associated with rising price-toearnings ratios for stocks, real estate, and in fact all income-earnings assets." A bond yield versus equity yield comparison has been used in practice long before the model was given this name. 10 The recently proposed over time. 11 Fruhwirth-Schnatter (2001) proposed a law. 12 is derived from Rosinski representation 13 and has the VII. ) ( t ? ? and ) t ( 2 ? : n y | ) (t 2 ? ~ N ( ) , 2 n 2 n ? ? ? ? µ + where ) t - t ( 1 - n n = ? and 2 n ? may be expressed as ] )) (t - ) t ( ( - ) t ( - ) t ( [ 1 1 - n 2 n # Tail Dependent Time Series In time series modeling, a key step is to determine a finite dimensional representation of the proposed model, i.e., to determine statistically how many parameters have to be included in the model. In linear time series models, such as AR (P) and MA (q), auto covariance functions or partial auto covariance functions are often used to determine the dimension, such as the value of p and q. but in nonlinear time series models, these techniques may no longer be applicable. In the context of the max-stable process, since the underlying distribution has no finite variance, the dimension of the model cannot be determined in the usual manner. Danielsson (2002) and Mikosch and Straumann (2002) used the gamma test to determine the order of the lag -k tail dependence existing in financial time series. Using standardized return series, based on the test results show that jumps in returns are not transient. New time series models which combine a specific class of max -stable process, Markov process and GARCH process are proposed and used to model tail gamma test to check whether there exists tail dependence for the S & P 500 return data. The approach is hierarchical i.e., to apply (1, 1) fitting and to get estimated standard deviations first, then based on standardized return series M3 and Markov process modeling are applied. It is also possible, perhaps, to study Markov process, GARCH process and M3 process simultaneously, But this may further additional research. Attention is restricted to M# process. This sub-class has the advantage of efficiently moeling serial tail dependent financial time series, while of course, other sub class specification are possibly also suitable. # VIII. # Conclusion and Comments Many empirical studies have uncovered significant nonlinearities in stock prices. If stock returns are governed by chaos of low complexity, we should be able to make predictions much better than using simple methods such as the random walk. The NN approach to forecasting financial time series is attractive because it means a certain mixture of technical analysis and chaotic behavior. There has been non-parametric approaches in estimating the time-varying beta with high frequency data denoted as the realized data. The market micro structure, noise, the lag of price adjustment, the true asset price is known to cause high level of distortion when price data is sampled at extremely high frequency. The Fed Model, an equity valuation model does matter because people use it. Analysts across JP Morgan, ti ING, to Prudential use the Fed model in their calcualtion. The Bayesian estimation of the stochastic volatility with marginal gamma law has much to offer in practical parameterization; Recently statistical evidence of impacts in financial time series are also observed. A new time series model is introduced combining Markov process, GARCH process and M3 process. dependencies within asset returns. 14 They used the ![is the number of jumps. Unlike the constant period-by-period beta from the CAPM, the realized beta allows continuous evaluation in the beta estimation.](image-2.png "N") ytj(ti)j(tj)rjiji1....nj1....d capital structure substitution theory argues that the Fedmodel indeed needs to be re-specified. It suggests thatsupply (company management), rather than demand (investors) drive the relationship between E/P and2013interest rates. Stock market earnings yield tends to be at equilibrium not with the government bond yield but with random variables. 6 The following constraints are needed the average after-tax corporate bond yield, as com-"?Changes in this ratio [P/E of the S & P 500Y earpanies adjust capital structure (mix of equity and bonds)to maximize earnings per share. If managements consistently optimize capital structure by substituting stocks (repurchasing shares) for bonds or vice versa, equilibrium is reached when:Volume XIII Issue V Version I)C(and even inefficient especially in small samples 7 . A8 But to both the estimation of autocorrelations is critical. 9 © 2013 Global Journals Inc. (US) Financial Time Series-Recent Trends in Econometrics © 2013 Global Journals Inc. (US) Financial Time Series-Recent Trends in Econometrics 10. The competing asset argument listed above argues that only when stocks have the same yield as government bonds, both asset classes are equally attractive to investors. But the earnings yield (E/P) of a stock does not describe what an investor actually receives as not all earnings are paid out to the investor. And how do corporate bonds (with a yield above the government bond yield) fit into this picture? A number of assumptions need to be made to go from the constant growth dividend discount model to the Fed model. Estrada starts with the Gordon growth model © 2013 Global Journals Inc. (US) * The Distribution of Stock Returns TGAnderson TBolleslev FDiebold HEbens Journal of Financial Economics 61 2 2003 * Realized Beta: Persistence and Predictability Advances in Econometrics TGAnderson TBolleslev FDiebold JWu 2006 20 * Estimating Continuous Time Stochastic Volatility Models of the Short-term Interest Rate TAnderson JLund Journal of Econometrics 3 1997 * Econometric Analysis of Realized Volatility and its Use in Estimating Stochastic Volatility Models J of Royal Statistical Society OBarndorff -Nielson NShephard 2002 Series B, Vol64 * Non Gaussian Ornstein -Uhlenbeck -based Models and Some of their Uses in Financial Economics OEBarndorff -Nielson NShephard Journal of the Royal Statistical Societies Series B 63 20 2001 * Non Parametric Exchange Rate Prediction Diebold F XandNason Journal of International Economics 28 1 1990 J * Realized Stock Volatility Working Paper HEbens Department of Economics 32 1998 Johns Hopkins University * Autoregressive Conditional Heteroskedasticity with Estimates of the Variance of REngle U.K. Inflation, Econometrica 50 1 1982 * Predicting Chaotic Time Series Physical Review Letter, Vol59 Famer JSidorowitch 1987 * Chains Estimation of Classical and Dynamic Switching and Mixture Models SFruhwirth -Schnatter Markov Journal of American Statistical Association 96 3 * CGranger RJoyeux An Introduction to Long Memory Time Series Models and Financial Differencing Journal of the Time Series Analysis, Vol1 1980 * Modeling Time Series Using Multivariate Adaptive Regression Splines (MARS) in P A WLewis RayStevens J Time Series Prediction: Forecasting the Futureand Understanding the Past A J Weigend N AGershenfeld Reading, MA Addison-Wesley 1994 * Stock Prices do not Follow Random Walk: Evidence from a Simple Specification Test Review of Financial Studies, Vol1 ALo Mackinlay 1998 * Whittle Estimation in a Heavy-tailed GARCH (1,1) Model' Stochastic Process TMikosch D'Straumann Appl Vol100 * Geometry from a Time Series N HPlackard JPCrutchfield Farmer RShaw Physical Review Letters 4 1980 * MPesaron SPotter Chaos and Econometrics 1993. 1908 John Wiley Poincare H Science etmethode * Series Representation of Levy Process from the Perspective of Point Process JRosinski Process: Theory and Application TO T Barndorff -Nielson LresnickMikoschk ess: Theory and Application Birkhauser 2001 * Chaos and Nonlinear Dynamics in the Financial Market: Theory, Evidence and Applications RTrippi Irwin, Chikago 1995