Jump to content

Talk:Autocorrelation

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Criticism in Statistics Section - Need to Add Alternative Definition Free from Signal Processing/Determinism (I.e. in the Domain of Stochastic Processes)

[edit]

In the statistics section, the definition of autocorrelation is not the same as those often given in many books on stochastic processes, and it differs from the definition given in many places on Wikipedia itself. It is a valid notation, however. For example, this notation is used in the following two books and any other books/notes on "time-series analysis".

  • Time Series Theory and Methods by Peter J. Brockwell, Richard A. Davi
  • Time series Analysis forecasting and Control by Box and Jenkins

On the other hand, I have found a definition of autocorrelation to simply be for two times and in these:

  • Probability, random variables, and Stochastic Processes by Papoulis
  • Probability, Statistics, and random Processes for Electrical Engineering by Leon-Garcia
  • Discrete Optimal Filtering and Optimal Filtering by Bertein and Ceschi

In other words, the alternative definition for autocorrelation deserves to be expressed with some formality in the context of stochastic processes (i.e. ) in the "statistics" section also. And it should be dealt with explicitly that these two functions have the same name in differing contexts (seems to be "time-series analysis" and "everywhere else"). This has several advantages of at least admitting this alternative definition exists not merely in the context of "signal processing" with its integral definition (rather than the probabilistic one with the expectation):

  • It is consistent with many other articles on Wikipedia (in the context of stochastic processes):
    • http://en.wikipedia.org/wiki/Autocorrelation_matrix . Here, the autocorrelation matrix, a generalization of the autocorrelation function, is defined without normalization.
    • http://en.wikipedia.org/wiki/Spectral_density states that "Using such formal reasoning, one may already guess that for a stationary random process, the power spectral density f(\omega) and the autocorrelation function of this signal \gamma(\tau)=\langle X(t) X(t+\tau)\rangle should be a Fourier transform pair" while clearly showing the Fourier transform of .
    • http://en.wikipedia.org/wiki/Wiener%E2%80%93Khinchin_theorem also calls the "autocorrelation function" (while being the only of these sources to even bother to clear the confusion that autocorrelation is sometimes called autocovariance).
  • It gives a name to the quantity which is often of theoretical importance (e.g. in the Wiener-Khinchin theorem). The statistical section's terminology admits no such naming of this quantity that I know of.
  • The autocorrelation defined this way has the other autocorrelation function in it with only knowledge of the mean functions u_1 and u_2. On the other hand, to go the other way, we need the mean functions u_1 and u_2 in addition to the variance functions sig_1 and sig_2 (to undo the normalization). And yes, autocorrelation defined this way ALSO has the covariance function in it if we accompany it with u_1 and u_2. Further, autocorrelation defined this way more readily brings about the second moment of the r.v. (i.e. it is more readily linked to standard random variable theory of moments and whatnot).
  • The autocorrelation function defined this way exists in situations where it will fail to exist with the other definition (when one of the variances is zero).


— Preceding unsigned comment added by 71.80.79.67 (talk) 06:59, 4 February 2014 (UTC)[reply]

Since that section cites no sources, you would be perfectly justified in replacing it or supplementing it with sourced content. I agree that there are at least two major sourceable definitions worth mentioning. Dicklyon (talk) 07:33, 4 February 2014 (UTC)[reply]
Please excuse this comment from someone late to this party, but it makes sense to call an "autocorrelation" if and only if for and b. Otherwise it's not a correlation (nor an autocorrelation).
The section on "Auto-correlation of stochastic processes" begins by saying, "the autocorrelation of a real or complex random process is the Pearson correlation between values of the process at different times, as a function of the two times or of the time lag." This means that Eq.1 is nonsense unless and the variance = 1 for each .
I suggest deleting expressions Eq.1 and Eq.3 as well as deleting the distinction between and .  ??? DavidMCEddy (talk) 02:48, 13 December 2021 (UTC)[reply]

Unclear passage

[edit]

This passage:

"For a weak-sense stationarity, wide-sense stationarity (WSS) process, the definition is

where

"

seems unclear to me, because to the uninitiated (like me), the phrase:

"For a weak-sense stationarity, wide-sense stationarity (WSS) process"

seems to be self-contradictory nonsense.

I hope someone knowledgeable about this subject will please rewrite this so that it does not come across as nonsense.

Please note: I am not saying that it is nonsense. But since it appears to be nonsense, this would benefit from some clarification.

Is there supposed to be an and between "weak-sense stationarity" and "wide-sense stationarity" ??? An or ??? Or what??? 2601:200:C000:1A0:300E:BD77:DEE5:AA45 (talk) 03:27, 17 August 2022 (UTC)[reply]

The passage you quoted contains a link to a section entitles "Weak or wide-sense stationarity" of the Wikipedia article on "Stationary process". That article begins by saying, "a stationary process (or a strict/strictly stationary process or strong/strongly stationary process) is a stochastic process whose unconditional joint probability distribution does not change when shifted in time." That subsection says that a process is "weak-sense stationarity, wide-sense stationarity (WSS), or covariance stationarity" if the first two moments and autocovariance are finite and do not change over time.
From that it's clear that we could rewrite that phrase as 'For a weak-sense stationary (WSS) process (also called a "wide-sense stationary" or "covariance stationary") process'. Please feel free to make that change. (Please excuse: It's 1:16 AM where I am right now, and I'm going back to bed. I believe that I tend to be more productive if I check my email in the middle of the night and spend a few minutes doing things like this. However, I don't feel an urgency to change this myself, but if you wanted to make this change, I would encourage you to do so. You may know that almost anyone can change almost anything on Wikipedia. What stays tends to be written from a neutral point of view citing credible sources. (For the policy regarding articles that can NOT be changed by anyone see see "Wikipedia:Protection policy". Wikipedia also has a policy encouraging people to make changes like this. The policy is summarized as "be bold but not reckless", i.e., just do it. ;-) Users who repeatedly make changes that are obvious vandalism can be blocked. This clearly isn't. If a change you is not quite right for some reason, some other user will likely fix it. As an aside, you may know that if you have an account, you can "watch" articles. I get several emails a day telling me that articles I'm "watching" have changed and inviting me to look at the change. That's how I saw your question.) DavidMCEddy (talk) 06:44, 17 August 2022 (UTC)[reply]


If it's obvious vadalism

DavidMCEddy (talk) 06:44, 17 August 2022 (UTC)[reply]

Contradictory definition

[edit]

The section Auto-correlation of stochastic processes begins as follows:

"In statistics, the autocorrelation of a real or complex random process is the Pearson correlation between values of the process at different times, as a function of the two times or of the time lag."

The definition is then presented as

(where is the expected value operator and the bar represents complex conjugation.)

However, this does not use Pearson correlation, but instead uses covariance.

Later in the article it is explained that there are two conventions, one for using Pearson correlation and another for using covriance (and calllling it "correlation" anyway).

Very bad idea. The passage I quote above contradicts itself.

And regardless of whether some people misuse the word "correlation" to mean "covariance", that should not be what Wikipedia does. 2601:200:C000:1A0:1841:4827:BAAA:F4DF (talk) 15:52, 17 August 2022 (UTC)[reply]

Agreed. (Eq. 1) is wrong.
Can someone check the two references? If they contain that expression, those references are wrong -- or at least using non-standard definitions.
I do not have time now to fix this, but I would support others doing it. E.g., what about deleting (Eq. 1). I suggest also deleting all equation numbers unless there are actual references. Simple searches yielded nothing.
FYI: Almost anyone can change almost anything in Wikipedia. Others change new contributions they find inappropriate. What stays tends to be written from a neutral point of view citing credible sources. We encourage users to Wikipedia:Be bold but not reckless.
Thanks for the observation. DavidMCEddy (talk) 16:18, 17 August 2022 (UTC)[reply]
@RaviGaaDu: Can you please check a reference for the formula you changed?
I just reverted it for two reasons:
  • The n or (n - 1) or (n - k) in the denominator would seem to refer to the number of terms in the summation, NOT the number of degrees of freedom in an estimate of .
  • The line above says it's "For a discrete process with known mean and variance", BUT this formula does NOT subtract the known mean.
Beyond this, I think all the math in this article should be reviewed carefully and check with published sources, as suggested in my comment above from almost two years ago: I think there are other errors in this article, but I am not prepared to spend the time myself to check and fix them. DavidMCEddy (talk) 12:41, 24 July 2024 (UTC)[reply]