Misplaced Pages

Alameda County Study

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.

A longitudinal study (or longitudinal survey , or panel study ) is a research design that involves repeated observations of the same variables (e.g., people) over long periods of time (i.e., uses longitudinal data ). It is often a type of observational study , although it can also be structured as longitudinal randomized experiment .

#284715

40-553: The Alameda County Study is a longitudinal study of residents from Alameda County, California , which examines the relationship between lifestyle and health . The "1965 cohort" were given health questionnaires in 1965, 1973, 1985, 1988, 1994, and 1999. The researchers found that those who followed five practices lived healthier and longer lives : Another study of the Alameda cohort suggests that social and community ties can also help an individual to live longer. Later studies of

80-418: A cohort (a group of people who share a defining characteristic, typically who experienced a common event in a selected period, such as birth or graduation) and perform cross-section observations at intervals through time. Not all longitudinal studies are cohort studies; some instead include a group of people who do not share a common event. As opposed to observing an entire population, a panel study follows

120-494: A (potentially time-dependent) random vector X = ( X 1 , … , X n ) T {\displaystyle \mathbf {X} =(X_{1},\ldots ,X_{n})^{\rm {T}}} is an n × n {\displaystyle n\times n} matrix containing as elements the autocorrelations of all pairs of elements of the random vector X {\displaystyle \mathbf {X} } . The autocorrelation matrix

160-391: A WSS process: | R X X ⁡ ( τ ) | ≤ R X X ⁡ ( 0 ) {\displaystyle \left|\operatorname {R} _{XX}(\tau )\right|\leq \operatorname {R} _{XX}(0)} Notice that R X X ⁡ ( 0 ) {\displaystyle \operatorname {R} _{XX}(0)}

200-474: A continuous-time white noise signal will have a strong peak (represented by a Dirac delta function ) at τ = 0 {\displaystyle \tau =0} and will be exactly 0 {\displaystyle 0} for all other τ {\displaystyle \tau } . The Wiener–Khinchin theorem relates the autocorrelation function R X X {\displaystyle \operatorname {R} _{XX}} to

240-734: A discrete-time signal y ( n ) {\displaystyle y(n)} is R y y ( ℓ ) = ∑ n ∈ Z y ( n ) y ( n − ℓ ) ¯ {\displaystyle R_{yy}(\ell )=\sum _{n\in Z}y(n)\,{\overline {y(n-\ell )}}} The above definitions work for signals that are square integrable, or square summable, that is, of finite energy. Signals that "last forever" are treated instead as random processes, in which case different definitions are needed, based on expected values. For wide-sense-stationary random processes ,

280-406: A point in time, this may mean that 10% of the population are always poor or that the whole population experiences poverty for 10% of the time. Longitudinal studies can be retrospective (looking back in time, thus using existing data such as medical records or claims database) or prospective (requiring the collection of new data). Cohort studies are one type of longitudinal study which sample

320-524: A smaller, selected group - called a 'panel'. When longitudinal studies are observational , in the sense that they observe the state of the world without manipulating it, it has been argued that they may have less power to detect causal relationships than experiments . Others say that because of the repeated observation at the individual level, they have more power than cross-sectional observational studies, by virtue of being able to exclude time-invariant unobserved individual differences and also of observing

360-930: A stochastic process is ρ X X ( t 1 , t 2 ) = K X X ⁡ ( t 1 , t 2 ) σ t 1 σ t 2 = E ⁡ [ ( X t 1 − μ t 1 ) ( X t 2 − μ t 2 ) ¯ ] σ t 1 σ t 2 . {\displaystyle \rho _{XX}(t_{1},t_{2})={\frac {\operatorname {K} _{XX}(t_{1},t_{2})}{\sigma _{t_{1}}\sigma _{t_{2}}}}={\frac {\operatorname {E} \left[(X_{t_{1}}-\mu _{t_{1}}){\overline {(X_{t_{2}}-\mu _{t_{2}})}}\right]}{\sigma _{t_{1}}\sigma _{t_{2}}}}.} If

400-682: Is a complex random vector , the autocorrelation matrix is instead defined by R Z Z ≜   E ⁡ [ Z Z H ] . {\displaystyle \operatorname {R} _{\mathbf {Z} \mathbf {Z} }\triangleq \ \operatorname {E} [\mathbf {Z} \mathbf {Z} ^{\rm {H}}].} Here H {\displaystyle {}^{\rm {H}}} denotes Hermitian transpose . For example, if X = ( X 1 , X 2 , X 3 ) T {\displaystyle \mathbf {X} =\left(X_{1},X_{2},X_{3}\right)^{\rm {T}}}

440-466: Is a mathematical tool for finding repeating patterns, such as the presence of a periodic signal obscured by noise , or identifying the missing fundamental frequency in a signal implied by its harmonic frequencies. It is often used in signal processing for analyzing functions or series of values, such as time domain signals. Different fields of study define autocorrelation differently, and not all of these definitions are equivalent. In some fields,

SECTION 10

#1732772664285

480-473: Is a random vector, then R X X {\displaystyle \operatorname {R} _{\mathbf {X} \mathbf {X} }} is a 3 × 3 {\displaystyle 3\times 3} matrix whose ( i , j ) {\displaystyle (i,j)} -th entry is E ⁡ [ X i X j ] {\displaystyle \operatorname {E} [X_{i}X_{j}]} . In signal processing ,

520-630: Is always real. The Cauchy–Schwarz inequality , inequality for stochastic processes: | R X X ⁡ ( t 1 , t 2 ) | 2 ≤ E ⁡ [ | X t 1 | 2 ] E ⁡ [ | X t 2 | 2 ] {\displaystyle \left|\operatorname {R} _{XX}(t_{1},t_{2})\right|^{2}\leq \operatorname {E} \left[|X_{t_{1}}|^{2}\right]\operatorname {E} \left[|X_{t_{2}}|^{2}\right]} The autocorrelation of

560-692: Is an even function can be stated as R X X ⁡ ( t 1 , t 2 ) = R X X ⁡ ( t 2 , t 1 ) ¯ {\displaystyle \operatorname {R} _{XX}(t_{1},t_{2})={\overline {\operatorname {R} _{XX}(t_{2},t_{1})}}} respectively for a WSS process: R X X ⁡ ( τ ) = R X X ⁡ ( − τ ) ¯ . {\displaystyle \operatorname {R} _{XX}(\tau )={\overline {\operatorname {R} _{XX}(-\tau )}}.} For

600-403: Is common practice in some disciplines (e.g. statistics and time series analysis ) to normalize the autocovariance function to get a time-dependent Pearson correlation coefficient . However, in other disciplines (e.g. engineering) the normalization is usually dropped and the terms "autocorrelation" and "autocovariance" are used interchangeably. The definition of the autocorrelation coefficient of

640-391: Is important both because the interpretation of the autocorrelation as a correlation provides a scale-free measure of the strength of statistical dependence , and because the normalization has an effect on the statistical properties of the estimated autocorrelations. The fact that the autocorrelation function R X X {\displaystyle \operatorname {R} _{XX}}

680-925: Is most often defined as the continuous cross-correlation integral of f ( t ) {\displaystyle f(t)} with itself, at lag τ {\displaystyle \tau } . R f f ( τ ) = ∫ − ∞ ∞ f ( t + τ ) f ( t ) ¯ d t = ∫ − ∞ ∞ f ( t ) f ( t − τ ) ¯ d t {\displaystyle R_{ff}(\tau )=\int _{-\infty }^{\infty }f(t+\tau ){\overline {f(t)}}\,{\rm {d}}t=\int _{-\infty }^{\infty }f(t){\overline {f(t-\tau )}}\,{\rm {d}}t} where f ( t ) ¯ {\displaystyle {\overline {f(t)}}} represents

720-461: Is not well defined for all-time series or processes, because the mean may not exist, or the variance may be zero (for a constant process) or infinite (for processes with distribution lacking well-behaved moments, such as certain types of power law ). If { X t } {\displaystyle \left\{X_{t}\right\}} is a wide-sense stationary process then the mean μ {\displaystyle \mu } and

760-434: Is that, unlike cross-sectional studies , in which different individuals with the same characteristics are compared, longitudinal studies track the same people, and so the differences observed in those people are less likely to be the result of cultural differences across generations, that is, the cohort effect . Longitudinal studies thus make observing changes more accurate and are applied in various other fields. In medicine,

800-1241: Is the expected value operator and the bar represents complex conjugation . Note that the expectation may not be well defined . Subtracting the mean before multiplication yields the auto-covariance function between times t 1 {\displaystyle t_{1}} and t 2 {\displaystyle t_{2}} : K X X ⁡ ( t 1 , t 2 ) = E ⁡ [ ( X t 1 − μ t 1 ) ( X t 2 − μ t 2 ) ¯ ] = E ⁡ [ X t 1 X ¯ t 2 ] − μ t 1 μ ¯ t 2 {\displaystyle \operatorname {K} _{XX}(t_{1},t_{2})=\operatorname {E} \left[(X_{t_{1}}-\mu _{t_{1}}){\overline {(X_{t_{2}}-\mu _{t_{2}})}}\right]=\operatorname {E} \left[X_{t_{1}}{\overline {X}}_{t_{2}}\right]-\mu _{t_{1}}{\overline {\mu }}_{t_{2}}} Note that this expression

840-458: Is the value (or realization ) produced by a given run of the process at time t {\displaystyle t} . Suppose that the process has mean μ t {\displaystyle \mu _{t}} and variance σ t 2 {\displaystyle \sigma _{t}^{2}} at time t {\displaystyle t} , for each t {\displaystyle t} . Then

SECTION 20

#1732772664285

880-530: Is used by both researchers and policymakers to better understand how Australians are aging and using health services to prevent and manage ill-health and disability and guide health system decisions. 45 and Up is the largest ongoing study of healthy aging in the Southern Hemisphere. GUiNZ is New Zealand's largest ongoing longitudinal study. It follows approximately 11% of all NZ children born between 2009 and 2010. The study aims to look in depth at

920-766: Is used in various digital signal processing algorithms. For a random vector X = ( X 1 , … , X n ) T {\displaystyle \mathbf {X} =(X_{1},\ldots ,X_{n})^{\rm {T}}} containing random elements whose expected value and variance exist, the autocorrelation matrix is defined by R X X ≜   E ⁡ [ X X T ] {\displaystyle \operatorname {R} _{\mathbf {X} \mathbf {X} }\triangleq \ \operatorname {E} \left[\mathbf {X} \mathbf {X} ^{\rm {T}}\right]} where T {\displaystyle {}^{\rm {T}}} denotes

960-891: The auto-covariance function : K X X ⁡ ( τ ) = E ⁡ [ ( X t + τ − μ ) ( X t − μ ) ¯ ] = E ⁡ [ X t + τ X ¯ t ] − μ μ ¯ {\displaystyle \operatorname {K} _{XX}(\tau )=\operatorname {E} \left[(X_{t+\tau }-\mu ){\overline {(X_{t}-\mu )}}\right]=\operatorname {E} \left[X_{t+\tau }{\overline {X}}_{t}\right]-\mu {\overline {\mu }}} In particular, note that K X X ⁡ ( 0 ) = σ 2 . {\displaystyle \operatorname {K} _{XX}(0)=\sigma ^{2}.} It

1000-420: The complex conjugate of f ( t ) {\displaystyle f(t)} . Note that the parameter t {\displaystyle t} in the integral is a dummy variable and is only necessary to calculate the integral. It has no specific meaning. The discrete autocorrelation R {\displaystyle R} at lag ℓ {\displaystyle \ell } for

1040-993: The power spectral density S X X {\displaystyle S_{XX}} via the Fourier transform : R X X ⁡ ( τ ) = ∫ − ∞ ∞ S X X ( f ) e i 2 π f τ d f {\displaystyle \operatorname {R} _{XX}(\tau )=\int _{-\infty }^{\infty }S_{XX}(f)e^{i2\pi f\tau }\,{\rm {d}}f} S X X ( f ) = ∫ − ∞ ∞ R X X ⁡ ( τ ) e − i 2 π f τ d τ . {\displaystyle S_{XX}(f)=\int _{-\infty }^{\infty }\operatorname {R} _{XX}(\tau )e^{-i2\pi f\tau }\,{\rm {d}}\tau .} For real-valued functions,

1080-1536: The transposed matrix of dimensions n × n {\displaystyle n\times n} . Written component-wise: R X X = [ E ⁡ [ X 1 X 1 ] E ⁡ [ X 1 X 2 ] ⋯ E ⁡ [ X 1 X n ] E ⁡ [ X 2 X 1 ] E ⁡ [ X 2 X 2 ] ⋯ E ⁡ [ X 2 X n ] ⋮ ⋮ ⋱ ⋮ E ⁡ [ X n X 1 ] E ⁡ [ X n X 2 ] ⋯ E ⁡ [ X n X n ] ] {\displaystyle \operatorname {R} _{\mathbf {X} \mathbf {X} }={\begin{bmatrix}\operatorname {E} [X_{1}X_{1}]&\operatorname {E} [X_{1}X_{2}]&\cdots &\operatorname {E} [X_{1}X_{n}]\\\\\operatorname {E} [X_{2}X_{1}]&\operatorname {E} [X_{2}X_{2}]&\cdots &\operatorname {E} [X_{2}X_{n}]\\\\\vdots &\vdots &\ddots &\vdots \\\\\operatorname {E} [X_{n}X_{1}]&\operatorname {E} [X_{n}X_{2}]&\cdots &\operatorname {E} [X_{n}X_{n}]\\\\\end{bmatrix}}} If Z {\displaystyle \mathbf {Z} }

1120-499: The above definition is often used without the normalization, that is, without subtracting the mean and dividing by the variance. When the autocorrelation function is normalized by mean and variance, it is sometimes referred to as the autocorrelation coefficient or autocovariance function. Given a signal f ( t ) {\displaystyle f(t)} , the continuous autocorrelation R f f ( τ ) {\displaystyle R_{ff}(\tau )}

1160-652: The autocovariance and autocorrelation can be expressed as a function of the time-lag, and that this would be an even function of the lag τ = t 2 − t 1 {\displaystyle \tau =t_{2}-t_{1}} . This gives the more familiar forms for the autocorrelation function R X X ⁡ ( τ ) = E ⁡ [ X t + τ X ¯ t ] {\displaystyle \operatorname {R} _{XX}(\tau )=\operatorname {E} \left[X_{t+\tau }{\overline {X}}_{t}\right]} and

1200-576: The cohort considered the impact of religiosity, social status, and hearing loss on health outcomes. Longitudinal study Longitudinal studies are often used in social-personality and clinical psychology , to study rapid fluctuations in behaviors, thoughts, and emotions from moment to moment or day to day; in developmental psychology , to study developmental trends across the life span; and in sociology , to study life events throughout lifetimes or generations; and in consumer research and political polling to study consumer trends. The reason for this

1240-615: The definition of the autocorrelation function between times t 1 {\displaystyle t_{1}} and t 2 {\displaystyle t_{2}} is R X X ⁡ ( t 1 , t 2 ) = E ⁡ [ X t 1 X ¯ t 2 ] {\displaystyle \operatorname {R} _{XX}(t_{1},t_{2})=\operatorname {E} \left[X_{t_{1}}{\overline {X}}_{t_{2}}\right]} where E {\displaystyle \operatorname {E} }

Alameda County Study - Misplaced Pages Continue

1280-400: The design is used to uncover predictors of certain diseases. In advertising, the design is used to identify the changes that advertising has produced in the attitudes and behaviors of those within the target audience who have seen the advertising campaign. Longitudinal studies allow social scientists to distinguish short from long-term phenomena, such as poverty . If the poverty rate is 10% at

1320-995: The function ρ X X {\displaystyle \rho _{XX}} is well defined, its value must lie in the range [ − 1 , 1 ] {\displaystyle [-1,1]} , with 1 indicating perfect correlation and −1 indicating perfect anti-correlation . For a wide-sense stationary (WSS) process, the definition is ρ X X ( τ ) = K X X ⁡ ( τ ) σ 2 = E ⁡ [ ( X t + τ − μ ) ( X t − μ ) ¯ ] σ 2 {\displaystyle \rho _{XX}(\tau )={\frac {\operatorname {K} _{XX}(\tau )}{\sigma ^{2}}}={\frac {\operatorname {E} \left[(X_{t+\tau }-\mu ){\overline {(X_{t}-\mu )}}\right]}{\sigma ^{2}}}} . The normalization

1360-480: The health and well-being of children (and their parents) growing up in NZ. Autocorrelation Autocorrelation , sometimes known as serial correlation in the discrete time case, is the correlation of a signal with a delayed copy of itself as a function of delay. Informally, it is the similarity between observations of a random variable as a function of the time lag between them. The analysis of autocorrelation

1400-411: The study for various reasons. Under longitudinal research methods, the reduction in the research sample will bias the remaining smaller sample. Practice effect is also one of the problems: longitudinal studies tend to be influenced because subjects repeat the same procedure many times (potentially introducing autocorrelation ), and this may cause their performance to improve or deteriorate. The Study

1440-1098: The symmetric autocorrelation function has a real symmetric transform, so the Wiener–Khinchin theorem can be re-expressed in terms of real cosines only: R X X ⁡ ( τ ) = ∫ − ∞ ∞ S X X ( f ) cos ⁡ ( 2 π f τ ) d f {\displaystyle \operatorname {R} _{XX}(\tau )=\int _{-\infty }^{\infty }S_{XX}(f)\cos(2\pi f\tau )\,{\rm {d}}f} S X X ( f ) = ∫ − ∞ ∞ R X X ⁡ ( τ ) cos ⁡ ( 2 π f τ ) d τ . {\displaystyle S_{XX}(f)=\int _{-\infty }^{\infty }\operatorname {R} _{XX}(\tau )\cos(2\pi f\tau )\,{\rm {d}}\tau .} The (potentially time-dependent) autocorrelation matrix (also called second moment) of

1480-464: The temporal order of events. Longitudinal studies do not require large numbers of participants (as in the examples below). Qualitative longitudinal studies may include only a handful of participants, and longitudinal pilot or feasibility studies often have fewer than 100 participants. Longitudinal studies are time-consuming and expensive. Longitudinal studies cannot avoid an attrition effect: that is, some subjects cannot continue to participate in

1520-470: The term is used interchangeably with autocovariance . Unit root processes, trend-stationary processes , autoregressive processes , and moving average processes are specific forms of processes with autocorrelation. In statistics , the autocorrelation of a real or complex random process is the Pearson correlation between values of the process at different times, as a function of the two times or of

1560-411: The time lag. Let { X t } {\displaystyle \left\{X_{t}\right\}} be a random process, and t {\displaystyle t} be any point in time ( t {\displaystyle t} may be an integer for a discrete-time process or a real number for a continuous-time process). Then X t {\displaystyle X_{t}}

1600-455: The variance σ 2 {\displaystyle \sigma ^{2}} are time-independent, and further the autocovariance function depends only on the lag between t 1 {\displaystyle t_{1}} and t 2 {\displaystyle t_{2}} : the autocovariance depends only on the time-distance between the pair of values but not on their position in time. This further implies that

#284715