94JCGS03\P0001-------------------------------------------------------
Examples of Scientific Problems and Data Analyses in Demography,
Neurophysiology, and Seismology
David R. Brillinger
Examples of scientific problems and data analyses are presented for
the fields of demography, neurophysiology, and seismology. The
examples are connected by the involvement of space or time. The
demographic problem is to display quantities derived from spatially
aggregated data and associated measures of uncertainty. The
neurophysiological problem is to infer the presence of complex
pathways among groups on neurons given sequences of firing times.
There are two seismological problems: (1) to determine isoseismals
of recorded intensities following the Loma Prieta earthquake and
(2) to relate intensity and acceleration values measured at distinct locations.
The statistical analyses are connected to each other by the
application of smoothing in some form and by the provision of
consequent graphical displays.
Key Words: Aggregated data; Birth rate; Coherence; Contours;
Demography; Isoseismals; Locally weighted likelihood; Networks of
neurons; Neurophysiology; Partial coherence; Point process; Seismic
risk; Seismology; Spatial analysis; Time series.
94JCGS03\P0023-------------------------------------------------------
Graphical Sensitivity Analysis for Multidimensional Scaling
Mary McFarlane and Forrest W. Young
This article introduces graphical sensitivity analysis for
multidimensional scaling. This new technique is designed to combat
two problems associated with multidimensional scaling analyses: The possibility
of local minima and the uncertainty regarding sensitivity of the
solution to changes in the parameters. Graphical sensitivity analysis
is currently available in ViSta-MDS, a test bed for graphical model
examination. By graphically manipulating points in the solution
space, analysts may examine the sensitivity of the solution to changes
in the model parameters. Furthermore, the analyst may search for
alternative solutions that represent local minima. An example of
graphical sensitivity analysis using ViSta-MDS is described.
Key Words: Dynamic graphics; Interactive graphics; Local minima.
94JCGS03\P0035-------------------------------------------------------
Fast Implementations of Nonparametric Curve Estimators
Jianqing Fan and James S. Marron
Recent proposals for implementation of kernel-based nonparametric
curve estimators are seen to be faster than naive direct
implementations by factors up into the hundreds. The main ideas behind
the two different approaches are made clear. Careful speed comparisons
in a variety of settings and using a variety of machines and software
are done. Various issues on computational accuracy and stability are
also discussed. Our speed tests show that the fast methods are as fast
or somewhat faster than methods traditionally considered very fast,
such as LOWESS and smoothing splines.
Key Words: Binning; Fast computation; Kernel methods; Nonparametric
curve estimation; Smoothing; Updating.
94JCGS03\P0057-------------------------------------------------------
The Construction and Properties of Boundary Kernels
for Smoothing Sparse Multinomials
Jianping Dong and Jeffrey S. Simonoff
In recent years several authors have investigated the use of
smoothing methods for sparse multinomial data. In particular, Hall
and Titterington (1987) studied kernel smoothing in detail. It is
pointed out here that the bias of kernel estimates of probabilities for cells
near the boundaries of the multinomial vector can dominate the mean sum
of squared error of the estimator for most true probability vectors.
Fortunately, boundary kernels devised to correct boundary effects for
kernel regression estimators can achieve the same result for these
estimators. Properties of estimates based on boundary kernels are
investigated and compared to unmodified kernel estimates and maximum
penalized likelihood estimates. Monte Carlo evidence indicates that
the boundary-corrected kernel estimates usually outperform uncorrected
kernel estimates and are quite competitive with penalized likelihood
estimates.
Key Words: Bias; Boundary kernel; Maximum penalized likelihood; Mean
sum of squared error; Smoothing methods.
94JCGS03\P0067-------------------------------------------------------
Confidence Intervals for Discrete Approximations to Ill-Posed Problems
Bert W. Rust and Dianne P. O'Leary
We consider the linear model $\mbm{Y} = \mbm{X}\mbm{\beta} + \mbm{\epsilon}$
that is obtained by discretizing a system of first-kind integral equations
describing a set of physical measurements. The $n$-vector
$\mbm{\beta}$ represents the desired quantities, the $m \times n$
matrix $\mbm{X}$ represents the instrument response functions, and the
$m$-vector $\mbm{Y}$ contains the measurements actually obtained.
These measurements are corrupted by random measuring errors $\mbm{\epsilon}$
drawn from a distribution with zero mean vector and known variance
matrix. Solution of first-kind integral equations is an
ill-posed problem, so the least squares solution for the above model is
a highly unstable function of the measurements, and the classical confidence
intervals for the solution are too wide to be useful. The solution
can often be stabilized by imposing physically motivated nonnegativity
constraints. In a previous paper (O'Leary and Rust 1986) we developed
a method for computing sets of nonnegatively constrained simultaneous confidence
intervals. In this article we briefly review the simultaneous
intervals and then show how to compute nonnegativity constrained
one-at-a-time confidence intervals. The technique gives valid
confidence intervals even for problems with $m