Abstract: In the standard cosmological framework, the Hubble diagram is interpreted by assuming that the light emitted by standard candles propagates in a spatially homogeneous and isotropic spacetime. However, the light from "point sources" -- such as supernovae -- probes the universe on scales where the homogeneity principle is no longer valid.
Modern MCMC method for cosmological parameter estimation.
"While Metropolis-Hastings is constrained by overheads, CosmoHammer is able to accelerate the sampling process from a wall time of 30 hours on a single machine to 16 minutes by the efficient use of 2048 cores. Such short wall times for complex data sets opens possibilities for extensive model testing and control of systematics."
Discovery of a group of quasars which appear to form a structure over 1 Gpc across. This is much larger than any previously-suggested homogeneity scale in LambdaCDM.
"We argue using simple models that all successful practical uses of probabilities originate in quantum fluctuations in the microscopic physical world around us, often propagated to macroscopic scales. Thus we claim there is no physically verified fully classical theory of probability. We comment on the general implications of this view, and specifically question the application of classical probability theory to cosmology in cases where key questions are known to have no quantum answer."
"Observations of lensing of the 21-cm background from the dark ages will be capable of detecting M>~10^12 Msun/h mass halos, but will require futuristic experiments to overcome the contaminating sources."
You can fix the acoustic scale using low-z luminosity distance measurements instead of the high-z CMB measurement. This would be a useful consistency test for LCDM, and can be used to constrain N_eff.
They claim a ~100x speed increase when using GPUs over a single processor. Execution time on the GPUs is comparable with a 128-processor MPI implementation.
As soon as you move away from a perfectly homogeneous and isotropic cosmological model, what is meant by the "acceleration" of spacetime becomes ambiguous. It's important to be clear on what type of acceleration is being considered in any given observational or theoretical study, since different types can have very different cosmological implications.
The authors find a dipole in the variance of the Hubble flow, which they attribute to the effects of local structure. They find that local structure induces a ~0.5% variation in the distance to last scattering over the sky.
A recent-historical analysis of cosmological parameter estimation. "Of the 28 measurements of Omega_Lambda in our sample published since 2003, only 2 are more than 1 sigma from the WMAP results. Wider use of blind analyses in cosmology could help to avoid this."
The statistical anisotropy of the mean of the CMB temperature fluctuations is tested. The naive inflationary prediction is that the mean a_lm's are zero, but the authors find a deviation from this expectation for l=221 - 240.
Point sources from the WMAP 7 catalogue are stacked, and the results are found to be consistent with the WMAP beam models. The correctness of the beam models used had been questioned by Shanks and others, who found that inaccurate models could introduce a significant bias into the measured CMB power spectrum. The authors here also find evidence for spectral steepening above 61GHz. This changes the estimates for the spectrum of unresolved point sources. Accounting for this effect, the primordial power spectrum seems to be closer to scale invariant than first thought.
Rather than being split up into two components of different aged stellar populations with distinct scale heights, a thin and thick disc, the Milky Way seems to have "a continuous and monotonic distribution of disk thicknesses".