## Astronomers Discover a Group of Quasars 4 Billion Light Years Across 106

New submitter mal0rd writes

*"NewScientist reports a 'collection of galaxies that is a whopping four billion light years long is the biggest cosmic structure ever seen. The group is roughly one-twentieth the diameter of the observable universe – big enough to challenge a principle dating back to Einstein, that, on large scales, the universe looks the same in every direction.' For reference, Andromeda is only 2.5 million light years away."*
## Re:Stupid (but serious) Question (Score:5, Interesting)

Most things in space tend to cluster together - dust around stars forms planets, stars group together in galaxies, there's a hierarchy of galactic clusters and super clusters, and some of the largest scale structures can contain tens of thousands of galaxies. These large scale structures aren't caused by gravity pulling galaxies together, it's more of an inbuild clustering effect which originates in slight density fluctuations in the very early universe.

## Big (Score:5, Interesting)

Based on the map in the linked article, it appears that this Quasar has an angular diameter of about 10 degrees. The moon is about 0.5 degrees. So if the magnitude was high enough to be visible, this structure would be the size of a constellation. Of course, if it was that bright, it would have fried most of the observable universe.

## Descriptive entropy (Score:5, Interesting)

Consider all the entities [stars, galaxies, or whatnot] in your study as points in 3-space. The descriptive length of the data is the total number of bits that describes the location of all points in your study.

If all points are random and evenly distributed, then the total number of bits required is (number of points)x(number of bits for 1 location).

Suppose you notice a clumping of points. Is this a structure or random variation?

Rework your data description as follows: for any point, use the first bit to determine whether a point is a member of the clump or not, and subsequent bits to complete the description, depending on whether the point is in the clump.

For this description, the total number of bits required is 1x(total number of points) + (number of points in clump)x(number of bits for location relative to clump) + (number of points not in clump)x(number of bits for general location).

If the 2nd description is shorter than the 1st description, then by Occam's razor the second description is more likely correct.

In fact, the number of bits directly tells the probability that the 2nd description is correct: if the 2nd description requires 10 fewer bits (total) than the 1st, then the 2nd description is more likely to be correct by a factor of 1024. Alternately, there is a 1/1024 chance that the 2nd description is *not* the correct description of the data.

If you have lots of data, it's not unusual for a descriptive length to be thousands of bits shorter than the baseline description; meaning, that it's virtually certain that the new description is correct and that the new structure does not arise from random variation.

I haven't seen the data, but I assume that describing all galaxies in the universe using the newly described "clump" as a categorical structure gives a smaller descriptive entropy than describing all galaxies without the extra category of "clump".

## inflation ok here? (Score:4, Interesting)

## Thought on size and distance (Score:5, Interesting)

What I think this means is: We can not calculate the size of this group from the angular diameter and its distance, it has nothing to do with reality. The angular diameter comes from different directions that the individual quasars are flying away from us, not from actually being this large. We can only see this quasar group as it was billion years ago, and at that time it was much smaller. We don't know what it looks like now. Also our perception of the form of this group would be distorted if the directions that its components are flying is not just caused by a homogeneous expansion of the universe.

## Re:Stupid (but serious) Question (Score:4, Interesting)

## Central limit theorem (Score:5, Interesting)

When Einstein said it looked the same in every direction, what he meant was that it's all governed by the same laws.

Actually it's more than that, it's also about the distribution of matter and energy on a large scale. It's assumed that matter is homogenous throughout the universe, homogenous literally means "no lumps" (above a certain size defined as "local" in your post). It's like an ideal gas, at the microscopic level you have all sorts of random "pressure" (kinetic energy of the individual atoms), at the macroscopic level there is just one pressure that is the same no matter what part of the gas you measure. This is because the macroscopic measurements are an average of all the individual microscopic pressures, the central limit theorem of statistics says that that the average of a big enough sample from a large population will always be very close to the real population average.

In other words the reason it's "odd" is that statistics says the observation can't be brushed aside as a fluke, if the distribution of quasars is lumpy then either the basic assumption of large scale homogeneity is wrong, or the observation is flawed. The OP's stupid question is by far the most insightful thing I've read about it so far, how are they defining the word "structure".