Forgot your password?
typodupeerror
This discussion has been archived. No new comments can be posted.

Astronomers Discover a Group of Quasars 4 Billion Light Years Across

Comments Filter:
  • by History's Coming To (1059484) on Saturday January 12, 2013 @01:56PM (#42568475) Journal
    From the sounds of it this is a case of proximity rather than being gravitationally bound:

    Since 1982 it has been know that quasars tend to group together in clumps or ‘structures’ of surprisingly large sizes, forming large quasar groups or LQGs.

    Most things in space tend to cluster together - dust around stars forms planets, stars group together in galaxies, there's a hierarchy of galactic clusters and super clusters, and some of the largest scale structures can contain tens of thousands of galaxies. These large scale structures aren't caused by gravity pulling galaxies together, it's more of an inbuild clustering effect which originates in slight density fluctuations in the very early universe.

  • Big (Score:5, Interesting)

    by PPH (736903) on Saturday January 12, 2013 @02:16PM (#42568625)

    Based on the map in the linked article, it appears that this Quasar has an angular diameter of about 10 degrees. The moon is about 0.5 degrees. So if the magnitude was high enough to be visible, this structure would be the size of a constellation. Of course, if it was that bright, it would have fried most of the observable universe.

  • Descriptive entropy (Score:5, Interesting)

    by Okian Warrior (537106) on Saturday January 12, 2013 @02:28PM (#42568707) Homepage Journal

    Consider all the entities [stars, galaxies, or whatnot] in your study as points in 3-space. The descriptive length of the data is the total number of bits that describes the location of all points in your study.

    If all points are random and evenly distributed, then the total number of bits required is (number of points)x(number of bits for 1 location).

    Suppose you notice a clumping of points. Is this a structure or random variation?

    Rework your data description as follows: for any point, use the first bit to determine whether a point is a member of the clump or not, and subsequent bits to complete the description, depending on whether the point is in the clump.

    For this description, the total number of bits required is 1x(total number of points) + (number of points in clump)x(number of bits for location relative to clump) + (number of points not in clump)x(number of bits for general location).

    If the 2nd description is shorter than the 1st description, then by Occam's razor the second description is more likely correct.

    In fact, the number of bits directly tells the probability that the 2nd description is correct: if the 2nd description requires 10 fewer bits (total) than the 1st, then the 2nd description is more likely to be correct by a factor of 1024. Alternately, there is a 1/1024 chance that the 2nd description is *not* the correct description of the data.

    If you have lots of data, it's not unusual for a descriptive length to be thousands of bits shorter than the baseline description; meaning, that it's virtually certain that the new description is correct and that the new structure does not arise from random variation.

    I haven't seen the data, but I assume that describing all galaxies in the universe using the newly described "clump" as a categorical structure gives a smaller descriptive entropy than describing all galaxies without the extra category of "clump".

  • inflation ok here? (Score:4, Interesting)

    by vistapwns (1103935) on Saturday January 12, 2013 @02:47PM (#42568805)
    Curious question for you physicists or arm-chair physicists, does this have any implications for inflation? I've read here and there that inflation would be problematic if there were large structures in the universe, because nothing would have had time to propagate the distance in the time required to be compatible with inflation, so does this bump up against that limit or break it?
  • by Zorpheus (857617) on Saturday January 12, 2013 @05:02PM (#42569771)
    With a redshift of 1.3 this quasar group is probably close to the edge of the observable universe. What we see is from a time maybe some million years after the big bang. But at this time the universe was much smaller, so these quasars were much closer together than they are now. They are flying away from us since then into slightly different directions, and flying away from each other.
    What I think this means is: We can not calculate the size of this group from the angular diameter and its distance, it has nothing to do with reality. The angular diameter comes from different directions that the individual quasars are flying away from us, not from actually being this large. We can only see this quasar group as it was billion years ago, and at that time it was much smaller. We don't know what it looks like now. Also our perception of the form of this group would be distorted if the directions that its components are flying is not just caused by a homogeneous expansion of the universe.
  • by History's Coming To (1059484) on Saturday January 12, 2013 @07:16PM (#42570667) Journal
    Fair call, I should clarify: Yes, the clustering happens because of gravity, but not because they were all spread out and gravity pulled them into a structure. The density fluctuations which cause a "structure" were there in the first few moments of the universe, what gravity does is amplify the effect and make the structure more obvious. If there was no gravity these would still be "structures", but they'd be identifiable as fractionally denser areas of matter rather than big, obvious, visible-from-billions-of-light-years-away structures. The structure is caused by the initial state of the universe, gravity makes it even more obvious.
  • by TapeCutter (624760) on Saturday January 12, 2013 @07:18PM (#42570691) Journal

    When Einstein said it looked the same in every direction, what he meant was that it's all governed by the same laws.

    Actually it's more than that, it's also about the distribution of matter and energy on a large scale. It's assumed that matter is homogenous throughout the universe, homogenous literally means "no lumps" (above a certain size defined as "local" in your post). It's like an ideal gas, at the microscopic level you have all sorts of random "pressure" (kinetic energy of the individual atoms), at the macroscopic level there is just one pressure that is the same no matter what part of the gas you measure. This is because the macroscopic measurements are an average of all the individual microscopic pressures, the central limit theorem of statistics says that that the average of a big enough sample from a large population will always be very close to the real population average.

    In other words the reason it's "odd" is that statistics says the observation can't be brushed aside as a fluke, if the distribution of quasars is lumpy then either the basic assumption of large scale homogeneity is wrong, or the observation is flawed. The OP's stupid question is by far the most insightful thing I've read about it so far, how are they defining the word "structure".

When the weight of the paperwork equals the weight of the plane, the plane will fly. -- Donald Douglas

Working...