Explore the Too-Big-to-Fail Problem: A Comprehensive Assessment

  • Thread starter wolram
  • Start date
In summary: In this paper we focus on the Too Big To Fail problem (hereafter simply ‘TBTF’), which is generally considered the most difficult to reconcile with ΛCDM, and which has spurred a frenzy of papers offering possible solutions and/or advocating modifications of the standard paradigm. This includes suggestions to change the nature of the dark matter from ‘cold’ to either ‘warm’ or ‘self-interacting’ (e.g., Macc ́ıo & Fontanot 2010; Vogelsberger, Zavala & Loeb 2012; Lovell et al. 2012; Anderhalden et al. 2013; Rocha et al. 2013; Sh
  • #1
wolram
Gold Member
Dearly Missed
4,446
558
There is a lot to digest here, have a browse and see what you think. arXiv:1508.02715 [pdf, ps, other]
Comprehensive Assessment of the Too-Big-to-Fail Problem
Fangzhou Jiang (1), Frank C. van den Bosch (1) ((1) Yale University)
Comments: 19 pages, 10 figures, accepted for publication in MNRAS
Subjects: Cosmology and Nongalactic Astrophysics (astro-ph.CO); Astrophysics of Galaxies (astro-ph.GA)

We use a semi-analytical model for the substructure of dark matter haloes to assess the too-big-to-fail (TBTF) problem. The model accurately reproduces the average subhalo mass and velocity functions, as well as their halo-to-halo variance, in N-body simulations. We construct thousands of realizations of Milky Way (MW) size host haloes, allowing us to investigate the TBTF problem with unprecedented statistical power. We examine the dependence on host halo mass and cosmology, and explicitly demonstrate that a reliable assessment of TBTF requires large samples of hundreds of host haloes. We argue that previous statistics used to address TBTF suffer from the look-elsewhere effect and/or disregard certain aspects of the data on the MW satellite population. We devise a new statistic that is not hampered by these shortcomings, and, using only data on the 9 known MW satellite galaxies with $V_{\rm max}>15{\rm kms}^{-1}$, demonstrate that $1.4^{+3.3}_{-1.1}\%$ of MW-size host haloes have a subhalo population in statistical agreement with that of the MW. However, when using data on the MW satellite galaxies down to $V_{\rm max}=8{\rm kms}^{-1}$, this MW consistent fraction plummets to $<5\times10^{-4}$ (at 68% CL). Hence, if it turns out that the inventory of MW satellite galaxies is complete down to 8km/s, then the maximum circular velocities of MW satellites are utterly inconsistent with $\Lambda$CDM predictions, unless baryonic effects can drastically increase the spread in $V_{\rm max}$ values of satellite galaxies compared to that of their subhaloes.
 
Space news on Phys.org
  • #2
Personally I would have highlighted this phrase in the sentence:
Hence, if it turns out that the inventory of MW satellite galaxies is complete down to 8km/s, then the maximum circular velocities of MW satellites are utterly inconsistent with [itex]\Lambda[/itex]CDM predictions, unless baryonic effects can drastically increase the spread in [itex]V_{\rm max}[/itex] values of satellite galaxies compared to that of their subhaloes.

Following on from what I said in https://www.physicsforums.com/threads/is-the-lcdm-model-correct.825595/#post-5184192"Once a model derived from interpreted data has been established to be the standard one it will take "extraordinary evidence" to support the "extraordinary claim" that it might be wrong," might this paper actually provide what will be considered as such "extraordinary evidence"? Probably not...

They say
In this paper we focus on the Too Big To Fail problem (hereafter simply ‘TBTF’), which is generally considered the most difficult to reconcile with ΛCDM, and which has spurred a frenzy of papers offering possible solutions and/or advocating modifications of the standard paradigm. This includes suggestions to change the nature of the dark matter from ‘cold’ to either ‘warm’ or ‘self-interacting’ (e.g., Macc ́ıo & Fontanot 2010; Vogelsberger, Zavala & Loeb 2012; Lovell et al. 2012; Anderhalden et al. 2013; Rocha et al. 2013; Shao et al. 2013; Polisensky & Ricotti 2014), relatively small changes in the normalization, σ8, and/or spectral index, ns, of the initial power spectrum (e.g., Polisensky & Ricotti 2014), a highly stochastic star formation efficiency for galactic subhaloes, so that a fraction of the more massive subhaloes remain dark (e.g., Kuhlen, Madau & Krumholz 2013; Rodriguez-Puebla, Avila-Reese & Drory 2013a,b), lowering the mass of the MW host halo to ∼1011.8h−1M (Di Cintio et al. 2011; Wang et al. 2012; Vera-Ciro et al. 2013), and enhanced tidal (impulsive) heating of satellite galaxies due to the stellar disk of the Milky Way (Zolotov et al. 2012; Brooks & Zolotov 2014; Arraki et al. 2014).

Has anyone tried Scalar field dark matter to solve this TBTF problem?

From A brief Review of the Scalar Field Dark Matter model
On the other hand, we also studied the implications of a SFDM/BEC model at galactic scales. We find that the SFDM/BEC model gives a constant density profile that is consistent with RCs of dark matter dominated galaxies. The profile is as good as one of the most frequently used empirical core profiles but with the advantage of coming from a solid theoretical frame. We fit data within1kpc and found a logarithmic slope [itex]\alpha = -0.27\pm 0.18[/itex] in perfect agreement with a core.

Garth
 
Last edited:
  • #3
Hi wolram:

wolram said:
assess the too-big-to-fail (TBTF) problem

The discussion appears to be interesting except I have no idea what the TBTF problem is. Can you provide a defiition and perhaps a link to an article that discusses in some detail what the TBTF problem is all about rathan than solutions?

Regards,
Buzz
 
  • #4
Buzz Bloom said:
The discussion appears to be interesting except I have no idea what the TBTF problem is. Can you provide a defiition and perhaps a link to an article that discusses in some detail what the TBTF problem is all about rathan than solutions?

Did you read the first paragraph of Introduction section of the arXiv paper to which wolram linked?
 
  • #5
The inventory of MW satellite galaxies is not complete as evidenced by this paper http://arxiv.org/abs/1508.02381, Digging deeper into the Southern skies: a compact Milky-Way companion discovered in first-year Dark Energy Survey data, which also appeared yesterday

"The Dark Energy Survey (DES) is a 5000 sq. degree survey in the southern hemisphere, which is rapidly reducing the existing north-south asymmetry in the census of MW satellites and other stellar substructure. We use the first-year DES data down to previously unprobed photometric depths to search for stellar systems in the Galactic halo, therefore complementing the previous analysis of the same data carried out by our group earlier this year. Our search is based on a matched filter algorithm that produces stellar density maps consistent with stellar population models of various ages, metallicities, and distances over the survey area. The most conspicuous density peaks in these maps have been identified automatically and ranked according to their significance and recurrence for different input models. We report the discovery of one additional stellar system besides those previously found by several authors using the same first-year DES data. The object is compact, and consistent with being dominated by an old and metal-poor population. DES J0034-4902 is found at high significance and appears in the DES images as a compact concentration of faint blue point sources at ~ 87 {kpc}. Its half-light radius of r_h = 9.88 +/- 4.31 {pc} and total luminosity of M_V ~ -3.05_{-0.42}^{+0.69} are consistent with it being a low mass halo cluster. It is also found to have a very elongated shape. In addition, our deeper probe of DES 1st year data confirms the recently reported satellite galaxy candidate Horologium II as a significant stellar overdensity. We also infer its structural properties and compare them to those reported in the literature."
 
  • #6
Hi @George:

George Jones said:
Did you read the first paragraph of Introduction section of the arXiv paper to which wolram linked?

Here is the "definition" from the introduction:
the overabundance of massive, dense subhaloes predicted by CDM compared to the observed number of relatively luminous galaxies of the Milky Way or the Local Group.​
I need some help understanding what this means.
(1) What does "relatively luminous galaxies of the Milky Way" mean?
(2) If "of the Milky Way, or" is ignored, what is a massive subhalo?
(3) Why is the mismach between the CDM and observation called a "too-big-to-fail" problem?

Thanks for your post,
Buzz
 
  • #7
Hi @Chronos:

Chronos said:
The inventory of MW satellite galaxies is not complete

If I understand your post correctly, you are pointing out that the apparent TBTF mismatch between CDM and observation is not necessarily a real mismatch because the survey from which these observations were collected is incomplete. Is that correct?

Thanks for your post,
Buzz
 
  • #8
Buzz Bloom said:
Buzz Bloom said:

I think that the actual quote is "the overabundance of massive, dense subhaloes predicted by CDM compared to the observed number of relatively luminous satellite galaxies of the Milky Way or the Local Group"

This means "relatively bright galaxies that we we see orbiting our galaxy or orbiting (i.e., satellite) other large members of our Local Group of galaxies.

A halo is something that surrounds something else. We think that dark matter halos surround the normal matter in galaxies.

Try Sean Carroll's exposition:


http://www.preposterousuniverse.com...ies-that-are-too-big-to-fail-but-fail-anyway/[/user]
 
  • #9
Hi George:

Thanks very much for your prompt and excellent answers to my questions. The Sean Carrol article you cited is also an excellent explanation.

Regards,
Buzz
 

Related to Explore the Too-Big-to-Fail Problem: A Comprehensive Assessment

1. What is the "too-big-to-fail" problem?

The "too-big-to-fail" problem refers to the situation where a financial institution, usually a bank, is considered so large and interconnected that its failure would have a widespread and devastating impact on the economy. This can lead to a government bailout in order to prevent a collapse that could have serious consequences for the financial system.

2. How does the "too-big-to-fail" problem affect the economy?

The "too-big-to-fail" problem can have a significant impact on the economy as a whole. When a large financial institution fails, it can cause a ripple effect throughout the financial system, leading to a decrease in lending, a decrease in consumer spending, and a decrease in overall economic activity. This can result in job losses, a decrease in GDP, and other negative consequences for the economy.

3. What are the causes of the "too-big-to-fail" problem?

The "too-big-to-fail" problem can have multiple causes, but one of the main factors is the complex and interconnected nature of the financial system. This can make it difficult for regulators to accurately assess the risks associated with large financial institutions, and can also make it challenging to unwind these institutions in the event of a failure. Additionally, government policies and regulations can also contribute to the "too-big-to-fail" problem by providing incentives for institutions to take on more risk.

4. What are some potential solutions to the "too-big-to-fail" problem?

There are various proposed solutions to the "too-big-to-fail" problem, including stricter regulations for large financial institutions, breaking up these institutions to reduce their size and interconnectedness, and implementing a system where institutions are required to hold more capital to absorb potential losses. Other solutions include creating a "resolution authority" that would allow for a more orderly and controlled wind-down of a failing institution, and implementing a "systemic risk fee" that would require large institutions to pay into a fund that could be used in the event of a failure.

5. What are the potential consequences of not addressing the "too-big-to-fail" problem?

If the "too-big-to-fail" problem is not addressed, it could lead to continued economic instability and potential future financial crises. It can also create a sense of unfairness, as smaller, non-systemically important institutions may be at a disadvantage compared to their larger counterparts who may receive government bailouts if they fail. Additionally, the "too-big-to-fail" problem can also lead to a lack of competition in the financial sector, as smaller institutions may struggle to compete with the advantages and protections given to larger institutions.

Similar threads

Replies
2
Views
1K
Replies
5
Views
1K
Replies
12
Views
1K
Replies
1
Views
1K
Replies
8
Views
2K
Replies
6
Views
3K
Replies
1
Views
1K
  • Beyond the Standard Models
Replies
3
Views
2K
Replies
20
Views
2K
Back
Top