How Does Botany Impact Our Daily Meals?

In summary: Since the article was published in PNAS, it is hoist on its own petard. Quoting from the article,Since the article was published in PNAS, it is hoist on its own petard. Quoting from the article,Since the article was published in PNAS, it is hoist on its own petard.
  • #1
glondor
64
0
http://www.stumbleupon.com/toolbar/#url=http%2525253A//www.eigenfactor.org/map/maps.htm
 
Physics news on Phys.org
  • #2
And this means what?
 
  • #3
Evo said:
And this means what?

I think it means biology is WAY cooler than physics, based on popularity, or something like that. :biggrin:
 
  • #4
Moonbear said:
I think it means biology is WAY cooler than physics, based on popularity, or something like that. :biggrin:
:smile:

I guess so.
 
  • #5
glondor said:
http://www.stumbleupon.com/toolbar/#url=http%2525253A//www.eigenfactor.org/map/maps.htm

No meteorology?! This map sucks.
 
  • #6
I'm not bad at biology, it was one of my favorite subjects, so the map rocks.

http://img70.imageshack.us/img70/6785/pyrodancepleaseft2.gif
 
Last edited by a moderator:
  • #7
It sucks and is complete nonsense.

Biology (I would rather die than taking that crap or Chemistry)

I don't know but most EE people hate anything like Chemistry/bio .. and are math freaks to some extent
 
  • #8
I think these are fisheye vizualizations of three-dimensional maps - if that's true, the perspective is the only reason that some subjects show up bigger than others.
 
  • #9
rootX said:
It sucks and is complete nonsense.

Biology (I would rather die than taking that crap or Chemistry)


Heh, I know what you mean.
 
  • #10
Biology is awesome.
 
  • #11
But not as awesome as Biophysics ...
 
  • #12
All of science is awesome IMO.
 
  • #13
  • #14
the only thing worse than biology is botany.
 
  • #15
Evo said:
And this means what?
Possibilities include
  • That molecular and cellular biology receive a lot more funding than any other science.
  • That researchers in molecular and cellular biology publish a lot more papers than researchers in any other science.
  • That authors of molecular and cellular biology papers list a lot more references in their papers than do authors of papers in any other field.
  • That a map of the sciences based solely on citations provides a skewed view of the sciences.
 
Last edited:
  • #16
i'm still trying to figure out how Control Theory links to neuroscience and computer science, but no probability, mathematics, or engineering. maybe the field is just so old that there are no new contributions from those areas? economics could probably benefit from a little control theory, too.
 
  • #17
Electrical engineering is spread around in multiple tiny disciplines, and mechanical engineering is non-existent.

The latter omission points out one huge bias: Journal selection. They didn't use any ASME journals! "Death studies" is a scientific pursuit, but mechanical engineering isn't?

Another bias is in the way people in different disciplines write papers. The list of references can be rather short in a mathematics, the hard sciences, or engineering journal paper. In the social sciences, papers in which the list of references is longer than the body of the paper is the norm. A math paper with 20 references might well come back with reviewer comments, "why so many references?" A linguistics paper with 50 references might well come back with reviewer comments, "why so few references?"

A field where papers typically stand on their own merits will artificially suffer by the methodology apparently used by the developers of this map.
 
  • #18
rootX said:
It sucks and is complete nonsense.

Biology (I would rather die than taking that crap or Chemistry)

I don't know but most EE people hate anything like Chemistry/bio .. and are math freaks to some extent

That's because physicists and engineers are wusses and give up as soon as a subject gets challenging and complicated. :biggrin: :-p

*dons fireproof suit and runs for cover*
 
  • #19
Moonbear said:
That's because physicists and engineers are wusses and give up as soon as a subject gets challenging and complicated. :biggrin: :-p

*dons fireproof suit and runs for cover*
*finds a spare fireproof suit in moonbear's closet and dons it*

Nah. It's just that you biological and medical science people insert dozens of references to unrelated articles in your papers, making the subject appear to be challenging and complicated.

That technique sure fooled this "map".
 
  • #20
meh, medical science doesn't have to worry about being right, only statistically significant.
 
  • #21
Nobody has explained what this "eigenfactor" score means...
 
  • #22
cepheid said:
Nobody has explained what this "eigenfactor" score means...
Described here:
Rosvall, M., Bergstrom, C.T.,"Maps of random walks on complex networks reveal community structure", Proceedings of the National Academy of Sciences USA. 105:1118-1123
arXiv preprint: http://arxiv.org/abs/0707.0609v3

My reading: The authors did something akin to website rankings by search engines. They
  • Categorized each 6,128 journals as belonging to one of several groups of science,
  • Extracted the citations from the 2004-2007 issues of those journals,
  • Eliminated citations to other articles in the same journal as the article in question, and
  • Analyzed the remaining network of citations.

Medicine and biology show up as such huge nodes because papers in journals classified as "medicine" and "molecular and cell biology" tend to have huge lists of references. Even more importantly, papers in journals classified as "medicine" reference articles in journals classified as "molecular and cell biology" (and vice versa). Those cross-grouping references really kick up the eigenfactor score. In other words, medicine and biology show up as such huge nodes in part because of observational bias.

Since the article was published in PNAS, it is hoist on its own petard. Quoting from the article,
We also exclude the only three major journals that span a broad range of scientific disciplines: Science, Nature, and Proceedings of the National Academy of Sciences; the broad scope of these journals otherwise creates an illusion of tighter connections among disciplines, when in fact few readers of the physics articles in Science are also close readers of the biomedical articles therein.​
The real problem is of course that Science, Nature, and PNAS cannot be categorized. That the methodology necessarily has to exclude the three most prestigious journals in all of science says something might be amiss.
 
Last edited:
  • #24
the fact that eigenfactor is trademarked and the website says this is non-commercial rather than non-profit of course brings up the profit motive.

so the next question for me is what is the mechanism for profit? some journals might see sales increase if librarians or others put stock in this scoring mechanism.
 
  • #25
ZapperZ said:
You guys must have missed a recent Nature News article on something similar. Unfortunately, unless you have a subscription to Nature, the free access to the article is now gone. I did, however, wrote a little bit on the article elsewhere:

http://physicsandphysicists.blogspot.com/2008/10/is-physics-better-than-biology.html

Zz.

Okay, biologists are more social then and don't mind giving credit to more people in more subspecialties! :-p :smile:
 
  • #26
Moonbear said:
Okay, biologists are more social then and don't mind giving credit to more people in more subspecialties! :-p :smile:

heh, no, i think it means that some disciplines have a higher information content than others.
 
  • #27
ZapperZ said:
Since the free free access period has expired, I'll have to draw from Zapper's blog,
For example, for papers published in 1999, articles with 100 citations are 50 times more common in developmental biology than in aerospace engineering.​
OMG! 100 citations! My reading is mostly in the field of aerospace engineering. If I ran across an article with 100 citations I would immediately regard it as suspect. We do not reference the Principia Mathematica, for example. An article should pretty much stand on its own merits in aerospace. References in aerospace exist to establish context and show the authors aren't just reinventing the H-infinity controller (and don't know that that is what they are doing). Continuing,
But if the citation counts are divided by the average number of citations per paper for the discipline in that year, the resulting statistical distributions are remarkably similar.​
This is something that the developers of the eigenfactorTM metric did not do.
 
  • #28
D H said:
Since the free free access period has expired, I'll have to draw from Zapper's blog,
For example, for papers published in 1999, articles with 100 citations are 50 times more common in developmental biology than in aerospace engineering.​

I'm not sure that's true either. I've rarely seen articles with anywhere close to 100 citations unless they are long review articles rather than original research. The norm in my field is around 30 to 40, give or take a dozen. The only place one routinely sees references exceeding 100 or so is in grant applications, which for good reason need to show you know the literature very well and have considered everything that could possibly be raised as a concern of a reviewer.

But it seems completely normal to have references outside your specific field...that's necessary. Just because someone is studying, for example, developmental biology, doesn't mean they can ignore the literature on a gene they're studying as it applies to adults in another field. Drawing an artificial boundary between disciplines seems counterproductive.

If you're studying aerospace engineering, would you ignore a relevant article in a physics journal presenting some new research on certain materials, or in an electrical engineering journal if you were using those circuits to control your systems? Surely the notion of crossing disciplines isn't unheard of in physics or engineering, is it?
 
  • #29
Moonbear said:
I've rarely seen articles with anywhere close to 100 citations unless they are long review articles rather than original research. The norm in my field is around 30 to 40, give or take a dozen.
The norm in aerospace is 7, plus or minus 2. We know the cognitive limits of our fellow aerospace engineers.
 
  • #30
D H said:
The norm in aerospace is 7, plus or minus 2. We know the cognitive limits of our fellow aerospace engineers.

How is that a thorough literature review? Sounds rather lazy! :bugeye:
 
  • #31
Moonbear said:
How is that a thorough literature review? Sounds rather lazy! :bugeye:
Aerospace is an anomaly. A good chunk of the knowledge in aerospace is in the heads of those with well-aged flatulence (i.e., the old farts in the company). We can't publish a lot of what we know because of security and ITAR restrictions. The articles that are published pretty much stand on their own merit.

Suppose you come up with a spankin' new spacecraft control algorithm. It doesn't have much to compete with: The phase space control schemes used since the 60s, various optimal control schemes such as H-infinity control, and that's about it. You cite one or two articles on phase space control, one article on basic concepts of optimal control, two or three articles on specific optimal controls, and your done. You don't have to cite the literature on the 'ilities (controllability, stability, fuel frugality, safety, ...) (yes, safety is an 'ility) because everyone who can read your article knows exactly what the metrics are -- and you had dang well better have shown in the text of the article that your spankin' new control algorithm spanks the existing lot of controllers when it comes to the 'ilities.
 
  • #32
tribdog said:
the only thing worse than biology is botany.

Hey! I resemble that remark. :) Didja ever eat cantaloupe, spaghetti with tomato sauce?
No Botany == no veggies and pretty much no meat either.
 

Related to How Does Botany Impact Our Daily Meals?

1. What is an "Interesting map of the sciences"?

An "Interesting map of the sciences" is a visual representation of the various fields and subfields of science, showing their relationships and connections to each other. It is often used to illustrate the vastness and complexity of the scientific world.

2. Who created the "Interesting map of the sciences"?

The "Interesting map of the sciences" was created by a team of scientists and researchers at the University of Oxford, led by physicist and philosopher Dr. James Ladyman. The map was first published in 2002 and has since been updated and revised multiple times.

3. How is the "Interesting map of the sciences" organized?

The map is organized into seven main branches of science: Mathematics, Physical Sciences, Biological Sciences, Social Sciences, Formal Sciences, Applied Sciences, and Humanities. Each branch is further divided into subfields, with connections and overlaps between them.

4. What is the purpose of the "Interesting map of the sciences"?

The purpose of the map is to provide a comprehensive overview of the different areas of science and their relationships, helping to promote interdisciplinary collaboration and understanding. It also serves as a tool for students and researchers to explore new areas of study and identify potential connections between fields.

5. Is the "Interesting map of the sciences" a complete representation of all scientific fields?

No, the map is not meant to be a complete representation of all scientific fields. It is constantly evolving and may not include every subfield or specialty within each branch of science. Additionally, new fields of study are constantly emerging, making it impossible to create a completely comprehensive map.

Similar threads

  • New Member Introductions
Replies
2
Views
144
Replies
6
Views
980
Replies
14
Views
2K
  • Classical Physics
Replies
12
Views
1K
  • Classical Physics
Replies
5
Views
909
Replies
1
Views
1K
Replies
13
Views
2K
  • Earth Sciences
Replies
4
Views
2K
Replies
10
Views
3K
  • General Discussion
2
Replies
42
Views
3K
Back
Top