Is There a Standard Method for Judging Experiments in Meta-Analysis?

  • Thread starter GFX
  • Start date
In summary, the conversation highlights the topic of telekinesis and its relation to quantum theory. While some believe in the potential for telekinesis, others view it as a pseudoscientific area that should be approached with a critical eye. The conversation also mentions the potential for unexpected discoveries in the pursuit of supernatural phenomena, and the use of science for various purposes, including entertainment. Ultimately, there is no solid evidence to support the existence of telekinesis, and any claims should be approached with skepticism.
  • #36
RE: "I don't know what happened there."

Are you sure he said that? I was thinking he would say something like "In the absence of definite knowledge, an answer to your query will not be forthcoming."
 
Physics news on Phys.org
  • #37
Phew - you guys made quick work of Radin! I will read the paper when I get a chance, but it sounds like I'll be able to fly a UFO through the holes.

However, although it is tempting to dismiss the other work on the grounds that operator 10 may well have done a bit of 'automatic writing' when it came to recording data...

"operator 10," believed to be a PEAR staff member, "has been involved in 15% of the 14 million trials, yet contributed to a full half of the total excess hits" (McCrone 1994).

... let's not dismiss the possibility that operator 10 might have unusual abilities. The fact that we have evidence suggesting that the results are not what they seem doesn't prove that something paranormal was not at work. It just means that in this case operator 10 needs to stick to the role of participant, and not investigator or other staff member. We don't want to totally dismiss conclusions on the grounds of circumstantial evidence (e.g. that we can't rule out high jinks) any more than we want to totally accept things on the grounds of circumstantial evidence (e.g. strong correlations or hearsay). Poorly designed work and wishful thinking does not negate the possibility that there may be something there worth investigating properly.
 
  • #38
RE: "Phew - you guys made quick work of Radin! I will read the paper when I get a chance, but it sounds like I'll be able to fly a UFO through the holes."

If you find reading the paper tough-sledding, it won't be because they are smart. They simply can't write fer ****.
 
  • #39
I was wondering if we could backtrack a bit to meta-analysis. I’m curious about the statement made by JohnDubYa on page one:

“The individual studies do not have the same designs. Sure, the experimenters reject those studies that have sufficiently dissimilar designs (as if you can really define "similar"). But all this does is give them one subjective means of throwing out experiments that they know will not help their cause.”

I’d like to ask if there is a standard technique for judging which experiments are allowed into a meta-anlysis and which aren’t. Some kind of method that everybody can use to lessen the subjectivity of which data are included. Does such a thing exist?
 

Similar threads

  • General Discussion
Replies
6
Views
1K
Replies
14
Views
911
Replies
3
Views
2K
  • Programming and Computer Science
Replies
14
Views
1K
  • General Discussion
Replies
12
Views
1K
Replies
9
Views
1K
Replies
1
Views
94
  • Beyond the Standard Models
Replies
3
Views
2K
Replies
5
Views
2K
Back
Top