Can a supercomputer change the way we view statistics?

  • Thread starter Evo
  • Start date
  • Tags
    Shift
In summary: That's my point. It's 2008 and the only way to fly is to pay an arm and a leg.Lies, damned lies, and statistics.
  • #1
Evo
Staff Emeritus
Science Advisor
24,017
3,337
This is a cool little video about statistics. I know some are right, most I haven't bothered to verify yet. Anyone that has any statistics that agree or disagree would be welcomed.

Of course the part about the supercomputer's capacity and processing doesn't mean it has the ability to think, so don't get your knicker's in a knot thinking otherwise.

(I love that soundtrack).

 
Last edited by a moderator:
Physics news on Phys.org
  • #2
So no predictions of flying cars yet?
 
  • #4
The tune is from Vangelis:

 
Last edited by a moderator:
  • #5
Vangelis is awesome.
 
  • #6
What the heck, I say we just go ahead and make MySpace a country.
 
  • #7
Andre said:
The tune is from Vangelis:

That's beautiful, thanks Andre!
 
Last edited by a moderator:
  • #8
hypatia said:
What the heck, I say we just go ahead and make MySpace a country.

online_communities.png
 
  • #9
Poop-Loops said:
So no predictions of flying cars yet?

The prediction of flying cars in the future is already in the past:smile:

http://www.paleofuture.com/search/label/flying%20cars
 
Last edited by a moderator:
  • #10
That's my point. It's 2008 and the only way to fly is to pay an arm and a leg.
 
  • #11
Lies, damned lies, and statistics.

But cool.
 
  • #12
The supercomputer predictions could actually happen. Intel announced they are working on a 80 core processor and possibly CPUs in the future could contain hundreds of independent cores.
 
  • #13
You're still limited by the software. How exactly do you split up the operations between each core? Moreover, you definitely start getting diminishing returns at some point because you have to co-ordinate all of the cores together and that by itself will take a while.

To make a very simplistic analogy, imagine you are doing a Riemann sum in some brute force calculation that normally takes a week to get done. The easiest thing would be to split it up so that each of the 80 cores gets a chunk to work on and then just add all that up.

The problem arises when you get methods/functions in your code that can't be split between other CPU's, so that ends up being your bottle neck.
 
  • #14
Poop-Loops said:
You're still limited by the software. How exactly do you split up the operations between each core? Moreover, you definitely start getting diminishing returns at some point because you have to co-ordinate all of the cores together and that by itself will take a while.

To make a very simplistic analogy, imagine you are doing a Riemann sum in some brute force calculation that normally takes a week to get done. The easiest thing would be to split it up so that each of the 80 cores gets a chunk to work on and then just add all that up.

The problem arises when you get methods/functions in your code that can't be split between other CPU's, so that ends up being your bottle neck.

Amdahl's law.
 
  • #15
You raised some valid points. However, there are ingenious algorithms in existence that can effectively divide work load among different processor. I believe one of them is MPI. And its the basis for many supercomputers around the world. Like the bluegene for instance, has 130000 CPUs hooked up together, half of which are used for communication. It's up to the software programmer to effectively utilize that computing power, even if inefficiently, the net result is still spectacular.
 
  • #16
Of course. I'm just saying that there is bound to be a limit where shoving in more CPU's just won't do anything, no matter how brilliant the programmer.

Your example of Bluegene, I'm not much familiar with it, but if you say half are to do with communications, then it's not really 1 program running, but a batch of them communicating with each other. What I am saying is if you have something like Doom running and you want it to run better, you can only do so much on the hardware side.

Of course, the nice thing is that programs are so complex these days that you could easily split up portions of the program to different CPU's. You're never going to get half of a program that can't be split into different cores or something. There's just too much independent stuff going on for that.
 
  • #17
The exponential computer curve seems to have hit a snag in voice recognition software a few years ago; it would be great to see some exponential improvements in that field...
 
  • #18
For Globalization:

Seems like the original video is by some American .. (probably a patriotic one)

I read somewhere that Investors are moving to Asia (especially after credit crunch), so it's not only Asians who are enjoying off-sourcing. Also, Microsoft is hiring for work in China (at my university ... )
 

Related to Can a supercomputer change the way we view statistics?

1. Can a supercomputer improve the accuracy of statistical analyses?

Yes, supercomputers are capable of processing large amounts of data at a much faster rate than traditional computers. This allows for more complex statistical analyses to be performed, resulting in more accurate results.

2. How can a supercomputer impact the field of statistics?

A supercomputer can greatly impact the field of statistics by allowing for the analysis of larger and more complex datasets. This can lead to new insights and discoveries, as well as improved data-driven decision making.

3. Are there any limitations to using a supercomputer for statistical analyses?

While supercomputers have greatly advanced statistical analyses, they do have limitations. They may not be able to handle certain types of data, such as unstructured data, and may require specialized programming and technical expertise to operate.

4. Can a supercomputer replace human statisticians?

No, a supercomputer cannot replace human statisticians. While they can perform complex calculations and analyses, they lack the critical thinking and problem-solving abilities of a human mind. Supercomputers should be seen as a tool to assist statisticians, not replace them.

5. How does using a supercomputer for statistical analyses impact the time and cost of research?

Using a supercomputer for statistical analyses can greatly reduce the time and cost of research. With its ability to process data at a much faster rate, researchers can analyze larger datasets in a shorter amount of time. This can also lead to cost savings, as it may eliminate the need for purchasing expensive hardware or software for statistical analyses.

Similar threads

  • STEM Academic Advising
Replies
9
Views
1K
Replies
4
Views
1K
Replies
33
Views
5K
Replies
46
Views
2K
Replies
29
Views
5K
  • General Discussion
Replies
2
Views
3K
  • STEM Career Guidance
Replies
4
Views
2K
Replies
16
Views
2K
  • General Discussion
Replies
11
Views
2K
Replies
12
Views
2K
Back
Top