• Question: How close is the engineering community to creating the next supercomputer?

    Asked by Scarlett to Andrew, Lizzie, Nick, Sonia on 23 Jun 2015.
    • Photo: Elizabeth Kapasa

      Elizabeth Kapasa answered on 23 Jun 2015:


      Hi Scarlett, its a bit out of my expertise so I’m not sure. I know that lots of countries are continually trying to develop a super-supercomputer. Pretty incredible stuff.
      I only did a quick search but this is the most recent article I found:
      http://www.theinquirer.net/inquirer/news/2404397/nvidia-starts-building-the-worlds-most-powerful-supercomputer

    • Photo: anon

      anon answered on 23 Jun 2015:


      There are many supercomputers in existance, they are mainly used in defence and academic research. they are built to provide computational power, usually measured in floating point operations per second – or flops.

      The fasters PCs run at 100 GigaFlops, or 100 thousdand floating point operations per second. The most powerful super computer is in China – the Tianhe-2, bench marking at 34 PetaFlops or 34 thousand million million floating point operations per second. next comes the CRAY XK7 which run at half the speed at 17 Petaflops

      IBM is in the process of building an evenfaster computer under the Blue Gene banner to beat the Tianhe-2. China wants to build Tianhe-3 a more powerful computer but the US has prohibited the export of IBM processor to China.

      What are these computers used for, in defense reseach for modelling how nuclear bombs work. A better use is to provide more accurate weather forecasts and provide accurate forecasts further into the future, currenlt only accurate to around 5 days. A more sinister use is to break encyption codes either on the internet to spy on its citizens or other governments. An encrytion key which can take 20 years to crack on your home PC can take seconds on a super computer

    • Photo: Andrew Phillips

      Andrew Phillips answered on 24 Jun 2015:


      Hi Scarlett,

      It’s not really my area of expertise, but I’m lucky to have access to a supercomputer that we use to run some of our analyses on. That’s pretty amazing, to know that a really power computer is working on things even when you’re out of the office. While computers are able to run more and more sophisticated models, we need to be careful to think about whether we really understand them. The same goes for processing really large amounts of data, how do you know what is important and what isn’t?

      When we build complex models we normally have simplified versions of the models so we know what we’re expecting to see, and can make sense of anything that is different. Often it’s the surprised that tell us the most!

Comments