Astronomers led by the California Institute of Technology (Caltech) have created yet the most detailed and accurate model of the Milky Way galaxy's formation.

The simulation also solved the mystery involving the group of smaller galaxies that should have been surrounding the larger Milky Way galaxy. Previous simulations required the presence of thousands of these smaller galaxies, which would have matched the mathematical model of the Milky Way to the real one. However, scientists only found about 30 of them, and they have been tweaking the simulations in hopes of finding the missing galaxies.

Using a network of 2,000 computers running in parallel, the researchers were able to create a new simulation that looks like the actual Milky Way, and accurately predict the number of smaller galaxies.

"That was the aha moment, when I saw that the simulation can finally produce a population of dwarf galaxies like the ones we observe around the Milky Way," Andrew Wetzel, postdoctoral fellow at Caltech and Carnegie Observatories in Pasadena and lead author of the study, said in a press release.

Astronomers had previously thought that their understanding of dark matter was incorrect in previous simulations. But with the new model, the scientist found that by accurately simulating the effects of supernovae, they were able to uncover the mystery.

According to the new computer simulation, the missing galaxies started forming as predicted, but were blown apart by supernovae before they even reach maturity. A supernova happens when extremely hot, massive stars run out of fuel and die in a dramatic explosion. When a star explodes in supernova, it sends ripples of strong winds that could obliterate the young dwarf galaxies surrounding the Milky Way, thus explaining why only few of them could be observed today.

"In a galaxy, you have 100 billion stars, all pulling on each other, not to mention other components we don't see like dark matter," Phil Hopkins, associate professor of theoretical astrophysics at Caltech and principal scientist for the study, said in the same statement.

"To simulate this, we give a supercomputer equations describing those interactions and then let it crank through those equations repeatedly and see what comes out at the end."

The updated supernovae data on the new simulation took about 700,000 CPU hours to complete, which means it would have taken 80 years to run the simulation on a normal computer.

The scientists are now working on using up to 20 million CPU hours for the next round of research to predict even the smallest of dwarf galaxies left to be found.