Elon Musk's goals to build a safer AI for the world is yet another step closer to fruition. NVIDIA CEO Jen-Hsung Huang has just delivered the first DGX-1 supercomputer to the non-profit organization OpenAI.

This is in hopes of helping the non-profit to achieve and "advance digital intelligence in the way that is most likely to benefit humanity" and "unconstrained by a need to generate financial return."

NVIDIA's DGX-1 supercomputer is the first of its kind. It has a gigantic 170 teraflops of computing power, which amounts to at least 250 conventional servers. Its development cost $2 billion.

NVIDIA praised its supercomputer as the fastest around, with its DGX-1 supercomputer as an "AI in a box."

And the tool to help them? Reddit. According to the NVIDIA blog, Reddit's huge size was not a hindrance for the supercomputer. It's in fact because of its size that it was chosen as a means for the AI to study.

According to Futurism, the AI will be using deep learning to learn from Reddit. It's a special class of models because scaling up means better results. It will sift through two billion Reddit comments. Aside from its 170 teraflops, it has 7TB of SSD storage, two Xeon processors and eight NVIDIA Tesla P100 GPUs.

DGX-1 will take it to Reddit to learn how to study "fast" and chat more accurately. Apparently, it will take a large amount of data on just how people talk to each other on the internet. It's like training a chatbot how to talk, but it's learning the way computers do.

The supercomputer appears to make things easier for OpenAI as well. According to OpenAI researcher Andrej Karpathy, they just have to increase the size of the model they have with their existing code, and the AI will do the rest.