Summary: Neural networks are a form of AI that learns to discern patterns in data, but training them takes a lot of time. To combat this, Boris Knyazev, a current Ph.D. student at the University of Guelph in Ontario, and his colleagues have designed and trained a “hypernetwork” that could speed up the training process. Hypernetworks are a kind of overlord of other neural networks; humans update the parameters of the hypernetwork, which then fine-tunes the parameters of other neural networks to make better predictions. This approach has proven to be effective: GHN-2 – the hypernetwork designed by Knyazev – in, on average, nearly as accurate as what is achieved using the current method, stochastic gradient descent (SGD). Most importantly, GHN-2 is about 10,000 times faster.