Training neural networks is time-consuming and difficult. Generally, it involves multiple PhDs tweaking the network over time, testing and retesting, adjusting the architecture and parameters, and, hopefully, ending up with a network that understands its inputs and makes accurate judgments. And as we ask neural networks to do more and complex tasks, they themselves are getting more and more complex. Which all begs the question: is there a better way to create deep learning architectures? Is there a better way to train these networks?
We think there is.
In a new paper published in early March by a host of Sentient Technologies authors, we unveiled a novel, automated approach to training neural networks called CoDeepNEAT. The paper highlights research that uses AI to evolve the neural network architecture itself. The genetic algorithms evolve the hyperparameters, topologies, and components, achieving results comparable to the best human designs. This approach does in fact require massive amounts of available compute, but as this resource continues to become more and more available, we feel this is a promising approach in the years ahead.
The paper discusses our research in the area and gives a concrete example of an ENN labeling images for a large online magazine.
Hope you enjoy!