Products You May Like
The vast number-crunching capabilities of artificial intelligence systems mean we can better predict the future of chaotic systems based on fewer and fewer patterns of the past – and a new algorithm is adding even more accuracy to the process.
Developed through next-gen reservoir computing techniques, which take a more dynamic, speedier approach to machine learning, the new algorithm improves predictions of complex physical processes such as the global weather forecast.
Calculations of these processes – known as spatiotemporal chaotic systems – can now be done in a fraction of the time, with greater accuracy, using fewer computational resources, and based on less training data.
“This is very exciting, as we believe it’s a substantial advance in terms of data processing efficiency and prediction accuracy in the field of machine learning,” says physicist Wendson de sa Barbosa, from Ohio State University.
Machine learning is exactly that: computer algorithms using a discovery process to make predictions (such as future weather patterns) based on large data archives (such as past weather patterns).
The reservoir computing approach attempts to more closely mimic the human brain, feeding information into a ‘reservoir’ of randomly connected artificial neurons as a means of discovering useful patterns. The results are then used to inform future cycles of learning.
Over time, these systems have become more streamlined and efficient. One innovation in machine learning has allowed distinct compoments of the predictive model to occur in parallel. Using this kind of architecture with the latest reservoir computing technology allows algorithms to spot potential symmetries in what is otherwise a chaotic mess of information.
The researchers tested their new approach on an atmospheric weather model. Using a normal laptop running Windows software, they were able to make predictions in a fraction of a second that previously needed a supercomputer. In this particular case, the calculations were made 240,000 times faster than with traditional algorithms.
“If one knows the equations that accurately describe how these unique processes for a system will evolve, then its behavior could be reproduced and predicted,” says de sa Barbosa.
Machine learning algorithms can be used to predict all kinds of future events, finding applications in fields as mundane as mining new resources to those as alarming as social engineering.
As these scenarios get more complex, there are more and more variables to consider, pushing the limits on computational resources. Machine learning systems are able to spot patterns in past data that it would be impossible for a human eye to spot, and then watch out for those patterns repeating. They can also feed back on themselves to improve their accuracy over time.
According to the researchers, further down the line these new and improved algorithms could be used in a wide variety of situations – such as monitoring the patterns of a heartbeat, spotting health issues that would otherwise get missed.
“Modern machine learning algorithms are especially well-suited for predicting dynamical systems by learning their underlying physical rules using historical data,” says de sa Barbosa.
“Once you have enough data and computational power, you can make predictions with machine learning models about any real-world complex system.”
The research has been published in Chaos: An Interdisciplinary Journal of Nonlinear Science.