Investors flock to the UK’s tech ecosystem
Fast-growing UK IT companies raised £24bn this year. It brings UK tech...
Pandemics and calamities soon predicted by artificial intelligence
Natural disaster time and scale forecasting is a crucial goal for scientists. However, because they are statistically so rare, there isn’t enough information to reliably predict them.
Researchers from Brown University and the Massachusetts Institute of Technology claim that artificial intelligence-based tools exist to forecast them.
In a recent study, scientists effectively eliminated the need for vast amounts of data by combining statistical methods, which use less data to make accurate predictions, with efficient machine learning (an application of AI).
In a university release, study author and professor of applied mathematics and engineering at Brown University George Karniadakis stated, “You have to realise that these are stochastic phenomena.”
“We don’t have a lot of historical data on rare events like the COVID-19 pandemic outbreak, environmental catastrophes in the Gulf of Mexico, earthquakes, massive wildfires in California, and 30-meter waves that capsize ships.”
“To make predictions about them further into the future, we don’t have enough examples from the past. What is the best data that we can use to reduce the number of data points that we need? is the question that we address in the study.
The group found that sequential sampling combined with active learning was the most effective approach.
In order to find additional data points that are equally critical or more significant, these algorithms have the capacity to analyse incoming data and learn from it. In other words, it is possible to accomplish more with less information.
The machine learning model they used is a kind of artificial neural network called DeepOnet, which employs interconnected and stacked nodes to mimic the neuronal connections of the human brain.
By processing data across both neural networks, this tool merges the capabilities of two neural networks into one.
As a result, enormous amounts of data can be evaluated in a brief period of time while also producing enormous amounts of data in response.
The researchers were able to demonstrate that even in the absence of a significant amount of data, they can accurately identify warning indications of a catastrophic occurrence by using DeepOnet and active learning methodologies.
The objective is not to collect every piece of information and enter it into the system, according to Karniadakis, but rather to actively look for occurrences that will signify the odd events.
Although there might not be many instances of the actual catastrophe, he said, those precursors might exist. They may be found using mathematics, and when supplemented with real-world events, they will help this data-hungry operator be trained.
The group even found that their method may be superior than conventional models, and they both agree that their framework may set a standard for more precise predictions of sporadic natural occurrences.
They found that they can forecast when destructive waves that are more than twice the size of surrounding waves will emerge by looking at anticipated conditions over time. In their publication, the team describes how researchers may structure further tests to save costs and improve forecasting.
Catch all the Business News, Breaking News Event and Latest News Updates on The BOL News
Download The BOL News App to get the Daily News Update & Live News.