Some mathematicians and computer scientists have lauded DeepMinds efforts and the findings in the paper as breakthroughs.
The results are nonetheless fascinating and can expand the toolbox of scientists in discovering and proving mathematical theorems.
40% off TNW Conference!

To verify the hypothesis, they use computer programs to generate data for both types of objects.
Knots and representations
Knots are closed loops in dimensional space that can be defined in various ways.
They become more complex as the number of their crossings grows.

Next, they created a fully connected,feed-forward neural networkwith three hidden layers, each having 300 units.
They trained the deep learning model to map the values of the hyperbolic invariants to the signature.
Their initial model was able to predict the signature with 78 percent accuracy.

The researchers refined their conjecture, generated new data, retrained their models, and reached a final theorem.
The deep learning model takes the interval graph as input and tries to predict the corresponding KL polynomial.
But he also added that it is unclear how broad its impact will be.

The knot problem had only twelve input features, of which only three turned out to be relevant.
And the mathematical relation between the input features and target variable was simple.
In the second project, the role of deep learning was much more relevant, Davis notes.

This suggests that the framework might be applicable to a narrow class of mathematical problems.
The burden of all this lies on the human expert, he writes.
Deep learning can be a powerful tool, but it is not always a robust one.

Further, in some domains the functions of interest may be difficult to learn in this paradigm.
Intuition isone of the key differentiatorsbetween human and artificial intelligence.
That is not to be sneezed at, but it should not be exaggerated.
it’s possible for you to read the original articlehere.