This is just a quick note that some folks have put together a site call Virtual Low-dimensional Topology “to organize virtual activities related to all facets of low-dimensional topology.” It includes working groups, virtual office hours with topologists, links to seminars, and more!
Isotopy in dimension 4
Four-dimensional manifold theory is remarkable for a variety of reasons. It has the only outstanding generalized smooth Poincare conjecture. It is the only dimension where vector spaces have more than one smooth structure. The only dimension with an unresolved generalized Shoenflies problem. The list goes on. One issue that is perhaps not discussed enough is the paucity of theorems about smooth isotopy. In dimensions 2 and 3, the Schoenflies and Alexander theorems are the backbone of all theorems about isotopy, allowing one to work from the ground-up.
The Topology of Neural Networks, Part 2: Compositions and Dimensions
In Part 1 of this series, I gave an abstract description of one of the main problems in Machine Learning, the Generalization Problem, in which one uses the values of a function at a finite number of points to infer the entire function. The typical approach to this problem is to choose a finite-dimensional subset of the space of all possible functions, then choose the function from this family that minimizes something called cost function, defined by how accurate each function is on the sampled points. In this post, I will describe how the regression example from the last post generalizes to a family of models called Neural Networks, then describe how I recently used some fairly basic topology to demonstrate restrictions on the types of functions certain neural networks can produce.
The Topology of Neural Networks, Part 1: The Generalization Problem
I gave a talk a few months ago at the Thompson-Scharlemann-Kirby conference about a theorem I recently proved about topological limitations on certain families of neural networks. Most of the talk was a description of how neural networks work in terms of more abstract mathematics than they’re typically described in, and I thought this was probably a good thing to write up in a blog post. I decided to split the post into two parts because it was getting quite long. So the first post will describe the general approach to defining Machine Learning models, and the second post will cover Neural Networks in particular.
Two widely-believed conjectures. One is false.
I haven’t blogged for a long time- for the last few years, my research has been leading me away from low dimensional topology and more towards foundations of quantum physics. You can read my latest paper on the topic HERE.
Today I’d like to tell you about a preprint by Malyutin, that shows that two widely believed knot theory conjectures are mutually exclusive!
A Malyutin, On the Question of Genericity of Hyperbolic Knots, https://arxiv.org/abs/1612.03368.
Conjecture 1: Almost all prime knots are hyperbolic. More precisely, the proportion of hyperbolic knots amongst all prime knots of or fewer crossings approaches 1 as
approaches
.
Conjecture 2: The crossing number (the minimal number of crossings of a knot diagram of that knot) of a composite knot is not less than that of each of its factors. Continue reading
You must be logged in to post a comment.