Speaker: Steven Damelin (The American Mathematical Society) Abstract: A classical problem in geometry goes as follows. Suppose we are given two sets of $D$ dimensional data, that is, sets of points in $R^D$. The data sets are indexed by the same set, and we know that pairwise distances between corresponding points are equal in the two data sets. In other words, the sets are isometric. Can this correspondence be extended to an isometry of the ambient Euclidean space? In this form the question is not terribly interesting; the answer has long known to be yes (see [Wells and Williams 1975], for example). But a related question is actually fundamental in data analysis: here the known points are samples from larger, unknown sets—say, manifolds in $R^D$—and we seek to know what can be said about the manifolds themselves. A typical example might be a face recognition problem, where all we have is multiple finite images of people’s faces from various views. An added complication is that in general we are not given exact distances. We have noise and so we need to demand that instead of the pairwise distances being equal, they should be close in some reasonable metric. Some results on almost isometries in Euclidean spaces can be found in [John 1961; Alestalo et al. 2003]. I will discuss various works in progress re this problem with Michael Werman (Hebrew U), Kai Diethelm (Braunschweig) and Charles Fefferman (Princeton). As it turns out the problem relates to the problem of Whitney extensions, interpolation in $R^D$ and bounds for Hilbert transforms. Moreover for practical algorithms there is a natural deep learning framework as well for both labelled and unlabeled data.
Mon, 18/12/2017 - 14:00 to 16:00
Sprinzak Building, Room 28