1 paper accepted to ICML

Our paper on convergence of gradient flow on multi-layer linear networks [1] has been accepted to International Conference on Machine Learning! Congrats Hancheng!

[1] H. Min, R. Vidal, and E. Mallada, “On the Convergence of Gradient Flow on Multi-layer Linear Models,” in International Conference on Machine Learning (ICML), 2023, pp. 1-8.
[Bibtex] [Abstract] [Download PDF]

Much of the theory for classical sparse recovery is based on conditions on the dictionary that are both necessary and sufficient (e.g., nullspace property) or only sufficient (e.g., incoherence and restricted isometry). In contrast, much of the theory for subspace-preserving recovery, the theoretical underpinnings for sparse subspace classification and clustering methods, is based on conditions on the subspaces and the data that are only sufficient (e.g., subspace incoherence and data inner-radius). This paper derives a necessary and sufficient condition for subspace-preserving recovery that is inspired by the classical nullspace property. Based on this novel condition, called here subspace nullspace property, we derive equivalent characterizations that either admit a clear geometric interpretation, relating data distribution and subspace separation to the recovery success, or can be verified using a finite set of extreme points of a properly defined set. We further exploit these characterizations to derive new sufficient conditions, based on inner-radius and outer-radius measures and dual bounds, that generalize existing conditions, while preserving the geometric interpretations. These results fill an important gap in the subspace-preserving recovery literature

@inproceedings{mvm2023icml,
  abstract = {Much of the theory for classical sparse recovery is based on conditions on the dictionary that are both necessary and sufficient (e.g., nullspace property) or only sufficient (e.g., incoherence and restricted isometry). In contrast, much of the theory for subspace-preserving recovery, the theoretical underpinnings for sparse subspace classification and clustering methods, is based on conditions on the subspaces and the data that are only sufficient (e.g., subspace incoherence and data inner-radius). This paper derives a necessary and sufficient condition for subspace-preserving recovery that is inspired by the classical nullspace property.
Based on this novel condition, called here subspace nullspace property, we derive equivalent characterizations that either admit a clear geometric interpretation, relating data distribution and subspace separation to the recovery success, or can be verified using a finite set of extreme points of a properly defined set. We further exploit these characterizations to derive new sufficient conditions, based on inner-radius and outer-radius measures and dual bounds, that generalize existing conditions, while preserving the geometric interpretations. These results fill an important gap in the subspace-preserving recovery literature},
  author = {Min, Hancheng and Vidal, Rene and Mallada, Enrique},
  booktitle = {International Conference on Machine Learning (ICML)},
  grants = {TRIPODS-1934979, CAREER-1752362},
  month = {4},
  pages = {1-8},
  pubstate = {accepted, submitted Jan 2023},
  title = {On the Convergence of Gradient Flow on Multi-layer Linear Models},
  url = {https://mallada.ece.jhu.edu/pubs/2023-ICML-MVM.pdf},
  year = {2023}
}