Yuetian Luo
YOU?
Author Swipe
View article: Robust Confidence Intervals for a Binomial Proportion: Local Optimality and Adaptivity
Robust Confidence Intervals for a Binomial Proportion: Local Optimality and Adaptivity Open
This paper revisits the classical problem of interval estimation of a binomial proportion under Huber contamination. Our main result derives the rate of optimal interval length when the contamination proportion is unknown under a local min…
View article: Are all models wrong? Fundamental limits in distribution-free empirical model falsification
Are all models wrong? Fundamental limits in distribution-free empirical model falsification Open
In statistics and machine learning, when we train a fitted model on available data, we typically want to ensure that we are searching within a model class that contains at least one accurate model -- that is, we would like to ensure an upp…
View article: Adaptive Robust Confidence Intervals
Adaptive Robust Confidence Intervals Open
This paper studies the construction of adaptive confidence intervals under Huber's contamination model when the contamination proportion is unknown. For the robust confidence interval of a Gaussian mean, we show that the optimal length of …
View article: Nonconvex Factorization and Manifold Formulations Are Almost Equivalent in Low-Rank Matrix Optimization
Nonconvex Factorization and Manifold Formulations Are Almost Equivalent in Low-Rank Matrix Optimization Open
In this paper, we consider the geometric landscape connection of the widely studied manifold and factorization formulations in low-rank positive semidefinite (PSD) and general matrix optimization. We establish a sandwich relation on the sp…
View article: Is Algorithmic Stability Testable? A Unified Framework under Computational Constraints
Is Algorithmic Stability Testable? A Unified Framework under Computational Constraints Open
Algorithmic stability is a central notion in learning theory that quantifies the sensitivity of an algorithm to small changes in the training data. If a learning algorithm satisfies certain stability properties, this leads to many importan…
View article: The Limits of Assumption-free Tests for Algorithm Performance
The Limits of Assumption-free Tests for Algorithm Performance Open
Algorithm evaluation and comparison are fundamental questions in machine learning and statistics -- how well does an algorithm perform at a given modeling task, and which algorithm performs best? Many methods have been developed to assess …
View article: Computational Lower Bounds for Graphon Estimation via Low-degree Polynomials
Computational Lower Bounds for Graphon Estimation via Low-degree Polynomials Open
Graphon estimation has been one of the most fundamental problems in network analysis and has received considerable attention in the past decade. From the statistical perspective, the minimax error rate of graphon estimation has been establ…
View article: Iterative Approximate Cross-Validation
Iterative Approximate Cross-Validation Open
Cross-validation (CV) is one of the most popular tools for assessing and selecting predictive models. However, standard CV suffers from high computational cost when the number of folds is large. Recently, under the empirical risk minimizat…
View article: Exact Clustering in Tensor Block Model: Statistical Optimality and Computational Limit
Exact Clustering in Tensor Block Model: Statistical Optimality and Computational Limit Open
High-order clustering aims to identify heterogeneous substructures in multiway datasets that arise commonly in neuroimaging, genomics, social network studies, etc. The non-convex and discontinuous nature of this problem pose significant ch…
View article: Nonconvex Matrix Factorization is Geodesically Convex: Global Landscape Analysis for Fixed-rank Matrix Optimization From a Riemannian Perspective
Nonconvex Matrix Factorization is Geodesically Convex: Global Landscape Analysis for Fixed-rank Matrix Optimization From a Riemannian Perspective Open
We study a general matrix optimization problem with a fixed-rank positive semidefinite (PSD) constraint. We perform the Burer-Monteiro factorization and consider a particular Riemannian quotient geometry in a search space that has a total …
View article: Tensor-on-Tensor Regression: Riemannian Optimization, Over-parameterization, Statistical-computational Gap, and Their Interplay
Tensor-on-Tensor Regression: Riemannian Optimization, Over-parameterization, Statistical-computational Gap, and Their Interplay Open
We study the tensor-on-tensor regression, where the goal is to connect tensor responses to tensor covariates with a low Tucker rank parameter tensor/matrix without the prior knowledge of its intrinsic rank. We propose the Riemannian gradie…
View article: On Geometric Connections of Embedded and Quotient Geometries in Riemannian Fixed-rank Matrix Optimization
On Geometric Connections of Embedded and Quotient Geometries in Riemannian Fixed-rank Matrix Optimization Open
In this paper, we propose a general procedure for establishing the geometric landscape connections of a Riemannian optimization problem under the embedded and quotient geometries. By applying the general procedure to the fixed-rank positiv…
View article: Nonconvex Factorization and Manifold Formulations are Almost Equivalent in Low-rank Matrix Optimization
Nonconvex Factorization and Manifold Formulations are Almost Equivalent in Low-rank Matrix Optimization Open
In this paper, we consider the geometric landscape connection of the widely studied manifold and factorization formulations in low-rank positive semidefinite (PSD) and general matrix optimization. We establish a sandwich relation on the sp…
View article: Low-rank Tensor Estimation via Riemannian Gauss-Newton: Statistical Optimality and Second-Order Convergence
Low-rank Tensor Estimation via Riemannian Gauss-Newton: Statistical Optimality and Second-Order Convergence Open
In this paper, we consider the estimation of a low Tucker rank tensor from a number of noisy linear measurements. The general problem covers many specific examples arising from applications, including tensor regression, tensor completion, …
View article: A Sharp Blockwise Tensor Perturbation Bound for Orthogonal Iteration
A Sharp Blockwise Tensor Perturbation Bound for Orthogonal Iteration Open
In this paper, we develop novel perturbation bounds for the high-order orthogonal iteration (HOOI) [DLDMV00b]. Under mild regularity conditions, we establish blockwise tensor perturbation bounds for HOOI with guarantees for both tensor rec…
View article: Exact Clustering in Tensor Block Model: Statistical Optimality and Computational Limit
Exact Clustering in Tensor Block Model: Statistical Optimality and Computational Limit Open
High-order clustering aims to identify heterogeneous substructures in multiway datasets that arise commonly in neuroimaging, genomics, social network studies, etc. The non-convex and discontinuous nature of this problem pose significant ch…
View article: Recursive Importance Sketching for Rank Constrained Least Squares: Algorithms and High-order Convergence
Recursive Importance Sketching for Rank Constrained Least Squares: Algorithms and High-order Convergence Open
In this paper, we propose {\it \underline{R}ecursive} {\it \underline{I}mportance} {\it \underline{S}ketching} algorithm for {\it \underline{R}ank} constrained least squares {\it \underline{O}ptimization} (RISRO). The key step of RISRO is …
View article: Open Problem: Average-Case Hardness of Hypergraphic Planted Clique\n Detection
Open Problem: Average-Case Hardness of Hypergraphic Planted Clique\n Detection Open
We note the significance of hypergraphic planted clique (HPC) detection in\nthe investigation of computational hardness for a range of tensor problems. We\nask if more evidence for the computational hardness of HPC detection can be\ndevelo…
View article: Open Problem: Average-Case Hardness of Hypergraphic Planted Clique Detection
Open Problem: Average-Case Hardness of Hypergraphic Planted Clique Detection Open
We note the significance of hypergraphic planted clique (HPC) detection in the investigation of computational hardness for a range of tensor problems. We ask if more evidence for the computational hardness of HPC detection can be developed…
View article: A Sharp Blockwise Tensor Perturbation Bound for Orthogonal Iteration
A Sharp Blockwise Tensor Perturbation Bound for Orthogonal Iteration Open
In this paper, we develop novel perturbation bounds for the high-order orthogonal iteration (HOOI) [DLDMV00b]. Under mild regularity conditions, we establish blockwise tensor perturbation bounds for HOOI with guarantees for both tensor rec…
View article: A Schatten-q Matrix Perturbation Theory via Perturbation Projection Error Bound.
A Schatten-q Matrix Perturbation Theory via Perturbation Projection Error Bound. Open
This paper studies the Schatten-$q$ error of low-rank matrix estimation by singular value decomposition under perturbation. Specifically, we establish a tight perturbation bound on the low-rank matrix estimation via a perturbation projecti…
View article: A Schatten-$q$ Low-rank Matrix Perturbation Analysis via Perturbation Projection Error Bound
A Schatten-$q$ Low-rank Matrix Perturbation Analysis via Perturbation Projection Error Bound Open
This paper studies the Schatten-$q$ error of low-rank matrix estimation by singular value decomposition under perturbation. We specifically establish a perturbation bound on the low-rank matrix estimation via a perturbation projection erro…
View article: Tensor Clustering with Planted Structures: Statistical Optimality and Computational Limits
Tensor Clustering with Planted Structures: Statistical Optimality and Computational Limits Open
This paper studies the statistical and computational limits of high-order clustering with planted structures. We focus on two clustering models, constant high-order clustering (CHC) and rank-one higher-order clustering (ROHC), and study th…
View article: ISLET: Fast and Optimal Low-Rank Tensor Regression via Importance Sketching
ISLET: Fast and Optimal Low-Rank Tensor Regression via Importance Sketching Open
In this paper, we develop a novel procedure for low-rank tensor regression, namely Importance Sketching Low-rank Estimation for Tensors (ISLET). The central idea behind ISLET is importance sketching, i.e., carefully designed sketches based…
View article: ISLET: Fast and Optimal Low-rank Tensor Regression via Importance Sketching
ISLET: Fast and Optimal Low-rank Tensor Regression via Importance Sketching Open
In this paper, we develop a novel procedure for low-rank tensor regression, namely \emph{\underline{I}mportance \underline{S}ketching \underline{L}ow-rank \underline{E}stimation for \underline{T}ensors} (ISLET). The central idea behind ISL…
View article: Diagnosing University Student Subject Proficiency and Predicting Degree Completion in Vector Space
Diagnosing University Student Subject Proficiency and Predicting Degree Completion in Vector Space Open
We investigate the issues of undergraduate on-time graduation with respect to subject proficiencies through the lens of representation learning, training a student vector embeddings from a dataset of 8 years of course enrollments. We compa…
View article: Model Confidence Bounds for Variable Selection
Model Confidence Bounds for Variable Selection Open
In this article, we introduce the concept of model confidence bounds (MCB) for variable selection in the context of nested models. Similarly to the endpoints in the familiar confidence interval for parameter estimation, the MCB identifies …