Daniel Reichman
YOU?
Author Swipe
View article: Protocols for Verifying Smooth Strategies in Bandits and Games
Protocols for Verifying Smooth Strategies in Bandits and Games Open
We study protocols for verifying approximate optimality of strategies in multi-armed bandits and normal-form games. As the number of actions available to each player is often large, we seek protocols where the number of queries to the util…
View article: The Computational Complexity of Counting Linear Regions in ReLU Neural Networks
The Computational Complexity of Counting Linear Regions in ReLU Neural Networks Open
An established measure of the expressive power of a given ReLU neural network is the number of linear regions into which it partitions the input space. There exist many different, non-equivalent definitions of what a linear region actually…
View article: On the Depth of Monotone ReLU Neural Networks and ICNNs
On the Depth of Monotone ReLU Neural Networks and ICNNs Open
We study two models of ReLU neural networks: monotone networks (ReLU$^+$) and input convex neural networks (ICNN). Our focus is on expressivity, mostly in terms of depth, and we prove the following lower bounds. For the maximum function MA…
View article: The Karp Dataset
The Karp Dataset Open
Understanding the mathematical reasoning capabilities of Large Language Models (LLMs) is a central topic in the study of artificial intelligence. This new domain necessitates the creation of datasets of reasoning tasks for both training an…
View article: Nearest Neighbor Complexity and Boolean Circuits
Nearest Neighbor Complexity and Boolean Circuits Open
A nearest neighbor representation of a Boolean function f is a set of vectors (anchors) labeled by 0 or 1 such that f(x) = 1 if and only if the closest anchor to x is labeled by 1. This model was introduced by Hajnal, Liu and Turán [2022],…
View article: Time Lower Bounds for the Metropolis Process and Simulated Annealing
Time Lower Bounds for the Metropolis Process and Simulated Annealing Open
The Metropolis process (MP) and Simulated Annealing (SA) are stochastic local search heuristics that are often used in solving combinatorial optimization problems. Despite significant interest, there are very few theoretical results regard…
View article: Improved Lower Bounds on the Expected Length of Longest Common Subsequences
Improved Lower Bounds on the Expected Length of Longest Common Subsequences Open
It has been proven that, when normalized by $n$, the expected length of a longest common subsequence of $d$ random strings of length $n$ over an alphabet of size $σ$ converges to some constant that depends only on $d$ and $σ$. These values…
View article: Depth Separations in Neural Networks: Separating the Dimension from the Accuracy
Depth Separations in Neural Networks: Separating the Dimension from the Accuracy Open
We prove an exponential size separation between depth 2 and depth 3 neural networks (with real inputs), when approximating a $\mathcal{O}(1)$-Lipschitz target function to constant accuracy, with respect to a distribution with support in th…
View article: How Many Neurons Does it Take to Approximate the Maximum?
How Many Neurons Does it Take to Approximate the Maximum? Open
We study the size of a neural network needed to approximate the maximum function over $d$ inputs, in the most basic setting of approximating with respect to the $L_2$ norm, for continuous distributions, for a network that uses ReLU activat…
View article: Inoculation strategies for bounded degree graphs
Inoculation strategies for bounded degree graphs Open
We analyze a game-theoretic abstraction of epidemic containment played on an undirected graph $G$: each player is associated with a node in $G$ and can either acquire protection from a contagious process or risk infection. After decisions …
View article: Testing Connectedness of Images
Testing Connectedness of Images Open
We investigate algorithms for testing whether an image is connected. Given a proximity parameter ε ∈ (0,1) and query access to a black-and-white image represented by an n×n matrix of Boolean pixel values, a (1-sided error) connectedness te…
View article: Size and depth of monotone neural networks: interpolation and approximation
Size and depth of monotone neural networks: interpolation and approximation Open
We study monotone neural networks with threshold gates where all the weights (other than the biases) are non-negative. We focus on the expressive power and efficiency of representation of such networks. Our first result establishes that ev…
View article: Local treewidth of random and noisy graphs with applications to stopping contagion in networks
Local treewidth of random and noisy graphs with applications to stopping contagion in networks Open
We study the notion of local treewidth in sparse random graphs: the maximum treewidth over all $k$-vertex subgraphs of an $n$-vertex graph. When $k$ is not too large, we give nearly tight bounds for this local treewidth parameter; we also …
View article: The Learning and Communication Complexity of Subsequence Containment
The Learning and Communication Complexity of Subsequence Containment Open
We consider the learning and communication complexity of subsequence containment. In the learning problem, we seek to learn a classifier that positively labels a binary string $x$ if it contains a fixed binary string $y$ as a subsequence. …
View article: Size and Depth Separation in Approximating Benign Functions with Neural Networks
Size and Depth Separation in Approximating Benign Functions with Neural Networks Open
When studying the expressive power of neural networks, a main challenge is to understand how the size and depth of the network affect its ability to approximate real functions. However, not all functions are interesting from a practical vi…
View article: Tight Hardness Results for Training Depth-2 ReLU Networks
Tight Hardness Results for Training Depth-2 ReLU Networks Open
We prove several hardness results for training depth-2 neural networks with the ReLU activation function; these networks are simply weighted sums (that may include negative coefficients) of ReLUs. Our goal is to output a depth-2 neural net…
View article: On the Rational Boundedness of Cognitive Control: Shared Versus Separated Representations
On the Rational Boundedness of Cognitive Control: Shared Versus Separated Representations Open
One of the most fundamental and striking limitations of human cognition appears to be a constraint in the number of control-dependent processes that can be executed at one time. This constraint motivates one of the most influential tenets …
View article: A note on the largest induced matching in graphs avoiding a fixed bipartite graph
A note on the largest induced matching in graphs avoiding a fixed bipartite graph Open
We give a simple proof that every $n$-vertex graph $d$-regular graph that does not contain a fixed bipartite graph as a subgraph has an induced matching of size $Ω((n/d)(\log d))$.
View article: A note on the largest induced matching in graphs without short cycles
A note on the largest induced matching in graphs without short cycles Open
We show that every $n$-vertex, $d$-regular graph of girth at least $5$ contains an induced matching of size at least $\Omega((n/d)(\log d / \log \log d))$.
View article: Cognitive Model Priors for Predicting Human Decisions
Cognitive Model Priors for Predicting Human Decisions Open
Human decision-making underlies all economic behavior. For the past four decades, human decision-making under uncertainty has continued to be explained by theoretical models based on prospect theory, a framework that was awarded the Nobel …
View article: String Matching: Communication, Circuits, and Learning
String Matching: Communication, Circuits, and Learning Open
String matching is the problem of deciding whether a given n-bit string contains a given k-bit pattern. We study the complexity of this problem in three settings.
\n- Communication complexity. For small k, we provide near-optimal upper an…
View article: The Computational Complexity of Training ReLU(s)
The Computational Complexity of Training ReLU(s) Open
We consider the computational complexity of training depth-2 neural networks composed of rectified linear units (ReLUs). We show that, even for the case of a single ReLU, finding a set of weights that minimizes the squared error (even appr…
View article: Multitasking Capacity: Hardness Results and Improved Constructions
Multitasking Capacity: Hardness Results and Improved Constructions Open
We consider the problem of determining the maximal $α\in (0,1]$ such that every matching $M$ of size $k$ (or at most $k$) in a bipartite graph $G$ contains an induced matching of size at least $α|M|$. This measure was recently introduced i…
View article: The Computational Challenges of Pursuing Multiple Goals: Network Structure of Goal Systems Predicts Human Performance
The Computational Challenges of Pursuing Multiple Goals: Network Structure of Goal Systems Predicts Human Performance Open
Extant psychological theories attribute people’s failure to achieve their goals primarily to failures of self-control, insufficient motivation, or lacking skills. We develop a complementary theory specifying conditions under which the comp…
View article: A rationale for biopsying embryos reaching the morula stage on Day 6 in women undergoing preimplantation genetic testing for aneuploidy
A rationale for biopsying embryos reaching the morula stage on Day 6 in women undergoing preimplantation genetic testing for aneuploidy Open
N/A.
View article: Contagious sets in random graphs
Contagious sets in random graphs Open
We consider the following activation process in undirected graphs: a vertex is active either if it belongs to a set of initially activated vertices or if at some point it has at least $r$ active neighbors. A \emph{contagious set} is a set …
View article: Detecting Patterns Can Be Hard: Circuit Lower Bounds for the String Matching Problem.
Detecting Patterns Can Be Hard: Circuit Lower Bounds for the String Matching Problem. Open
Detecting patterns in strings and images is a fundamental and well studied problem. We study the circuit complexity of the string matching problem under two popular choices of gates: De Morgan and threshold gates. For strings of length $n$…
View article: String Matching: Communication, Circuits, and Learning
String Matching: Communication, Circuits, and Learning Open
String matching is the problem of deciding whether a given $n$-bit string contains a given $k$-bit pattern. We study the complexity of this problem in three settings. Communication complexity. For small $k$, we provide near-optimal upper a…