B. A. Knyazev
YOU?
Author Swipe
View article: Generating $π$-Functional Molecules Using STGG+ with Active Learning
Generating $π$-Functional Molecules Using STGG+ with Active Learning Open
Generating novel molecules with out-of-distribution properties is a major challenge in molecular discovery. While supervised learning methods generate high-quality molecules similar to those in a dataset, they struggle to generalize to out…
View article: Celo: Training Versatile Learned Optimizers on a Compute Diet
Celo: Training Versatile Learned Optimizers on a Compute Diet Open
Learned optimization has emerged as a promising alternative to hand-crafted optimizers, with the potential to discover stronger learned update rules that enable faster, hyperparameter-free training of neural networks. A critical element fo…
View article: Accelerating Training with Neuron Interaction and Nowcasting Networks
Accelerating Training with Neuron Interaction and Nowcasting Networks Open
Neural network training can be accelerated when a learnable update rule is used in lieu of classic adaptive optimizers (e.g. Adam). However, learnable update rules can be costly and unstable to train and use. Recently, Jang et al. (2023) p…
View article: Any-Property-Conditional Molecule Generation with Self-Criticism using Spanning Trees
Any-Property-Conditional Molecule Generation with Self-Criticism using Spanning Trees Open
Generating novel molecules is challenging, with most representations leading to generative models producing many invalid molecules. Spanning Tree-based Graph Generation (STGG) is a promising approach to ensure the generation of valid molec…
View article: $μ$LO: Compute-Efficient Meta-Generalization of Learned Optimizers
$μ$LO: Compute-Efficient Meta-Generalization of Learned Optimizers Open
Learned optimizers (LOs) have the potential to significantly reduce the wall-clock training time of neural networks. However, they can struggle to optimize unseen tasks (\emph{meta-generalize}), especially when training networks wider than…
View article: LoGAH: Predicting 774-Million-Parameter Transformers using Graph HyperNetworks with 1/100 Parameters
LoGAH: Predicting 774-Million-Parameter Transformers using Graph HyperNetworks with 1/100 Parameters Open
A good initialization of deep learning models is essential since it can help them converge better and faster. However, pretraining large models is unaffordable for many researchers, which makes a desired prediction for initial parameters m…
View article: Graph Neural Networks for Learning Equivariant Representations of Neural Networks
Graph Neural Networks for Learning Equivariant Representations of Neural Networks Open
Neural networks that process the parameters of other neural networks find applications in domains as diverse as classifying implicit neural representations, generating neural network weights, and predicting generalization errors. However, …
View article: Meta-learning Optimizers for Communication-Efficient Learning
Meta-learning Optimizers for Communication-Efficient Learning Open
Communication-efficient variants of SGD, specifically local SGD, have received a great deal of interest in recent years. These approaches compute multiple gradient steps locally on each worker, before averaging model parameters, helping re…
View article: Terahertz Bessel Beams Formed by Binary and Holographic Axicons
Terahertz Bessel Beams Formed by Binary and Holographic Axicons Open
The characteristics of high-power vortex Bessel beams in the terahertz range (λ=141 μm) obtained with the use of diffractive axicons (DAs) illuminated by a Gaussian beam of the Novosibirsk free-electron laser were studied. Two of the three…
View article: Can We Scale Transformers to Predict Parameters of Diverse ImageNet Models?
Can We Scale Transformers to Predict Parameters of Diverse ImageNet Models? Open
Pretraining a neural network on a large dataset is becoming a cornerstone in machine learning that is within the reach of only a few communities with large-resources. We aim at an ambitious goal of democratizing pretraining. Towards that g…
View article: Optical Characterization of Thin Films by Surface Plasmon Resonance Spectroscopy Using an Acousto-Optic Tunable Filter
Optical Characterization of Thin Films by Surface Plasmon Resonance Spectroscopy Using an Acousto-Optic Tunable Filter Open
The paper presents the application of the acousto-optic tunable filter (AOTF) in surface plasmon resonance (SPR) spectroscopy to measure the optical thickness of thin dielectric coatings. The technique presented uses combined angular and s…
View article: Subwavelength Diffractive Optical Elements for Generation of Terahertz Coherent Beams with Pre-Given Polarization State
Subwavelength Diffractive Optical Elements for Generation of Terahertz Coherent Beams with Pre-Given Polarization State Open
Coherent terahertz beams with radial polarization of the 1st, 2nd, and 3rd orders have been generated with the use of silicon subwavelength diffractive optical elements (DOEs). Silicon elements were fabricated by a technology similar to th…
View article: Optimization, fabrication and characterization of a binary subwavelength cylindrical terahertz lens
Optimization, fabrication and characterization of a binary subwavelength cylindrical terahertz lens Open
A problem of optimizing the subwavelength microrelief of a binary cylindrical transmissive diffractive lens (DL) with a 300-mm focal length for a wavelength of λ=141 μm was considered. High-resistivity silicon was chosen as the DL substrat…
View article: Symbiotic and photosynthetic activities of soybean plants depending on soil moisture in the steppe zone
Symbiotic and photosynthetic activities of soybean plants depending on soil moisture in the steppe zone Open
Аннотация.В работе изучено влияние уровня обеспеченности влагой растений сои в период формирования элементов продуктивности.Исследования проводились в степной зоне Кабардино-Балкарии в 2020-2022 гг.Урожайность сои зависит от количества про…
View article: Model Zoos: A Dataset of Diverse Populations of Neural Network Models
Model Zoos: A Dataset of Diverse Populations of Neural Network Models Open
In the last years, neural networks (NN) have evolved from laboratory environments to the state-of-the-art for many real-world problems. It was shown that NN models (i.e., their weights and biases) evolve on unique trajectories in weight sp…
View article: Hyper-Representations as Generative Models: Sampling Unseen Neural Network Weights
Hyper-Representations as Generative Models: Sampling Unseen Neural Network Weights Open
Learning representations of neural network weights given a model zoo is an emerging and challenging area with many potential applications from model inspection, to neural architecture search or knowledge distillation. Recently, an autoenco…
View article: Sparsified Model Zoo Twins: A Dataset of Sparsified Populations of Neural Network Models - SVHN
Sparsified Model Zoo Twins: A Dataset of Sparsified Populations of Neural Network Models - SVHN Open
Abstract In the last years, neural networks have evolved from laboratory environments to the state-of-the-art for many real-world problems. Our hypothesis is that neural network models (i.e., their weights and biases) evolve on unique, smo…
View article: Sparsified Model Zoo Twins: A Dataset of Sparsified Populations of Neural Network Models - SVHN
Sparsified Model Zoo Twins: A Dataset of Sparsified Populations of Neural Network Models - SVHN Open
Abstract In the last years, neural networks have evolved from laboratory environments to the state-of-the-art for many real-world problems. Our hypothesis is that neural network models (i.e., their weights and biases) evolve on unique, smo…
View article: Sparsified Model Zoo Twins: A Dataset of Sparsified Populations of Neural Network Models - MNIST
Sparsified Model Zoo Twins: A Dataset of Sparsified Populations of Neural Network Models - MNIST Open
Abstract In the last years, neural networks have evolved from laboratory environments to the state-of-the-art for many real-world problems. Our hypothesis is that neural network models (i.e., their weights and biases) evolve on unique, smo…
View article: Sparsified Model Zoo Twins: A Dataset of Sparsified Populations of Neural Network Models - MNIST
Sparsified Model Zoo Twins: A Dataset of Sparsified Populations of Neural Network Models - MNIST Open
Abstract In the last years, neural networks have evolved from laboratory environments to the state-of-the-art for many real-world problems. Our hypothesis is that neural network models (i.e., their weights and biases) evolve on unique, smo…
View article: Model Zoo: A Dataset of Diverse Populations of Resnet-18 Models - Tiny ImageNet
Model Zoo: A Dataset of Diverse Populations of Resnet-18 Models - Tiny ImageNet Open
Abstract In the last years, neural networks have evolved from laboratory environments to the state-of-the-art for many real-world problems. Our hypothesis is that neural network models (i.e., their weights and biases) evolve on unique, smo…
View article: Model Zoo: A Dataset of Diverse Populations of Resnet-18 Models - Tiny ImageNet
Model Zoo: A Dataset of Diverse Populations of Resnet-18 Models - Tiny ImageNet Open
Abstract In the last years, neural networks have evolved from laboratory environments to the state-of-the-art for many real-world problems. Our hypothesis is that neural network models (i.e., their weights and biases) evolve on unique, smo…
View article: Model Zoo: A Dataset of Diverse Populations of Resnet-18 Models - CIFAR-100
Model Zoo: A Dataset of Diverse Populations of Resnet-18 Models - CIFAR-100 Open
Abstract In the last years, neural networks have evolved from laboratory environments to the state-of-the-art for many real-world problems. Our hypothesis is that neural network models (i.e., their weights and biases) evolve on unique, smo…
View article: Model Zoo: A Dataset of Diverse Populations of Resnet-18 Models - CIFAR-100
Model Zoo: A Dataset of Diverse Populations of Resnet-18 Models - CIFAR-100 Open
Abstract In the last years, neural networks have evolved from laboratory environments to the state-of-the-art for many real-world problems. Our hypothesis is that neural network models (i.e., their weights and biases) evolve on unique, smo…
View article: Model Zoo: A Dataset of Diverse Populations of Resnet-18 Models - CIFAR-10
Model Zoo: A Dataset of Diverse Populations of Resnet-18 Models - CIFAR-10 Open
Abstract In the last years, neural networks have evolved from laboratory environments to the state-of-the-art for many real-world problems. Our hypothesis is that neural network models (i.e., their weights and biases) evolve on unique, smo…
View article: Model Zoo: A Dataset of Diverse Populations of Resnet-18 Models - CIFAR-10
Model Zoo: A Dataset of Diverse Populations of Resnet-18 Models - CIFAR-10 Open
Abstract In the last years, neural networks have evolved from laboratory environments to the state-of-the-art for many real-world problems. Our hypothesis is that neural network models (i.e., their weights and biases) evolve on unique, smo…
View article: Detection of Ramsey Oscillations in Germanium Doped with Shallow Donors upon the Excitation of the 1s → 2p0 Transition
Detection of Ramsey Oscillations in Germanium Doped with Shallow Donors upon the Excitation of the 1s → 2p0 Transition Open
\n Contains fulltext :\n 283231.pdf (Publisher’s version ) (Open Access)\n
View article: Hyper-Representations for Pre-Training and Transfer Learning
Hyper-Representations for Pre-Training and Transfer Learning Open
Learning representations of neural network weights given a model zoo is an emerging and challenging area with many potential applications from model inspection, to neural architecture search or knowledge distillation. Recently, an autoenco…
View article: Pretraining a Neural Network before Knowing Its Architecture
Pretraining a Neural Network before Knowing Its Architecture Open
Training large neural networks is possible by training a smaller hypernetwork that predicts parameters for the large ones. A recently released Graph HyperNetwork (GHN) trained this way on one million smaller ImageNet architectures is able …
View article: Model Zoo: A Dataset of Diverse Populations of Neural Network Models - STL10 - Preprocessed Datasets
Model Zoo: A Dataset of Diverse Populations of Neural Network Models - STL10 - Preprocessed Datasets Open
Abstract In the last years, neural networks have evolved from laboratory environments to the state-of-the-art for many real-world problems. Our hypothesis is that neural network models (i.e., their weights and biases) evolve on unique, smo…