Samarth Tripathi
YOU?
Author Swipe
Evolving GANs: When Contradictions Turn into Compliance. Open
Limited availability of labeled-data makes any supervised learning problem challenging. Alternative learning settings like semi-supervised and universum learning alleviate the dependency on labeled data, but still require a large amount of…
Universum GANs: Improving GANs through contradictions Open
Limited availability of labeled-data makes any supervised learning problem challenging. Alternative learning settings like semi-supervised and universum learning alleviate the dependency on labeled data, but still require a large amount of…
Improving Model Training by Periodic Sampling over Weight Distributions Open
In this paper, we explore techniques centered around periodic sampling of model weights that provide convergence improvements on gradient update methods (vanilla \acs{SGD}, Momentum, Adam) for a variety of vision problems (classification, …
Pruning Algorithms to Accelerate Convolutional Neural Networks for Edge Applications: A Survey Open
With the general trend of increasing Convolutional Neural Network (CNN) model sizes, model compression and acceleration techniques have become critical for the deployment of these models on edge devices. In this paper, we provide a compreh…
Auptimizer -- an Extensible, Open-Source Framework for Hyperparameter Tuning Open
Tuning machine learning models at scale, especially finding the right hyperparameter values, can be difficult and time-consuming. In addition to the computational effort required, this process also requires some ancillary efforts including…
On-Device Machine Learning: An Algorithms and Learning Theory Perspective Open
The predominant paradigm for using machine learning models on a device is to train a model in the cloud and perform inference using the trained model on the device. However, with increasing number of smart devices and improved hardware, th…
Robust Neural Network Training using Periodic Sampling over Model Weights Open
Deep neural networks provide best-in-class performance for a number of computer vision problems. However, training these networks is computationally intensive and requires fine-tuning various hyperparameters. In addition, performance swing…
Make (Nearly) Every Neural Network Better: Generating Neural Network Ensembles by Weight Parameter Resampling Open
Deep Neural Networks (DNNs) have become increasingly popular in computer vision, natural language processing, and other areas. However, training and fine-tuning a deep learning model is computationally intensive and time-consuming. We prop…
Towards Deeper Generative Architectures for GANs using Dense connections Open
In this paper, we present the result of adopting skip connections and dense layers, previously used in image classification tasks, in the Fisher GAN implementation. We have experimented with different numbers of layers and inserting these …
Multi-Modal Emotion recognition on IEMOCAP Dataset using Deep Learning Open
Emotion recognition has become an important field of research in Human Computer Interactions as we improve upon the techniques for modelling the various aspects of behaviour. With the advancement of technology our understanding of emotions…
Using Deep and Convolutional Neural Networks for Accurate Emotion Classification on DEAP Data Open
Emotion recognition is an important field of research in Brain Computer Interactions. As technology and the understanding of emotions are advancing, there are growing opportunities for automatic emotion recognition systems. Neural networks…
Predicting Online Doctor Ratings from User Reviews Using Convolutional Neural Networks Open
Individuals are increasingly turning to the web to seek and share healthcare information and this trend in online health information has resulted in a proliferation of user generated health centric content, especially online physician revi…