Award Date
5-1-2020
Degree Type
Dissertation
Degree Name
Doctor of Philosophy (PhD)
Department
Computer Science
First Committee Member
Justin Zhan
Second Committee Member
Kazem Taghva
Third Committee Member
Laxmi Gewali
Fourth Committee Member
Yoohwan Kim
Fifth Committee Member
Ge Kan
Number of Pages
172
Abstract
The vast majority of advances in deep neural network research operate on the basis of a real-valued weight space. Recent work in alternative spaces have challenged and complemented this idea; for instance, the use of complex- or binary-valued weights have yielded promising and fascinating results. We propose a framework for a novel weight space consisting of vector values which we christen VectorNet. We first develop the theoretical foundations of our proposed approach, including formalizing the requisite theory for forward and backpropagating values in a vector-weighted layer. We also introduce the concept of expansion and aggregation functions for conversion between real and vector values. These contributions enable the seamless integration of vector-weighted layers with conventional layers, resulting in network architectures exhibiting height in addition to width and depth, and consequently models which we might be inclined to call tall learning. As a means of evaluating its effect on model performance, we apply our framework on top of three neural network architectural families—the multilayer perceptron (MLP), convolutional neural network (CNN), and directed acyclic graph neural network (DAG-NN)—trained over multiple classic machine learning and image classification benchmarks. We also consider evolutionary algorithms for performing neural architecture search over the new hyperparameters introduced by our framework. Lastly, we solidify the case for the utility of our contributions by implementing our approach on real-world data in the domains of mental illness diagnosis and static malware detection, achieving state-of-the-art results in both. Our implementations are made publicly available to drive further investigation into the exciting potential of VectorNet.
Keywords
Deep learning; Deep neural networks; Machine learning; Malware detection; Neural architecture search; Schizophrenia diagnosis
Disciplines
Artificial Intelligence and Robotics | Computer Engineering | Computer Sciences
File Format
File Size
1.8 MB
Degree Grantor
University of Nevada, Las Vegas
Language
English
Repository Citation
Chiu, Carter, "A Framework for Vector-Weighted Deep Neural Networks" (2020). UNLV Theses, Dissertations, Professional Papers, and Capstones. 3876.
http://dx.doi.org/10.34917/19412042
Rights
IN COPYRIGHT. For more information about this rights statement, please visit http://rightsstatements.org/vocab/InC/1.0/