Now showing items 1-6 of 6
Fixed Size Ordinally-Forgetting Encoding and its Applications
In this thesis, we propose the new Fixed-size Ordinally-Forgetting Encoding (FOFE) method, which can almost uniquely encode any variable-length sequence of words into a fixed-size representation. FOFE can model the word ...
Context-FOFE Based Deep Learning Models for Text Classification and Modeling
Text classification is a fundamental task in natural language processing. Many recently proposed deep learning models have leveraged context information in documents and achieved great successes. However, most of these ...
What's Missing in your Shopping Cart? A Set Based Recommendation Method for "Cold-Start" Prediction
This thesis studies the problem of predicting the missing items in the current user's session when there is no additional side information available. Many recommender systems fail in general to provide a precise set of ...
Two-Stream Convolutional Networks for Dynamic Texture Synthesis
This thesis introduces a two-stream model for dynamic texture synthesis. The model is based on pre-trained convolutional networks (ConvNets) that target two independent tasks: (i) object recognition, and (ii) optical flow ...
A Study on Deep Learning: Training, Models and Applications
In the past few years, deep learning has become a very important research field that has attracted a lot of research interests, attributing to the development of the computational hardware like high performance GPUs, ...
Dual Fixed-Size Ordinally Forgetting Encoding (FOFE) For Natural Language Processing
In this thesis, we propose a new approach to employ fixed-size ordinally-forgetting encoding (FOFE) on Natural Language Processing (NLP) tasks, called dual-FOFE. The main idea behind dual-FOFE is that it allows the encoding ...