18.221.127.221
18.221.127.221
close menu
KCI 등재 SCIE SCOPUS
Robustness of Differentiable Neural Computer Using Limited Retention Vector-based Memory Deallocation in Language Model
( Donghyun Lee ) , ( Hosung Park ) , ( Soonshin Seo ) , ( Hyunsoo Son ) , ( Gyujin Kim ) , ( Ji-hwan Kim )
UCI I410-ECN-0102-2022-500-000583409

Recurrent neural network (RNN) architectures have been used for language modeling (LM) tasks that require learning long-range word or character sequences. However, the RNN architecture is still suffered from unstable gradients on long-range sequences. To address the issue of long-range sequences, an attention mechanism has been used, showing state-of-the-art (SOTA) performance in all LM tasks. A differentiable neural computer (DNC) is a deep learning architecture using an attention mechanism. The DNC architecture is a neural network augmented with a content-addressable external memory. However, in the write operation, some information unrelated to the input word remains in memory. Moreover, DNCs have been found to perform poorly with low numbers of weight parameters. Therefore, we propose a robust memory deallocation method using a limited retention vector. The limited retention vector determines whether the network increases or decreases its usage of information in external memory according to a threshold. We experimentally evaluate the robustness of a DNC implementing the proposed approach according to the size of the controller and external memory on the enwik8 LM task. When we decreased the number of weight parameters by 32.47%, the proposed DNC showed a low bits-per-character (BPC) degradation of 4.30%, demonstrating the effectiveness of our approach in language modeling tasks.

1. Introduction
2. Previous Works
3. Differentiable Neural Computer
4. Differentiable Neural Computer Using Limited Retention Vector-based Memory Deallocation
5. Experiments and Discussion
6. Conclusion
References
[자료제공 : 네이버학술정보]
×