The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. ex. Some numerals are expressed as "XNUMX".
Copyrights notice
The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. Copyrights notice
본 논문에서는 수정된 유한 오토마타(MFA)와 동등한 이진 2차 순환 신경망(BSRNN)을 제안하고, 정규 문법 추론을 위한 안정적인 BSRNN을 구성하기 위한 학습 알고리즘을 제시합니다. 이 네트워크는 두 가지 추세를 결합합니다. 하나는 뉴런 수, 문자열 수, 문자열 길이의 제한 없이 훈련을 통해 정규 문법의 문자열을 순환 신경망으로 변환하는 것이고, 다른 하나는 직접적으로 유한 자동장치로 변환하는 것이다. BSRNN의 뉴런은 하드 리미터 활성화 기능을 사용하므로 제안된 BSRNN은 문법 추론뿐만 아니라 일반 문법 및 유한 오토마타를 위한 하드웨어 구현의 좋은 대안이 될 수 있습니다.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
부
Soon-Ho JUNG, Hyunsoo YOON, "Binary Second-Order Recurrent Neural Networks for Inferring Regular Grammars" in IEICE TRANSACTIONS on Information,
vol. E83-D, no. 11, pp. 1996-2007, November 2000, doi: .
Abstract: This paper proposes the binary second-order recurrent neural networks (BSRNN) equivalent to the modified finite automata (MFA) and presents the learning algorithm to construct the stable BSRNN for inferring regular grammar. This network combines two trends; one is to transform strings of a regular grammar into a recurrent neural network through training with no restriction of the number of neurons, the number of strings, and the length of string and the other is to directly transform itself into a finite automaton. Since neurons in the BSRNN employ a hard-limiter activation functions, the proposed BSRNN can become a good alternative of hardware implementation for regular grammars and finite automata as well as grammatical inference.
URL: https://global.ieice.org/en_transactions/information/10.1587/e83-d_11_1996/_p
부
@ARTICLE{e83-d_11_1996,
author={Soon-Ho JUNG, Hyunsoo YOON, },
journal={IEICE TRANSACTIONS on Information},
title={Binary Second-Order Recurrent Neural Networks for Inferring Regular Grammars},
year={2000},
volume={E83-D},
number={11},
pages={1996-2007},
abstract={This paper proposes the binary second-order recurrent neural networks (BSRNN) equivalent to the modified finite automata (MFA) and presents the learning algorithm to construct the stable BSRNN for inferring regular grammar. This network combines two trends; one is to transform strings of a regular grammar into a recurrent neural network through training with no restriction of the number of neurons, the number of strings, and the length of string and the other is to directly transform itself into a finite automaton. Since neurons in the BSRNN employ a hard-limiter activation functions, the proposed BSRNN can become a good alternative of hardware implementation for regular grammars and finite automata as well as grammatical inference.},
keywords={},
doi={},
ISSN={},
month={November},}
부
TY - JOUR
TI - Binary Second-Order Recurrent Neural Networks for Inferring Regular Grammars
T2 - IEICE TRANSACTIONS on Information
SP - 1996
EP - 2007
AU - Soon-Ho JUNG
AU - Hyunsoo YOON
PY - 2000
DO -
JO - IEICE TRANSACTIONS on Information
SN -
VL - E83-D
IS - 11
JA - IEICE TRANSACTIONS on Information
Y1 - November 2000
AB - This paper proposes the binary second-order recurrent neural networks (BSRNN) equivalent to the modified finite automata (MFA) and presents the learning algorithm to construct the stable BSRNN for inferring regular grammar. This network combines two trends; one is to transform strings of a regular grammar into a recurrent neural network through training with no restriction of the number of neurons, the number of strings, and the length of string and the other is to directly transform itself into a finite automaton. Since neurons in the BSRNN employ a hard-limiter activation functions, the proposed BSRNN can become a good alternative of hardware implementation for regular grammars and finite automata as well as grammatical inference.
ER -