The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. ex. Some numerals are expressed as "XNUMX".
Copyrights notice
The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. Copyrights notice
1차 라인 검색 최적화 기술은 계산 단순성과 낮은 메모리 요구 사항으로 인해 2차 최적화 기술에 비해 본질적인 실제적 중요성을 얻었습니다. 2차 방법의 계산 과잉은 대규모 최적화 작업에서는 견딜 수 없게 됩니다. 그러한 경우에 적용 가능한 유일한 최적화 기술은 1차 접근법의 변형입니다. 이 기사에서는 1차 라인 검색 최적화 기술의 변형 중 하나를 제시합니다. 제시된 알고리즘은 선형 검색 하위 문제를 단계 길이의 적절한 값에 대한 단일 단계 계산으로 실질적으로 단순화했습니다. 이는 선형 탐색 하위 문제의 구현 및 계산 복잡성을 현저히 단순화하면서도 방법의 안정성을 해치지 않습니다. 이 알고리즘은 이론적으로 초선형 수렴 속도로 수렴하는 것으로 입증되었으며, 1차 최적화를 위해 이전에 제안된 분류 프레임워크 내에서 정확하게 분류되었습니다. 제안된 알고리즘의 성능은 5개의 데이터 세트에 대해 실제로 평가되었으며 관련 표준 1차 최적화 기술과 비교되었습니다. 결과는 표준 1차 방법에 비해 제시된 알고리즘의 우수한 성능을 나타냅니다.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
부
Peter GECZY, Shiro USUI, "Superlinear Conjugate Gradient Method with Adaptable Step Length and Constant Momentum Term" in IEICE TRANSACTIONS on Fundamentals,
vol. E83-A, no. 11, pp. 2320-2328, November 2000, doi: .
Abstract: First order line seach optimization techniques gained essential practical importance over second order optimization techniques due to their computational simplicity and low memory requirements. The computational excess of second order methods becomes unbearable for large optimization tasks. The only applicable optimization techniques in such cases are variations of first order approaches. This article presents one such variation of first order line search optimization technique. The presented algorithm has substantially simplified a line search subproblem into a single step calculation of the appropriate value of step length. This remarkably simplifies the implementation and computational complexity of the line search subproblem and yet does not harm the stability of the method. The algorithm is theoretically proven convergent, with superlinear convergence rates, and exactly classified within the formerly proposed classification framework for first order optimization. Performance of the proposed algorithm is practically evaluated on five data sets and compared to the relevant standard first order optimization technique. The results indicate superior performance of the presented algorithm over the standard first order method.
URL: https://global.ieice.org/en_transactions/fundamentals/10.1587/e83-a_11_2320/_p
부
@ARTICLE{e83-a_11_2320,
author={Peter GECZY, Shiro USUI, },
journal={IEICE TRANSACTIONS on Fundamentals},
title={Superlinear Conjugate Gradient Method with Adaptable Step Length and Constant Momentum Term},
year={2000},
volume={E83-A},
number={11},
pages={2320-2328},
abstract={First order line seach optimization techniques gained essential practical importance over second order optimization techniques due to their computational simplicity and low memory requirements. The computational excess of second order methods becomes unbearable for large optimization tasks. The only applicable optimization techniques in such cases are variations of first order approaches. This article presents one such variation of first order line search optimization technique. The presented algorithm has substantially simplified a line search subproblem into a single step calculation of the appropriate value of step length. This remarkably simplifies the implementation and computational complexity of the line search subproblem and yet does not harm the stability of the method. The algorithm is theoretically proven convergent, with superlinear convergence rates, and exactly classified within the formerly proposed classification framework for first order optimization. Performance of the proposed algorithm is practically evaluated on five data sets and compared to the relevant standard first order optimization technique. The results indicate superior performance of the presented algorithm over the standard first order method.},
keywords={},
doi={},
ISSN={},
month={November},}
부
TY - JOUR
TI - Superlinear Conjugate Gradient Method with Adaptable Step Length and Constant Momentum Term
T2 - IEICE TRANSACTIONS on Fundamentals
SP - 2320
EP - 2328
AU - Peter GECZY
AU - Shiro USUI
PY - 2000
DO -
JO - IEICE TRANSACTIONS on Fundamentals
SN -
VL - E83-A
IS - 11
JA - IEICE TRANSACTIONS on Fundamentals
Y1 - November 2000
AB - First order line seach optimization techniques gained essential practical importance over second order optimization techniques due to their computational simplicity and low memory requirements. The computational excess of second order methods becomes unbearable for large optimization tasks. The only applicable optimization techniques in such cases are variations of first order approaches. This article presents one such variation of first order line search optimization technique. The presented algorithm has substantially simplified a line search subproblem into a single step calculation of the appropriate value of step length. This remarkably simplifies the implementation and computational complexity of the line search subproblem and yet does not harm the stability of the method. The algorithm is theoretically proven convergent, with superlinear convergence rates, and exactly classified within the formerly proposed classification framework for first order optimization. Performance of the proposed algorithm is practically evaluated on five data sets and compared to the relevant standard first order optimization technique. The results indicate superior performance of the presented algorithm over the standard first order method.
ER -