The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. ex. Some numerals are expressed as "XNUMX".
Copyrights notice
The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. Copyrights notice
빈발항목집합(FI) 마이닝은 광범위한 애플리케이션에서 데이터 세트를 분석하는 데 널리 사용되는 중요한 첫 번째 단계입니다. 빈발항목집합을 찾는 전통적인 접근방식에는 두 가지 주요 문제가 있습니다. 첫째, 바람직하지 않게 큰 빈발 항목 집합과 연관 규칙 집합이 파생되는 경우가 많습니다. 둘째, 소음에 취약하다. 이러한 문제를 개별적으로 해결하기 위해 두 가지 접근 방식이 제안되었습니다. 첫 번째 문제는 접근 방식으로 해결됩니다. FCI(빈번히 마감된 항목 집합), 인치당 지속 변화 결과에서 중복된 정보를 모두 제거하고 정보 손실이 없는지 확인합니다. 두 번째 문제는 접근 방식으로 해결됩니다. 대략적인 빈발항목집합(AFI), AFI 데이터 세트의 노이즈를 식별하고 수정할 수 있습니다. 이 두 가지 개념 각각에는 고유한 한계가 있지만 저자는 다음과 같이 생각합니다. 인치당 지속 변화 및 AFI 서로 협력하여 한계를 극복하고 장점을 증폭시킬 수 있습니다. 새로운 통합 접근 방식이 호출됩니다. 소음에 강한 빈번한 폐쇄 품목 세트(NFCI). 실험 결과는 새로운 접근 방식의 장점을 보여줍니다. (1) 소음에 강합니다. (2) 노이즈와 간헐적인 패턴을 제외하면 정보 손실이 거의 없이 생성되는 항목 집합의 수가 획기적으로 줄어듭니다. (3) 따라서 시간과 공간 모두 효율적입니다. (4) 결과에는 중복된 정보가 없습니다.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
부
Junbo CHEN, Bo ZHOU, Xinyu WANG, Yiqun DING, Lu CHEN, "Mining Noise-Tolerant Frequent Closed Itemsets in Very Large Database" in IEICE TRANSACTIONS on Information,
vol. E92-D, no. 8, pp. 1523-1533, August 2009, doi: 10.1587/transinf.E92.D.1523.
Abstract: Frequent Itemsets(FI) mining is a popular and important first step in analyzing datasets across a broad range of applications. There are two main problems with the traditional approach for finding frequent itemsets. Firstly, it may often derive an undesirably huge set of frequent itemsets and association rules. Secondly, it is vulnerable to noise. There are two approaches which have been proposed to address these problems individually. The first problem is addressed by the approach Frequent Closed Itemsets(FCI), FCI removes all the redundant information from the result and makes sure there is no information loss. The second problem is addressed by the approach Approximate Frequent Itemsets(AFI), AFI could identify and fix the noises in the datasets. Each of these two concepts has its own limitations, however, the authors find that if FCI and AFI are put together, they could help each other to overcome the limitations and amplify the advantages. The new integrated approach is termed Noise-tolerant Frequent Closed Itemset(NFCI). The results of the experiments demonstrate the advantages of the new approach: (1) It is noise tolerant. (2) The number of itemsets generated would be dramatically reduced with almost no information loss except for the noise and the infrequent patterns. (3) Hence, it is both time and space efficient. (4) No redundant information is in the result.
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.E92.D.1523/_p
부
@ARTICLE{e92-d_8_1523,
author={Junbo CHEN, Bo ZHOU, Xinyu WANG, Yiqun DING, Lu CHEN, },
journal={IEICE TRANSACTIONS on Information},
title={Mining Noise-Tolerant Frequent Closed Itemsets in Very Large Database},
year={2009},
volume={E92-D},
number={8},
pages={1523-1533},
abstract={Frequent Itemsets(FI) mining is a popular and important first step in analyzing datasets across a broad range of applications. There are two main problems with the traditional approach for finding frequent itemsets. Firstly, it may often derive an undesirably huge set of frequent itemsets and association rules. Secondly, it is vulnerable to noise. There are two approaches which have been proposed to address these problems individually. The first problem is addressed by the approach Frequent Closed Itemsets(FCI), FCI removes all the redundant information from the result and makes sure there is no information loss. The second problem is addressed by the approach Approximate Frequent Itemsets(AFI), AFI could identify and fix the noises in the datasets. Each of these two concepts has its own limitations, however, the authors find that if FCI and AFI are put together, they could help each other to overcome the limitations and amplify the advantages. The new integrated approach is termed Noise-tolerant Frequent Closed Itemset(NFCI). The results of the experiments demonstrate the advantages of the new approach: (1) It is noise tolerant. (2) The number of itemsets generated would be dramatically reduced with almost no information loss except for the noise and the infrequent patterns. (3) Hence, it is both time and space efficient. (4) No redundant information is in the result.},
keywords={},
doi={10.1587/transinf.E92.D.1523},
ISSN={1745-1361},
month={August},}
부
TY - JOUR
TI - Mining Noise-Tolerant Frequent Closed Itemsets in Very Large Database
T2 - IEICE TRANSACTIONS on Information
SP - 1523
EP - 1533
AU - Junbo CHEN
AU - Bo ZHOU
AU - Xinyu WANG
AU - Yiqun DING
AU - Lu CHEN
PY - 2009
DO - 10.1587/transinf.E92.D.1523
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E92-D
IS - 8
JA - IEICE TRANSACTIONS on Information
Y1 - August 2009
AB - Frequent Itemsets(FI) mining is a popular and important first step in analyzing datasets across a broad range of applications. There are two main problems with the traditional approach for finding frequent itemsets. Firstly, it may often derive an undesirably huge set of frequent itemsets and association rules. Secondly, it is vulnerable to noise. There are two approaches which have been proposed to address these problems individually. The first problem is addressed by the approach Frequent Closed Itemsets(FCI), FCI removes all the redundant information from the result and makes sure there is no information loss. The second problem is addressed by the approach Approximate Frequent Itemsets(AFI), AFI could identify and fix the noises in the datasets. Each of these two concepts has its own limitations, however, the authors find that if FCI and AFI are put together, they could help each other to overcome the limitations and amplify the advantages. The new integrated approach is termed Noise-tolerant Frequent Closed Itemset(NFCI). The results of the experiments demonstrate the advantages of the new approach: (1) It is noise tolerant. (2) The number of itemsets generated would be dramatically reduced with almost no information loss except for the noise and the infrequent patterns. (3) Hence, it is both time and space efficient. (4) No redundant information is in the result.
ER -