Jinmao Wei, Wei-Guo Yi, Ming-Yang Wang, Shuqin Wang
{"title":"面向数据特征的数据划分与加权数据挖掘","authors":"Jinmao Wei, Wei-Guo Yi, Ming-Yang Wang, Shuqin Wang","doi":"10.1109/ICIA.2004.1373371","DOIUrl":null,"url":null,"abstract":"It is comprehensible that to find as much interesting knowledge as possible is the initial and main aim to mine data, no matter which pattern (parallel or sequential) is utilized in data mining, though parallelism is practically important as well. We present a principle, called DFDP, for partitioning large dataset-the first step for parallelization. Data subsets after partitioning are treated tendentiously for possible parallel or distributed processing. One feasible logical structure for parallel processing is recommended in the paper. Also experimental comparisons are reported in the paper, which shows that weighted data mining will find more interesting rules from data.","PeriodicalId":297178,"journal":{"name":"International Conference on Information Acquisition, 2004. Proceedings.","volume":"3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2004-06-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Data feature oriented data partition and weighted data mining\",\"authors\":\"Jinmao Wei, Wei-Guo Yi, Ming-Yang Wang, Shuqin Wang\",\"doi\":\"10.1109/ICIA.2004.1373371\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"It is comprehensible that to find as much interesting knowledge as possible is the initial and main aim to mine data, no matter which pattern (parallel or sequential) is utilized in data mining, though parallelism is practically important as well. We present a principle, called DFDP, for partitioning large dataset-the first step for parallelization. Data subsets after partitioning are treated tendentiously for possible parallel or distributed processing. One feasible logical structure for parallel processing is recommended in the paper. Also experimental comparisons are reported in the paper, which shows that weighted data mining will find more interesting rules from data.\",\"PeriodicalId\":297178,\"journal\":{\"name\":\"International Conference on Information Acquisition, 2004. Proceedings.\",\"volume\":\"3 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2004-06-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Conference on Information Acquisition, 2004. Proceedings.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICIA.2004.1373371\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Conference on Information Acquisition, 2004. Proceedings.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICIA.2004.1373371","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Data feature oriented data partition and weighted data mining
It is comprehensible that to find as much interesting knowledge as possible is the initial and main aim to mine data, no matter which pattern (parallel or sequential) is utilized in data mining, though parallelism is practically important as well. We present a principle, called DFDP, for partitioning large dataset-the first step for parallelization. Data subsets after partitioning are treated tendentiously for possible parallel or distributed processing. One feasible logical structure for parallel processing is recommended in the paper. Also experimental comparisons are reported in the paper, which shows that weighted data mining will find more interesting rules from data.