Cluster Interfaced Objective Function for Decision Tree Classifiers for Mining Data with Uncertainty
S. Chidambaranathan
S. Chidambaranathan, Department of Ministry Corporate Affairs, Palaymkottai Tamil Nadu, India. St. Xavier’s College, Palaymkottai Tamil Nadu, India.
Manuscript received on 15 April 2019 | Revised Manuscript received on 22 April 2019 | Manuscript Published on 26 July 2019 | PP: 1554-1562 | Volume-8 Issue-6S4 April 2019 | Retrieval Number: F13140486S419/19©BEIESP | DOI: 10.35940/ijitee.F1314.0486S419
Open Access | Editorial and Publishing Policies | Cite | Mendeley | Indexing and Abstracting
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open-access article under the CC-BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)
Abstract: Ordinary desire tree classifiers artwork with recognized and specific information esteems. In late data amassing strategies, apparent diploma of developments are unsure. The unsure characteristics, in nearly all programs, have greater results on the informational index on records grouping and desire tree develops. Vulnerability want to be dealt with correctly. Vulnerability occurs due to facts staleness, some silly estimations, estimation and quantization errors. Vulnerability of an statistics factor is spoken to as some distance as one of a kind capabilities. typically unsure records are disconnected via the usage of measurable subsidiaries (eg., suggest, present day deviation, center and so forth.,). complete statistics of the information element improves the exactness of choice tree classifier (e.g., possibility Density feature (PDF)). In this paper, the proposed work is made to enhance the pruning of desire tree classifier calculation by means of grouping with separation limits and dividing of questionable danger move esteems. Bunching methods increment the rate of desire tree improvement and restriction the pruning time to more noteworthy degree. Separation limit grouping system, works based on the criteria of lower and upper bounds distances of the uncertain attributes values. Partitioning is done with objective function introduced on probability distribution based on the density levels. Objective function introduced evaluates the discrete value of the uncertain data item. Experiments are planned to conduct performance evaluation of heart disease diagnosis and prediction from UCI repository data sets.
Keywords: Objective Function, Decision Tree Classifiers, Cluster Interface, Uncertainty.
Scope of the Article: Classification