1991 | DAVID W. AHA, DENNIS KIBLER, MARC K. ALBERT
Instance-based learning (IBL) algorithms use specific instances to generate classification predictions without maintaining abstractions. These algorithms, like the nearest neighbor method, store instances but can reduce storage requirements with minimal loss in accuracy. The paper discusses IBL's application in supervised learning, emphasizing its ability to handle incremental learning and noisy data. IBL algorithms, such as IB1 and IB2, are compared to decision tree algorithms, with IB2 reducing storage by saving only misclassified instances. However, IB2 is sensitive to noise, leading to the development of IB3, which uses a significance test to filter noisy instances. Empirical studies show that IBL performs well on noise-free databases but struggles with noisy data. The paper concludes that IBL is effective in various domains but requires modifications to handle noise, with IB3 offering improved noise tolerance.Instance-based learning (IBL) algorithms use specific instances to generate classification predictions without maintaining abstractions. These algorithms, like the nearest neighbor method, store instances but can reduce storage requirements with minimal loss in accuracy. The paper discusses IBL's application in supervised learning, emphasizing its ability to handle incremental learning and noisy data. IBL algorithms, such as IB1 and IB2, are compared to decision tree algorithms, with IB2 reducing storage by saving only misclassified instances. However, IB2 is sensitive to noise, leading to the development of IB3, which uses a significance test to filter noisy instances. Empirical studies show that IBL performs well on noise-free databases but struggles with noisy data. The paper concludes that IBL is effective in various domains but requires modifications to handle noise, with IB3 offering improved noise tolerance.