This paper presents a comparison of three approaches to multiclass learning problems: direct multiclass methods (using decision trees), the one-per-class approach, and the meaningful distributed output approach. It introduces a new technique based on error-correcting output codes (ECOC) and shows that this method improves generalization performance on a wide range of multiclass tasks. The ECOC approach uses distributed output representations, where each class is assigned a unique binary codeword. During training, binary functions are learned for each bit position in the codewords. When classifying a new example, the outputs of these functions are combined to form an output string, which is then compared to the codewords to determine the class. The ECOC approach is robust to changes in training sample size, codeword assignment, and overfitting avoidance techniques. It also provides reliable class probability estimates. The paper compares the performance of the ECOC approach to the three existing methods on various data sets, including the glass, vowel, soybean, audiologyS, ISOLET, letter, and NETtalk data sets. The results show that the ECOC approach performs better than the other methods in most cases. The paper also discusses the robustness of the ECOC approach to changes in training sample size, codeword assignment, and pruning techniques. It concludes that the ECOC approach provides reliable class probability estimates and is robust to various changes in the learning task. The paper also addresses open questions about why the errors made in different bit positions are somewhat independent, which is important for the effectiveness of the ECOC approach. The paper also discusses the relationship between the ECOC approach and other machine learning methods, such as ensemble methods, and suggests that further research is needed to understand the underlying principles of the ECOC approach.This paper presents a comparison of three approaches to multiclass learning problems: direct multiclass methods (using decision trees), the one-per-class approach, and the meaningful distributed output approach. It introduces a new technique based on error-correcting output codes (ECOC) and shows that this method improves generalization performance on a wide range of multiclass tasks. The ECOC approach uses distributed output representations, where each class is assigned a unique binary codeword. During training, binary functions are learned for each bit position in the codewords. When classifying a new example, the outputs of these functions are combined to form an output string, which is then compared to the codewords to determine the class. The ECOC approach is robust to changes in training sample size, codeword assignment, and overfitting avoidance techniques. It also provides reliable class probability estimates. The paper compares the performance of the ECOC approach to the three existing methods on various data sets, including the glass, vowel, soybean, audiologyS, ISOLET, letter, and NETtalk data sets. The results show that the ECOC approach performs better than the other methods in most cases. The paper also discusses the robustness of the ECOC approach to changes in training sample size, codeword assignment, and pruning techniques. It concludes that the ECOC approach provides reliable class probability estimates and is robust to various changes in the learning task. The paper also addresses open questions about why the errors made in different bit positions are somewhat independent, which is important for the effectiveness of the ECOC approach. The paper also discusses the relationship between the ECOC approach and other machine learning methods, such as ensemble methods, and suggests that further research is needed to understand the underlying principles of the ECOC approach.