This paper introduces GenBaB, a general framework for neural network (NN) verification using branch-and-bound (BaB) for general nonlinearities. GenBaB extends BaB to handle nonlinearities beyond ReLU, including Sigmoid, Tanh, Sine, GeLU, and multi-dimensional operations like multiplication and division. The framework uses linear bound propagation to compute verified bounds and introduces a new branching heuristic, "Bound Propagation with Shortcuts (BBPS)", to efficiently estimate potential improvements from branching. It also proposes pre-optimizing branching points offline to enhance verification efficiency. GenBaB is integrated into the latest α,β-CROWN, which won the 4th International Verification of Neural Networks Competition (VNN-COMP 2023). The framework is tested on various networks, including feedforward networks with diverse activation functions, LSTMs, Vision Transformers (ViTs), and AC Optimal Power Flow (ACOPF) applications. Results show that GenBaB significantly improves verification accuracy, especially for models with strong nonlinearities. It outperforms existing baselines and enables verification of complex computational graphs beyond simple neural networks. The method is effective for a wide range of nonlinearities and demonstrates practical applications in real-world scenarios.This paper introduces GenBaB, a general framework for neural network (NN) verification using branch-and-bound (BaB) for general nonlinearities. GenBaB extends BaB to handle nonlinearities beyond ReLU, including Sigmoid, Tanh, Sine, GeLU, and multi-dimensional operations like multiplication and division. The framework uses linear bound propagation to compute verified bounds and introduces a new branching heuristic, "Bound Propagation with Shortcuts (BBPS)", to efficiently estimate potential improvements from branching. It also proposes pre-optimizing branching points offline to enhance verification efficiency. GenBaB is integrated into the latest α,β-CROWN, which won the 4th International Verification of Neural Networks Competition (VNN-COMP 2023). The framework is tested on various networks, including feedforward networks with diverse activation functions, LSTMs, Vision Transformers (ViTs), and AC Optimal Power Flow (ACOPF) applications. Results show that GenBaB significantly improves verification accuracy, especially for models with strong nonlinearities. It outperforms existing baselines and enables verification of complex computational graphs beyond simple neural networks. The method is effective for a wide range of nonlinearities and demonstrates practical applications in real-world scenarios.