This paper presents a novel neuro-symbolic learning framework that improves the integration of neural network learning and symbolic reasoning through a softened symbol grounding process. The framework addresses the challenge of symbol grounding, which is crucial for connecting the neural world (continuous, stochastic) with the symbolic world (discrete, deterministic). The key innovations include modeling symbol solution states as a Boltzmann distribution, using a new MCMC technique with projection and SMT solvers for efficient sampling, and an annealing mechanism to escape sub-optimal solutions. The framework optimizes the joint distribution of neural network parameters and symbol grounding, enabling more effective and efficient learning. Experiments on three tasks—handwritten formula evaluation, visual Sudoku classification, and shortest path search—demonstrate that the proposed method outperforms existing approaches, achieving superior performance in both symbol and calculation accuracy. The method leverages MCMC sampling with projection to overcome connectivity barriers in the solution space, and uses an annealing strategy to gradually converge to a deterministic symbol grounding. Theoretical analysis shows that the proposed approach can converge to an approximate stationary point, with the gradient estimate being biased but offset by the stochastic gradient descent algorithm. The framework is shown to be effective in various neuro-symbolic learning tasks, including those with weak supervision, and provides a generalizable mapping from raw inputs to latent symbols. The results highlight the effectiveness of the softened symbol grounding approach in bridging the gap between neural learning and symbolic reasoning.This paper presents a novel neuro-symbolic learning framework that improves the integration of neural network learning and symbolic reasoning through a softened symbol grounding process. The framework addresses the challenge of symbol grounding, which is crucial for connecting the neural world (continuous, stochastic) with the symbolic world (discrete, deterministic). The key innovations include modeling symbol solution states as a Boltzmann distribution, using a new MCMC technique with projection and SMT solvers for efficient sampling, and an annealing mechanism to escape sub-optimal solutions. The framework optimizes the joint distribution of neural network parameters and symbol grounding, enabling more effective and efficient learning. Experiments on three tasks—handwritten formula evaluation, visual Sudoku classification, and shortest path search—demonstrate that the proposed method outperforms existing approaches, achieving superior performance in both symbol and calculation accuracy. The method leverages MCMC sampling with projection to overcome connectivity barriers in the solution space, and uses an annealing strategy to gradually converge to a deterministic symbol grounding. Theoretical analysis shows that the proposed approach can converge to an approximate stationary point, with the gradient estimate being biased but offset by the stochastic gradient descent algorithm. The framework is shown to be effective in various neuro-symbolic learning tasks, including those with weak supervision, and provides a generalizable mapping from raw inputs to latent symbols. The results highlight the effectiveness of the softened symbol grounding approach in bridging the gap between neural learning and symbolic reasoning.