ParaHome is a system designed to parameterize detailed 3D movements of humans, hands, and objects during human-object interactions in a home environment. The system uses 70 synchronized RGB cameras and wearable motion capture devices, including an IMU-based body suit and hand motion capture gloves, to capture and parameterize dynamic 3D movements. This allows for the collection of a large-scale dataset of human-object interactions, which includes 30 participants, 22 objects, 101 scenarios, and 440 minutes of sequences. The dataset captures both 3D body and hand motions, as well as articulated object movements, and includes detailed descriptions of interactions with multiple objects in various scenarios. The system enables the generation of human-object interaction models by providing a parameterized 3D space that captures the nuanced relationship between human motions and object movements. The dataset is intended to support research in generative modeling of human-object interactions in a real-world room setting. The system also includes post-processing techniques to enhance tracking accuracy and handle failures. The dataset is publicly available and provides new opportunities for studying human-object interactions in a parameterized 3D space. The system addresses limitations of existing datasets by capturing more natural interactions, including articulated objects and multiple objects in a room environment. The system is designed to overcome challenges in capturing detailed human and object movements, including severe occlusions and multiple scales of motion. The system's hardware and software solutions enable the capture of high-quality 3D data, which is essential for training and evaluating models of human-object interactions. The dataset is expected to contribute to advancements in generative modeling and understanding of human-object interactions in real-world environments.ParaHome is a system designed to parameterize detailed 3D movements of humans, hands, and objects during human-object interactions in a home environment. The system uses 70 synchronized RGB cameras and wearable motion capture devices, including an IMU-based body suit and hand motion capture gloves, to capture and parameterize dynamic 3D movements. This allows for the collection of a large-scale dataset of human-object interactions, which includes 30 participants, 22 objects, 101 scenarios, and 440 minutes of sequences. The dataset captures both 3D body and hand motions, as well as articulated object movements, and includes detailed descriptions of interactions with multiple objects in various scenarios. The system enables the generation of human-object interaction models by providing a parameterized 3D space that captures the nuanced relationship between human motions and object movements. The dataset is intended to support research in generative modeling of human-object interactions in a real-world room setting. The system also includes post-processing techniques to enhance tracking accuracy and handle failures. The dataset is publicly available and provides new opportunities for studying human-object interactions in a parameterized 3D space. The system addresses limitations of existing datasets by capturing more natural interactions, including articulated objects and multiple objects in a room environment. The system is designed to overcome challenges in capturing detailed human and object movements, including severe occlusions and multiple scales of motion. The system's hardware and software solutions enable the capture of high-quality 3D data, which is essential for training and evaluating models of human-object interactions. The dataset is expected to contribute to advancements in generative modeling and understanding of human-object interactions in real-world environments.