Skip to main content

Physics and Sensor Simulation

This chapter focuses on creating realistic physics simulation and sensor models for humanoid robots in Gazebo. Proper physics and sensor simulation are critical for developing and testing humanoid robot behaviors before deployment on real hardware.

Learning Objectives

By the end of this chapter, you will be able to:

  • Configure realistic physics parameters for humanoid robots
  • Model and simulate various types of sensors (cameras, IMUs, force/torque)
  • Create custom sensor noise models for realistic behavior
  • Tune physics parameters for stable humanoid simulation
  • Validate sensor simulation against real hardware characteristics
  • Implement advanced sensor fusion techniques in simulation

Physics Simulation Fundamentals

Understanding the Physics Pipeline

Physics simulation in Gazebo follows this pipeline:

  1. Force Application: Joint torques, gravity, external forces
  2. Integration: Computing motion from forces using numerical integration
  3. Collision Detection: Identifying contacting bodies
  4. Contact Resolution: Computing contact forces and responses
  5. State Update: Updating positions and velocities

Physics Engine Configuration

<physics type="ode" name="default_physics">
<max_step_size>0.001</max_step_size> <!-- Simulation time step -->
<real_time_factor>1.0</real_time_factor> <!-- Simulation speed -->
<real_time_update_rate>1000</real_time_update_rate> <!-- Hz -->
<gravity>0 0 -9.8</gravity>

<ode>
<solver>
<type>quick</type> <!-- Fast iterative solver -->
<iters>1000</iters> <!-- More iterations = more stable -->
<sor>1.3</sor> <!-- Successive Over-Relaxation parameter -->
</solver>

<constraints>
<cfm>0.0</cfm> <!-- Constraint Force Mixing -->
<erp>0.2</erp> <!-- Error Reduction Parameter -->
<contact_max_correcting_vel>100.0</contact_max_correcting_vel>
<contact_surface_layer>0.001</contact_surface_layer>
</constraints>
</ode>
</physics>

Physics Parameters for Humanoid Robots

Humanoid robots require special attention to physics parameters due to their complex kinematics and balance requirements:

<physics type="ode" name="humanoid_physics">
<max_step_size>0.0005</max_step_size> <!-- Smaller for stability -->
<real_time_factor>0.5</real_time_factor> <!-- May need slower simulation -->
<real_time_update_rate>2000</real_time_update_rate>
<gravity>0 0 -9.8</gravity>

<ode>
<solver>
<type>quick</type>
<iters>2000</iters> <!-- More iterations for complex joints -->
<sor>1.2</sor>
</solver>

<constraints>
<cfm>0.0001</cfm> <!-- Small CFM for tight constraints -->
<erp>0.1</erp> <!-- Moderate ERP for stability -->
<contact_max_correcting_vel>10.0</contact_max_correcting_vel>
<contact_surface_layer>0.0005</contact_surface_layer>
</constraints>
</ode>
</physics>

Collision Properties and Contact Modeling

Contact Parameters for Different Body Parts

Different parts of a humanoid robot need different contact properties:

<!-- Feet - need high friction for stable walking -->
<link name="left_foot">
<collision name="foot_collision">
<geometry>
<box>
<size>0.18 0.08 0.02</size>
</box>
</geometry>
<surface>
<friction>
<ode>
<mu>1.0</mu> <!-- High friction for grip -->
<mu2>1.0</mu2>
</ode>
</friction>
<contact>
<ode>
<kp>1e7</kp> <!-- High stiffness for solid contact -->
<kd>100</kd> <!-- Damping to prevent oscillation -->
<max_vel>100.0</max_vel>
<min_depth>0.001</min_depth> <!-- Penetration depth -->
</ode>
</contact>
</surface>
</collision>
</link>

<!-- Hands - need moderate friction for grasping -->
<link name="left_hand">
<collision name="hand_collision">
<geometry>
<box>
<size>0.08 0.06 0.04</size>
</box>
</geometry>
<surface>
<friction>
<ode>
<mu>0.8</mu> <!-- Good for grasping objects -->
<mu2>0.8</mu2>
</ode>
</friction>
<contact>
<ode>
<kp>1e6</kp> <!-- Medium stiffness -->
<kd>50</kd>
</ode>
</contact>
</surface>
</collision>
</link>

<!-- Torso - less critical for friction -->
<link name="torso">
<collision name="torso_collision">
<geometry>
<box>
<size>0.2 0.15 0.4</size>
</box>
</geometry>
<surface>
<friction>
<ode>
<mu>0.5</mu>
<mu2>0.5</mu2>
</ode>
</friction>
<contact>
<ode>
<kp>1e5</kp> <!-- Lower stiffness -->
<kd>20</kd>
</ode>
</contact>
</surface>
</collision>
</link>

Sensor Simulation

Camera Sensors

<link name="head">
<sensor name="rgb_camera" type="camera">
<pose>0.05 0 0.05 0 0 0</pose> <!-- Position in head -->
<camera name="head_camera">
<horizontal_fov>1.047</horizontal_fov> <!-- 60 degrees -->

<clip>
<near>0.1</near>
<far>10.0</far>
</clip>
<distortion>
<k1>0.0</k1>
<k2>0.0</k2>
<k3>0.0</k3>
<p1>0.0</p1>
<p2>0.0</p2>
<center>0.5 0.5</center>
</distortion>
</camera>
<always_on>1</always_on>
<update_rate>30</update_rate>
<visualize>true</visualize>
</sensor>

<!-- Depth camera -->
<sensor name="depth_camera" type="depth">
<pose>0.07 0 0.05 0 0 0</pose> <!-- Slightly offset -->
<camera name="head_depth_camera">
<horizontal_fov>1.047</horizontal_fov>

<clip>
<near>0.1</near>
<far>5.0</far>
</clip>
</camera>
<always_on>1</always_on>
<update_rate>30</update_rate>
<visualize>false</visualize>
</sensor>
</link>

IMU Sensors

<link name="imu_link">  <!-- Usually in torso or pelvis -->
<sensor name="imu_sensor" type="imu">
<pose>0 0 0 0 0 0</pose>
<always_on>1</always_on>
<update_rate>100</update_rate>
<imu>
<!-- Angular velocity noise -->
<angular_velocity>
<x>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>2e-4</stddev> <!-- ~0.01 deg/s RMS -->
<bias_mean>0.001</bias_mean>
<bias_stddev>0.0001</bias_stddev>
</noise>
</x>
<y>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>2e-4</stddev>
<bias_mean>0.001</bias_mean>
<bias_stddev>0.0001</bias_stddev>
</noise>
</y>
<z>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>2e-4</stddev>
<bias_mean>0.001</bias_mean>
<bias_stddev>0.0001</bias_stddev>
</noise>
</z>
</angular_velocity>

<!-- Linear acceleration noise -->
<linear_acceleration>
<x>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>1.7e-2</stddev> <!-- ~0.0017g RMS -->
<bias_mean>0.01</bias_mean>
<bias_stddev>0.001</bias_stddev>
</noise>
</x>
<y>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>1.7e-2</stddev>
<bias_mean>0.01</bias_mean>
<bias_stddev>0.001</bias_stddev>
</noise>
</y>
<z>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>1.7e-2</stddev>
<bias_mean>0.01</bias_mean>
<bias_stddev>0.001</bias_stddev>
</noise>
</z>
</linear_acceleration>
</imu>
</sensor>
</link>

Force/Torque Sensors

<link name="left_foot">
<sensor name="ft_sensor" type="force_torque">
<pose>0 0 -0.01 0 0 0</pose> <!-- At foot contact point -->
<always_on>1</always_on>
<update_rate>100</update_rate>
<force_torque>
<frame>child</frame> <!-- Measurements in child link frame -->
<measure_direction>child_to_parent</measure_direction>

<!-- Noise parameters -->
<force>
<x>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>0.5</stddev> <!-- 0.5N RMS -->
</noise>
</x>
<y>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>0.5</stddev>
</noise>
</y>
<z>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>1.0</stddev> <!-- 1.0N RMS (higher for Z axis) -->
</noise>
</z>
</force>

<torque>
<x>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>0.01</stddev> <!-- 0.01Nm RMS -->
</noise>
</x>
<y>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>0.01</stddev>
</noise>
</y>
<z>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>0.01</stddev>
</noise>
</z>
</torque>
</force_torque>
</sensor>
</link>

LIDAR Sensors

<link name="head">
<sensor name="lidar" type="ray">
<pose>0.05 0 0.1 0 0 0</pose> <!-- On top of head -->
<ray>
<scan>
<horizontal>
<samples>360</samples>
<resolution>1</resolution>
<min_angle>-3.14159</min_angle> <!-- -π -->
<max_angle>3.14159</max_angle> <!-- π -->
</horizontal>
</scan>
<range>
<min>0.1</min>
<max>10.0</max>
<resolution>0.01</resolution>
</range>
</ray>
<always_on>1</always_on>
<update_rate>10</update_rate>
<visualize>false</visualize>
</sensor>
</link>

Advanced Physics Tuning for Humanoid Robots

Balance and Stability Considerations

Humanoid robots require special attention to physics parameters for stable simulation:

<!-- In the world file -->
<physics type="ode" name="humanoid_simulation">
<max_step_size>0.0005</max_step_size>
<real_time_factor>0.8</real_time_factor>
<real_time_update_rate>2000</real_time_update_rate>
<gravity>0 0 -9.8</gravity>

<ode>
<solver>
<type>quick</type>
<iters>1500</iters> <!-- High for stability -->
<sor>1.2</sor>
</solver>

<constraints>
<cfm>1e-5</cfm> <!-- Very low CFM for tight constraints -->
<erp>0.05</erp> <!-- Low ERP for accuracy -->
<contact_max_correcting_vel>5.0</contact_max_correcting_vel>
<contact_surface_layer>0.0001</contact_surface_layer>
</constraints>
</ode>
</physics>

Joint-Level Physics Parameters

<!-- In the robot URDF/SDF -->
<joint name="left_knee" type="revolute">
<parent>left_thigh</parent>
<child>left_lower_leg</child>
<axis>
<xyz>0 1 0</xyz>
<limit lower="-0.1" upper="2.5" effort="200" velocity="5"/>
<dynamics damping="1.0" friction="0.1"/> <!-- Joint-specific damping -->
</axis>
<safety_controller k_position="10" k_velocity="1"
soft_lower_limit="-0.05" soft_upper_limit="2.45"/>
</joint>

Sensor Noise Modeling

Realistic Noise Characteristics

Real sensors have various types of noise that should be modeled:

<!-- Gyroscope with realistic noise -->
<sensor name="gyro" type="imu">
<imu>
<angular_velocity>
<x>
<noise type="gaussian">
<!-- Angle Random Walk: 0.3 deg/sqrt(hr) = 0.0015 deg/s/sqrt(Hz) -->
<stddev>2.6e-5</stddev>
<!-- Rate Random Walk: 3.8 deg/s/sqrt(hr) = 0.02 deg/s^2/sqrt(Hz) -->
<bias_stddev>3.5e-4</bias_stddev>
</noise>
</x>
<!-- Similar for y and z axes -->
</angular_velocity>
</imu>
</sensor>

<!-- Accelerometer with realistic noise -->
<sensor name="accel" type="imu">
<imu>
<linear_acceleration>
<x>
<noise type="gaussian">
<!-- Bias instability: 50 ug -->
<bias_stddev>4.9e-4</bias_stddev>
<!-- Noise density: 100 ug/sqrt(Hz) -->
<stddev>9.8e-4</stddev>
</noise>
</x>
<!-- Similar for y and z axes -->
</linear_acceleration>
</imu>
</sensor>

Validation and Calibration

Comparing Simulation to Reality

To validate your sensor models:

  1. Collect real data: Record sensor readings from real robot
  2. Simulate same scenario: Run identical motions in simulation
  3. Compare statistics: Analyze noise characteristics, biases
  4. Adjust parameters: Tune simulation to match real behavior

Calibration Workflows

# Example Python code for sensor validation
import numpy as np
from scipy import stats

def validate_imu_simulation(real_data, sim_data):
"""Compare real and simulated IMU data"""

# Compare noise characteristics
real_std = np.std(real_data)
sim_std = np.std(sim_data)

print(f"Real data std: {real_std}")
print(f"Sim data std: {sim_std}")
print(f"Ratio: {sim_std/real_std}")

# Perform statistical tests
ks_stat, p_value = stats.ks_2samp(real_data, sim_data)
print(f"Kolmogorov-Smirnov test p-value: {p_value}")

# If p > 0.05, distributions are statistically similar
return p_value > 0.05

Advanced Sensor Fusion in Simulation

Creating Multi-Sensor Nodes

import rclpy
from rclpy.node import Node
from sensor_msgs.msg import Imu, Image, PointCloud2
from nav_msgs.msg import Odometry
from geometry_msgs.msg import PoseWithCovarianceStamped

class SensorFusionNode(Node):
def __init__(self):
super().__init__('sensor_fusion_node')

# Subscribe to various sensors
self.imu_sub = self.create_subscription(
Imu, '/imu/data', self.imu_callback, 10)
self.camera_sub = self.create_subscription(
Image, '/camera/image_raw', self.camera_callback, 10)
self.odom_sub = self.create_subscription(
Odometry, '/odom', self.odom_callback, 10)

# Publish fused data
self.pose_pub = self.create_publisher(
PoseWithCovarianceStamped, '/fused_pose', 10)

self.get_logger().info('Sensor fusion node initialized')

def imu_callback(self, msg):
"""Process IMU data for attitude estimation"""
# Implement attitude estimation
pass

def camera_callback(self, msg):
"""Process camera data for visual odometry"""
# Implement visual processing
pass

def odom_callback(self, msg):
"""Process odometry data"""
# Implement odometry fusion
pass

Performance Optimization

Reducing Computational Load

For complex humanoid simulations:

<!-- Optimize sensor update rates -->
<sensor name="low_freq_camera" type="camera">
<update_rate>10</update_rate> <!-- Lower for background tasks -->
</sensor>

<sensor name="high_freq_imu" type="imu">
<update_rate>200</update_rate> <!-- Higher for control -->
</sensor>

<!-- Simplify collision geometries for performance -->
<link name="torso">
<collision name="simplified_collision">
<geometry>
<box>
<size>0.25 0.2 0.5</size> <!-- Simplified from complex mesh -->
</box>
</geometry>
</collision>
</link>

Multi-Threading Considerations

<!-- In Gazebo launch file -->
<node name="gazebo" pkg="gazebo_ros" exec="gzserver">
<param name="enable_multithread_physics" value="true"/>
<param name="max_threads" value="4"/>
</node>

Debugging Physics and Sensor Issues

Common Problems and Solutions

Problem: Robot falls through floor

Solution: Check collision properties, mass, and inertia values

Problem: Unstable joint oscillation

Solution: Increase damping, adjust ERP/CFM, reduce step size

Problem: Sensor data looks unrealistic

Solution: Verify noise parameters, update rates, and coordinate frames

Problem: Simulation runs too slowly

Solution: Simplify collision geometries, reduce update rates, optimize physics

Diagnostic Tools

# Monitor physics performance
gz topic -e /gazebo/performance_metrics

# Check sensor topics
ros2 topic echo /camera/image_raw --field header.stamp

# Monitor TF tree
ros2 run tf2_tools view_frames

Integration with ROS 2 Control

Sensor and Joint State Integration

<!-- In robot URDF -->
<gazebo>
<plugin name="ros_control" filename="libgazebo_ros_control.so">
<robot_namespace>/humanoid_robot</robot_namespace>
<robot_param>robot_description</robot_param>
<period>0.001</period>
</plugin>
</gazebo>

<gazebo>
<plugin name="joint_state_publisher" filename="libgazebo_ros_joint_state_publisher.so">
<robot_namespace>/humanoid_robot</robot_namespace>
<joint_name>left_hip_joint</joint_name>
<joint_name>left_knee_joint</joint_name>
<!-- Add all joints -->
<update_rate>100</update_rate>
</plugin>
</gazebo>

Best Practices for Humanoid Simulation

1. Gradual Complexity

  • Start with simple models and basic physics
  • Add complexity incrementally
  • Test each addition thoroughly

2. Realistic Parameters

  • Use real robot specifications for masses and dimensions
  • Model sensor noise based on actual hardware
  • Validate physics parameters against real robot behavior

3. Performance vs Realism Balance

  • Use high-fidelity models for final validation
  • Use simplified models for rapid testing
  • Adjust parameters based on simulation purpose

4. Comprehensive Testing

  • Test extreme scenarios safely in simulation
  • Validate across different environments
  • Compare simulation and real robot performance

Summary

Physics and sensor simulation are crucial for creating effective digital twins of humanoid robots. Properly configured physics parameters ensure stable simulation, while realistic sensor models enable robust algorithm development. The key is balancing computational efficiency with simulation accuracy to create a useful training and testing environment.

Next Steps

In the next chapter, we'll explore Unity integration for robot visualization and advanced simulation capabilities, complementing the physics-based simulation with high-quality graphics.


Estimated Reading Time: 30 minutes