<?xml version="1.0" encoding="UTF-8"?><rss version="2.0" xmlns:content="http://purl.org/rss/1.0/modules/content/"><channel><title>Robot Docs | Blog</title><description/><link>https://zeulewan.github.io/</link><language>en</language><item><title>Switching to Full NVIDIA Isaac ROS Image</title><link>https://zeulewan.github.io/robot-docs-astro/blog/2026-02-08-isaac-ros-full-image/</link><guid isPermaLink="true">https://zeulewan.github.io/robot-docs-astro/blog/2026-02-08-isaac-ros-full-image/</guid><pubDate>Sun, 08 Feb 2026 00:00:00 GMT</pubDate><content:encoded>&lt;p&gt;I’m ditching the hand-built Isaac ROS container in favor of NVIDIA’s full &lt;code dir=&quot;auto&quot;&gt;noble-ros2_jazzy&lt;/code&gt; image. The current setup is a house of cards and it’s time to fix that before I go any further with G1 development.&lt;/p&gt;

&lt;div&gt;&lt;h2 id=&quot;the-problem&quot;&gt;The Problem&lt;/h2&gt;&lt;/div&gt;
&lt;p&gt;My &lt;code dir=&quot;auto&quot;&gt;isaac_ros_dev_container&lt;/code&gt; was built interactively — &lt;code dir=&quot;auto&quot;&gt;docker exec&lt;/code&gt; to get a shell, install packages by hand, &lt;code dir=&quot;auto&quot;&gt;docker commit&lt;/code&gt; to save. No Dockerfile, completely non-reproducible. Worse, if &lt;code dir=&quot;auto&quot;&gt;isaac-ros activate&lt;/code&gt; runs on a stopped container, it wipes everything. I’ve already lost work to this once. Not again.&lt;/p&gt;
&lt;div&gt;&lt;h2 id=&quot;what-nvidia-gives-you&quot;&gt;What NVIDIA Gives You&lt;/h2&gt;&lt;/div&gt;
&lt;p&gt;NVIDIA provides three Dockerfile layers that stack into one image:&lt;/p&gt;

























&lt;table&gt;&lt;thead&gt;&lt;tr&gt;&lt;th&gt;Layer&lt;/th&gt;&lt;th&gt;Size&lt;/th&gt;&lt;th&gt;Contents&lt;/th&gt;&lt;/tr&gt;&lt;/thead&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td&gt;&lt;code dir=&quot;auto&quot;&gt;Dockerfile.isaac_ros&lt;/code&gt;&lt;/td&gt;&lt;td&gt;~1.5 GB&lt;/td&gt;&lt;td&gt;Ubuntu Noble, ROS base, apt repos&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;&lt;code dir=&quot;auto&quot;&gt;Dockerfile.noble&lt;/code&gt;&lt;/td&gt;&lt;td&gt;~15 GB&lt;/td&gt;&lt;td&gt;TensorRT, PyTorch, CUDA dev, VPI, Triton&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;&lt;code dir=&quot;auto&quot;&gt;Dockerfile.ros2_jazzy&lt;/code&gt;&lt;/td&gt;&lt;td&gt;~3 GB&lt;/td&gt;&lt;td&gt;50+ ROS 2 packages, nav2, rviz2, foxglove-bridge, slam_toolbox&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;/table&gt;
&lt;p&gt;My current container uses only the first layer plus manually installed packages, and it’s already ~19 GB. The full image is the same size but includes nav2, rviz2, slam_toolbox, and a lot more. No reason not to use it.&lt;/p&gt;
&lt;div&gt;&lt;h2 id=&quot;why-now&quot;&gt;Why Now&lt;/h2&gt;&lt;/div&gt;
&lt;p&gt;I’m starting to build toward actual Unitree G1 development, which needs both simulation (Isaac Sim) and real robot (Jetson Orin) support. The full image already includes everything I was going to need:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;foxglove-bridge&lt;/strong&gt; — WebSocket visualization&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;nav2&lt;/strong&gt; — navigation stack for the G1&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;rviz2&lt;/strong&gt; — 3D visualization&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;slam_toolbox&lt;/strong&gt; — SLAM algorithms&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;CycloneDDS&lt;/strong&gt; — matches the G1’s DDS middleware&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Only a handful of packages need adding on top: Isaac ROS perception (AprilTag, cuVSLAM, nvblox), H.264 NVENC transport, and custom entrypoint scripts.&lt;/p&gt;
&lt;div&gt;&lt;h2 id=&quot;image-pulled&quot;&gt;Image Pulled&lt;/h2&gt;&lt;/div&gt;
&lt;div&gt;&lt;figure&gt;&lt;figcaption&gt;&lt;/figcaption&gt;&lt;pre&gt;&lt;code&gt;&lt;div&gt;&lt;div&gt;&lt;span&gt;nvcr.io/nvidia/isaac/ros:noble-ros2_jazzy_d3e84470d576702a380478a513fb3fc6-amd64&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;/code&gt;&lt;/pre&gt;&lt;div&gt;&lt;div aria-live=&quot;polite&quot;&gt;&lt;/div&gt;&lt;/div&gt;&lt;/figure&gt;&lt;/div&gt;
&lt;p&gt;Most layers already existed locally from the base image, so the pull was fast.&lt;/p&gt;
&lt;div&gt;&lt;h2 id=&quot;next-steps&quot;&gt;Next Steps&lt;/h2&gt;&lt;/div&gt;
&lt;ul&gt;
&lt;li&gt;Write the Dockerfile (~40 lines on top of the full base)&lt;/li&gt;
&lt;li&gt;Create docker-compose.yml&lt;/li&gt;
&lt;li&gt;Build, verify, and swap the container&lt;/li&gt;
&lt;li&gt;Update &lt;code dir=&quot;auto&quot;&gt;~/bin/ros2&lt;/code&gt; wrapper to use &lt;code dir=&quot;auto&quot;&gt;/etc/fastdds_no_shm.xml&lt;/code&gt; instead of &lt;code dir=&quot;auto&quot;&gt;/tmp/&lt;/code&gt;&lt;/li&gt;
&lt;/ul&gt;</content:encoded><category>isaac-ros</category><category>docker</category><category>unitree-g1</category></item><item><title>ROS 2 Bridge, Headless Isaac Sim, and Wrapper Scripts</title><link>https://zeulewan.github.io/robot-docs-astro/blog/2026-02-07-ros2-bridge-and-headless-sim/</link><guid isPermaLink="true">https://zeulewan.github.io/robot-docs-astro/blog/2026-02-07-ros2-bridge-and-headless-sim/</guid><pubDate>Sat, 07 Feb 2026 00:00:00 GMT</pubDate><content:encoded>&lt;p&gt;Big day. I got Isaac Sim’s ROS 2 bridge actually publishing topics, wrote a headless launch script, and verified the whole pipeline end-to-end with Foxglove. This is the first time I’ve seen camera feeds flow from simulation all the way through to a visualization tool.&lt;/p&gt;

&lt;div&gt;&lt;h2 id=&quot;getting-the-ros-2-bridge-to-work&quot;&gt;Getting the ROS 2 Bridge to Work&lt;/h2&gt;&lt;/div&gt;
&lt;p&gt;This took longer than it should have. Isaac Sim 5.1 bundles its own ROS 2 Jazzy bridge built against Python 3.11, and it absolutely does not play nice with a system ROS 2 install. The ABI conflicts between the bundled rclpy and system libraries are fatal — I’m talking instant segfaults.&lt;/p&gt;
&lt;p&gt;The solution turned out to be: don’t install system ROS 2 at all. Instead:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Isaac Sim side&lt;/strong&gt;: Set &lt;code dir=&quot;auto&quot;&gt;ROS_DISTRO=jazzy&lt;/code&gt;, &lt;code dir=&quot;auto&quot;&gt;RMW_IMPLEMENTATION=rmw_fastrtps_cpp&lt;/code&gt;, and add the bundled bridge’s &lt;code dir=&quot;auto&quot;&gt;lib/&lt;/code&gt; and &lt;code dir=&quot;auto&quot;&gt;rclpy/&lt;/code&gt; dirs to &lt;code dir=&quot;auto&quot;&gt;LD_LIBRARY_PATH&lt;/code&gt; and &lt;code dir=&quot;auto&quot;&gt;PYTHONPATH&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;ROS 2 CLI&lt;/strong&gt;: Run &lt;code dir=&quot;auto&quot;&gt;ros2&lt;/code&gt; commands inside the Isaac ROS Docker container via a wrapper script&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Two completely separate ROS environments that talk over DDS. It works.&lt;/p&gt;
&lt;div&gt;&lt;h2 id=&quot;wrapper-scripts&quot;&gt;Wrapper Scripts&lt;/h2&gt;&lt;/div&gt;
&lt;p&gt;To make this less painful day-to-day, I wrote two scripts in &lt;code dir=&quot;auto&quot;&gt;~/bin/&lt;/code&gt;:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;&lt;code dir=&quot;auto&quot;&gt;isaacsim-ros&lt;/code&gt;&lt;/strong&gt; — launches Isaac Sim with all the ROS 2 bridge environment variables pre-configured. Just run &lt;code dir=&quot;auto&quot;&gt;isaacsim-ros&lt;/code&gt; and it handles the rest.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;&lt;code dir=&quot;auto&quot;&gt;ros2&lt;/code&gt;&lt;/strong&gt; — runs &lt;code dir=&quot;auto&quot;&gt;ros2&lt;/code&gt; commands inside the &lt;code dir=&quot;auto&quot;&gt;isaac_ros_dev_container&lt;/code&gt; Docker container with FastDDS SHM-bypass config applied. So &lt;code dir=&quot;auto&quot;&gt;ros2 topic list&lt;/code&gt; and &lt;code dir=&quot;auto&quot;&gt;ros2 topic hz /clock&lt;/code&gt; just work from my terminal like normal.&lt;/li&gt;
&lt;/ul&gt;
&lt;div&gt;&lt;h2 id=&quot;headless-launch-script&quot;&gt;Headless Launch Script&lt;/h2&gt;&lt;/div&gt;
&lt;p&gt;I created &lt;code dir=&quot;auto&quot;&gt;~/Documents/isaac-sim-scripts/headless-sample-scene.sh&lt;/code&gt; to run Isaac Sim without a GUI. The tricky part: headless mode doesn’t auto-enable the ROS 2 bridge extension, so OmniGraph nodes like &lt;code dir=&quot;auto&quot;&gt;ROS2CameraHelper&lt;/code&gt; silently do nothing. You have to explicitly enable &lt;code dir=&quot;auto&quot;&gt;isaacsim.ros2.bridge&lt;/code&gt; via the extension manager. That one cost me a solid hour of wondering why my camera topics were empty.&lt;/p&gt;
&lt;p&gt;The script loads the carter warehouse AprilTag scene via &lt;code dir=&quot;auto&quot;&gt;get_assets_root_path()&lt;/code&gt; (resolves to NVIDIA’s S3 CDN) and logs cache hit/miss stats so I can track asset downloads.&lt;/p&gt;
&lt;div&gt;&lt;h2 id=&quot;the-fastdds-shared-memory-trap&quot;&gt;The FastDDS Shared Memory Trap&lt;/h2&gt;&lt;/div&gt;
&lt;p&gt;This was the sneakiest bug of the day. Cross-process DDS between Isaac Sim on the host and the Isaac ROS Docker container &lt;em&gt;looks&lt;/em&gt; like it works — &lt;code dir=&quot;auto&quot;&gt;ros2 topic list&lt;/code&gt; shows all the topics. But &lt;code dir=&quot;auto&quot;&gt;ros2 topic hz&lt;/code&gt; shows nothing. Zero data flowing.&lt;/p&gt;
&lt;p&gt;The culprit: FastDDS shared memory transport. It can discover topics across the container boundary but can’t actually move data. The fix is a FastDDS XML profile that forces UDP-only transport, set via &lt;code dir=&quot;auto&quot;&gt;FASTRTPS_DEFAULT_PROFILES_FILE&lt;/code&gt;. Once I figured that out, everything lit up.&lt;/p&gt;
&lt;div&gt;&lt;h2 id=&quot;foxglove-verification&quot;&gt;Foxglove Verification&lt;/h2&gt;&lt;/div&gt;
&lt;p&gt;Launched Foxglove Studio and there it was — camera feeds, &lt;code dir=&quot;auto&quot;&gt;/clock&lt;/code&gt;, &lt;code dir=&quot;auto&quot;&gt;/tf&lt;/code&gt;, and &lt;code dir=&quot;auto&quot;&gt;/tag_detections&lt;/code&gt; all flowing from headless Isaac Sim through the Isaac ROS container. Genuinely satisfying to see it all come together.&lt;/p&gt;
&lt;div&gt;&lt;h2 id=&quot;storage-architecture-plan&quot;&gt;Storage Architecture Plan&lt;/h2&gt;&lt;/div&gt;
&lt;p&gt;Also drafted a plan to mount the 1.8 TB Seagate FireCuda 520 as &lt;code dir=&quot;auto&quot;&gt;/data&lt;/code&gt;, replace the oversized 121 GB swap partition with a 16 GB swapfile, and bind-mount the Isaac Sim asset cache (&lt;code dir=&quot;auto&quot;&gt;~/.cache/ov/&lt;/code&gt;) to the faster drive. That’s a project for another day.&lt;/p&gt;
&lt;div&gt;&lt;h2 id=&quot;skills-and-docs&quot;&gt;Skills and Docs&lt;/h2&gt;&lt;/div&gt;
&lt;p&gt;Updated the Isaac Sim skill docs with the new ROS 2 bridge architecture, headless launch workflow, and all the lessons learned. Added workstation, networking, and Moonlight skills too.&lt;/p&gt;</content:encoded><category>isaac-sim</category><category>isaac-ros</category><category>ros2</category><category>foxglove</category></item><item><title>Ubuntu 24.04 Upgrade and Sunshine Rebuild</title><link>https://zeulewan.github.io/robot-docs-astro/blog/2026-02-06-ubuntu-upgrade-and-sunshine/</link><guid isPermaLink="true">https://zeulewan.github.io/robot-docs-astro/blog/2026-02-06-ubuntu-upgrade-and-sunshine/</guid><pubDate>Fri, 06 Feb 2026 00:00:00 GMT</pubDate><content:encoded>&lt;p&gt;Today I finally pulled the trigger on upgrading the workstation from Ubuntu 22.04 to 24.04. I’d been putting it off because this machine runs my entire robotics stack, but it had to happen eventually.&lt;/p&gt;

&lt;div&gt;&lt;h2 id=&quot;the-os-upgrade&quot;&gt;The OS Upgrade&lt;/h2&gt;&lt;/div&gt;
&lt;p&gt;Before touching anything, I backed up the configs I knew I’d regret losing:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;code dir=&quot;auto&quot;&gt;/etc/X11/xorg.conf&lt;/code&gt; (my custom virtual display + EDID config for headless Sunshine streaming)&lt;/li&gt;
&lt;li&gt;&lt;code dir=&quot;auto&quot;&gt;~/.zshrc&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;&lt;code dir=&quot;auto&quot;&gt;~/.config/sunshine/&lt;/code&gt; and the systemd overrides&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Then I ran &lt;code dir=&quot;auto&quot;&gt;do-release-upgrade&lt;/code&gt; and let it churn. It went surprisingly smoothly — post-reboot health checks confirmed Docker 29.2.1, nvidia-container-toolkit 1.18.2, and all my conda environments survived intact.&lt;/p&gt;
&lt;div&gt;&lt;h2 id=&quot;nvidia-driver-570-to-580&quot;&gt;NVIDIA Driver 570 to 580&lt;/h2&gt;&lt;/div&gt;
&lt;p&gt;While I was at it, I upgraded to driver 580.126.09 (CUDA 13.0). Reboot, &lt;code dir=&quot;auto&quot;&gt;nvidia-smi&lt;/code&gt;, RTX 3090 looking healthy. Nothing exciting, which is exactly how driver upgrades should go.&lt;/p&gt;
&lt;div&gt;&lt;h2 id=&quot;sunshine-broke-of-course&quot;&gt;Sunshine Broke (Of Course)&lt;/h2&gt;&lt;/div&gt;
&lt;p&gt;This one I expected. The 22.04 Sunshine .deb immediately complained about a missing &lt;code dir=&quot;auto&quot;&gt;libicuuc.so.70&lt;/code&gt; — 24.04 ships &lt;code dir=&quot;auto&quot;&gt;libicuuc.so.74&lt;/code&gt; instead. Grabbed the 24.04 .deb from the LizardByte releases page, installed it, restored my config backup and systemd overrides, and confirmed NvFBC capture and NVENC encoding were back.&lt;/p&gt;
&lt;p&gt;I also wrote &lt;code dir=&quot;auto&quot;&gt;~/set-resolution.sh&lt;/code&gt;, a Sunshine pre-session script that matches the display resolution to whatever the client requests via &lt;code dir=&quot;auto&quot;&gt;nvidia-settings&lt;/code&gt;. Nice quality-of-life improvement for streaming from my MacBook.&lt;/p&gt;
&lt;div&gt;&lt;h2 id=&quot;toshy-venv-fix&quot;&gt;Toshy Venv Fix&lt;/h2&gt;&lt;/div&gt;
&lt;p&gt;The Toshy keyboard remapper broke too — its Python venv was built against 3.10 and 24.04 ships 3.12. Quick fix: re-ran the Toshy setup script to rebuild the venv. One of those things you only figure out after staring at a cryptic import error for a few minutes.&lt;/p&gt;
&lt;div&gt;&lt;h2 id=&quot;claude-skills-cleanup&quot;&gt;Claude Skills Cleanup&lt;/h2&gt;&lt;/div&gt;
&lt;p&gt;Since I was already in spring-cleaning mode, I reorganized my Claude skill docs in &lt;code dir=&quot;auto&quot;&gt;~/.claude/skills/&lt;/code&gt; — overhauled the Isaac Sim skill for the pip-based install, consolidated the dock app skills into a single file, and updated the README.&lt;/p&gt;</content:encoded><category>workstation</category><category>ubuntu</category><category>sunshine</category><category>nvidia</category></item></channel></rss>