The ROS 2 Perception Node accelerated application implements a subset of image_pipeline, which is one of the most popular packages in the ROS 2 ecosystem and a core piece of the ROS perception stack. It creates a simple computational graph consisting of two hardware accelerated nodes, resize & rectify, and leverages KRS framework for tracing and benchmarking.
No, the app does not require any experience in FPGA design.
This app is free of charge from Xilinx.
No, it is not mandatory to use a real camera for this application. This application by default uses Gazebo for camera simulation. Though it supports various ROS 2 cameras, using a real camera is optional.
Learn all about adaptive SOMs, including examples of why and how they can be deployed in next-generation edge applications, and how smart vision providers benefit from the performance, flexibility, and rapid development that can only be achieved by an adaptive SOM.
Demand for robotics is accelerating rapidly. Building a robot that is safe and secure and can operate alongside humans is difficult enough. But getting these technologies working together can be even more challenging. Complicating matters is the addition of machine learning and artificial intelligence which is making it more difficult to keep up with computational demands. Roboticists are turning toward adaptive computing platforms which offer lower latency and deterministic, multi-axis control with built-in safety and security on an integrated, adaptable platform that is expandable for the future. Read the eBook to learn more.