The parameter of XR-MROS-A3 Intelligent Driving Robot Car

XR-MROS-A3 intelligent driving 5G robot car is an intelligent unmanned vehicle kit developed based on the Jetson Nano main control board. This car uses Mecanum wheels as a traveling mechanism and can move horizontally and laterally; it integrates laser radar, deep vision camera, and multi-degree-of-freedom robotic arm. It can realize laser radar mapping and navigation, real-time mapping fixed-point navigation and multi-point positioning, autonomous path planning, dynamic obstacle avoidance, visual three-dimensional mapping navigation, and can also realize artificial intelligence technology, face vision, object recognition, color tracking and other visual interaction functions through computational vision.

The robot supports mobile phone APP, PC virtual machine RVIZ, PS4 handle and other control terminals, and can quickly and easily enter the mapping-navigation mode;

The robot is equipped with the "XR-ROS human-computer interaction system" independently developed by Xiao R Technology, and cooperates with the "Xiao R Technology Ros-SLAM robot" exclusive APP to realize real-time mapping or navigation and other interactive operations on the mobile phone. The robot can realize automatic path planning and navigation according to the established electronic map, and automatically avoid obstacles when encountering obstacles during the journey.

The robot is equipped with an XR-A3 high-precision, high-torque robotic arm, which can implement the Moveit plug-in for robotic arm path motion planning. The maximum gripping weight of the robotic arm is 500g.

Key technologies: Python/ROS/SLAM autonomous navigation/Moveit robotic arm

  1. Hardware parameters

Size: 542*265*386mm

Weight: 3.4kg;

Material: 3D printed epoxy resin/acrylic/aluminum alloy;

Process: anodizing;

Travel mechanism: 4WD Mecanum wheel;

Main chip platform: Quad-core ARM quad-core A57; 128-Maxwell architecture;

Memory RAM: 4GB

Storage ROM: 64GB high-speed TF card

Maximum vehicle speed: ≥0.8m/s;

Sub-control chip: STM32F103RCT6;

Network port: RJ45 network port, 10/100/1000M adaptive

WiFi: 802.11ac/a/b/g/n, 2.4/5 GHz

Bluetooth: Bluetooth4.1

Interface type: 4PIN-ZH1.5;

Camera: binocular depth camera, 1080P;

Display: 7-inch HDMI Display high-definition touch screen;

Rated torque: 0.3kg.cm;

Drive voltage: 6V-12V;

Robotic arm servo: XR-ST-3215-C010, 30kg·cm; 6-12.6V;

Control system: XR-GUI human-computer interaction system;

★Mecanum wheel travel mechanism: The car adopts Mecanum wheel travel mode, which can be translated horizontally, can realize cross-shaped movement, can rotate 360 ​​degrees, run stably, and have strong passability.

★4DOF robotic arm: 180-degree free rotation, unique APP control interface, virtual robotic arm synchronous control, free control of the robotic arm's grasping movement, equipped with Moveit! Robotic arm plug-in, which can realize the robot arm movement path planning.

★WIFI transmission, APP control: After the car is turned on, it generates a WiFi signal. The mobile phone or tablet can connect to the car via WiFi and control it with a dedicated APP.

★Wireless handle control: The car is compatible with 2.4G wireless handle real-time remote control, supports handle control of robotic arm movements, and the control link is stable;

★Support 5G remote communication: The car is support  5G communication module, you can install a 5G module on the car (plug the 5G module with USB interface into the JetsonNano motherboard), and then install the remote control software on the UBUNTU system of the car to achieve remote control.

★Wireless video transmission: Through WiFi, the video images captured by the car camera are transmitted in real time to the APP or computer control software interface to achieve first-person visual effects.

★RPLADAR A1 laser radar mapping and navigation: 360-degree scanning distance measurement, support gmapping, hector/Harto algorithm to build maps, support keyboard, mouse and select area to build maps, and can realize SLAM laser radar mapping and automatic navigation functions through APP or virtual machine. When encountering obstacles during the journey, it automatically plans a new route to avoid them.

★XR-ROS GUI human-computer interaction system: Intuitive and convenient human-computer interaction operating system interface, which can realize one-key mapping and one-key navigation, and the status of each module of the system is clear at a glance.

★Moveit! Intelligent robotic arm: The action path planning function of the robot can be realized through the Moveit! intelligent robotic arm plug-in to achieve automatic grasping and collision avoidance.

  1. Teaching platform features

5G remote communication: The car is equipped with a 5G communication module, which supports remote cross-regional control of the robot car, not limited to the LAN environment.

Lidar SLAM mapping: 360-degree scanning and ranging, supports gmapping, hector/Harto algorithm to build maps, supports keyboard, mouse and selected area to build maps.

Deep visual SLAM mapping: Equipped with a deep visual camera, it can perform visual SLAM mapping navigation

Real-time video transmission: The car is equipped with a deep visual camera, which can push the video screen to the client interface in real time.

Visual interaction: Through the visual camera, artificial intelligence technology, face vision, object recognition, color tracking and other visual interaction functions are realized.

Indoor automatic obstacle avoidance and indoor positioning: The car autonomously plans the path according to the indoor map, and will autonomously perform secondary planning when the road is blocked to achieve dynamic obstacle avoidance.

GUI touch screen information interaction and function trigger system: Equipped with a GUI human-computer interaction system, it can realize mapping, navigation, IMU calibration, end process, view IP, camera status, radar status, chassis drive status and other information with one click on the touch screen interface

  • Course Resources

Provides a complete set of course lists, learning frameworks, related tools, and firmware. The video includes the use of control software, host computer software, and annotations of source code, etc. Secondary development can be carried out based on existing codes.

AiAi robotRadar robot carRosRos robotRos robot carSlamSmart carSmart robot car

Kommentar hinterlassen

Alle Kommentare werden von einem Moderator vor der Veröffentlichung überprüft