XR X3 Pi ROS MINI radar navigation AI robot car manual
I.Product Description

1.1 Product pictures

sunrise ros2 lidar robot car

1.2 Product introduction

The XR X3 Pi ROS MINI radar navigation AI robot car is a Mecanum wheel ROS robot which is jointly designed and developed by XiaoR Technology and Horizon.

This product has the function of video image transmission, which can transmit the video image collected by robot eyes (HD camera) to the control terminal for display in real time.

The robot uses the Mecanum wheel, and built with the most popular ROS system of the current robot makers. In combination with the excellent BPU computing performance and AI vision algorithm of Horizon X3, it can realize such cool AI functions as using mobile phone APP to remotely control the car's front, back, left, right, rotation, translation, and zigzag oblique movement, video transmission, mapping, navigation, gesture recognition control, and vision following the human body. It is very suitable for science museums, campus science festival science exhibition performance activities, automatic driving robot competition and other events.

This product also provides secondary development SDK code and open API for users to call. The secondary development uses Python as programming language, which is the widely popular programming language of makers. You just need to simply learn the basic programming operations of Python, and you can easily start the secondary development of this robot action. It can be used as a training platform for artificial intelligence laboratories in secondary education institutions such as electronics information, artificial intelligence, the Internet of Things, computers, automation, etc.

 

1.3 Product Parameter

1.3.1 Car body parameters

 Parameter name Parameter details
Size 190x131x74mm
Weight 0.6kg
Coprocessor STM32F103RCTR6
Supply voltage DC 8V
Working current 1.5A
Max speed 35cm/S
Battery capacity 3500mAh
Battery life 2 Hours
Charging time 3-4 Hours
Camera resolution 720P
Control distance 20M
Control mode WiFi Wireless 802.11b/g/n
Control terminal Android phone、PAD、PC
Programming language Python/C++
Radar detection range 8M
Radar scanning frequency 10Hz
Angle resolution 0.9°
Radar supply voltage 5V
Scanning angle 0-360°
Measurement accuracy error 6mm
Radar operating current 200ma4

1.3.2 Drive board parameters

 

Parameter name  Parameter details
Coprocessor STM32F103RCT6
Secondary development interface Provide one-click burning system and support OTA upgrade
Voltage stabilization system Dual voltage stabilization system, using a logical level isolation design to reduce interference
Voltage stabilizing ability Acceptable 7-12V wide voltage input, stable to provide 5V,3.3V DC voltage output
Power supply load capacity Load power not less than 20W
Drive ability Use XRDRV2.0 drive scheme, drive four-way independent motor, current not less than 1.5A and drive power not less than 12W
Control interface Provide at least 5 compatible with Bluetooth, WiFi, 433, serial port, IIC and other control interfaces

 

II. AI Function

2.1 APP control

The ROS XR Master APP of XiaoR can realize the basic motion operation and video capture of the car, as well as the mapping and navigation functions.

First, you need to install the control software APP on the mobile phone. The name of the Android version control software APP is XR-SLAM Robot, and the download address is: https://www.xiaorgeek.com/Software/index.html

When running the APP for the first time, click the + button at the upper right corner of the interface to add a new robot, and fill in the IP address 10.5.5.1 of the robot in the Master URI. Other parameters remain unchanged.

When connecting WiFi for the first time, you need to set the static IP set by the mobile phone WiFi, turn on the WLAN function of the mobile phone, search the WiFi signal sunrise of the car, and right click to set the following picture options (the IP address can be specified as 10.5.5.1).

Then return to the APP interface. If the gray dot in front of Robot1 turns green, it means that the robot is online. You can click Robot1 to enter the control interface.

The following is the control interface. There is a blue virtual rocker on the right side of the interface, which can control the front, rear, left and right movements of the car. There are speed adjustment, map building and navigation buttons above the APP.

2.2 Laser mapping

Click the "Mapping" button on the control interface, and the APP will prompt to reload resources. After a moment, the APP will automatically jump back to the main interface. At this time, click "Mapping" again to enter the drawing interface.

 

In the building map interface, the virtual rocker on the right controls the car to walk indoors. The car's laser radar will scan out the surrounding obstacles and form an electronic map.

Due to the capacity limitation of the memory card, it is recommended that the map should not be too large, preferably within 20 square meters. After the drawing is completed, click the leftred background + , and then click the Save button. The car will save the map, and the car map will not be lost due to power failure.

2.3 Navigation

After the map has been built, enter the navigation interface from the main APP interface.

 

First select the "Set Initial Position" in the lower left corner, then press and hold to determine the position on the electronic map, drag the finger to determine the direction, and set the initial position according to the actual position and direction of the car. The initial position should be close to the actual position of the car as much as possible, so that the car can quickly enter a precise navigation state. If the initial position set is too far from the actual position of the car, it may lead to unsuccessful map matching, which will prevent the car from entering the navigation state.

After setting the initial position, select "Set Navigation Point" in the lower left corner, and then long press the target navigation point position on the electronic map, and drag the finger to determine the direction. After releasing the finger, the car will automatically plan the route to the navigation point. If there are other obstacles in the middle, the car willautomatically plan a new path to bypass, and will automatically stop when the car reaches the navigation point,

And adjust the head direction to be consistent with the direction when setting the navigation point.

Note: In the process of drawing and navigation, the wheels of the car must fully contact the ground without skidding, otherwise the odometer data will have deviation, which will lead to build map failure.

2.4 Gesture recognition control

 

The gesture recognition control function is independent of the SLAM lidar navigation function. Before using this function, you need to exit the SLAM lidar navigation process.

① The computer installs Putty or other SSH tool software, starts the car, and connects to the car's sunrise hotspot through the WiFi of the computer.

② Open Putty, create a new SSH session, and the host name is 10.5.5.1. Click the "Login" button to log in to the car's system. The user name and the password both are root.

③ Use the cd command to jump to the/home/sunrise/work/rosts/directory, execute the command sh./stop.sh, and run the script to stop the SLAM process.

cd /home/sunrise/work/rosts/

sh ./stop.sh

④ Enter the command: source/opt/tros/setup.bash initializes environment variables.

source /opt/tros/setup.bash

⑤ Use the cd command, jump to the/userdata directory, and enter the command: ros2 launch

hotspot_ app_ xrrobot_ gesture_ control

hobot_ app_ xrrobot_ gesture_ control.launch. py runs a gesture recognition control case.

cd /userdata

ros2 launch hobot_ app_ xrrobot_ gesture_ control hobot_

app_ xrrobot_ gesture_ control.launch.py

⑥ After starting the gesture recognition control command, stand facing the car, about 2 meters away, stretch out your right hand, make the OK gesture to unlock the car first. When you make the thumbs up gesture, the car moves forward.

When you make the V gesture, the car moves backward. When you make the thumbs up gesture tilt left, the car turns left. When you make the thumbs up gesture tilt right, the car turns right. When you extend the vertical palm, the car stops.

⑦ When experiencing this function, it is recommended to cooperate with the observation of real-time images to achieve the best effect. You can enter the car's/opt/tros/lib/websocket/webservice directory, and then execute the chmod+x./sbin/nginx&&./sbin/nginx - p. command. Then open the computer's browser, visit the 10.5.5.1 address, and you can see the real-time recognition effect video.

cd /opt/tros/lib/websocket/webservice

chmod +x ./sbin/nginx && ./sbin/nginx -p .

 

2.5 Vision follows human body

 

The visual follow human function is independent of the SLAM lidar navigation function. Before using this function, you need to exit the SLAM lidar navigation process.

① The computer installs Putty or other SSH tool software, starts the car, and connects to the car's sunrise hotspot through the wifi of the computer.

② Open Putty, create a new SSH session, and the host name is 10.5.5.1. Click the "Login" button to log in to the car's system. The user name is root and the password is root.

③ Use the cd command to jump to the/home/sunrise/work/rosts/directory, execute the command sh./stop.sh, and run the script to stop the SLAM process.

cd /home/sunrise/work/rosts/

sh ./stop.sh

④ . Enter the command: source/opt/tros/setup bash

initializes environment variables.

source /opt/tros/setup.bash

⑤ . Use the cd command, jump to the/userdata directory, and enter the command: ros2 launch hotspot_ app_ xrrobot_body_ tracking hobot_ app_ xrrobot_ body_ tracking.launch. py runs a visual follow human case.

cd /userdata

ros2 launch hobot_ app_ xrrobot_ body_ tracking hobot_ app_

xrrobot_ body_ tracking.launch.py

⑥ After starting up, stand facing the car with a distance of about 1.5 meters, stretch out your right hand, make an OK gesture first, and unlock the car. When the user steps back, the car moves forward. When the user moves slowly to the left and right, the car follows and rotates. When the car is about 1 meter away from the user, it stops moving forward.

Note: The moving speed should not be too fast to prevent the camera of the car from losing the target and thus unableto perform the correct action.)

⑦ When experiencing this function, it is recommended to cooperate with the observation of real-time images to achieve the best effect. You can enter the car's/opt/tros/lib/websocket/webservice directory, and then execute the chmod+x./sbin/nginx&&./sbin/nginx - p. command. Then open the computer's browser, visit the 10.5.5.1 address, and you can see the real-time recognition effect video.

cd /opt/tros/lib/websocket/webservice

chmod +x ./sbin/nginx && ./sbin/nginx -p .

 

III. Use Qualification

Because the Mecanum wheel has a small landing area, it is relatively easy to slip. It is recommended that the wheel should not be placed on a too smooth ground. The ground should be flat, and no obvious bulges and depressions should occur. During map building and navigation, once the wheel slips, the odometer data will be inaccurate. Within a certain deviation range, the car can automatically use algorithms to correct. If the deviation is too large, it cannot navigate normally.

In the navigation process, if the car finds that the difference between the real-time contour scanned by its radar and the map contour saved inside is too large, it will automatically rotate in place to match the contour. If the contour still cannot be matched after 3 turns, it will be considered to have lost its position, and it will exit the navigation function and stop in place, waiting for the user to intervene to re-determine the starting point and navigation point on the APP.

IV. Contact Us

Technical support: service@xiaorgeek.com

9:00-18:00 on working days, legal holidays.

Official website: https://www.xiaorgeek.net

Official forum: http://www.wifi-robots.com

TEL: 0755-28915204

Address: 1106, North Area, Block B, Building 18, Hisense

Innovation Industrial City, Jihua Street, Longgang District,

Shenzhen, Guangdong

Material:

https://drive.google.com/drive/folders/1szkN1bLVWrgJot_D08tfl

rT1aAhCfNlb?usp=share_link

About the control software, download from the link:

https://www.xiaorgeek.com/Software/index.html

LidarRosRos2SlamSmart robot car

Leave a comment

All comments are moderated before being published