Auto-Tracking Vehicle | Maqueen Plus Advanced Tutorial Lesson 3
userhead
ShadowNie 2021-05-21 16:12:10
projectImage

When we are at the airport, we always have to keep on hand on the suitcase, and this leaves us in a mess. I wonder how good it is if there is an “auto-driving” suitcase that just can follow us.

The auto-tracking suitcase mainly uses AI visual cameras to recognize human bones + human faces + characteristic identification, and then combines tracking algorithms to perform stable and accurate recognition of users.

Let’s think about it, can we use the principle of the auto-tracking suitcase to create an auto-tracking Maqueen Plus?

projectImage

Function

The project mainly uses the object tracking function of HUSKYLENS to let Maqueen Plus follow the vehicle in front "flexibly".

Bill of Materials

projectImage

Hardware Connection

1.The Leading Vehicle: It is the first vehicle, which does not need to be connected to peripheral hardware, but the target sticker should be attached onto it. The steps are as follows.

(1) Print out the target sticker, then choose the one you like, cut it out along its outline and paste it on the paperboard. Finally, stick it on the Maqueen Plus vehicle.

projectImage

Hardware Connection

1.The Leading Vehicle: It is the first vehicle, which does not need to be connected to peripheral hardware, but the target sticker should be attached onto it. The steps are as follows.

(1) Print out the target sticker, then choose the one you like, cut it out along its outline and paste it on the paperboard. Finally, stick it on the Maqueen Plus vehicle.

projectImage

Note: If you are using more than 2 Maqueen Plus cars, stick the target pattern on the back of each car in turn.

Knowledge Field

When we are going to track a moving object, visual object tracking is needed besides manual operation. This technology has already been widely used in our life, such as video monitoring, UAV shotting with follow, etc. In this project, we make use of the object tracking of HUSKYLENS.

1. What is object tracking?

As one of the vital functions of AI visual recognition, object tracking belongs to one type of behavior recognition. Object tracking is a key point in computer vision, referring to the process of making continuous inferences about the target’s state in a video sequence, which can be simply regarded as recognizing and tracking the objects moving within the visual range of the camera.

projectImage

2. Operating principles:

The image information is collected by the camera and sent to the computer. After analysis and process, the computer can work out the relative position of the moving object. Meanwhile, it will make the camera rotate to carry out real-time tracking. The object tracking system is mainly divided into four steps: object recognition, object tracking, movement prediction, camera controlling.

projectImage

Object recognition: It is to obtain accurate appearance information of the object through some image processing algorithms under a static background, and the shape of the object can be recognized and marked, as shown in the figure.

projectImage

Object tracking: It refers to tracking the subsequent image sequence through algorithms according to the appearance characteristics of the object obtained from the previous step, and carry out more in-depth learning in the subsequent tracking to make the tracking more and more accurate.

projectImage

Movement prediction: it means predicting the image of a moving object in the next frame using algorithms so as to optimize the algorithm and improve efficiency.

projectImage

Camera controlling: It is to move the camera according to the moving direction of the object while collecting the image information. It usually requires coordination with a cloud platform or other movement mechanism.

projectImage

1. Application Fields

Smart Video Monitoring: Based on motion recognition (human recognition basing on footwork, automatic object detection), automatic monitoring (to monitor the suspicious acts), traffic monitoring (collecting the real-time traffic data to direct the traffic).

projectImage

Human-computer Interaction: The traditional human-computer interaction is carried out by the keyboard and mouse of the computer. Tracking technology is the key point when a computer needs to be able to recognize and understand the posture, movement, and gesture.

projectImage

VR: 3D interaction and virtual character action simulation in the virtual environment directly benefit from the research results of video human motion analysis. They provide richer forms of interaction for the participants. And human tracking and analysis are the key technologies.

projectImage

2. Demonstration of HUSKYLENS Sensor-object tracking Function

HUSKYLENS has a built-in object tracking function, allowing to learn the features of the object, track the position of object on the screen and feedback the position information to the main control board.

Different from color recognition and face recognition, object recognition can completely learn and recognize an object (or human). Color recognition is only for color, while face recognition is only for a part of the body. Object tracking is to learn the overall characteristics of the object to track it.

The object tracking function can only track one object, and does not support tracking multiple objects now. Therefore, it is better for the learned object to have a clear outline, so that it can be recognized more easily.

1. Select “Object Tracking” Function

Dial the “function button” to the left till the “object tracking” shows at the top of the screen.

projectImage

2. Learning

Point HUSKYLENS to the target object, adjusting the distance till the object is included in the yellow frame of the center of the screen. If the object is difficult to be completely contained in the yellow frame, containing distinctive features is also okay. Then long press "learning button" to learn the object from various angles and distances. During the learning process, the words "Learning: ID1" with a yellow frame will be displayed on the screen.

projectImage

When HUSKYLENS can track the object at different angles and distances, you can release the "learning button" to end the learning.

Note: If there is no yellow frame in the center of the screen, it means that HUSKYLENS has learned an object before. Please let it forget the learned one and learn again.

3. Keep Learning

The object tracking feature allows HUSKYLENS to keep learning the current state of the object as soon as the camera sees it, which helps it capture moving objects.

Operation method: Long press the "function button" to enter the submenu of the object tracking function, select "learning enable", then short press the "function button", and then dial the "function button" to turn on the "learning enable" switch, that is to say: The color of the progress bar turns blue, and the square on the progress bar is located at the right of the progress bar. When exiting, select "Yes" to save the parameters.

projectImage

4. Save the Model

When HUSKYLENS is rebooted, it is default not to save the last object you studied, but you can realize it by turning on --”Auto Save”.

Operation method: It is the same as above. After entering the submenu, turn on the "Auto Save" function. It will save the object learned last time.

projectImage

Program Practice:

How does HUSKYLENS realize the object recognition in the project of the auto-tracking vehicle? How does it make the vehicle follow the front Maqueen Plus car? There are 4 steps for this project.

First learn to design the program of the first vehicle, use the infrared remote controller to control the first vehicle to move forward, turn left, and turn right; then learn to use the object recognition function of HUSKYLENS to output various parameters of the target object on the OLED screen ( X coordinate and height); then learn how to make the vehicle follow the specified target. When the target moves, it will adjust; finally complete the whole project so that Maqueen Plus can follow the vehicle ahead as required.

Note: the programs after task 2 are all for the second following vehicle.

Task1: Design Program for the First Vehicle

1. Program Design

Step1: Learn the Controller

An Infrared remote controller is used in this project, so we need to understand the code value corresponding to each key on it.

projectImage

Step2: Instruction Learning

projectImage

Step3: Function Analysis

The infrared remote controller is used to control the first Maqueen vehicle to move forward, backward, turn left, turn right, and stop. The procedure flow chart is as follows.

projectImage

2. Sample Program

projectImage

 

3. Operating Effect

Turn on Maqueen Plus and press the buttons 2, 8, 4, 6 and 5 on the remote controller to make Maqueen Plus go forward, backward, left, right and stop accordingly.

projectImage

Task2: Learn Object Tracking

1. Program Design

Step1: Learning and Recognition

Select a target pattern you like and let it be learned.

 

Step2: Instruction Learning

projectImage

Step3: Function Analysis

Coordinate Analysis

The screen resolution of HUSKYLENS sensor is 320*240, as shown in the picture. The center point coordinates of the object we obtained through the program are also within this range. For example, if the obtained coordinate value is (160, 120), the object being tracked is at the center of the screen.

projectImage

The "X center" and "Y center" of the frame parameter refer to the position of the center point of the recognition frame in the screen coordinate system.

The "width" and "height" of the frame parameters refer to the size of the recognition frame. Under the object recognition function, the recognition frame is a square, so the width and height are equal.

projectImage

Function Analysis

This task is mainly to display the X center, Y center, width, and height of the frame through OLED.

2. Sample Program

projectImage

 

 

3. Operating Effect

We can see the parameters from the OLED. Try moving the object left and right and observe the numerical change of the X; up and down and the change of Y; forward and backward and the change of width and height values.

 

projectImage

Task3: Front and Back Adjustment

1. Program Design

Step1: Function Analysis

Maqueen Plus must move forward and backward following the "steps" of the vehicle in front.

Based on the data obtained from the task 2, we can judge the distance of the target object through the height of the box, and then make the adjustment according to the result.

projectImage

Step2: Flowchart Analysis

projectImage

2. Sample Program

projectImage

 

3. Operating Effect

After learning the target pattern behind the vehicle in front, Maqueen Plus will automatically follow the pattern to move forward and backward, and always stay within a suitable distance range. If the vehicle in front moves, Maqueen Plus will follow it.

Task4: Left and right Adjustment

1. Program Design

Step1: Function Analysis

As shown in the picture, the screen is divided into 3 sections according to the X axis of the camera screen coordinate system, and the middle section is the target section. The camera continuously detects the state of the target object. When Maqueen Plus is moving forward, and its X center is between 120 and 200, which means that the target is in the center of the field of view, and there is no need to adjust its position; when its X center is between 0 and 120, Maqueen Plus moves to the left; when its X center is between 200 and 320, Maqueen Plus moves to the right. When Maqueen Plus moves back, and its X center is between 0 and 140, it goes back to the right; when its X center is between 180 and 320, it goes backward to the left.

 

projectImage

Step2: Flowchart Analysis

Based on task 2, the direction of the vehicle in front is judged by the position of the X center point. Maqueen Plus makes left-right adjustments based on the results.

Note: Left and right adjustment is divided into left and right adjustment when forward (left and right forward adjustment), and left and right adjustment when backward (left and right backward adjustment).

projectImage

The overall flowchart is as follows:

projectImage

2. Sample Program

Based on task 3, the functions of "adjust left and right forward" and "adjust left and right backward" are added. The final program is as follows:

projectImage

 

3. Operating Effect

After learning the target behind the vehicle in front, Maqueen Plus will automatically follow the target to move forward, backward, left and right, always keeping the object frame in the center of the screen, and the two vehicles will always keep a proper distance.

Project Development

In this project, we use two vehicles for experiments. Can you use more vehicles to follow one by one? Let them always follow the locomotive just like a train.

Note: During the experiment, please adjust the wheel speed according to the actual situation.

FILE
Program.zip 599KB Download 0
License
1
All Rights
Reserved
[[i.text]]
licensBg
1
0
COMMENTS ([[commentTotla]])
[[c.user_name]] [[c.create_time]]
[[c.parent_comment.count]]