User Tools

Site Tools


user:deniz001

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
user:deniz001 [2021/02/18 19:43] – [3. Object Detection] deniz001user:deniz001 [2021/08/24 17:35] (current) – external edit 127.0.0.1
Line 87: Line 87:
  
 ==== 4. Object Tracking ==== ==== 4. Object Tracking ====
-By object tracking we can uniquely identify an object instance, so the drone instance. +By using an object tracking algorithm we can uniquely identify an object instance, and locate a moving object by estimating the location of the target object in the future frames and check if the object in current frame is the same as the one which was in the very previous frame.
- +
- +
-The goal of an object tracker is to locate a moving object by estimating the location of the target object in the future frames and check if the object in current frame is the same as the one which was in the very previous frame.+
  
 Tracking process goes by first initially defining a bounding box of the target object. Tracking process goes by first initially defining a bounding box of the target object.
 +
 +A good tracker must model the motion, and appearance of an object and detect the motion space to localization the object in the future frames using the knowledge from the past frames.
  
 == Motion modelling == == Motion modelling ==
-Objects do not randomly move in the space but rather they have moving characteristics and 
-patterns which can be modeled. 
-movement prediction model to remember how the object moved in the past frames so that we can predict the next possible location space of the object.  
-An object tracker tries to understand and model the motion of an object mostly in the pixel level, that is called the motion model. it can estimate the location of an object in the future frames that would reduce the size of the image that the tracker looks for the object. 
  
-== Appearance Modelling ==+Any object do not randomly move in the space but rather they have moving characteristics and patterns which can be modeled.
  
-A good tracker must understand the appearance of the object that the tracker tracks, they must learn to differentiate the object from the background which is in the image.+Therefore, a successful object tracker must understand and model a movement estimation model which remembers how the object moved in the past frames in order to predict the next possible location space of that the object can be present. This will also make the algorithm faster by reducing the size of the region of interest that the tracker needs to scan for that object.
  
-== Motion Detection ==+== Appearance Modelling ==
  
-A good tracker must learn to estimate the motion of the object in order to have a guess about the space that the target possibly can be present in the frame.+An instance of an object has also an appearance characteristics. 
 +A good tracker must understand the appearance of the object that the tracker tracks by using the previous frames to train the appearance model and also they must learn to differentiate the object from the background which is in the image.
  
-== Object Localization == +To sum up, if the tracker has efficient models about the object'look and behavior, it can then use this knowledge to find the exact location of the target object in that current frame.
-Focus the attention on the region of interest in the frame. Data reduction +
-A good tracker uses the motion estimation and figures out the possible region where the target may be locating in the current frame and scan this area using the model that the tracker created about the object'appearance in the past frames and finally find the exact location of the target in that current frame.+
  
-Offline trackers are used when we have a recorded media, in that case we use also the future frames to make tracking predictions. While online trackers can only use the past frames to model the appearance, and the motion of the object for tracking estimations.+== Type of object trackers: ==
  
-Online learning trackers train itself to learn about the object(which is initially selected and the bounding box is inputted to the tracker for learning) using the array of frames that start from the initial frame till the frame that is one before the current frame.+**Offline learning trackers** are used when we have a recorded media, in that case we also use the future frames to make tracking predictions.
  
-Offline learning trackers are trained offline and they do not learn anything during the tracking process. An offline tracker may be trained to identify an object before the tracking starts.+**Online learning trackers** train itself to learn about the object which is inputted to the tracker for learning by drawing a bounding box around that object. Those trackers use an array of frames, starting from the initial frame until the frame that is one before the current frame.
  
-Most of the traditional trackers that are available in OpenCV are not based on Deep Learning(KCF is the best one) +A decision has to be made: 
-  +  - Use an online tracker that could train itself
-CNN(Convolutional Neural Network) based offline trackers: GOTURN +  - Use an offline tracker that has been already trained. 
-CNN(Convolutional Neural Network) based online trackers: MDNet(Multi domain network) best DL based+  - Train an offline tracker to identify only the drones. 
 +  - Train an offline tracker to identify drones and many other objects.
  
-Tracking algorithms available: +Offline trackers do not need to learn anything during the tracking process, that sounds faster but training is not an easy task because we can never train CNN for every possibility. However, online learning trackers may just learn about the object that we are interested in at that moment, for example the object may be red and the background may have no red color, in this case it will be so easy to track the object, in the opposite case it may be very challenging. This is not a physics problem that we can explain and formulate using mathematics but rather an engineering problem that requires experimenting and many trackers have its advantage in different cases, therefore, I have decided to implement several tracking algorithms which the user can decide what to use in different scenarios
-  * __**Boosting Tracker:**__ A real-time object tracking based on novel online version of the AdaBoost algorithm. The classifier uses the surrounding background as negative examples in update step to avoid the drifting problem. +
-  * __**MIL Tracker:**__ +
-  * __**KCF Tracker:**__ +
-  * __**KCF Tracker:**__ +
-  * __**KCF Tracker:**__ +
-  * __**KCF Tracker:**__ +
-  * __**KCF Tracker:**__ +
-  * __**KCF Tracker:**__+
  
 +I have been implementing various tracking algorithms and will continue to work on this for the future. For more information, updates, and when to use which algorithm please feel free to check the gitlab page that I provided in this document.
 ==== 5. PID Controller ==== ==== 5. PID Controller ====
  
Line 138: Line 126:
  
 The output of the tracking algorithm is a bounding box that represents the location of the object that we track, that is the drone object. Using the output of the tracker the error, that is the distance between the center of the current frame and the center of the drone object in the current frame, is calculated and this error is the input to the PID controller which tells the PTU(Pan and Tilt Unit) in which direction to move in order to put the object in the center of the current frame. The output of the tracking algorithm is a bounding box that represents the location of the object that we track, that is the drone object. Using the output of the tracker the error, that is the distance between the center of the current frame and the center of the drone object in the current frame, is calculated and this error is the input to the PID controller which tells the PTU(Pan and Tilt Unit) in which direction to move in order to put the object in the center of the current frame.
 +
 +==== 6. References ====
 +
 +
user/deniz001.1613673804.txt.gz · Last modified: 2021/08/24 17:34 (external edit)