Hong

addyolo

Showing 58 changed files with 3803 additions and 20 deletions
1 code/web/package-lock.json 1 code/web/package-lock.json
2 -code/web/node_modules/**
...\ No newline at end of file ...\ No newline at end of file
2 +code/web/node_modules/**
3 +/code/venv
4 +/code/deep_sort_yolov4/model_data/yolov4.weights
5 +/code/deep_sort_yolov4/model_data/yolo4_weight.h5
...\ No newline at end of file ...\ No newline at end of file
......
1 +# YOLOv4 + Deep_SORT
2 +
3 +<img src="https://github.com/yehengchen/Object-Detection-and-Tracking/blob/master/OneStage/yolo/deep_sort_yolov4/output/comparison.png" width="81%" height="81%"> <img src="https://github.com/yehengchen/video_demo/blob/master/video_demo/output.gif" width="40%" height="40%"> <img src="https://github.com/yehengchen/video_demo/blob/master/video_demo/TownCentreXVID_output.gif" width="40%" height="40%">
4 +
5 +__Object Tracking & Counting Demo - [[BiliBili]](https://www.bilibili.com/video/BV1Ug4y1i71w#reply3014975828) [[Chinese Version]](https://blog.csdn.net/weixin_38107271/article/details/96741706)__
6 +## Requirement
7 +__Development Environment: [Deep-Learning-Environment-Setup](https://github.com/yehengchen/Ubuntu-16.04-Deep-Learning-Environment-Setup)__
8 +
9 +* OpenCV
10 +* sklean
11 +* pillow
12 +* numpy 1.15.0
13 +* torch 1.3.0
14 +* tensorflow-gpu 1.13.1
15 +* CUDA 10.0
16 +***
17 +
18 +It uses:
19 +
20 +* __Detection__: [YOLOv4](https://github.com/yehengchen/Object-Detection-and-Tracking/tree/master/OneStage/yolo/Train-a-YOLOv4-model) to detect objects on each of the video frames. - 用自己的数据训练YOLOv4模型
21 +
22 +* __Tracking__: [Deep_SORT](https://github.com/nwojke/deep_sort) to track those objects over different frames.
23 +
24 +*This repository contains code for Simple Online and Realtime Tracking with a Deep Association Metric (Deep SORT). We extend the original SORT algorithm to integrate appearance information based on a deep appearance descriptor. See the [arXiv preprint](https://arxiv.org/abs/1703.07402) for more information.*
25 +
26 +## Quick Start
27 +
28 +__0.Requirements__
29 +
30 + pip install -r requirements.txt
31 +
32 +__1. Download the code to your computer.__
33 +
34 + git clone https://github.com/yehengchen/Object-Detection-and-Tracking.git
35 +
36 +__2. Download [[yolov4.weights]](https://drive.google.com/file/d/1cewMfusmPjYWbrnuJRuKhPMwRe_b9PaT/view) [[Baidu]](https://pan.baidu.com/s/1jRudrrXAS3DRGqT6mL4L3A ) - `mnv6`__ and place it in `deep_sort_yolov4/model_data/`
37 +
38 +*Here you can download my trained [[yolo4_weight.h5]](https://pan.baidu.com/s/1JuT4KCUFaE2Gvme0_S37DQ ) - `w17w` weights for detecting person/car/bicycle,etc.*
39 +
40 +__3. Convert the Darknet YOLO model to a Keras model:__
41 +```
42 +$ python convert.py model_data/yolov4.cfg model_data/yolov4.weights model_data/yolo.h5
43 +```
44 +__4. Run the YOLO_DEEP_SORT:__
45 +
46 +```
47 +$ python main.py -c [CLASS NAME] -i [INPUT VIDEO PATH]
48 +
49 +$ python main.py -c person -i ./test_video/testvideo.avi
50 +```
51 +
52 +__5. Can change [deep_sort_yolov3/yolo.py] `__Line 100__` to your tracking object__
53 +
54 +*DeepSORT pre-trained weights using people-ReID datasets only for person*
55 +```
56 + if predicted_class != args["class"]:
57 + continue
58 +
59 + if predicted_class != 'person' and predicted_class != 'car':
60 + continue
61 +```
62 +
63 +## Train on Market1501 & MARS
64 +*People Re-identification model*
65 +
66 +[cosine_metric_learning](https://github.com/nwojke/cosine_metric_learning) for training a metric feature representation to be used with the deep_sort tracker.
67 +
68 +## Citation
69 +
70 +### YOLOv4 :
71 +
72 + @misc{bochkovskiy2020yolov4,
73 + title={YOLOv4: Optimal Speed and Accuracy of Object Detection},
74 + author={Alexey Bochkovskiy and Chien-Yao Wang and Hong-Yuan Mark Liao},
75 + year={2020},
76 + eprint={2004.10934},
77 + archivePrefix={arXiv},
78 + primaryClass={cs.CV}
79 + }
80 +
81 +### Deep_SORT :
82 +
83 + @inproceedings{Wojke2017simple,
84 + title={Simple Online and Realtime Tracking with a Deep Association Metric},
85 + author={Wojke, Nicolai and Bewley, Alex and Paulus, Dietrich},
86 + booktitle={2017 IEEE International Conference on Image Processing (ICIP)},
87 + year={2017},
88 + pages={3645--3649},
89 + organization={IEEE},
90 + doi={10.1109/ICIP.2017.8296962}
91 + }
92 +
93 + @inproceedings{Wojke2018deep,
94 + title={Deep Cosine Metric Learning for Person Re-identification},
95 + author={Wojke, Nicolai and Bewley, Alex},
96 + booktitle={2018 IEEE Winter Conference on Applications of Computer Vision (WACV)},
97 + year={2018},
98 + pages={748--756},
99 + organization={IEEE},
100 + doi={10.1109/WACV.2018.00087}
101 + }
102 +
103 +## Reference
104 +#### Github:deep_sort@[Nicolai Wojke nwojke](https://github.com/nwojke/deep_sort)
105 +#### Github:deep_sort_yolov3@[Qidian213 ](https://github.com/Qidian213/deep_sort_yolov3)
106 +#### Github:Deep-SORT-YOLOv4@[LeonLok](https://github.com/LeonLok/Deep-SORT-YOLOv4)
107 +
1 +import os
2 +import colorsys
3 +
4 +import numpy as np
5 +from keras import backend as K
6 +from keras.models import load_model
7 +from keras.layers import Input
8 +
9 +from yolo4.model import yolo_eval, yolo4_body
10 +from yolo4.utils import letterbox_image
11 +
12 +from PIL import Image, ImageFont, ImageDraw
13 +from timeit import default_timer as timer
14 +import matplotlib.pyplot as plt
15 +
16 +from operator import itemgetter
17 +
18 +class Yolo4(object):
19 + def get_class(self):
20 + classes_path = os.path.expanduser(self.classes_path)
21 + with open(classes_path) as f:
22 + class_names = f.readlines()
23 + class_names = [c.strip() for c in class_names]
24 + return class_names
25 +
26 + def get_anchors(self):
27 + anchors_path = os.path.expanduser(self.anchors_path)
28 + with open(anchors_path) as f:
29 + anchors = f.readline()
30 + anchors = [float(x) for x in anchors.split(',')]
31 + return np.array(anchors).reshape(-1, 2)
32 +
33 + def load_yolo(self):
34 + model_path = os.path.expanduser(self.model_path)
35 + assert model_path.endswith('.h5'), 'Keras model or weights must be a .h5 file.'
36 +
37 + self.class_names = self.get_class()
38 + self.anchors = self.get_anchors()
39 +
40 + num_anchors = len(self.anchors)
41 + num_classes = len(self.class_names)
42 +
43 + # Generate colors for drawing bounding boxes.
44 + hsv_tuples = [(x / len(self.class_names), 1., 1.)
45 + for x in range(len(self.class_names))]
46 + self.colors = list(map(lambda x: colorsys.hsv_to_rgb(*x), hsv_tuples))
47 + self.colors = list(
48 + map(lambda x: (int(x[0] * 255), int(x[1] * 255), int(x[2] * 255)),
49 + self.colors))
50 +
51 + self.sess = K.get_session()
52 +
53 + # Load model, or construct model and load weights.
54 + self.yolo4_model = yolo4_body(Input(shape=(608, 608, 3)), num_anchors//3, num_classes)
55 +
56 + # Read and convert darknet weight
57 + print('Loading weights.')
58 + weights_file = open(self.weights_path, 'rb')
59 + major, minor, revision = np.ndarray(
60 + shape=(3, ), dtype='int32', buffer=weights_file.read(12))
61 + if (major*10+minor)>=2 and major<1000 and minor<1000:
62 + seen = np.ndarray(shape=(1,), dtype='int64', buffer=weights_file.read(8))
63 + else:
64 + seen = np.ndarray(shape=(1,), dtype='int32', buffer=weights_file.read(4))
65 + print('Weights Header: ', major, minor, revision, seen)
66 +
67 + convs_to_load = []
68 + bns_to_load = []
69 + for i in range(len(self.yolo4_model.layers)):
70 + layer_name = self.yolo4_model.layers[i].name
71 + if layer_name.startswith('conv2d_'):
72 + convs_to_load.append((int(layer_name[7:]), i))
73 + if layer_name.startswith('batch_normalization_'):
74 + bns_to_load.append((int(layer_name[20:]), i))
75 +
76 + convs_sorted = sorted(convs_to_load, key=itemgetter(0))
77 + bns_sorted = sorted(bns_to_load, key=itemgetter(0))
78 +
79 + bn_index = 0
80 + for i in range(len(convs_sorted)):
81 + print('Converting ', i)
82 + if i == 93 or i == 101 or i == 109:
83 + #no bn, with bias
84 + weights_shape = self.yolo4_model.layers[convs_sorted[i][1]].get_weights()[0].shape
85 + bias_shape = self.yolo4_model.layers[convs_sorted[i][1]].get_weights()[0].shape[3]
86 + filters = bias_shape
87 + size = weights_shape[0]
88 + darknet_w_shape = (filters, weights_shape[2], size, size)
89 + weights_size = np.product(weights_shape)
90 +
91 + conv_bias = np.ndarray(
92 + shape=(filters, ),
93 + dtype='float32',
94 + buffer=weights_file.read(filters * 4))
95 + conv_weights = np.ndarray(
96 + shape=darknet_w_shape,
97 + dtype='float32',
98 + buffer=weights_file.read(weights_size * 4))
99 + conv_weights = np.transpose(conv_weights, [2, 3, 1, 0])
100 + self.yolo4_model.layers[convs_sorted[i][1]].set_weights([conv_weights, conv_bias])
101 + else:
102 + #with bn, no bias
103 + weights_shape = self.yolo4_model.layers[convs_sorted[i][1]].get_weights()[0].shape
104 + size = weights_shape[0]
105 + bn_shape = self.yolo4_model.layers[bns_sorted[bn_index][1]].get_weights()[0].shape
106 + filters = bn_shape[0]
107 + darknet_w_shape = (filters, weights_shape[2], size, size)
108 + weights_size = np.product(weights_shape)
109 +
110 + conv_bias = np.ndarray(
111 + shape=(filters, ),
112 + dtype='float32',
113 + buffer=weights_file.read(filters * 4))
114 + bn_weights = np.ndarray(
115 + shape=(3, filters),
116 + dtype='float32',
117 + buffer=weights_file.read(filters * 12))
118 +
119 + bn_weight_list = [
120 + bn_weights[0], # scale gamma
121 + conv_bias, # shift beta
122 + bn_weights[1], # running mean
123 + bn_weights[2] # running var
124 + ]
125 + self.yolo4_model.layers[bns_sorted[bn_index][1]].set_weights(bn_weight_list)
126 +
127 + conv_weights = np.ndarray(
128 + shape=darknet_w_shape,
129 + dtype='float32',
130 + buffer=weights_file.read(weights_size * 4))
131 + conv_weights = np.transpose(conv_weights, [2, 3, 1, 0])
132 + self.yolo4_model.layers[convs_sorted[i][1]].set_weights([conv_weights])
133 +
134 + bn_index += 1
135 +
136 + weights_file.close()
137 +
138 + self.yolo4_model.save(self.model_path)
139 +
140 +
141 + if self.gpu_num>=2:
142 + self.yolo4_model = multi_gpu_model(self.yolo4_model, gpus=self.gpu_num)
143 +
144 + self.input_image_shape = K.placeholder(shape=(2, ))
145 + self.boxes, self.scores, self.classes = yolo_eval(self.yolo4_model.output, self.anchors,
146 + len(self.class_names), self.input_image_shape,
147 + score_threshold=self.score)
148 +
149 + def __init__(self, score, iou, anchors_path, classes_path, model_path, weights_path, gpu_num=1):
150 + self.score = score
151 + self.iou = iou
152 + self.anchors_path = anchors_path
153 + self.classes_path = classes_path
154 + self.weights_path = weights_path
155 + self.model_path = model_path
156 + self.gpu_num = gpu_num
157 + self.load_yolo()
158 +
159 + def close_session(self):
160 + self.sess.close()
161 +
162 +if __name__ == '__main__':
163 + model_path = 'model_data/yolo4_weight.h5'
164 + anchors_path = 'model_data/yolo_anchors.txt'
165 + classes_path = 'model_data/coco_classes.txt'
166 + weights_path = 'model_data/yolov4.weights'
167 +
168 + score = 0.5
169 + iou = 0.5
170 +
171 + model_image_size = (608, 608)
172 +
173 + yolo4_model = Yolo4(score, iou, anchors_path, classes_path, model_path, weights_path)
174 +
175 + yolo4_model.close_session()
...\ No newline at end of file ...\ No newline at end of file
1 +# vim: expandtab:ts=4:sw=4
2 +import numpy as np
3 +
4 +
5 +class Detection(object):
6 + """
7 + This class represents a bounding box detection in a single image.
8 +
9 + Parameters
10 + ----------
11 + tlwh : array_like
12 + Bounding box in format `(x, y, w, h)`.
13 + confidence : float
14 + Detector confidence score.
15 + feature : array_like
16 + A feature vector that describes the object contained in this image.
17 +
18 + Attributes
19 + ----------
20 + tlwh : ndarray
21 + Bounding box in format `(top left x, top left y, width, height)`.
22 + confidence : ndarray
23 + Detector confidence score.
24 + feature : ndarray | NoneType
25 + A feature vector that describes the object contained in this image.
26 +
27 + """
28 +
29 + def __init__(self, tlwh, confidence, feature):
30 + self.tlwh = np.asarray(tlwh, dtype=np.float)
31 + self.confidence = float(confidence)
32 + self.feature = np.asarray(feature, dtype=np.float32)
33 +
34 + def to_tlbr(self):
35 + """Convert bounding box to format `(min x, min y, max x, max y)`, i.e.,
36 + `(top left, bottom right)`.
37 + """
38 + ret = self.tlwh.copy()
39 + ret[2:] += ret[:2]
40 + return ret
41 +
42 + def to_xyah(self):
43 + """Convert bounding box to format `(center x, center y, aspect ratio,
44 + height)`, where the aspect ratio is `width / height`.
45 + """
46 + ret = self.tlwh.copy()
47 + ret[:2] += ret[2:] / 2
48 + ret[2] /= ret[3]
49 + return ret
1 +# vim: expandtab:ts=4:sw=4
2 +import numpy as np
3 +
4 +
5 +class Detection_YOLO(object):
6 + """
7 + This class represents a bounding box detection in a single image.
8 + Parameters
9 + ----------
10 + tlwh : array_like
11 + Bounding box in format `(x, y, w, h)`.
12 + confidence : float
13 + Detector confidence score.
14 + feature : array_like
15 + A feature vector that describes the object contained in this image.
16 + Attributesutils
17 + ----------
18 + tlwh : ndarray
19 + Bounding box in format `(top left x, top left y, width, height)`.
20 + confidence : ndarray
21 + Detector confidence score.
22 + feature : ndarray | NoneType
23 + A feature vector that describes the object contained in this image.
24 + """
25 +
26 + def __init__(self, tlwh, confidence, cls):
27 + self.tlwh = np.asarray(tlwh, dtype=np.float)
28 + self.confidence = float(confidence)
29 + self.cls = cls
30 +
31 + def to_tlbr(self):
32 + """Convert bounding box to format `(min x, min y, max x, max y)`, i.e.,
33 + `(top left, bottom right)`.
34 + """
35 + ret = self.tlwh.copy()
36 + ret[2:] += ret[:2]
37 + return ret
38 +
39 + def to_xyah(self):
40 + """Convert bounding box to format `(center x, center y, aspect ratio,
41 + height)`, where the aspect ratio is `width / height`.
42 + """
43 + ret = self.tlwh.copy()
44 + ret[:2] += ret[2:] / 2
45 + ret[2] /= ret[3]
46 + return ret
1 +# vim: expandtab:ts=4:sw=4
2 +from __future__ import absolute_import
3 +import numpy as np
4 +from . import linear_assignment
5 +
6 +
7 +def iou(bbox, candidates):
8 + """Computer intersection over union.
9 +
10 + Parameters
11 + ----------
12 + bbox : ndarray
13 + A bounding box in format `(top left x, top left y, width, height)`.
14 + candidates : ndarray
15 + A matrix of candidate bounding boxes (one per row) in the same format
16 + as `bbox`.
17 +
18 + Returns
19 + -------
20 + ndarray
21 + The intersection over union in [0, 1] between the `bbox` and each
22 + candidate. A higher score means a larger fraction of the `bbox` is
23 + occluded by the candidate.
24 +
25 + """
26 + bbox_tl, bbox_br = bbox[:2], bbox[:2] + bbox[2:]
27 + candidates_tl = candidates[:, :2]
28 + candidates_br = candidates[:, :2] + candidates[:, 2:]
29 +
30 + tl = np.c_[np.maximum(bbox_tl[0], candidates_tl[:, 0])[:, np.newaxis],
31 + np.maximum(bbox_tl[1], candidates_tl[:, 1])[:, np.newaxis]]
32 + br = np.c_[np.minimum(bbox_br[0], candidates_br[:, 0])[:, np.newaxis],
33 + np.minimum(bbox_br[1], candidates_br[:, 1])[:, np.newaxis]]
34 + wh = np.maximum(0., br - tl)
35 +
36 + area_intersection = wh.prod(axis=1)
37 + area_bbox = bbox[2:].prod()
38 + area_candidates = candidates[:, 2:].prod(axis=1)
39 + return area_intersection / (area_bbox + area_candidates - area_intersection)
40 +
41 +
42 +def iou_cost(tracks, detections, track_indices=None,
43 + detection_indices=None):
44 + """An intersection over union distance metric.
45 +
46 + Parameters
47 + ----------
48 + tracks : List[deep_sort.track.Track]
49 + A list of tracks.
50 + detections : List[deep_sort.detection.Detection]
51 + A list of detections.
52 + track_indices : Optional[List[int]]
53 + A list of indices to tracks that should be matched. Defaults to
54 + all `tracks`.
55 + detection_indices : Optional[List[int]]
56 + A list of indices to detections that should be matched. Defaults
57 + to all `detections`.
58 +
59 + Returns
60 + -------
61 + ndarray
62 + Returns a cost matrix of shape
63 + len(track_indices), len(detection_indices) where entry (i, j) is
64 + `1 - iou(tracks[track_indices[i]], detections[detection_indices[j]])`.
65 +
66 + """
67 + if track_indices is None:
68 + track_indices = np.arange(len(tracks))
69 + if detection_indices is None:
70 + detection_indices = np.arange(len(detections))
71 +
72 + cost_matrix = np.zeros((len(track_indices), len(detection_indices)))
73 + for row, track_idx in enumerate(track_indices):
74 + if tracks[track_idx].time_since_update > 1:
75 + cost_matrix[row, :] = linear_assignment.INFTY_COST
76 + continue
77 +
78 + bbox = tracks[track_idx].to_tlwh()
79 + candidates = np.asarray([detections[i].tlwh for i in detection_indices])
80 + cost_matrix[row, :] = 1. - iou(bbox, candidates)
81 + return cost_matrix
1 +# vim: expandtab:ts=4:sw=4
2 +import numpy as np
3 +import scipy.linalg
4 +
5 +
6 +"""
7 +Table for the 0.95 quantile of the chi-square distribution with N degrees of
8 +freedom (contains values for N=1, ..., 9). Taken from MATLAB/Octave's chi2inv
9 +function and used as Mahalanobis gating threshold.
10 +"""
11 +chi2inv95 = {
12 + 1: 3.8415,
13 + 2: 5.9915,
14 + 3: 7.8147,
15 + 4: 9.4877,
16 + 5: 11.070,
17 + 6: 12.592,
18 + 7: 14.067,
19 + 8: 15.507,
20 + 9: 16.919}
21 +
22 +
23 +class KalmanFilter(object):
24 + """
25 + A simple Kalman filter for tracking bounding boxes in image space.
26 +
27 + The 8-dimensional state space
28 +
29 + x, y, a, h, vx, vy, va, vh
30 +
31 + contains the bounding box center position (x, y), aspect ratio a, height h,
32 + and their respective velocities.
33 +
34 + Object motion follows a constant velocity model. The bounding box location
35 + (x, y, a, h) is taken as direct observation of the state space (linear
36 + observation model).
37 +
38 + """
39 +
40 + def __init__(self):
41 + ndim, dt = 4, 1.
42 +
43 + # Create Kalman filter model matrices.
44 + self._motion_mat = np.eye(2 * ndim, 2 * ndim)
45 + for i in range(ndim):
46 + self._motion_mat[i, ndim + i] = dt
47 + self._update_mat = np.eye(ndim, 2 * ndim)
48 +
49 + # Motion and observation uncertainty are chosen relative to the current
50 + # state estimate. These weights control the amount of uncertainty in
51 + # the model. This is a bit hacky.
52 + self._std_weight_position = 1. / 20
53 + self._std_weight_velocity = 1. / 160
54 +
55 + def initiate(self, measurement):
56 + """Create track from unassociated measurement.
57 +
58 + Parameters
59 + ----------
60 + measurement : ndarray
61 + Bounding box coordinates (x, y, a, h) with center position (x, y),
62 + aspect ratio a, and height h.
63 +
64 + Returns
65 + -------
66 + (ndarray, ndarray)
67 + Returns the mean vector (8 dimensional) and covariance matrix (8x8
68 + dimensional) of the new track. Unobserved velocities are initialized
69 + to 0 mean.
70 +
71 + """
72 + mean_pos = measurement
73 + mean_vel = np.zeros_like(mean_pos)
74 + mean = np.r_[mean_pos, mean_vel]
75 +
76 + std = [
77 + 2 * self._std_weight_position * measurement[3],
78 + 2 * self._std_weight_position * measurement[3],
79 + 1e-2,
80 + 2 * self._std_weight_position * measurement[3],
81 + 10 * self._std_weight_velocity * measurement[3],
82 + 10 * self._std_weight_velocity * measurement[3],
83 + 1e-5,
84 + 10 * self._std_weight_velocity * measurement[3]]
85 + covariance = np.diag(np.square(std))
86 + return mean, covariance
87 +
88 + def predict(self, mean, covariance):
89 + """Run Kalman filter prediction step.
90 +
91 + Parameters
92 + ----------
93 + mean : ndarray
94 + The 8 dimensional mean vector of the object state at the previous
95 + time step.
96 + covariance : ndarray
97 + The 8x8 dimensional covariance matrix of the object state at the
98 + previous time step.
99 +
100 + Returns
101 + -------
102 + (ndarray, ndarray)
103 + Returns the mean vector and covariance matrix of the predicted
104 + state. Unobserved velocities are initialized to 0 mean.
105 +
106 + """
107 + std_pos = [
108 + self._std_weight_position * mean[3],
109 + self._std_weight_position * mean[3],
110 + 1e-2,
111 + self._std_weight_position * mean[3]]
112 + std_vel = [
113 + self._std_weight_velocity * mean[3],
114 + self._std_weight_velocity * mean[3],
115 + 1e-5,
116 + self._std_weight_velocity * mean[3]]
117 + motion_cov = np.diag(np.square(np.r_[std_pos, std_vel]))
118 +
119 + mean = np.dot(self._motion_mat, mean)
120 + covariance = np.linalg.multi_dot((
121 + self._motion_mat, covariance, self._motion_mat.T)) + motion_cov
122 +
123 + return mean, covariance
124 +
125 + def project(self, mean, covariance):
126 + """Project state distribution to measurement space.
127 +
128 + Parameters
129 + ----------
130 + mean : ndarray
131 + The state's mean vector (8 dimensional array).
132 + covariance : ndarray
133 + The state's covariance matrix (8x8 dimensional).
134 +
135 + Returns
136 + -------
137 + (ndarray, ndarray)
138 + Returns the projected mean and covariance matrix of the given state
139 + estimate.
140 +
141 + """
142 + std = [
143 + self._std_weight_position * mean[3],
144 + self._std_weight_position * mean[3],
145 + 1e-1,
146 + self._std_weight_position * mean[3]]
147 + innovation_cov = np.diag(np.square(std))
148 +
149 + mean = np.dot(self._update_mat, mean)
150 + covariance = np.linalg.multi_dot((
151 + self._update_mat, covariance, self._update_mat.T))
152 + return mean, covariance + innovation_cov
153 +
154 + def update(self, mean, covariance, measurement):
155 + """Run Kalman filter correction step.
156 +
157 + Parameters
158 + ----------
159 + mean : ndarray
160 + The predicted state's mean vector (8 dimensional).
161 + covariance : ndarray
162 + The state's covariance matrix (8x8 dimensional).
163 + measurement : ndarray
164 + The 4 dimensional measurement vector (x, y, a, h), where (x, y)
165 + is the center position, a the aspect ratio, and h the height of the
166 + bounding box.
167 +
168 + Returns
169 + -------
170 + (ndarray, ndarray)
171 + Returns the measurement-corrected state distribution.
172 +
173 + """
174 + projected_mean, projected_cov = self.project(mean, covariance)
175 +
176 + chol_factor, lower = scipy.linalg.cho_factor(
177 + projected_cov, lower=True, check_finite=False)
178 + kalman_gain = scipy.linalg.cho_solve(
179 + (chol_factor, lower), np.dot(covariance, self._update_mat.T).T,
180 + check_finite=False).T
181 + innovation = measurement - projected_mean
182 +
183 + new_mean = mean + np.dot(innovation, kalman_gain.T)
184 + new_covariance = covariance - np.linalg.multi_dot((
185 + kalman_gain, projected_cov, kalman_gain.T))
186 + return new_mean, new_covariance
187 +
188 + def gating_distance(self, mean, covariance, measurements,
189 + only_position=False):
190 + """Compute gating distance between state distribution and measurements.
191 +
192 + A suitable distance threshold can be obtained from `chi2inv95`. If
193 + `only_position` is False, the chi-square distribution has 4 degrees of
194 + freedom, otherwise 2.
195 +
196 + Parameters
197 + ----------
198 + mean : ndarray
199 + Mean vector over the state distribution (8 dimensional).
200 + covariance : ndarray
201 + Covariance of the state distribution (8x8 dimensional).
202 + measurements : ndarray
203 + An Nx4 dimensional matrix of N measurements, each in
204 + format (x, y, a, h) where (x, y) is the bounding box center
205 + position, a the aspect ratio, and h the height.
206 + only_position : Optional[bool]
207 + If True, distance computation is done with respect to the bounding
208 + box center position only.
209 +
210 + Returns
211 + -------
212 + ndarray
213 + Returns an array of length N, where the i-th element contains the
214 + squared Mahalanobis distance between (mean, covariance) and
215 + `measurements[i]`.
216 +
217 + """
218 + mean, covariance = self.project(mean, covariance)
219 + if only_position:
220 + mean, covariance = mean[:2], covariance[:2, :2]
221 + measurements = measurements[:, :2]
222 +
223 + cholesky_factor = np.linalg.cholesky(covariance)
224 + d = measurements - mean
225 + z = scipy.linalg.solve_triangular(
226 + cholesky_factor, d.T, lower=True, check_finite=False,
227 + overwrite_b=True)
228 + squared_maha = np.sum(z * z, axis=0)
229 + return squared_maha
1 +# vim: expandtab:ts=4:sw=4
2 +from __future__ import absolute_import
3 +import numpy as np
4 +from sklearn.utils.linear_assignment_ import linear_assignment
5 +from . import kalman_filter
6 +
7 +
8 +INFTY_COST = 1e+5
9 +
10 +
11 +def min_cost_matching(
12 + distance_metric, max_distance, tracks, detections, track_indices=None,
13 + detection_indices=None):
14 + """Solve linear assignment problem.
15 +
16 + Parameters
17 + ----------
18 + distance_metric : Callable[List[Track], List[Detection], List[int], List[int]) -> ndarray
19 + The distance metric is given a list of tracks and detections as well as
20 + a list of N track indices and M detection indices. The metric should
21 + return the NxM dimensional cost matrix, where element (i, j) is the
22 + association cost between the i-th track in the given track indices and
23 + the j-th detection in the given detection_indices.
24 + max_distance : float
25 + Gating threshold. Associations with cost larger than this value are
26 + disregarded.
27 + tracks : List[track.Track]
28 + A list of predicted tracks at the current time step.
29 + detections : List[detection.Detection]
30 + A list of detections at the current time step.
31 + track_indices : List[int]
32 + List of track indices that maps rows in `cost_matrix` to tracks in
33 + `tracks` (see description above).
34 + detection_indices : List[int]
35 + List of detection indices that maps columns in `cost_matrix` to
36 + detections in `detections` (see description above).
37 +
38 + Returns
39 + -------
40 + (List[(int, int)], List[int], List[int])
41 + Returns a tuple with the following three entries:
42 + * A list of matched track and detection indices.
43 + * A list of unmatched track indices.
44 + * A list of unmatched detection indices.
45 +
46 + """
47 + if track_indices is None:
48 + track_indices = np.arange(len(tracks))
49 + if detection_indices is None:
50 + detection_indices = np.arange(len(detections))
51 +
52 + if len(detection_indices) == 0 or len(track_indices) == 0:
53 + return [], track_indices, detection_indices # Nothing to match.
54 +
55 + cost_matrix = distance_metric(
56 + tracks, detections, track_indices, detection_indices)
57 + cost_matrix[cost_matrix > max_distance] = max_distance + 1e-5
58 + indices = linear_assignment(cost_matrix)
59 +
60 + matches, unmatched_tracks, unmatched_detections = [], [], []
61 + for col, detection_idx in enumerate(detection_indices):
62 + if col not in indices[:, 1]:
63 + unmatched_detections.append(detection_idx)
64 + for row, track_idx in enumerate(track_indices):
65 + if row not in indices[:, 0]:
66 + unmatched_tracks.append(track_idx)
67 + for row, col in indices:
68 + track_idx = track_indices[row]
69 + detection_idx = detection_indices[col]
70 + if cost_matrix[row, col] > max_distance:
71 + unmatched_tracks.append(track_idx)
72 + unmatched_detections.append(detection_idx)
73 + else:
74 + matches.append((track_idx, detection_idx))
75 + return matches, unmatched_tracks, unmatched_detections
76 +
77 +
78 +def matching_cascade(
79 + distance_metric, max_distance, cascade_depth, tracks, detections,
80 + track_indices=None, detection_indices=None):
81 + """Run matching cascade.
82 +
83 + Parameters
84 + ----------
85 + distance_metric : Callable[List[Track], List[Detection], List[int], List[int]) -> ndarray
86 + The distance metric is given a list of tracks and detections as well as
87 + a list of N track indices and M detection indices. The metric should
88 + return the NxM dimensional cost matrix, where element (i, j) is the
89 + association cost between the i-th track in the given track indices and
90 + the j-th detection in the given detection indices.
91 + max_distance : float
92 + Gating threshold. Associations with cost larger than this value are
93 + disregarded.
94 + cascade_depth: int
95 + The cascade depth, should be se to the maximum track age.
96 + tracks : List[track.Track]
97 + A list of predicted tracks at the current time step.
98 + detections : List[detection.Detection]
99 + A list of detections at the current time step.
100 + track_indices : Optional[List[int]]
101 + List of track indices that maps rows in `cost_matrix` to tracks in
102 + `tracks` (see description above). Defaults to all tracks.
103 + detection_indices : Optional[List[int]]
104 + List of detection indices that maps columns in `cost_matrix` to
105 + detections in `detections` (see description above). Defaults to all
106 + detections.
107 +
108 + Returns
109 + -------
110 + (List[(int, int)], List[int], List[int])
111 + Returns a tuple with the following three entries:
112 + * A list of matched track and detection indices.
113 + * A list of unmatched track indices.
114 + * A list of unmatched detection indices.
115 +
116 + """
117 + if track_indices is None:
118 + track_indices = list(range(len(tracks)))
119 + if detection_indices is None:
120 + detection_indices = list(range(len(detections)))
121 +
122 + unmatched_detections = detection_indices
123 + matches = []
124 + for level in range(cascade_depth):
125 + if len(unmatched_detections) == 0: # No detections left
126 + break
127 +
128 + track_indices_l = [
129 + k for k in track_indices
130 + if tracks[k].time_since_update == 1 + level
131 + ]
132 + if len(track_indices_l) == 0: # Nothing to match at this level
133 + continue
134 +
135 + matches_l, _, unmatched_detections = \
136 + min_cost_matching(
137 + distance_metric, max_distance, tracks, detections,
138 + track_indices_l, unmatched_detections)
139 + matches += matches_l
140 + unmatched_tracks = list(set(track_indices) - set(k for k, _ in matches))
141 + return matches, unmatched_tracks, unmatched_detections
142 +
143 +
144 +def gate_cost_matrix(
145 + kf, cost_matrix, tracks, detections, track_indices, detection_indices,
146 + gated_cost=INFTY_COST, only_position=False):
147 + """Invalidate infeasible entries in cost matrix based on the state
148 + distributions obtained by Kalman filtering.
149 +
150 + Parameters
151 + ----------
152 + kf : The Kalman filter.
153 + cost_matrix : ndarray
154 + The NxM dimensional cost matrix, where N is the number of track indices
155 + and M is the number of detection indices, such that entry (i, j) is the
156 + association cost between `tracks[track_indices[i]]` and
157 + `detections[detection_indices[j]]`.
158 + tracks : List[track.Track]
159 + A list of predicted tracks at the current time step.
160 + detections : List[detection.Detection]
161 + A list of detections at the current time step.
162 + track_indices : List[int]
163 + List of track indices that maps rows in `cost_matrix` to tracks in
164 + `tracks` (see description above).
165 + detection_indices : List[int]
166 + List of detection indices that maps columns in `cost_matrix` to
167 + detections in `detections` (see description above).
168 + gated_cost : Optional[float]
169 + Entries in the cost matrix corresponding to infeasible associations are
170 + set this value. Defaults to a very large value.
171 + only_position : Optional[bool]
172 + If True, only the x, y position of the state distribution is considered
173 + during gating. Defaults to False.
174 +
175 + Returns
176 + -------
177 + ndarray
178 + Returns the modified cost matrix.
179 +
180 + """
181 + gating_dim = 2 if only_position else 4
182 + gating_threshold = kalman_filter.chi2inv95[gating_dim]
183 + measurements = np.asarray(
184 + [detections[i].to_xyah() for i in detection_indices])
185 + for row, track_idx in enumerate(track_indices):
186 + track = tracks[track_idx]
187 + gating_distance = kf.gating_distance(
188 + track.mean, track.covariance, measurements, only_position)
189 + cost_matrix[row, gating_distance > gating_threshold] = gated_cost
190 + return cost_matrix
1 +# vim: expandtab:ts=4:sw=4
2 +import numpy as np
3 +
4 +
5 +def _pdist(a, b):
6 + """Compute pair-wise squared distance between points in `a` and `b`.
7 +
8 + Parameters
9 + ----------
10 + a : array_like
11 + An NxM matrix of N samples of dimensionality M.
12 + b : array_like
13 + An LxM matrix of L samples of dimensionality M.
14 +
15 + Returns
16 + -------
17 + ndarray
18 + Returns a matrix of size len(a), len(b) such that eleement (i, j)
19 + contains the squared distance between `a[i]` and `b[j]`.
20 +
21 + """
22 + a, b = np.asarray(a), np.asarray(b)
23 + if len(a) == 0 or len(b) == 0:
24 + return np.zeros((len(a), len(b)))
25 + a2, b2 = np.square(a).sum(axis=1), np.square(b).sum(axis=1)
26 + r2 = -2. * np.dot(a, b.T) + a2[:, None] + b2[None, :]
27 + r2 = np.clip(r2, 0., float(np.inf))
28 + return r2
29 +
30 +
31 +def _cosine_distance(a, b, data_is_normalized=False):
32 + """Compute pair-wise cosine distance between points in `a` and `b`.
33 +
34 + Parameters
35 + ----------
36 + a : array_like
37 + An NxM matrix of N samples of dimensionality M.
38 + b : array_like
39 + An LxM matrix of L samples of dimensionality M.
40 + data_is_normalized : Optional[bool]
41 + If True, assumes rows in a and b are unit length vectors.
42 + Otherwise, a and b are explicitly normalized to lenght 1.
43 +
44 + Returns
45 + -------
46 + ndarray
47 + Returns a matrix of size len(a), len(b) such that eleement (i, j)
48 + contains the squared distance between `a[i]` and `b[j]`.
49 +
50 + """
51 + if not data_is_normalized:
52 + a = np.asarray(a) / np.linalg.norm(a, axis=1, keepdims=True)
53 + b = np.asarray(b) / np.linalg.norm(b, axis=1, keepdims=True)
54 + return 1. - np.dot(a, b.T)
55 +
56 +
57 +def _nn_euclidean_distance(x, y):
58 + """ Helper function for nearest neighbor distance metric (Euclidean).
59 +
60 + Parameters
61 + ----------
62 + x : ndarray
63 + A matrix of N row-vectors (sample points).
64 + y : ndarray
65 + A matrix of M row-vectors (query points).
66 +
67 + Returns
68 + -------
69 + ndarray
70 + A vector of length M that contains for each entry in `y` the
71 + smallest Euclidean distance to a sample in `x`.
72 +
73 + """
74 + distances = _pdist(x, y)
75 + return np.maximum(0.0, distances.min(axis=0))
76 +
77 +
78 +def _nn_cosine_distance(x, y):
79 + """ Helper function for nearest neighbor distance metric (cosine).
80 +
81 + Parameters
82 + ----------
83 + x : ndarray
84 + A matrix of N row-vectors (sample points).
85 + y : ndarray
86 + A matrix of M row-vectors (query points).
87 +
88 + Returns
89 + -------
90 + ndarray
91 + A vector of length M that contains for each entry in `y` the
92 + smallest cosine distance to a sample in `x`.
93 +
94 + """
95 + distances = _cosine_distance(x, y)
96 + return distances.min(axis=0)
97 +
98 +
99 +class NearestNeighborDistanceMetric(object):
100 + """
101 + A nearest neighbor distance metric that, for each target, returns
102 + the closest distance to any sample that has been observed so far.
103 +
104 + Parameters
105 + ----------
106 + metric : str
107 + Either "euclidean" or "cosine".
108 + matching_threshold: float
109 + The matching threshold. Samples with larger distance are considered an
110 + invalid match.
111 + budget : Optional[int]
112 + If not None, fix samples per class to at most this number. Removes
113 + the oldest samples when the budget is reached.
114 +
115 + Attributes
116 + ----------
117 + samples : Dict[int -> List[ndarray]]
118 + A dictionary that maps from target identities to the list of samples
119 + that have been observed so far.
120 +
121 + """
122 +
123 + def __init__(self, metric, matching_threshold, budget=None):
124 +
125 +
126 + if metric == "euclidean":
127 + self._metric = _nn_euclidean_distance
128 + elif metric == "cosine":
129 + self._metric = _nn_cosine_distance
130 + else:
131 + raise ValueError(
132 + "Invalid metric; must be either 'euclidean' or 'cosine'")
133 + self.matching_threshold = matching_threshold
134 + self.budget = budget
135 + self.samples = {}
136 +
137 + def partial_fit(self, features, targets, active_targets):
138 + """Update the distance metric with new data.
139 +
140 + Parameters
141 + ----------
142 + features : ndarray
143 + An NxM matrix of N features of dimensionality M.
144 + targets : ndarray
145 + An integer array of associated target identities.
146 + active_targets : List[int]
147 + A list of targets that are currently present in the scene.
148 +
149 + """
150 + for feature, target in zip(features, targets):
151 + self.samples.setdefault(target, []).append(feature)
152 + if self.budget is not None:
153 + self.samples[target] = self.samples[target][-self.budget:]
154 + self.samples = {k: self.samples[k] for k in active_targets}
155 +
156 + def distance(self, features, targets):
157 + """Compute distance between features and targets.
158 +
159 + Parameters
160 + ----------
161 + features : ndarray
162 + An NxM matrix of N features of dimensionality M.
163 + targets : List[int]
164 + A list of targets to match the given `features` against.
165 +
166 + Returns
167 + -------
168 + ndarray
169 + Returns a cost matrix of shape len(targets), len(features), where
170 + element (i, j) contains the closest squared distance between
171 + `targets[i]` and `features[j]`.
172 +
173 + """
174 + cost_matrix = np.zeros((len(targets), len(features)))
175 + for i, target in enumerate(targets):
176 + cost_matrix[i, :] = self._metric(self.samples[target], features)
177 + return cost_matrix
1 +# vim: expandtab:ts=4:sw=4
2 +import numpy as np
3 +import cv2
4 +
5 +
6 +def non_max_suppression(boxes, max_bbox_overlap, scores=None):
7 + """Suppress overlapping detections.
8 +
9 + Original code from [1]_ has been adapted to include confidence score.
10 +
11 + .. [1] http://www.pyimagesearch.com/2015/02/16/
12 + faster-non-maximum-suppression-python/
13 +
14 + Examples
15 + --------
16 +
17 + >>> boxes = [d.roi for d in detections]
18 + >>> scores = [d.confidence for d in detections]
19 + >>> indices = non_max_suppression(boxes, max_bbox_overlap, scores)
20 + >>> detections = [detections[i] for i in indices]
21 +
22 + Parameters
23 + ----------
24 + boxes : ndarray
25 + Array of ROIs (x, y, width, height).
26 + max_bbox_overlap : float
27 + ROIs that overlap more than this values are suppressed.
28 + scores : Optional[array_like]
29 + Detector confidence score.
30 +
31 + Returns
32 + -------
33 + List[int]
34 + Returns indices of detections that have survived non-maxima suppression.
35 +
36 + """
37 + if len(boxes) == 0:
38 + return []
39 +
40 + boxes = boxes.astype(np.float)
41 + pick = []
42 +
43 + x1 = boxes[:, 0]
44 + y1 = boxes[:, 1]
45 + x2 = boxes[:, 2] + boxes[:, 0]
46 + y2 = boxes[:, 3] + boxes[:, 1]
47 +
48 + area = (x2 - x1 + 1) * (y2 - y1 + 1)
49 + if scores is not None:
50 + idxs = np.argsort(scores)
51 + else:
52 + idxs = np.argsort(y2)
53 +
54 + while len(idxs) > 0:
55 + last = len(idxs) - 1
56 + i = idxs[last]
57 + pick.append(i)
58 +
59 + xx1 = np.maximum(x1[i], x1[idxs[:last]])
60 + yy1 = np.maximum(y1[i], y1[idxs[:last]])
61 + xx2 = np.minimum(x2[i], x2[idxs[:last]])
62 + yy2 = np.minimum(y2[i], y2[idxs[:last]])
63 +
64 + w = np.maximum(0, xx2 - xx1 + 1)
65 + h = np.maximum(0, yy2 - yy1 + 1)
66 +
67 + overlap = (w * h) / area[idxs[:last]]
68 +
69 + idxs = np.delete(
70 + idxs, np.concatenate(
71 + ([last], np.where(overlap > max_bbox_overlap)[0])))
72 +
73 + return pick
1 +# vim: expandtab:ts=4:sw=4
2 +
3 +
4 +class TrackState:
5 + """
6 + Enumeration type for the single target track state. Newly created tracks are
7 + classified as `tentative` until enough evidence has been collected. Then,
8 + the track state is changed to `confirmed`. Tracks that are no longer alive
9 + are classified as `deleted` to mark them for removal from the set of active
10 + tracks.
11 +
12 + """
13 +
14 + Tentative = 1
15 + Confirmed = 2
16 + Deleted = 3
17 +
18 +
19 +class Track:
20 + """
21 + A single target track with state space `(x, y, a, h)` and associated
22 + velocities, where `(x, y)` is the center of the bounding box, `a` is the
23 + aspect ratio and `h` is the height.
24 +
25 + Parameters
26 + ----------
27 + mean : ndarray
28 + Mean vector of the initial state distribution.
29 + covariance : ndarray
30 + Covariance matrix of the initial state distribution.
31 + track_id : int
32 + A unique track identifier.
33 + n_init : int
34 + Number of consecutive detections before the track is confirmed. The
35 + track state is set to `Deleted` if a miss occurs within the first
36 + `n_init` frames.
37 + max_age : int
38 + The maximum number of consecutive misses before the track state is
39 + set to `Deleted`.
40 + feature : Optional[ndarray]
41 + Feature vector of the detection this track originates from. If not None,
42 + this feature is added to the `features` cache.
43 +
44 + Attributes
45 + ----------
46 + mean : ndarray
47 + Mean vector of the initial state distribution.
48 + covariance : ndarray
49 + Covariance matrix of the initial state distribution.
50 + track_id : int
51 + A unique track identifier.
52 + hits : int
53 + Total number of measurement updates.
54 + age : int
55 + Total number of frames since first occurance.
56 + time_since_update : int
57 + Total number of frames since last measurement update.
58 + state : TrackState
59 + The current track state.
60 + features : List[ndarray]
61 + A cache of features. On each measurement update, the associated feature
62 + vector is added to this list.
63 +
64 + """
65 +
66 + def __init__(self, mean, covariance, track_id, n_init, max_age,
67 + feature=None):
68 + self.mean = mean
69 + self.covariance = covariance
70 + self.track_id = track_id
71 + self.hits = 1
72 + self.age = 1
73 + self.time_since_update = 0
74 +
75 + self.state = TrackState.Tentative
76 + self.features = []
77 + if feature is not None:
78 + self.features.append(feature)
79 +
80 + self._n_init = n_init
81 + self._max_age = max_age
82 +
83 + def to_tlwh(self):
84 + """Get current position in bounding box format `(top left x, top left y,
85 + width, height)`.
86 +
87 + Returns
88 + -------
89 + ndarray
90 + The bounding box.
91 +
92 + """
93 + ret = self.mean[:4].copy()
94 + ret[2] *= ret[3]
95 + ret[:2] -= ret[2:] / 2
96 + return ret
97 +
98 + def to_tlbr(self):
99 + """Get current position in bounding box format `(min x, miny, max x,
100 + max y)`.
101 +
102 + Returns
103 + -------
104 + ndarray
105 + The bounding box.
106 +
107 + """
108 + ret = self.to_tlwh()
109 + ret[2:] = ret[:2] + ret[2:]
110 + return ret
111 +
112 + def predict(self, kf):
113 + """Propagate the state distribution to the current time step using a
114 + Kalman filter prediction step.
115 +
116 + Parameters
117 + ----------
118 + kf : kalman_filter.KalmanFilter
119 + The Kalman filter.
120 +
121 + """
122 + self.mean, self.covariance = kf.predict(self.mean, self.covariance)
123 + self.age += 1
124 + self.time_since_update += 1
125 +
126 + def update(self, kf, detection):
127 + """Perform Kalman filter measurement update step and update the feature
128 + cache.
129 +
130 + Parameters
131 + ----------
132 + kf : kalman_filter.KalmanFilter
133 + The Kalman filter.
134 + detection : Detection
135 + The associated detection.
136 +
137 + """
138 + self.mean, self.covariance = kf.update(
139 + self.mean, self.covariance, detection.to_xyah())
140 + self.features.append(detection.feature)
141 +
142 + self.hits += 1
143 + self.time_since_update = 0
144 + if self.state == TrackState.Tentative and self.hits >= self._n_init:
145 + self.state = TrackState.Confirmed
146 +
147 + def mark_missed(self):
148 + """Mark this track as missed (no association at the current time step).
149 + """
150 + if self.state == TrackState.Tentative:
151 + self.state = TrackState.Deleted
152 + elif self.time_since_update > self._max_age:
153 + self.state = TrackState.Deleted
154 +
155 + def is_tentative(self):
156 + """Returns True if this track is tentative (unconfirmed).
157 + """
158 + return self.state == TrackState.Tentative
159 +
160 + def is_confirmed(self):
161 + """Returns True if this track is confirmed."""
162 + return self.state == TrackState.Confirmed
163 +
164 + def is_deleted(self):
165 + """Returns True if this track is dead and should be deleted."""
166 + return self.state == TrackState.Deleted
1 +# vim: expandtab:ts=4:sw=4
2 +from __future__ import absolute_import
3 +import numpy as np
4 +from . import kalman_filter
5 +from . import linear_assignment
6 +from . import iou_matching
7 +from .track import Track
8 +
9 +
10 +class Tracker:
11 + """
12 + This is the multi-target tracker.
13 +
14 + Parameters
15 + ----------
16 + metric : nn_matching.NearestNeighborDistanceMetric
17 + A distance metric for measurement-to-track association.
18 + max_age : int
19 + Maximum number of missed misses before a track is deleted.
20 + n_init : int
21 + Number of consecutive detections before the track is confirmed. The
22 + track state is set to `Deleted` if a miss occurs within the first
23 + `n_init` frames.
24 +
25 + Attributes
26 + ----------
27 + metric : nn_matching.NearestNeighborDistanceMetric
28 + The distance metric used for measurement to track association.
29 + max_age : int
30 + Maximum number of missed misses before a track is deleted.
31 + n_init : int
32 + Number of frames that a track remains in initialization phase.
33 + kf : kalman_filter.KalmanFilter
34 + A Kalman filter to filter target trajectories in image space.
35 + tracks : List[Track]
36 + The list of active tracks at the current time step.
37 +
38 + """
39 +
40 + def __init__(self, metric, max_iou_distance=0.7, max_age=30, n_init=3):
41 + self.metric = metric
42 + self.max_iou_distance = max_iou_distance
43 + self.max_age = max_age
44 + self.n_init = n_init
45 +
46 + self.kf = kalman_filter.KalmanFilter()
47 + self.tracks = []
48 + self._next_id = 1
49 +
50 + def predict(self):
51 + """Propagate track state distributions one time step forward.
52 +
53 + This function should be called once every time step, before `update`.
54 + """
55 + for track in self.tracks:
56 + track.predict(self.kf)
57 +
58 + def update(self, detections):
59 + """Perform measurement update and track management.
60 +
61 + Parameters
62 + ----------
63 + detections : List[deep_sort.detection.Detection]
64 + A list of detections at the current time step.
65 +
66 + """
67 + # Run matching cascade.
68 + matches, unmatched_tracks, unmatched_detections = \
69 + self._match(detections)
70 +
71 + # Update track set.
72 + for track_idx, detection_idx in matches:
73 + self.tracks[track_idx].update(
74 + self.kf, detections[detection_idx])
75 + for track_idx in unmatched_tracks:
76 + self.tracks[track_idx].mark_missed()
77 + for detection_idx in unmatched_detections:
78 + self._initiate_track(detections[detection_idx])
79 + self.tracks = [t for t in self.tracks if not t.is_deleted()]
80 +
81 + # Update distance metric.
82 + active_targets = [t.track_id for t in self.tracks if t.is_confirmed()]
83 + features, targets = [], []
84 + for track in self.tracks:
85 + if not track.is_confirmed():
86 + continue
87 + features += track.features
88 + targets += [track.track_id for _ in track.features]
89 + track.features = []
90 + self.metric.partial_fit(
91 + np.asarray(features), np.asarray(targets), active_targets)
92 +
93 + def _match(self, detections):
94 +
95 + def gated_metric(tracks, dets, track_indices, detection_indices):
96 + features = np.array([dets[i].feature for i in detection_indices])
97 + targets = np.array([tracks[i].track_id for i in track_indices])
98 + cost_matrix = self.metric.distance(features, targets)
99 + cost_matrix = linear_assignment.gate_cost_matrix(
100 + self.kf, cost_matrix, tracks, dets, track_indices,
101 + detection_indices)
102 +
103 + return cost_matrix
104 +
105 + # Split track set into confirmed and unconfirmed tracks.
106 + confirmed_tracks = [
107 + i for i, t in enumerate(self.tracks) if t.is_confirmed()]
108 + unconfirmed_tracks = [
109 + i for i, t in enumerate(self.tracks) if not t.is_confirmed()]
110 +
111 + # Associate confirmed tracks using appearance features.
112 + matches_a, unmatched_tracks_a, unmatched_detections = \
113 + linear_assignment.matching_cascade(
114 + gated_metric, self.metric.matching_threshold, self.max_age,
115 + self.tracks, detections, confirmed_tracks)
116 +
117 + # Associate remaining tracks together with unconfirmed tracks using IOU.
118 + iou_track_candidates = unconfirmed_tracks + [
119 + k for k in unmatched_tracks_a if
120 + self.tracks[k].time_since_update == 1]
121 + unmatched_tracks_a = [
122 + k for k in unmatched_tracks_a if
123 + self.tracks[k].time_since_update != 1]
124 + matches_b, unmatched_tracks_b, unmatched_detections = \
125 + linear_assignment.min_cost_matching(
126 + iou_matching.iou_cost, self.max_iou_distance, self.tracks,
127 + detections, iou_track_candidates, unmatched_detections)
128 +
129 + matches = matches_a + matches_b
130 + unmatched_tracks = list(set(unmatched_tracks_a + unmatched_tracks_b))
131 + return matches, unmatched_tracks, unmatched_detections
132 +
133 + def _initiate_track(self, detection):
134 + mean, covariance = self.kf.initiate(detection.to_xyah())
135 + self.tracks.append(Track(
136 + mean, covariance, self._next_id, self.n_init, self.max_age,
137 + detection.feature))
138 + self._next_id += 1
This diff could not be displayed because it is too large.
1 +#! /usr/bin/env python
2 +# -*- coding: utf-8 -*-
3 +
4 +from __future__ import division, print_function, absolute_import
5 +import sys
6 +#sys.path.remove('/opt/ros/kinetic/lib/python2.7/dist-packages')
7 +import os
8 +import datetime
9 +from timeit import time
10 +import warnings
11 +import cv2
12 +import numpy as np
13 +import argparse
14 +from sklearn.cluster import KMeans
15 +import matplotlib.pyplot as plt
16 +from PIL import Image
17 +from yolo import YOLO
18 +import matplotlib.cbook as cbook
19 +
20 +from deep_sort import preprocessing
21 +from deep_sort import nn_matching
22 +from deep_sort.detection import Detection
23 +from deep_sort.tracker import Tracker
24 +from tools import generate_detections as gdet
25 +from deep_sort.detection import Detection as ddet
26 +from collections import deque
27 +from keras import backend
28 +import tensorflow as tf
29 +from tensorflow.compat.v1 import InteractiveSession
30 +import pandas as pd
31 +import json
32 +
33 +def createKmeans(file_name):
34 + data = json.load(open(os.getcwd() + '/../deep_sort_yolov4/output/'+ file_name +'_xy.json'))
35 +
36 + data = pd.DataFrame(data["data"])
37 +
38 + xy = data[['x','y']]
39 +
40 + km = KMeans(n_clusters=5)
41 + km.fit(xy)
42 +
43 + predict = pd.DataFrame(km.predict(xy))
44 + predict.columns=['predict']
45 + r = pd.concat([xy,predict],axis=1)
46 +
47 + plt.scatter(r['x'],r['y'],c=r['predict'],alpha=0.3, s=200)
48 +
49 +
50 + imageFile = cbook.get_sample_data(os.getcwd() + '/../deep_sort_yolov4/output/'+ file_name + '_img.png')
51 + image = plt.imread(imageFile)
52 + plt.imshow(image)
53 + plt.savefig(os.getcwd() + '/public/data/'+file_name+'_kmeans.png')
54 +
55 +def draw_border(img, pt1, pt2, color, thickness, r, d):
56 + x1,y1 = pt1
57 + x2,y2 = pt2
58 +
59 + # Top left
60 + cv2.line(img, (x1 + r, y1), (x1 + r + d, y1), color, thickness)
61 + cv2.line(img, (x1, y1 + r), (x1, y1 + r + d), color, thickness)
62 + cv2.ellipse(img, (x1 + r, y1 + r), (r, r), 180, 0, 90, color, thickness)
63 +
64 + # Top right
65 + cv2.line(img, (x2 - r, y1), (x2 - r - d, y1), color, thickness)
66 + cv2.line(img, (x2, y1 + r), (x2, y1 + r + d), color, thickness)
67 + cv2.ellipse(img, (x2 - r, y1 + r), (r, r), 270, 0, 90, color, thickness)
68 +
69 + # Bottom left
70 + cv2.line(img, (x1 + r, y2), (x1 + r + d, y2), color, thickness)
71 + cv2.line(img, (x1, y2 - r), (x1, y2 - r - d), color, thickness)
72 + cv2.ellipse(img, (x1 + r, y2 - r), (r, r), 90, 0, 90, color, thickness)
73 +
74 + # Bottom right
75 + cv2.line(img, (x2 - r, y2), (x2 - r - d, y2), color, thickness)
76 + cv2.line(img, (x2, y2 - r), (x2, y2 - r - d), color, thickness)
77 + cv2.ellipse(img, (x2 - r, y2 - r), (r, r), 0, 0, 90, color, thickness)
78 +
79 +config = tf.ConfigProto()
80 +config.gpu_options.allow_growth = True
81 +session = InteractiveSession(config=config)
82 +
83 +
84 +#ap = argparse.ArgumentParser()
85 +#ap.add_argument("-i", "--input",help="path to input video", default = "./test_video/test3.mp4")
86 +#ap.add_argument("-c", "--class",help="name of class", default = "person")
87 +#args = vars(ap.parse_args())
88 +
89 +pts = [deque(maxlen=30) for _ in range(9999)]
90 +warnings.filterwarnings('ignore')
91 +
92 +# initialize a list of colors to represent each possible class label
93 +np.random.seed(100)
94 +COLORS = np.random.randint(0, 255, size=(200, 3),
95 + dtype="uint8")
96 +#list = [[] for _ in range(100)]
97 +
98 +def main(yolo):
99 + df1 = pd.DataFrame(columns=['x','y','id','time','s'])
100 + df2 = pd.DataFrame(columns=['total','now','time','s'])
101 + start = datetime.datetime.now()
102 + max_cosine_distance = 0.3
103 + nn_budget = None
104 + nms_max_overlap = 1.0
105 +
106 + counter = []
107 + #deep_sort
108 + model_filename = os.getcwd() + '/../deep_sort_yolov4/model_data/market1501.pb'
109 + encoder = gdet.create_box_encoder(model_filename,batch_size=1)
110 +
111 + find_objects = ['person']
112 + metric = nn_matching.NearestNeighborDistanceMetric("cosine", max_cosine_distance, nn_budget)
113 + tracker = Tracker(metric)
114 +
115 + writeVideo_flag = True
116 + video_capture = cv2.VideoCapture(os.getcwd() + '/../deep_sort_yolov4/' + sys.argv[1])
117 +
118 + file_name = sys.argv[1].split('.')[0]
119 + if writeVideo_flag:
120 + # Define the codec and create VideoWriter object
121 + w = int(video_capture.get(3))
122 + h = int(video_capture.get(4))
123 + total_frame = int(video_capture.get(cv2.CAP_PROP_FRAME_COUNT))
124 + fourcc = cv2.VideoWriter_fourcc(*'MP4V')
125 + out = cv2.VideoWriter(os.getcwd() + '/public/data/'+file_name+'.mp4', fourcc, 15, (w, h))
126 + list_file = open(os.getcwd() + '/../deep_sort_yolov4/detection_rslt.txt', 'w')
127 + frame_index = -1
128 +
129 + fps = 0.0
130 + test = 1
131 + while True:
132 +
133 + ret, frame = video_capture.read() # frame shape 640*480*3
134 + if ret != True:
135 + break
136 + cFrame = int(video_capture.get(cv2.CAP_PROP_POS_MSEC)/76)
137 + #temp = pd.DataFrame({'num':cFrame},index = [0])
138 + #temp.to_csv(os.getcwd() + '/../deep_sort_yolov4/output/temp.csv')
139 + t1 = time.time()
140 + if(test == 1 ):
141 + cv2.imwrite(os.getcwd() + '/../deep_sort_yolov4/output/' + file_name + '_img.png',frame)
142 + test +=1
143 +
144 + #image = Image.fromarray(frame)
145 + image = Image.fromarray(frame[...,::-1]) #bgr to rgb
146 + boxs, confidence, class_names = yolo.detect_image(image)
147 + features = encoder(frame,boxs)
148 + # score to 1.0 here).
149 + detections = [Detection(bbox, 1.0, feature) for bbox, feature in zip(boxs, features)]
150 + # Run non-maxima suppression.
151 + boxes = np.array([d.tlwh for d in detections])
152 + scores = np.array([d.confidence for d in detections])
153 + indices = preprocessing.non_max_suppression(boxes, nms_max_overlap, scores)
154 + detections = [detections[i] for i in indices]
155 + cs = float(59)*int(cFrame)/total_frame
156 + t = start + datetime.timedelta(milliseconds=cs*1000)
157 + time_format = "%Y-%m-%d %H:%M:%S.%f"
158 + time_str = t.strftime(time_format)
159 + cTime = time_str
160 +
161 + # Call the tracker
162 + tracker.predict()
163 + tracker.update(detections)
164 +
165 + i = int(0)
166 + indexIDs = []
167 + c = []
168 + boxes = []
169 +
170 + for track in tracker.tracks:
171 + if not track.is_confirmed() or track.time_since_update > 1:
172 + continue
173 + #boxes.append([track[0], track[1], track[2], track[3]])
174 + indexIDs.append(int(track.track_id))
175 + counter.append(int(track.track_id))
176 + bbox = track.to_tlbr()
177 + color = [int(c) for c in COLORS[indexIDs[i] % len(COLORS)]]
178 + #print(frame_index)
179 + list_file.write(str(frame_index)+',')
180 + list_file.write(str(track.track_id)+',')
181 + center = (int(((bbox[0])+(bbox[2]))/2),int(((bbox[1])+(bbox[3]))/2))
182 + r = (int(bbox[2]) - int(bbox[0]))/2
183 + draw_border(frame,(int(((bbox[0])+(bbox[2]))/2) - int(r), int(((bbox[1])+(bbox[3]))/2) - int(r)), (int(((bbox[0])+(bbox[2]))/2) + int(r), int(((bbox[1])+(bbox[3]))/2) + int(r)),(color),4,int(r/5),int(r/5))
184 + #cv2.rectangle(frame, (int(bbox[0]), int(bbox[1])), (int(bbox[2]), int(bbox[3])),(color), 3)
185 + b0 = str(bbox[0])#.split('.')[0] + '.' + str(bbox[0]).split('.')[0][:1]
186 + b1 = str(bbox[1])#.split('.')[0] + '.' + str(bbox[1]).split('.')[0][:1]
187 + b2 = str(bbox[2]-bbox[0])#.split('.')[0] + '.' + str(bbox[3]).split('.')[0][:1]
188 + b3 = str(bbox[3]-bbox[1])
189 +
190 + list_file.write(str(b0) + ','+str(b1) + ','+str(b2) + ','+str(b3))
191 + #print(str(track.track_id))
192 + list_file.write('\n')
193 + #list_file.write(str(track.track_id)+',')
194 + #cv2.putText(frame,str(track.track_id),(int(bbox[0]), int(bbox[1] + 30)),0, 5e-3 * 150, (color),2)
195 + cv2.putText(frame,str(track.track_id),(int(((bbox[0])+(bbox[2]))/2), int(((bbox[1])+(bbox[3]))/2 + 30)),0, 5e-3 * 150, (color),2)
196 +
197 + if len(class_names) > 0:
198 + class_name = class_names[0]
199 + cv2.putText(frame, str(class_names[0]),(int(((bbox[0])+(bbox[2]))/2), int(((bbox[1])+(bbox[3]))/2 + 10)),0, 5e-3 * 150, (color),2)
200 +
201 + i += 1
202 + #bbox_center_point(x,y)
203 + #track_id[center]
204 + x,y = int(((bbox[0])+(bbox[2]))/2),int(((bbox[1])+(bbox[3]))/2)
205 + # x, y
206 + if (cFrame%10==0):
207 + df1 = df1.append({'x':x,'y':y,'id':int(track.track_id),'time':cTime,'s': cs},ignore_index=True)
208 +
209 + pts[track.track_id].append(center)
210 +
211 + thickness = 5
212 + #center point
213 + #cv2.circle(frame, (center), 1, color, thickness)
214 +
215 + # draw motion path
216 + # for j in range(1, len(pts[track.track_id])):
217 + # if pts[track.track_id][j - 1] is None or pts[track.track_id][j] is None:
218 + # continue
219 + # thickness = int(np.sqrt(64 / float(j + 1)) * 2)
220 + #cv2.line(frame,(pts[track.track_id][j-1]), (pts[track.track_id][j]),(color),thickness)
221 + #cv2.putText(frame, str(class_names[j]),(int(bbox[0]), int(bbox[1] -20)),0, 5e-3 * 150, (255,255,255),2)
222 +
223 + count = len(set(counter))
224 + if (cFrame%10==0):
225 + df2 = df2.append({'total':count,'now':i,'time':cTime,'s': cs},ignore_index=True)
226 + cv2.putText(frame, "Total Pedestrian Counter: "+str(count),(int(30), int(105)),0, 5e-3 * 150, (0,255,0),2)
227 + cv2.putText(frame, "Current Pedestrian Counter: "+str(i),(int(30), int(80)),0, 5e-3 * 150, (0,255,0),2)
228 + #cv2.putText(frame, "FPS: %f"%(fps),(int(20), int(40)),0, 5e-3 * 200, (0,255,0),3)
229 + cv2.putText(frame, "Time: " + str(cTime),(int(30), int(55)),0, 5e-3 * 150, (0,255,0),3)
230 + cv2.namedWindow("YOLO4_Deep_SORT", 0);
231 + cv2.resizeWindow('YOLO4_Deep_SORT', 1024, 768);
232 + cv2.imshow('YOLO4_Deep_SORT', frame)
233 +
234 +
235 + if writeVideo_flag:
236 + # save a frame
237 + out.write(frame)
238 + frame_index = frame_index + 1
239 +
240 +
241 + fps = ( fps + (1./(time.time()-t1)) ) / 2
242 + out.write(frame)
243 + frame_index = frame_index + 1
244 +
245 + # Press Q to stop!
246 + if cv2.waitKey(1) & 0xFF == ord('q'):
247 + break
248 + df1.to_json(os.getcwd() + '/../deep_sort_yolov4/output/' + file_name + '_xy.json',orient='table')
249 + df2.to_json(os.getcwd() + '/../deep_sort_yolov4/output/' + file_name + '_count.json',orient='table')
250 + createKmeans(file_name)
251 + print(" ")
252 + print("[Finish]")
253 + end = time.time()
254 +
255 + #print("[INFO]: model_image_size = (960, 960)")
256 + video_capture.release()
257 + if writeVideo_flag:
258 + out.release()
259 + list_file.close()
260 + cv2.destroyAllWindows()
261 +
262 +if __name__ == '__main__':
263 + main(YOLO())
1 +person
2 +bicycle
3 +car
4 +motorbike
5 +aeroplane
6 +bus
7 +train
8 +truck
9 +boat
10 +traffic light
11 +fire hydrant
12 +stop sign
13 +parking meter
14 +bench
15 +bird
16 +cat
17 +dog
18 +horse
19 +sheep
20 +cow
21 +elephant
22 +bear
23 +zebra
24 +giraffe
25 +backpack
26 +umbrella
27 +handbag
28 +tie
29 +suitcase
30 +frisbee
31 +skis
32 +snowboard
33 +sports ball
34 +kite
35 +baseball bat
36 +baseball glove
37 +skateboard
38 +surfboard
39 +tennis racket
40 +bottle
41 +wine glass
42 +cup
43 +fork
44 +knife
45 +spoon
46 +bowl
47 +banana
48 +apple
49 +sandwich
50 +orange
51 +broccoli
52 +carrot
53 +hot dog
54 +pizza
55 +donut
56 +cake
57 +chair
58 +sofa
59 +pottedplant
60 +bed
61 +diningtable
62 +toilet
63 +tvmonitor
64 +laptop
65 +mouse
66 +remote
67 +keyboard
68 +cell phone
69 +microwave
70 +oven
71 +toaster
72 +sink
73 +refrigerator
74 +book
75 +clock
76 +vase
77 +scissors
78 +teddy bear
79 +hair drier
80 +toothbrush
This file is too large to display.
This file is too large to display.
This file is too large to display.
1 +12, 16, 19, 36, 40, 28, 36, 75, 76, 55, 72, 146, 142, 110, 192, 243, 459, 401
1 +{"schema":{"fields":[{"name":"index","type":"integer"},{"name":"total","type":"string"},{"name":"now","type":"string"},{"name":"time","type":"string"},{"name":"s","type":"number"}],"primaryKey":["index"],"pandas_version":"0.20.0"},"data":[{"index":0,"total":0,"now":0,"time":"2020-06-17 03:27:10.288851","s":0.0},{"index":1,"total":11,"now":7,"time":"2020-06-17 03:27:11.051125","s":0.7622739018},{"index":2,"total":13,"now":8,"time":"2020-06-17 03:27:11.813399","s":1.5245478036},{"index":3,"total":15,"now":10,"time":"2020-06-17 03:27:12.575673","s":2.2868217054},{"index":4,"total":21,"now":13,"time":"2020-06-17 03:27:13.337947","s":3.0490956072},{"index":5,"total":22,"now":13,"time":"2020-06-17 03:27:14.100221","s":3.811369509},{"index":6,"total":23,"now":12,"time":"2020-06-17 03:27:14.862494","s":4.5736434109},{"index":7,"total":24,"now":11,"time":"2020-06-17 03:27:15.624768","s":5.3359173127},{"index":8,"total":24,"now":7,"time":"2020-06-17 03:27:16.387042","s":6.0981912145},{"index":9,"total":25,"now":8,"time":"2020-06-17 03:27:17.149316","s":6.8604651163},{"index":10,"total":26,"now":9,"time":"2020-06-17 03:27:17.911590","s":7.6227390181},{"index":11,"total":26,"now":10,"time":"2020-06-17 03:27:18.673864","s":8.3850129199},{"index":12,"total":26,"now":7,"time":"2020-06-17 03:27:19.436138","s":9.1472868217},{"index":13,"total":29,"now":8,"time":"2020-06-17 03:27:20.198412","s":9.9095607235},{"index":14,"total":31,"now":11,"time":"2020-06-17 03:27:20.960686","s":10.6718346253},{"index":15,"total":31,"now":10,"time":"2020-06-17 03:27:21.722960","s":11.4341085271},{"index":16,"total":32,"now":13,"time":"2020-06-17 03:27:22.485233","s":12.1963824289},{"index":17,"total":33,"now":11,"time":"2020-06-17 03:27:23.247507","s":12.9586563307},{"index":18,"total":33,"now":11,"time":"2020-06-17 03:27:24.009781","s":13.7209302326},{"index":19,"total":35,"now":11,"time":"2020-06-17 03:27:24.772055","s":14.4832041344},{"index":20,"total":37,"now":9,"time":"2020-06-17 03:27:25.534329","s":15.2454780362},{"index":21,"total":38,"now":10,"time":"2020-06-17 03:27:26.296603","s":16.007751938},{"index":22,"total":39,"now":11,"time":"2020-06-17 03:27:27.058877","s":16.7700258398},{"index":23,"total":41,"now":12,"time":"2020-06-17 03:27:27.821151","s":17.5322997416},{"index":24,"total":42,"now":11,"time":"2020-06-17 03:27:28.583425","s":18.2945736434},{"index":25,"total":42,"now":12,"time":"2020-06-17 03:27:29.345699","s":19.0568475452},{"index":26,"total":42,"now":10,"time":"2020-06-17 03:27:30.107972","s":19.819121447},{"index":27,"total":42,"now":8,"time":"2020-06-17 03:27:30.870246","s":20.5813953488},{"index":28,"total":43,"now":9,"time":"2020-06-17 03:27:31.632520","s":21.3436692506},{"index":29,"total":45,"now":11,"time":"2020-06-17 03:27:32.394794","s":22.1059431525},{"index":30,"total":45,"now":9,"time":"2020-06-17 03:27:33.157068","s":22.8682170543},{"index":31,"total":46,"now":11,"time":"2020-06-17 03:27:33.919342","s":23.6304909561},{"index":32,"total":46,"now":11,"time":"2020-06-17 03:27:34.681616","s":24.3927648579},{"index":33,"total":46,"now":11,"time":"2020-06-17 03:27:35.443890","s":25.1550387597},{"index":34,"total":47,"now":10,"time":"2020-06-17 03:27:36.206164","s":25.9173126615},{"index":35,"total":47,"now":9,"time":"2020-06-17 03:27:36.968438","s":26.6795865633},{"index":36,"total":48,"now":10,"time":"2020-06-17 03:27:37.730711","s":27.4418604651},{"index":37,"total":48,"now":10,"time":"2020-06-17 03:27:38.492985","s":28.2041343669},{"index":38,"total":50,"now":10,"time":"2020-06-17 03:27:39.255259","s":28.9664082687},{"index":39,"total":51,"now":10,"time":"2020-06-17 03:27:40.017533","s":29.7286821705},{"index":40,"total":52,"now":9,"time":"2020-06-17 03:27:40.779807","s":30.4909560724},{"index":41,"total":52,"now":8,"time":"2020-06-17 03:27:41.542081","s":31.2532299742},{"index":42,"total":53,"now":7,"time":"2020-06-17 03:27:42.304355","s":32.015503876},{"index":43,"total":55,"now":8,"time":"2020-06-17 03:27:43.066629","s":32.7777777778},{"index":44,"total":56,"now":8,"time":"2020-06-17 03:27:43.828903","s":33.5400516796},{"index":45,"total":56,"now":10,"time":"2020-06-17 03:27:44.591177","s":34.3023255814},{"index":46,"total":56,"now":9,"time":"2020-06-17 03:27:45.353450","s":35.0645994832},{"index":47,"total":57,"now":10,"time":"2020-06-17 03:27:46.115724","s":35.826873385},{"index":48,"total":57,"now":8,"time":"2020-06-17 03:27:46.877998","s":36.5891472868},{"index":49,"total":58,"now":10,"time":"2020-06-17 03:27:47.640272","s":37.3514211886},{"index":50,"total":58,"now":8,"time":"2020-06-17 03:27:48.402546","s":38.1136950904},{"index":51,"total":60,"now":8,"time":"2020-06-17 03:27:49.164820","s":38.8759689922},{"index":52,"total":63,"now":12,"time":"2020-06-17 03:27:49.927094","s":39.6382428941},{"index":53,"total":63,"now":12,"time":"2020-06-17 03:27:50.689368","s":40.4005167959},{"index":54,"total":63,"now":8,"time":"2020-06-17 03:27:51.451642","s":41.1627906977},{"index":55,"total":63,"now":10,"time":"2020-06-17 03:27:52.213916","s":41.9250645995},{"index":56,"total":64,"now":11,"time":"2020-06-17 03:27:52.976190","s":42.6873385013},{"index":57,"total":64,"now":9,"time":"2020-06-17 03:27:53.738463","s":43.4496124031},{"index":58,"total":64,"now":10,"time":"2020-06-17 03:27:54.500737","s":44.2118863049},{"index":59,"total":64,"now":9,"time":"2020-06-17 03:27:55.263011","s":44.9741602067},{"index":60,"total":64,"now":8,"time":"2020-06-17 03:27:56.025285","s":45.7364341085},{"index":61,"total":64,"now":9,"time":"2020-06-17 03:27:56.787559","s":46.4987080103},{"index":62,"total":65,"now":8,"time":"2020-06-17 03:27:57.549833","s":47.2609819121},{"index":63,"total":66,"now":9,"time":"2020-06-17 03:27:58.312107","s":48.023255814},{"index":64,"total":67,"now":10,"time":"2020-06-17 03:27:59.074381","s":48.7855297158},{"index":65,"total":68,"now":10,"time":"2020-06-17 03:27:59.836655","s":49.5478036176},{"index":66,"total":68,"now":9,"time":"2020-06-17 03:28:00.598929","s":50.3100775194},{"index":67,"total":69,"now":9,"time":"2020-06-17 03:28:01.361202","s":51.0723514212},{"index":68,"total":69,"now":8,"time":"2020-06-17 03:28:02.123476","s":51.834625323},{"index":69,"total":69,"now":10,"time":"2020-06-17 03:28:02.885750","s":52.5968992248},{"index":70,"total":70,"now":8,"time":"2020-06-17 03:28:03.648024","s":53.3591731266},{"index":71,"total":71,"now":9,"time":"2020-06-17 03:28:04.410298","s":54.1214470284},{"index":72,"total":73,"now":11,"time":"2020-06-17 03:28:05.172572","s":54.8837209302},{"index":73,"total":73,"now":10,"time":"2020-06-17 03:28:05.934846","s":55.645994832},{"index":74,"total":74,"now":9,"time":"2020-06-17 03:28:06.697120","s":56.4082687339},{"index":75,"total":75,"now":10,"time":"2020-06-17 03:28:07.459394","s":57.1705426357},{"index":76,"total":75,"now":7,"time":"2020-06-17 03:28:08.221668","s":57.9328165375},{"index":77,"total":77,"now":10,"time":"2020-06-17 03:28:08.983941","s":58.6950904393}]}
...\ No newline at end of file ...\ No newline at end of file
1 +{"schema":{"fields":[{"name":"index","type":"integer"},{"name":"x","type":"string"},{"name":"y","type":"string"},{"name":"id","type":"string"},{"name":"time","type":"string"},{"name":"s","type":"number"}],"primaryKey":["index"],"pandas_version":"0.20.0"},"data":[{"index":0,"x":902,"y":316,"id":1,"time":"2020-06-17 03:27:11.051125","s":0.7622739018},{"index":1,"x":747,"y":541,"id":2,"time":"2020-06-17 03:27:11.051125","s":0.7622739018},{"index":2,"x":75,"y":354,"id":3,"time":"2020-06-17 03:27:11.051125","s":0.7622739018},{"index":3,"x":918,"y":559,"id":4,"time":"2020-06-17 03:27:11.051125","s":0.7622739018},{"index":4,"x":652,"y":104,"id":5,"time":"2020-06-17 03:27:11.051125","s":0.7622739018},{"index":5,"x":288,"y":205,"id":7,"time":"2020-06-17 03:27:11.051125","s":0.7622739018},{"index":6,"x":290,"y":110,"id":11,"time":"2020-06-17 03:27:11.051125","s":0.7622739018},{"index":7,"x":919,"y":296,"id":1,"time":"2020-06-17 03:27:11.813399","s":1.5245478036},{"index":8,"x":766,"y":470,"id":2,"time":"2020-06-17 03:27:11.813399","s":1.5245478036},{"index":9,"x":75,"y":352,"id":3,"time":"2020-06-17 03:27:11.813399","s":1.5245478036},{"index":10,"x":927,"y":551,"id":4,"time":"2020-06-17 03:27:11.813399","s":1.5245478036},{"index":11,"x":655,"y":104,"id":5,"time":"2020-06-17 03:27:11.813399","s":1.5245478036},{"index":12,"x":285,"y":202,"id":7,"time":"2020-06-17 03:27:11.813399","s":1.5245478036},{"index":13,"x":364,"y":188,"id":8,"time":"2020-06-17 03:27:11.813399","s":1.5245478036},{"index":14,"x":112,"y":250,"id":14,"time":"2020-06-17 03:27:11.813399","s":1.5245478036},{"index":15,"x":903,"y":281,"id":1,"time":"2020-06-17 03:27:12.575673","s":2.2868217054},{"index":16,"x":827,"y":435,"id":2,"time":"2020-06-17 03:27:12.575673","s":2.2868217054},{"index":17,"x":73,"y":351,"id":3,"time":"2020-06-17 03:27:12.575673","s":2.2868217054},{"index":18,"x":926,"y":569,"id":4,"time":"2020-06-17 03:27:12.575673","s":2.2868217054},{"index":19,"x":653,"y":104,"id":5,"time":"2020-06-17 03:27:12.575673","s":2.2868217054},{"index":20,"x":273,"y":188,"id":7,"time":"2020-06-17 03:27:12.575673","s":2.2868217054},{"index":21,"x":438,"y":205,"id":8,"time":"2020-06-17 03:27:12.575673","s":2.2868217054},{"index":22,"x":139,"y":248,"id":14,"time":"2020-06-17 03:27:12.575673","s":2.2868217054},{"index":23,"x":307,"y":143,"id":20,"time":"2020-06-17 03:27:12.575673","s":2.2868217054},{"index":24,"x":1051,"y":567,"id":24,"time":"2020-06-17 03:27:12.575673","s":2.2868217054},{"index":25,"x":901,"y":227,"id":1,"time":"2020-06-17 03:27:13.337947","s":3.0490956072},{"index":26,"x":81,"y":349,"id":3,"time":"2020-06-17 03:27:13.337947","s":3.0490956072},{"index":27,"x":867,"y":575,"id":4,"time":"2020-06-17 03:27:13.337947","s":3.0490956072},{"index":28,"x":650,"y":104,"id":5,"time":"2020-06-17 03:27:13.337947","s":3.0490956072},{"index":29,"x":275,"y":193,"id":7,"time":"2020-06-17 03:27:13.337947","s":3.0490956072},{"index":30,"x":446,"y":206,"id":8,"time":"2020-06-17 03:27:13.337947","s":3.0490956072},{"index":31,"x":141,"y":248,"id":14,"time":"2020-06-17 03:27:13.337947","s":3.0490956072},{"index":32,"x":293,"y":116,"id":20,"time":"2020-06-17 03:27:13.337947","s":3.0490956072},{"index":33,"x":390,"y":180,"id":26,"time":"2020-06-17 03:27:13.337947","s":3.0490956072},{"index":34,"x":836,"y":406,"id":27,"time":"2020-06-17 03:27:13.337947","s":3.0490956072},{"index":35,"x":394,"y":53,"id":29,"time":"2020-06-17 03:27:13.337947","s":3.0490956072},{"index":36,"x":1014,"y":582,"id":30,"time":"2020-06-17 03:27:13.337947","s":3.0490956072},{"index":37,"x":947,"y":337,"id":32,"time":"2020-06-17 03:27:13.337947","s":3.0490956072},{"index":38,"x":900,"y":274,"id":1,"time":"2020-06-17 03:27:14.100221","s":3.811369509},{"index":39,"x":1013,"y":375,"id":2,"time":"2020-06-17 03:27:14.100221","s":3.811369509},{"index":40,"x":88,"y":341,"id":3,"time":"2020-06-17 03:27:14.100221","s":3.811369509},{"index":41,"x":794,"y":560,"id":4,"time":"2020-06-17 03:27:14.100221","s":3.811369509},{"index":42,"x":654,"y":105,"id":5,"time":"2020-06-17 03:27:14.100221","s":3.811369509},{"index":43,"x":295,"y":236,"id":7,"time":"2020-06-17 03:27:14.100221","s":3.811369509},{"index":44,"x":447,"y":205,"id":8,"time":"2020-06-17 03:27:14.100221","s":3.811369509},{"index":45,"x":111,"y":202,"id":14,"time":"2020-06-17 03:27:14.100221","s":3.811369509},{"index":46,"x":271,"y":97,"id":20,"time":"2020-06-17 03:27:14.100221","s":3.811369509},{"index":47,"x":387,"y":149,"id":26,"time":"2020-06-17 03:27:14.100221","s":3.811369509},{"index":48,"x":917,"y":385,"id":27,"time":"2020-06-17 03:27:14.100221","s":3.811369509},{"index":49,"x":1006,"y":569,"id":30,"time":"2020-06-17 03:27:14.100221","s":3.811369509},{"index":50,"x":340,"y":117,"id":34,"time":"2020-06-17 03:27:14.100221","s":3.811369509},{"index":51,"x":902,"y":294,"id":1,"time":"2020-06-17 03:27:14.862494","s":4.5736434109},{"index":52,"x":92,"y":298,"id":3,"time":"2020-06-17 03:27:14.862494","s":4.5736434109},{"index":53,"x":746,"y":566,"id":4,"time":"2020-06-17 03:27:14.862494","s":4.5736434109},{"index":54,"x":653,"y":107,"id":5,"time":"2020-06-17 03:27:14.862494","s":4.5736434109},{"index":55,"x":296,"y":232,"id":7,"time":"2020-06-17 03:27:14.862494","s":4.5736434109},{"index":56,"x":394,"y":198,"id":8,"time":"2020-06-17 03:27:14.862494","s":4.5736434109},{"index":57,"x":105,"y":204,"id":14,"time":"2020-06-17 03:27:14.862494","s":4.5736434109},{"index":58,"x":271,"y":105,"id":20,"time":"2020-06-17 03:27:14.862494","s":4.5736434109},{"index":59,"x":972,"y":351,"id":27,"time":"2020-06-17 03:27:14.862494","s":4.5736434109},{"index":60,"x":1015,"y":506,"id":30,"time":"2020-06-17 03:27:14.862494","s":4.5736434109},{"index":61,"x":336,"y":116,"id":34,"time":"2020-06-17 03:27:14.862494","s":4.5736434109},{"index":62,"x":223,"y":186,"id":36,"time":"2020-06-17 03:27:14.862494","s":4.5736434109},{"index":63,"x":904,"y":296,"id":1,"time":"2020-06-17 03:27:15.624768","s":5.3359173127},{"index":64,"x":96,"y":322,"id":3,"time":"2020-06-17 03:27:15.624768","s":5.3359173127},{"index":65,"x":741,"y":567,"id":4,"time":"2020-06-17 03:27:15.624768","s":5.3359173127},{"index":66,"x":651,"y":105,"id":5,"time":"2020-06-17 03:27:15.624768","s":5.3359173127},{"index":67,"x":300,"y":231,"id":7,"time":"2020-06-17 03:27:15.624768","s":5.3359173127},{"index":68,"x":387,"y":173,"id":8,"time":"2020-06-17 03:27:15.624768","s":5.3359173127},{"index":69,"x":108,"y":202,"id":14,"time":"2020-06-17 03:27:15.624768","s":5.3359173127},{"index":70,"x":268,"y":89,"id":20,"time":"2020-06-17 03:27:15.624768","s":5.3359173127},{"index":71,"x":1052,"y":446,"id":30,"time":"2020-06-17 03:27:15.624768","s":5.3359173127},{"index":72,"x":225,"y":187,"id":36,"time":"2020-06-17 03:27:15.624768","s":5.3359173127},{"index":73,"x":476,"y":70,"id":41,"time":"2020-06-17 03:27:15.624768","s":5.3359173127},{"index":74,"x":904,"y":296,"id":1,"time":"2020-06-17 03:27:16.387042","s":6.0981912145},{"index":75,"x":112,"y":319,"id":3,"time":"2020-06-17 03:27:16.387042","s":6.0981912145},{"index":76,"x":748,"y":568,"id":4,"time":"2020-06-17 03:27:16.387042","s":6.0981912145},{"index":77,"x":632,"y":100,"id":5,"time":"2020-06-17 03:27:16.387042","s":6.0981912145},{"index":78,"x":263,"y":237,"id":7,"time":"2020-06-17 03:27:16.387042","s":6.0981912145},{"index":79,"x":1079,"y":420,"id":30,"time":"2020-06-17 03:27:16.387042","s":6.0981912145},{"index":80,"x":276,"y":97,"id":34,"time":"2020-06-17 03:27:16.387042","s":6.0981912145},{"index":81,"x":897,"y":284,"id":1,"time":"2020-06-17 03:27:17.149316","s":6.8604651163},{"index":82,"x":118,"y":319,"id":3,"time":"2020-06-17 03:27:17.149316","s":6.8604651163},{"index":83,"x":723,"y":572,"id":4,"time":"2020-06-17 03:27:17.149316","s":6.8604651163},{"index":84,"x":637,"y":102,"id":5,"time":"2020-06-17 03:27:17.149316","s":6.8604651163},{"index":85,"x":251,"y":236,"id":7,"time":"2020-06-17 03:27:17.149316","s":6.8604651163},{"index":86,"x":325,"y":148,"id":8,"time":"2020-06-17 03:27:17.149316","s":6.8604651163},{"index":87,"x":1087,"y":416,"id":30,"time":"2020-06-17 03:27:17.149316","s":6.8604651163},{"index":88,"x":276,"y":110,"id":34,"time":"2020-06-17 03:27:17.149316","s":6.8604651163},{"index":89,"x":915,"y":286,"id":1,"time":"2020-06-17 03:27:17.911590","s":7.6227390181},{"index":90,"x":120,"y":318,"id":3,"time":"2020-06-17 03:27:17.911590","s":7.6227390181},{"index":91,"x":740,"y":572,"id":4,"time":"2020-06-17 03:27:17.911590","s":7.6227390181},{"index":92,"x":614,"y":101,"id":5,"time":"2020-06-17 03:27:17.911590","s":7.6227390181},{"index":93,"x":255,"y":202,"id":7,"time":"2020-06-17 03:27:17.911590","s":7.6227390181},{"index":94,"x":314,"y":143,"id":8,"time":"2020-06-17 03:27:17.911590","s":7.6227390181},{"index":95,"x":1058,"y":406,"id":30,"time":"2020-06-17 03:27:17.911590","s":7.6227390181},{"index":96,"x":383,"y":104,"id":46,"time":"2020-06-17 03:27:17.911590","s":7.6227390181},{"index":97,"x":326,"y":112,"id":49,"time":"2020-06-17 03:27:17.911590","s":7.6227390181},{"index":98,"x":904,"y":274,"id":1,"time":"2020-06-17 03:27:18.673864","s":8.3850129199},{"index":99,"x":120,"y":318,"id":3,"time":"2020-06-17 03:27:18.673864","s":8.3850129199},{"index":100,"x":725,"y":576,"id":4,"time":"2020-06-17 03:27:18.673864","s":8.3850129199},{"index":101,"x":640,"y":109,"id":5,"time":"2020-06-17 03:27:18.673864","s":8.3850129199},{"index":102,"x":259,"y":204,"id":7,"time":"2020-06-17 03:27:18.673864","s":8.3850129199},{"index":103,"x":760,"y":183,"id":27,"time":"2020-06-17 03:27:18.673864","s":8.3850129199},{"index":104,"x":1042,"y":371,"id":30,"time":"2020-06-17 03:27:18.673864","s":8.3850129199},{"index":105,"x":284,"y":119,"id":34,"time":"2020-06-17 03:27:18.673864","s":8.3850129199},{"index":106,"x":382,"y":105,"id":46,"time":"2020-06-17 03:27:18.673864","s":8.3850129199},{"index":107,"x":324,"y":119,"id":49,"time":"2020-06-17 03:27:18.673864","s":8.3850129199},{"index":108,"x":872,"y":247,"id":1,"time":"2020-06-17 03:27:19.436138","s":9.1472868217},{"index":109,"x":120,"y":317,"id":3,"time":"2020-06-17 03:27:19.436138","s":9.1472868217},{"index":110,"x":726,"y":574,"id":4,"time":"2020-06-17 03:27:19.436138","s":9.1472868217},{"index":111,"x":693,"y":114,"id":5,"time":"2020-06-17 03:27:19.436138","s":9.1472868217},{"index":112,"x":278,"y":184,"id":7,"time":"2020-06-17 03:27:19.436138","s":9.1472868217},{"index":113,"x":1006,"y":339,"id":30,"time":"2020-06-17 03:27:19.436138","s":9.1472868217},{"index":114,"x":381,"y":107,"id":46,"time":"2020-06-17 03:27:19.436138","s":9.1472868217},{"index":115,"x":845,"y":190,"id":1,"time":"2020-06-17 03:27:20.198412","s":9.9095607235},{"index":116,"x":116,"y":314,"id":3,"time":"2020-06-17 03:27:20.198412","s":9.9095607235},{"index":117,"x":719,"y":570,"id":4,"time":"2020-06-17 03:27:20.198412","s":9.9095607235},{"index":118,"x":750,"y":132,"id":5,"time":"2020-06-17 03:27:20.198412","s":9.9095607235},{"index":119,"x":958,"y":310,"id":30,"time":"2020-06-17 03:27:20.198412","s":9.9095607235},{"index":120,"x":272,"y":96,"id":34,"time":"2020-06-17 03:27:20.198412","s":9.9095607235},{"index":121,"x":381,"y":116,"id":46,"time":"2020-06-17 03:27:20.198412","s":9.9095607235},{"index":122,"x":675,"y":120,"id":60,"time":"2020-06-17 03:27:20.198412","s":9.9095607235},{"index":123,"x":850,"y":181,"id":1,"time":"2020-06-17 03:27:20.960686","s":10.6718346253},{"index":124,"x":116,"y":326,"id":3,"time":"2020-06-17 03:27:20.960686","s":10.6718346253},{"index":125,"x":715,"y":569,"id":4,"time":"2020-06-17 03:27:20.960686","s":10.6718346253},{"index":126,"x":764,"y":163,"id":5,"time":"2020-06-17 03:27:20.960686","s":10.6718346253},{"index":127,"x":277,"y":184,"id":7,"time":"2020-06-17 03:27:20.960686","s":10.6718346253},{"index":128,"x":955,"y":304,"id":30,"time":"2020-06-17 03:27:20.960686","s":10.6718346253},{"index":129,"x":275,"y":91,"id":34,"time":"2020-06-17 03:27:20.960686","s":10.6718346253},{"index":130,"x":381,"y":117,"id":46,"time":"2020-06-17 03:27:20.960686","s":10.6718346253},{"index":131,"x":328,"y":122,"id":49,"time":"2020-06-17 03:27:20.960686","s":10.6718346253},{"index":132,"x":586,"y":62,"id":57,"time":"2020-06-17 03:27:20.960686","s":10.6718346253},{"index":133,"x":280,"y":228,"id":58,"time":"2020-06-17 03:27:20.960686","s":10.6718346253},{"index":134,"x":828,"y":169,"id":1,"time":"2020-06-17 03:27:21.722960","s":11.4341085271},{"index":135,"x":117,"y":317,"id":3,"time":"2020-06-17 03:27:21.722960","s":11.4341085271},{"index":136,"x":720,"y":569,"id":4,"time":"2020-06-17 03:27:21.722960","s":11.4341085271},{"index":137,"x":857,"y":236,"id":5,"time":"2020-06-17 03:27:21.722960","s":11.4341085271},{"index":138,"x":958,"y":307,"id":30,"time":"2020-06-17 03:27:21.722960","s":11.4341085271},{"index":139,"x":275,"y":95,"id":34,"time":"2020-06-17 03:27:21.722960","s":11.4341085271},{"index":140,"x":379,"y":117,"id":46,"time":"2020-06-17 03:27:21.722960","s":11.4341085271},{"index":141,"x":328,"y":115,"id":49,"time":"2020-06-17 03:27:21.722960","s":11.4341085271},{"index":142,"x":650,"y":84,"id":57,"time":"2020-06-17 03:27:21.722960","s":11.4341085271},{"index":143,"x":280,"y":195,"id":58,"time":"2020-06-17 03:27:21.722960","s":11.4341085271},{"index":144,"x":823,"y":171,"id":1,"time":"2020-06-17 03:27:22.485233","s":12.1963824289},{"index":145,"x":115,"y":332,"id":3,"time":"2020-06-17 03:27:22.485233","s":12.1963824289},{"index":146,"x":731,"y":577,"id":4,"time":"2020-06-17 03:27:22.485233","s":12.1963824289},{"index":147,"x":913,"y":316,"id":5,"time":"2020-06-17 03:27:22.485233","s":12.1963824289},{"index":148,"x":976,"y":307,"id":30,"time":"2020-06-17 03:27:22.485233","s":12.1963824289},{"index":149,"x":275,"y":85,"id":34,"time":"2020-06-17 03:27:22.485233","s":12.1963824289},{"index":150,"x":378,"y":130,"id":46,"time":"2020-06-17 03:27:22.485233","s":12.1963824289},{"index":151,"x":326,"y":114,"id":49,"time":"2020-06-17 03:27:22.485233","s":12.1963824289},{"index":152,"x":725,"y":129,"id":57,"time":"2020-06-17 03:27:22.485233","s":12.1963824289},{"index":153,"x":282,"y":191,"id":58,"time":"2020-06-17 03:27:22.485233","s":12.1963824289},{"index":154,"x":661,"y":131,"id":60,"time":"2020-06-17 03:27:22.485233","s":12.1963824289},{"index":155,"x":216,"y":178,"id":62,"time":"2020-06-17 03:27:22.485233","s":12.1963824289},{"index":156,"x":1090,"y":569,"id":65,"time":"2020-06-17 03:27:22.485233","s":12.1963824289},{"index":157,"x":116,"y":317,"id":3,"time":"2020-06-17 03:27:23.247507","s":12.9586563307},{"index":158,"x":792,"y":553,"id":4,"time":"2020-06-17 03:27:23.247507","s":12.9586563307},{"index":159,"x":977,"y":403,"id":5,"time":"2020-06-17 03:27:23.247507","s":12.9586563307},{"index":160,"x":283,"y":231,"id":7,"time":"2020-06-17 03:27:23.247507","s":12.9586563307},{"index":161,"x":278,"y":96,"id":34,"time":"2020-06-17 03:27:23.247507","s":12.9586563307},{"index":162,"x":373,"y":120,"id":46,"time":"2020-06-17 03:27:23.247507","s":12.9586563307},{"index":163,"x":326,"y":132,"id":49,"time":"2020-06-17 03:27:23.247507","s":12.9586563307},{"index":164,"x":845,"y":198,"id":57,"time":"2020-06-17 03:27:23.247507","s":12.9586563307},{"index":165,"x":677,"y":117,"id":60,"time":"2020-06-17 03:27:23.247507","s":12.9586563307},{"index":166,"x":1027,"y":557,"id":65,"time":"2020-06-17 03:27:23.247507","s":12.9586563307},{"index":167,"x":85,"y":186,"id":67,"time":"2020-06-17 03:27:23.247507","s":12.9586563307},{"index":168,"x":766,"y":148,"id":1,"time":"2020-06-17 03:27:24.009781","s":13.7209302326},{"index":169,"x":112,"y":320,"id":3,"time":"2020-06-17 03:27:24.009781","s":13.7209302326},{"index":170,"x":808,"y":511,"id":4,"time":"2020-06-17 03:27:24.009781","s":13.7209302326},{"index":171,"x":1172,"y":458,"id":5,"time":"2020-06-17 03:27:24.009781","s":13.7209302326},{"index":172,"x":274,"y":230,"id":7,"time":"2020-06-17 03:27:24.009781","s":13.7209302326},{"index":173,"x":276,"y":85,"id":34,"time":"2020-06-17 03:27:24.009781","s":13.7209302326},{"index":174,"x":373,"y":121,"id":46,"time":"2020-06-17 03:27:24.009781","s":13.7209302326},{"index":175,"x":328,"y":117,"id":49,"time":"2020-06-17 03:27:24.009781","s":13.7209302326},{"index":176,"x":904,"y":272,"id":57,"time":"2020-06-17 03:27:24.009781","s":13.7209302326},{"index":177,"x":676,"y":116,"id":60,"time":"2020-06-17 03:27:24.009781","s":13.7209302326},{"index":178,"x":1083,"y":571,"id":65,"time":"2020-06-17 03:27:24.009781","s":13.7209302326},{"index":179,"x":772,"y":141,"id":1,"time":"2020-06-17 03:27:24.772055","s":14.4832041344},{"index":180,"x":95,"y":328,"id":3,"time":"2020-06-17 03:27:24.772055","s":14.4832041344},{"index":181,"x":812,"y":468,"id":4,"time":"2020-06-17 03:27:24.772055","s":14.4832041344},{"index":182,"x":275,"y":234,"id":7,"time":"2020-06-17 03:27:24.772055","s":14.4832041344},{"index":183,"x":281,"y":100,"id":34,"time":"2020-06-17 03:27:24.772055","s":14.4832041344},{"index":184,"x":371,"y":121,"id":46,"time":"2020-06-17 03:27:24.772055","s":14.4832041344},{"index":185,"x":322,"y":134,"id":49,"time":"2020-06-17 03:27:24.772055","s":14.4832041344},{"index":186,"x":882,"y":246,"id":57,"time":"2020-06-17 03:27:24.772055","s":14.4832041344},{"index":187,"x":661,"y":115,"id":60,"time":"2020-06-17 03:27:24.772055","s":14.4832041344},{"index":188,"x":902,"y":586,"id":65,"time":"2020-06-17 03:27:24.772055","s":14.4832041344},{"index":189,"x":93,"y":199,"id":67,"time":"2020-06-17 03:27:24.772055","s":14.4832041344},{"index":190,"x":778,"y":134,"id":1,"time":"2020-06-17 03:27:25.534329","s":15.2454780362},{"index":191,"x":838,"y":427,"id":4,"time":"2020-06-17 03:27:25.534329","s":15.2454780362},{"index":192,"x":277,"y":99,"id":34,"time":"2020-06-17 03:27:25.534329","s":15.2454780362},{"index":193,"x":369,"y":121,"id":46,"time":"2020-06-17 03:27:25.534329","s":15.2454780362},{"index":194,"x":319,"y":114,"id":49,"time":"2020-06-17 03:27:25.534329","s":15.2454780362},{"index":195,"x":704,"y":338,"id":57,"time":"2020-06-17 03:27:25.534329","s":15.2454780362},{"index":196,"x":642,"y":116,"id":60,"time":"2020-06-17 03:27:25.534329","s":15.2454780362},{"index":197,"x":815,"y":642,"id":65,"time":"2020-06-17 03:27:25.534329","s":15.2454780362},{"index":198,"x":850,"y":185,"id":79,"time":"2020-06-17 03:27:25.534329","s":15.2454780362},{"index":199,"x":775,"y":138,"id":1,"time":"2020-06-17 03:27:26.296603","s":16.007751938},{"index":200,"x":77,"y":321,"id":3,"time":"2020-06-17 03:27:26.296603","s":16.007751938},{"index":201,"x":843,"y":398,"id":4,"time":"2020-06-17 03:27:26.296603","s":16.007751938},{"index":202,"x":271,"y":184,"id":7,"time":"2020-06-17 03:27:26.296603","s":16.007751938},{"index":203,"x":278,"y":103,"id":34,"time":"2020-06-17 03:27:26.296603","s":16.007751938},{"index":204,"x":370,"y":120,"id":46,"time":"2020-06-17 03:27:26.296603","s":16.007751938},{"index":205,"x":649,"y":117,"id":60,"time":"2020-06-17 03:27:26.296603","s":16.007751938},{"index":206,"x":677,"y":636,"id":65,"time":"2020-06-17 03:27:26.296603","s":16.007751938},{"index":207,"x":850,"y":170,"id":79,"time":"2020-06-17 03:27:26.296603","s":16.007751938},{"index":208,"x":213,"y":149,"id":81,"time":"2020-06-17 03:27:26.296603","s":16.007751938},{"index":209,"x":762,"y":140,"id":1,"time":"2020-06-17 03:27:27.058877","s":16.7700258398},{"index":210,"x":80,"y":327,"id":3,"time":"2020-06-17 03:27:27.058877","s":16.7700258398},{"index":211,"x":892,"y":351,"id":4,"time":"2020-06-17 03:27:27.058877","s":16.7700258398},{"index":212,"x":277,"y":190,"id":7,"time":"2020-06-17 03:27:27.058877","s":16.7700258398},{"index":213,"x":371,"y":128,"id":46,"time":"2020-06-17 03:27:27.058877","s":16.7700258398},{"index":214,"x":653,"y":116,"id":60,"time":"2020-06-17 03:27:27.058877","s":16.7700258398},{"index":215,"x":653,"y":656,"id":65,"time":"2020-06-17 03:27:27.058877","s":16.7700258398},{"index":216,"x":93,"y":200,"id":67,"time":"2020-06-17 03:27:27.058877","s":16.7700258398},{"index":217,"x":855,"y":203,"id":79,"time":"2020-06-17 03:27:27.058877","s":16.7700258398},{"index":218,"x":205,"y":153,"id":81,"time":"2020-06-17 03:27:27.058877","s":16.7700258398},{"index":219,"x":269,"y":451,"id":82,"time":"2020-06-17 03:27:27.058877","s":16.7700258398},{"index":220,"x":759,"y":145,"id":1,"time":"2020-06-17 03:27:27.821151","s":17.5322997416},{"index":221,"x":73,"y":294,"id":3,"time":"2020-06-17 03:27:27.821151","s":17.5322997416},{"index":222,"x":886,"y":302,"id":4,"time":"2020-06-17 03:27:27.821151","s":17.5322997416},{"index":223,"x":271,"y":182,"id":7,"time":"2020-06-17 03:27:27.821151","s":17.5322997416},{"index":224,"x":322,"y":141,"id":34,"time":"2020-06-17 03:27:27.821151","s":17.5322997416},{"index":225,"x":371,"y":120,"id":46,"time":"2020-06-17 03:27:27.821151","s":17.5322997416},{"index":226,"x":661,"y":116,"id":60,"time":"2020-06-17 03:27:27.821151","s":17.5322997416},{"index":227,"x":88,"y":185,"id":67,"time":"2020-06-17 03:27:27.821151","s":17.5322997416},{"index":228,"x":850,"y":179,"id":79,"time":"2020-06-17 03:27:27.821151","s":17.5322997416},{"index":229,"x":184,"y":489,"id":82,"time":"2020-06-17 03:27:27.821151","s":17.5322997416},{"index":230,"x":693,"y":671,"id":84,"time":"2020-06-17 03:27:27.821151","s":17.5322997416},{"index":231,"x":279,"y":109,"id":86,"time":"2020-06-17 03:27:27.821151","s":17.5322997416},{"index":232,"x":760,"y":146,"id":1,"time":"2020-06-17 03:27:28.583425","s":18.2945736434},{"index":233,"x":69,"y":321,"id":3,"time":"2020-06-17 03:27:28.583425","s":18.2945736434},{"index":234,"x":893,"y":275,"id":4,"time":"2020-06-17 03:27:28.583425","s":18.2945736434},{"index":235,"x":373,"y":120,"id":46,"time":"2020-06-17 03:27:28.583425","s":18.2945736434},{"index":236,"x":102,"y":197,"id":67,"time":"2020-06-17 03:27:28.583425","s":18.2945736434},{"index":237,"x":689,"y":145,"id":76,"time":"2020-06-17 03:27:28.583425","s":18.2945736434},{"index":238,"x":830,"y":196,"id":79,"time":"2020-06-17 03:27:28.583425","s":18.2945736434},{"index":239,"x":209,"y":538,"id":82,"time":"2020-06-17 03:27:28.583425","s":18.2945736434},{"index":240,"x":652,"y":670,"id":84,"time":"2020-06-17 03:27:28.583425","s":18.2945736434},{"index":241,"x":280,"y":107,"id":86,"time":"2020-06-17 03:27:28.583425","s":18.2945736434},{"index":242,"x":252,"y":143,"id":87,"time":"2020-06-17 03:27:28.583425","s":18.2945736434},{"index":243,"x":757,"y":148,"id":1,"time":"2020-06-17 03:27:29.345699","s":19.0568475452},{"index":244,"x":76,"y":349,"id":3,"time":"2020-06-17 03:27:29.345699","s":19.0568475452},{"index":245,"x":844,"y":203,"id":4,"time":"2020-06-17 03:27:29.345699","s":19.0568475452},{"index":246,"x":253,"y":186,"id":7,"time":"2020-06-17 03:27:29.345699","s":19.0568475452},{"index":247,"x":319,"y":114,"id":34,"time":"2020-06-17 03:27:29.345699","s":19.0568475452},{"index":248,"x":373,"y":119,"id":46,"time":"2020-06-17 03:27:29.345699","s":19.0568475452},{"index":249,"x":105,"y":213,"id":67,"time":"2020-06-17 03:27:29.345699","s":19.0568475452},{"index":250,"x":694,"y":146,"id":76,"time":"2020-06-17 03:27:29.345699","s":19.0568475452},{"index":251,"x":214,"y":599,"id":82,"time":"2020-06-17 03:27:29.345699","s":19.0568475452},{"index":252,"x":654,"y":672,"id":84,"time":"2020-06-17 03:27:29.345699","s":19.0568475452},{"index":253,"x":278,"y":106,"id":86,"time":"2020-06-17 03:27:29.345699","s":19.0568475452},{"index":254,"x":246,"y":145,"id":87,"time":"2020-06-17 03:27:29.345699","s":19.0568475452},{"index":255,"x":749,"y":149,"id":1,"time":"2020-06-17 03:27:30.107972","s":19.819121447},{"index":256,"x":79,"y":354,"id":3,"time":"2020-06-17 03:27:30.107972","s":19.819121447},{"index":257,"x":820,"y":178,"id":4,"time":"2020-06-17 03:27:30.107972","s":19.819121447},{"index":258,"x":253,"y":198,"id":7,"time":"2020-06-17 03:27:30.107972","s":19.819121447},{"index":259,"x":273,"y":105,"id":34,"time":"2020-06-17 03:27:30.107972","s":19.819121447},{"index":260,"x":372,"y":121,"id":46,"time":"2020-06-17 03:27:30.107972","s":19.819121447},{"index":261,"x":108,"y":208,"id":67,"time":"2020-06-17 03:27:30.107972","s":19.819121447},{"index":262,"x":212,"y":592,"id":82,"time":"2020-06-17 03:27:30.107972","s":19.819121447},{"index":263,"x":285,"y":138,"id":86,"time":"2020-06-17 03:27:30.107972","s":19.819121447},{"index":264,"x":245,"y":145,"id":87,"time":"2020-06-17 03:27:30.107972","s":19.819121447},{"index":265,"x":64,"y":347,"id":3,"time":"2020-06-17 03:27:30.870246","s":20.5813953488},{"index":266,"x":788,"y":173,"id":4,"time":"2020-06-17 03:27:30.870246","s":20.5813953488},{"index":267,"x":253,"y":223,"id":7,"time":"2020-06-17 03:27:30.870246","s":20.5813953488},{"index":268,"x":372,"y":122,"id":46,"time":"2020-06-17 03:27:30.870246","s":20.5813953488},{"index":269,"x":105,"y":221,"id":67,"time":"2020-06-17 03:27:30.870246","s":20.5813953488},{"index":270,"x":219,"y":493,"id":82,"time":"2020-06-17 03:27:30.870246","s":20.5813953488},{"index":271,"x":286,"y":137,"id":86,"time":"2020-06-17 03:27:30.870246","s":20.5813953488},{"index":272,"x":248,"y":135,"id":87,"time":"2020-06-17 03:27:30.870246","s":20.5813953488},{"index":273,"x":69,"y":280,"id":3,"time":"2020-06-17 03:27:31.632520","s":21.3436692506},{"index":274,"x":804,"y":175,"id":4,"time":"2020-06-17 03:27:31.632520","s":21.3436692506},{"index":275,"x":275,"y":180,"id":7,"time":"2020-06-17 03:27:31.632520","s":21.3436692506},{"index":276,"x":277,"y":102,"id":34,"time":"2020-06-17 03:27:31.632520","s":21.3436692506},{"index":277,"x":372,"y":121,"id":46,"time":"2020-06-17 03:27:31.632520","s":21.3436692506},{"index":278,"x":108,"y":203,"id":67,"time":"2020-06-17 03:27:31.632520","s":21.3436692506},{"index":279,"x":193,"y":489,"id":82,"time":"2020-06-17 03:27:31.632520","s":21.3436692506},{"index":280,"x":680,"y":676,"id":84,"time":"2020-06-17 03:27:31.632520","s":21.3436692506},{"index":281,"x":287,"y":243,"id":91,"time":"2020-06-17 03:27:31.632520","s":21.3436692506},{"index":282,"x":750,"y":107,"id":1,"time":"2020-06-17 03:27:32.394794","s":22.1059431525},{"index":283,"x":78,"y":334,"id":3,"time":"2020-06-17 03:27:32.394794","s":22.1059431525},{"index":284,"x":793,"y":172,"id":4,"time":"2020-06-17 03:27:32.394794","s":22.1059431525},{"index":285,"x":318,"y":110,"id":34,"time":"2020-06-17 03:27:32.394794","s":22.1059431525},{"index":286,"x":372,"y":121,"id":46,"time":"2020-06-17 03:27:32.394794","s":22.1059431525},{"index":287,"x":122,"y":209,"id":67,"time":"2020-06-17 03:27:32.394794","s":22.1059431525},{"index":288,"x":228,"y":471,"id":82,"time":"2020-06-17 03:27:32.394794","s":22.1059431525},{"index":289,"x":272,"y":85,"id":86,"time":"2020-06-17 03:27:32.394794","s":22.1059431525},{"index":290,"x":300,"y":240,"id":91,"time":"2020-06-17 03:27:32.394794","s":22.1059431525},{"index":291,"x":647,"y":111,"id":92,"time":"2020-06-17 03:27:32.394794","s":22.1059431525},{"index":292,"x":227,"y":149,"id":93,"time":"2020-06-17 03:27:32.394794","s":22.1059431525},{"index":293,"x":84,"y":343,"id":3,"time":"2020-06-17 03:27:33.157068","s":22.8682170543},{"index":294,"x":791,"y":175,"id":4,"time":"2020-06-17 03:27:33.157068","s":22.8682170543},{"index":295,"x":304,"y":205,"id":7,"time":"2020-06-17 03:27:33.157068","s":22.8682170543},{"index":296,"x":318,"y":114,"id":34,"time":"2020-06-17 03:27:33.157068","s":22.8682170543},{"index":297,"x":372,"y":120,"id":46,"time":"2020-06-17 03:27:33.157068","s":22.8682170543},{"index":298,"x":94,"y":207,"id":67,"time":"2020-06-17 03:27:33.157068","s":22.8682170543},{"index":299,"x":292,"y":442,"id":82,"time":"2020-06-17 03:27:33.157068","s":22.8682170543},{"index":300,"x":273,"y":87,"id":86,"time":"2020-06-17 03:27:33.157068","s":22.8682170543},{"index":301,"x":217,"y":168,"id":93,"time":"2020-06-17 03:27:33.157068","s":22.8682170543},{"index":302,"x":744,"y":101,"id":1,"time":"2020-06-17 03:27:33.919342","s":23.6304909561},{"index":303,"x":85,"y":342,"id":3,"time":"2020-06-17 03:27:33.919342","s":23.6304909561},{"index":304,"x":779,"y":179,"id":4,"time":"2020-06-17 03:27:33.919342","s":23.6304909561},{"index":305,"x":319,"y":115,"id":34,"time":"2020-06-17 03:27:33.919342","s":23.6304909561},{"index":306,"x":373,"y":113,"id":46,"time":"2020-06-17 03:27:33.919342","s":23.6304909561},{"index":307,"x":107,"y":211,"id":67,"time":"2020-06-17 03:27:33.919342","s":23.6304909561},{"index":308,"x":401,"y":356,"id":82,"time":"2020-06-17 03:27:33.919342","s":23.6304909561},{"index":309,"x":272,"y":90,"id":86,"time":"2020-06-17 03:27:33.919342","s":23.6304909561},{"index":310,"x":289,"y":212,"id":91,"time":"2020-06-17 03:27:33.919342","s":23.6304909561},{"index":311,"x":214,"y":170,"id":93,"time":"2020-06-17 03:27:33.919342","s":23.6304909561},{"index":312,"x":688,"y":143,"id":94,"time":"2020-06-17 03:27:33.919342","s":23.6304909561},{"index":313,"x":746,"y":101,"id":1,"time":"2020-06-17 03:27:34.681616","s":24.3927648579},{"index":314,"x":100,"y":331,"id":3,"time":"2020-06-17 03:27:34.681616","s":24.3927648579},{"index":315,"x":778,"y":175,"id":4,"time":"2020-06-17 03:27:34.681616","s":24.3927648579},{"index":316,"x":320,"y":112,"id":34,"time":"2020-06-17 03:27:34.681616","s":24.3927648579},{"index":317,"x":373,"y":121,"id":46,"time":"2020-06-17 03:27:34.681616","s":24.3927648579},{"index":318,"x":436,"y":299,"id":82,"time":"2020-06-17 03:27:34.681616","s":24.3927648579},{"index":319,"x":271,"y":99,"id":86,"time":"2020-06-17 03:27:34.681616","s":24.3927648579},{"index":320,"x":293,"y":244,"id":91,"time":"2020-06-17 03:27:34.681616","s":24.3927648579},{"index":321,"x":656,"y":105,"id":92,"time":"2020-06-17 03:27:34.681616","s":24.3927648579},{"index":322,"x":215,"y":166,"id":93,"time":"2020-06-17 03:27:34.681616","s":24.3927648579},{"index":323,"x":689,"y":144,"id":94,"time":"2020-06-17 03:27:34.681616","s":24.3927648579},{"index":324,"x":747,"y":105,"id":1,"time":"2020-06-17 03:27:35.443890","s":25.1550387597},{"index":325,"x":106,"y":317,"id":3,"time":"2020-06-17 03:27:35.443890","s":25.1550387597},{"index":326,"x":772,"y":167,"id":4,"time":"2020-06-17 03:27:35.443890","s":25.1550387597},{"index":327,"x":321,"y":109,"id":34,"time":"2020-06-17 03:27:35.443890","s":25.1550387597},{"index":328,"x":372,"y":120,"id":46,"time":"2020-06-17 03:27:35.443890","s":25.1550387597},{"index":329,"x":116,"y":201,"id":67,"time":"2020-06-17 03:27:35.443890","s":25.1550387597},{"index":330,"x":460,"y":234,"id":82,"time":"2020-06-17 03:27:35.443890","s":25.1550387597},{"index":331,"x":274,"y":85,"id":86,"time":"2020-06-17 03:27:35.443890","s":25.1550387597},{"index":332,"x":301,"y":239,"id":91,"time":"2020-06-17 03:27:35.443890","s":25.1550387597},{"index":333,"x":639,"y":106,"id":92,"time":"2020-06-17 03:27:35.443890","s":25.1550387597},{"index":334,"x":229,"y":166,"id":93,"time":"2020-06-17 03:27:35.443890","s":25.1550387597},{"index":335,"x":100,"y":319,"id":3,"time":"2020-06-17 03:27:36.206164","s":25.9173126615},{"index":336,"x":761,"y":167,"id":4,"time":"2020-06-17 03:27:36.206164","s":25.9173126615},{"index":337,"x":312,"y":116,"id":34,"time":"2020-06-17 03:27:36.206164","s":25.9173126615},{"index":338,"x":371,"y":121,"id":46,"time":"2020-06-17 03:27:36.206164","s":25.9173126615},{"index":339,"x":122,"y":204,"id":67,"time":"2020-06-17 03:27:36.206164","s":25.9173126615},{"index":340,"x":431,"y":216,"id":82,"time":"2020-06-17 03:27:36.206164","s":25.9173126615},{"index":341,"x":255,"y":85,"id":86,"time":"2020-06-17 03:27:36.206164","s":25.9173126615},{"index":342,"x":280,"y":236,"id":91,"time":"2020-06-17 03:27:36.206164","s":25.9173126615},{"index":343,"x":630,"y":108,"id":92,"time":"2020-06-17 03:27:36.206164","s":25.9173126615},{"index":344,"x":826,"y":182,"id":96,"time":"2020-06-17 03:27:36.206164","s":25.9173126615},{"index":345,"x":97,"y":326,"id":3,"time":"2020-06-17 03:27:36.968438","s":26.6795865633},{"index":346,"x":722,"y":150,"id":4,"time":"2020-06-17 03:27:36.968438","s":26.6795865633},{"index":347,"x":315,"y":110,"id":34,"time":"2020-06-17 03:27:36.968438","s":26.6795865633},{"index":348,"x":119,"y":210,"id":67,"time":"2020-06-17 03:27:36.968438","s":26.6795865633},{"index":349,"x":382,"y":174,"id":82,"time":"2020-06-17 03:27:36.968438","s":26.6795865633},{"index":350,"x":271,"y":83,"id":86,"time":"2020-06-17 03:27:36.968438","s":26.6795865633},{"index":351,"x":288,"y":242,"id":91,"time":"2020-06-17 03:27:36.968438","s":26.6795865633},{"index":352,"x":653,"y":109,"id":92,"time":"2020-06-17 03:27:36.968438","s":26.6795865633},{"index":353,"x":829,"y":194,"id":96,"time":"2020-06-17 03:27:36.968438","s":26.6795865633},{"index":354,"x":692,"y":122,"id":1,"time":"2020-06-17 03:27:37.730711","s":27.4418604651},{"index":355,"x":102,"y":318,"id":3,"time":"2020-06-17 03:27:37.730711","s":27.4418604651},{"index":356,"x":752,"y":157,"id":4,"time":"2020-06-17 03:27:37.730711","s":27.4418604651},{"index":357,"x":310,"y":99,"id":34,"time":"2020-06-17 03:27:37.730711","s":27.4418604651},{"index":358,"x":120,"y":211,"id":67,"time":"2020-06-17 03:27:37.730711","s":27.4418604651},{"index":359,"x":369,"y":147,"id":82,"time":"2020-06-17 03:27:37.730711","s":27.4418604651},{"index":360,"x":258,"y":85,"id":86,"time":"2020-06-17 03:27:37.730711","s":27.4418604651},{"index":361,"x":315,"y":247,"id":91,"time":"2020-06-17 03:27:37.730711","s":27.4418604651},{"index":362,"x":825,"y":186,"id":96,"time":"2020-06-17 03:27:37.730711","s":27.4418604651},{"index":363,"x":242,"y":169,"id":99,"time":"2020-06-17 03:27:37.730711","s":27.4418604651},{"index":364,"x":705,"y":131,"id":1,"time":"2020-06-17 03:27:38.492985","s":28.2041343669},{"index":365,"x":101,"y":318,"id":3,"time":"2020-06-17 03:27:38.492985","s":28.2041343669},{"index":366,"x":745,"y":157,"id":4,"time":"2020-06-17 03:27:38.492985","s":28.2041343669},{"index":367,"x":305,"y":86,"id":34,"time":"2020-06-17 03:27:38.492985","s":28.2041343669},{"index":368,"x":123,"y":205,"id":67,"time":"2020-06-17 03:27:38.492985","s":28.2041343669},{"index":369,"x":384,"y":134,"id":82,"time":"2020-06-17 03:27:38.492985","s":28.2041343669},{"index":370,"x":255,"y":85,"id":86,"time":"2020-06-17 03:27:38.492985","s":28.2041343669},{"index":371,"x":314,"y":246,"id":91,"time":"2020-06-17 03:27:38.492985","s":28.2041343669},{"index":372,"x":820,"y":191,"id":96,"time":"2020-06-17 03:27:38.492985","s":28.2041343669},{"index":373,"x":243,"y":151,"id":99,"time":"2020-06-17 03:27:38.492985","s":28.2041343669},{"index":374,"x":104,"y":318,"id":3,"time":"2020-06-17 03:27:39.255259","s":28.9664082687},{"index":375,"x":713,"y":147,"id":4,"time":"2020-06-17 03:27:39.255259","s":28.9664082687},{"index":376,"x":294,"y":84,"id":34,"time":"2020-06-17 03:27:39.255259","s":28.9664082687},{"index":377,"x":125,"y":207,"id":67,"time":"2020-06-17 03:27:39.255259","s":28.9664082687},{"index":378,"x":253,"y":84,"id":86,"time":"2020-06-17 03:27:39.255259","s":28.9664082687},{"index":379,"x":319,"y":246,"id":91,"time":"2020-06-17 03:27:39.255259","s":28.9664082687},{"index":380,"x":819,"y":197,"id":96,"time":"2020-06-17 03:27:39.255259","s":28.9664082687},{"index":381,"x":244,"y":174,"id":99,"time":"2020-06-17 03:27:39.255259","s":28.9664082687},{"index":382,"x":344,"y":101,"id":101,"time":"2020-06-17 03:27:39.255259","s":28.9664082687},{"index":383,"x":471,"y":68,"id":103,"time":"2020-06-17 03:27:39.255259","s":28.9664082687},{"index":384,"x":100,"y":319,"id":3,"time":"2020-06-17 03:27:40.017533","s":29.7286821705},{"index":385,"x":711,"y":149,"id":4,"time":"2020-06-17 03:27:40.017533","s":29.7286821705},{"index":386,"x":126,"y":213,"id":67,"time":"2020-06-17 03:27:40.017533","s":29.7286821705},{"index":387,"x":250,"y":78,"id":86,"time":"2020-06-17 03:27:40.017533","s":29.7286821705},{"index":388,"x":292,"y":243,"id":91,"time":"2020-06-17 03:27:40.017533","s":29.7286821705},{"index":389,"x":818,"y":198,"id":96,"time":"2020-06-17 03:27:40.017533","s":29.7286821705},{"index":390,"x":242,"y":133,"id":99,"time":"2020-06-17 03:27:40.017533","s":29.7286821705},{"index":391,"x":348,"y":132,"id":101,"time":"2020-06-17 03:27:40.017533","s":29.7286821705},{"index":392,"x":512,"y":56,"id":103,"time":"2020-06-17 03:27:40.017533","s":29.7286821705},{"index":393,"x":661,"y":113,"id":104,"time":"2020-06-17 03:27:40.017533","s":29.7286821705},{"index":394,"x":101,"y":317,"id":3,"time":"2020-06-17 03:27:40.779807","s":30.4909560724},{"index":395,"x":713,"y":143,"id":4,"time":"2020-06-17 03:27:40.779807","s":30.4909560724},{"index":396,"x":120,"y":206,"id":67,"time":"2020-06-17 03:27:40.779807","s":30.4909560724},{"index":397,"x":296,"y":249,"id":91,"time":"2020-06-17 03:27:40.779807","s":30.4909560724},{"index":398,"x":820,"y":198,"id":96,"time":"2020-06-17 03:27:40.779807","s":30.4909560724},{"index":399,"x":349,"y":132,"id":101,"time":"2020-06-17 03:27:40.779807","s":30.4909560724},{"index":400,"x":488,"y":48,"id":103,"time":"2020-06-17 03:27:40.779807","s":30.4909560724},{"index":401,"x":659,"y":108,"id":104,"time":"2020-06-17 03:27:40.779807","s":30.4909560724},{"index":402,"x":305,"y":101,"id":105,"time":"2020-06-17 03:27:40.779807","s":30.4909560724},{"index":403,"x":88,"y":318,"id":3,"time":"2020-06-17 03:27:41.542081","s":31.2532299742},{"index":404,"x":715,"y":153,"id":4,"time":"2020-06-17 03:27:41.542081","s":31.2532299742},{"index":405,"x":306,"y":87,"id":34,"time":"2020-06-17 03:27:41.542081","s":31.2532299742},{"index":406,"x":258,"y":81,"id":86,"time":"2020-06-17 03:27:41.542081","s":31.2532299742},{"index":407,"x":308,"y":247,"id":91,"time":"2020-06-17 03:27:41.542081","s":31.2532299742},{"index":408,"x":815,"y":198,"id":96,"time":"2020-06-17 03:27:41.542081","s":31.2532299742},{"index":409,"x":256,"y":140,"id":99,"time":"2020-06-17 03:27:41.542081","s":31.2532299742},{"index":410,"x":351,"y":97,"id":101,"time":"2020-06-17 03:27:41.542081","s":31.2532299742},{"index":411,"x":88,"y":314,"id":3,"time":"2020-06-17 03:27:42.304355","s":32.015503876},{"index":412,"x":306,"y":85,"id":34,"time":"2020-06-17 03:27:42.304355","s":32.015503876},{"index":413,"x":257,"y":78,"id":86,"time":"2020-06-17 03:27:42.304355","s":32.015503876},{"index":414,"x":310,"y":250,"id":91,"time":"2020-06-17 03:27:42.304355","s":32.015503876},{"index":415,"x":823,"y":196,"id":96,"time":"2020-06-17 03:27:42.304355","s":32.015503876},{"index":416,"x":250,"y":138,"id":99,"time":"2020-06-17 03:27:42.304355","s":32.015503876},{"index":417,"x":349,"y":99,"id":101,"time":"2020-06-17 03:27:42.304355","s":32.015503876},{"index":418,"x":92,"y":308,"id":3,"time":"2020-06-17 03:27:43.066629","s":32.7777777778},{"index":419,"x":367,"y":267,"id":91,"time":"2020-06-17 03:27:43.066629","s":32.7777777778},{"index":420,"x":823,"y":197,"id":96,"time":"2020-06-17 03:27:43.066629","s":32.7777777778},{"index":421,"x":261,"y":197,"id":99,"time":"2020-06-17 03:27:43.066629","s":32.7777777778},{"index":422,"x":351,"y":87,"id":101,"time":"2020-06-17 03:27:43.066629","s":32.7777777778},{"index":423,"x":306,"y":113,"id":105,"time":"2020-06-17 03:27:43.066629","s":32.7777777778},{"index":424,"x":657,"y":671,"id":112,"time":"2020-06-17 03:27:43.066629","s":32.7777777778},{"index":425,"x":124,"y":266,"id":113,"time":"2020-06-17 03:27:43.066629","s":32.7777777778},{"index":426,"x":94,"y":316,"id":3,"time":"2020-06-17 03:27:43.828903","s":33.5400516796},{"index":427,"x":105,"y":207,"id":67,"time":"2020-06-17 03:27:43.828903","s":33.5400516796},{"index":428,"x":410,"y":276,"id":91,"time":"2020-06-17 03:27:43.828903","s":33.5400516796},{"index":429,"x":823,"y":196,"id":96,"time":"2020-06-17 03:27:43.828903","s":33.5400516796},{"index":430,"x":261,"y":205,"id":99,"time":"2020-06-17 03:27:43.828903","s":33.5400516796},{"index":431,"x":309,"y":141,"id":105,"time":"2020-06-17 03:27:43.828903","s":33.5400516796},{"index":432,"x":656,"y":671,"id":112,"time":"2020-06-17 03:27:43.828903","s":33.5400516796},{"index":433,"x":348,"y":139,"id":114,"time":"2020-06-17 03:27:43.828903","s":33.5400516796},{"index":434,"x":96,"y":314,"id":3,"time":"2020-06-17 03:27:44.591177","s":34.3023255814},{"index":435,"x":731,"y":147,"id":4,"time":"2020-06-17 03:27:44.591177","s":34.3023255814},{"index":436,"x":94,"y":204,"id":67,"time":"2020-06-17 03:27:44.591177","s":34.3023255814},{"index":437,"x":423,"y":268,"id":91,"time":"2020-06-17 03:27:44.591177","s":34.3023255814},{"index":438,"x":822,"y":198,"id":96,"time":"2020-06-17 03:27:44.591177","s":34.3023255814},{"index":439,"x":271,"y":204,"id":99,"time":"2020-06-17 03:27:44.591177","s":34.3023255814},{"index":440,"x":309,"y":139,"id":105,"time":"2020-06-17 03:27:44.591177","s":34.3023255814},{"index":441,"x":671,"y":99,"id":108,"time":"2020-06-17 03:27:44.591177","s":34.3023255814},{"index":442,"x":657,"y":671,"id":112,"time":"2020-06-17 03:27:44.591177","s":34.3023255814},{"index":443,"x":347,"y":136,"id":114,"time":"2020-06-17 03:27:44.591177","s":34.3023255814},{"index":444,"x":97,"y":311,"id":3,"time":"2020-06-17 03:27:45.353450","s":35.0645994832},{"index":445,"x":738,"y":143,"id":4,"time":"2020-06-17 03:27:45.353450","s":35.0645994832},{"index":446,"x":95,"y":207,"id":67,"time":"2020-06-17 03:27:45.353450","s":35.0645994832},{"index":447,"x":476,"y":281,"id":91,"time":"2020-06-17 03:27:45.353450","s":35.0645994832},{"index":448,"x":822,"y":197,"id":96,"time":"2020-06-17 03:27:45.353450","s":35.0645994832},{"index":449,"x":243,"y":168,"id":99,"time":"2020-06-17 03:27:45.353450","s":35.0645994832},{"index":450,"x":667,"y":101,"id":108,"time":"2020-06-17 03:27:45.353450","s":35.0645994832},{"index":451,"x":661,"y":672,"id":112,"time":"2020-06-17 03:27:45.353450","s":35.0645994832},{"index":452,"x":349,"y":134,"id":114,"time":"2020-06-17 03:27:45.353450","s":35.0645994832},{"index":453,"x":95,"y":289,"id":3,"time":"2020-06-17 03:27:46.115724","s":35.826873385},{"index":454,"x":738,"y":140,"id":4,"time":"2020-06-17 03:27:46.115724","s":35.826873385},{"index":455,"x":96,"y":212,"id":67,"time":"2020-06-17 03:27:46.115724","s":35.826873385},{"index":456,"x":824,"y":196,"id":96,"time":"2020-06-17 03:27:46.115724","s":35.826873385},{"index":457,"x":252,"y":172,"id":99,"time":"2020-06-17 03:27:46.115724","s":35.826873385},{"index":458,"x":261,"y":101,"id":105,"time":"2020-06-17 03:27:46.115724","s":35.826873385},{"index":459,"x":662,"y":101,"id":108,"time":"2020-06-17 03:27:46.115724","s":35.826873385},{"index":460,"x":661,"y":672,"id":112,"time":"2020-06-17 03:27:46.115724","s":35.826873385},{"index":461,"x":349,"y":134,"id":114,"time":"2020-06-17 03:27:46.115724","s":35.826873385},{"index":462,"x":632,"y":364,"id":115,"time":"2020-06-17 03:27:46.115724","s":35.826873385},{"index":463,"x":83,"y":327,"id":3,"time":"2020-06-17 03:27:46.877998","s":36.5891472868},{"index":464,"x":739,"y":149,"id":4,"time":"2020-06-17 03:27:46.877998","s":36.5891472868},{"index":465,"x":822,"y":197,"id":96,"time":"2020-06-17 03:27:46.877998","s":36.5891472868},{"index":466,"x":247,"y":184,"id":99,"time":"2020-06-17 03:27:46.877998","s":36.5891472868},{"index":467,"x":643,"y":108,"id":108,"time":"2020-06-17 03:27:46.877998","s":36.5891472868},{"index":468,"x":654,"y":666,"id":112,"time":"2020-06-17 03:27:46.877998","s":36.5891472868},{"index":469,"x":345,"y":134,"id":114,"time":"2020-06-17 03:27:46.877998","s":36.5891472868},{"index":470,"x":811,"y":551,"id":115,"time":"2020-06-17 03:27:46.877998","s":36.5891472868},{"index":471,"x":76,"y":336,"id":3,"time":"2020-06-17 03:27:47.640272","s":37.3514211886},{"index":472,"x":738,"y":146,"id":4,"time":"2020-06-17 03:27:47.640272","s":37.3514211886},{"index":473,"x":113,"y":217,"id":67,"time":"2020-06-17 03:27:47.640272","s":37.3514211886},{"index":474,"x":824,"y":196,"id":96,"time":"2020-06-17 03:27:47.640272","s":37.3514211886},{"index":475,"x":239,"y":168,"id":99,"time":"2020-06-17 03:27:47.640272","s":37.3514211886},{"index":476,"x":648,"y":104,"id":108,"time":"2020-06-17 03:27:47.640272","s":37.3514211886},{"index":477,"x":612,"y":639,"id":112,"time":"2020-06-17 03:27:47.640272","s":37.3514211886},{"index":478,"x":347,"y":134,"id":114,"time":"2020-06-17 03:27:47.640272","s":37.3514211886},{"index":479,"x":943,"y":642,"id":115,"time":"2020-06-17 03:27:47.640272","s":37.3514211886},{"index":480,"x":112,"y":240,"id":116,"time":"2020-06-17 03:27:47.640272","s":37.3514211886},{"index":481,"x":105,"y":331,"id":3,"time":"2020-06-17 03:27:48.402546","s":38.1136950904},{"index":482,"x":742,"y":147,"id":4,"time":"2020-06-17 03:27:48.402546","s":38.1136950904},{"index":483,"x":823,"y":197,"id":96,"time":"2020-06-17 03:27:48.402546","s":38.1136950904},{"index":484,"x":245,"y":166,"id":99,"time":"2020-06-17 03:27:48.402546","s":38.1136950904},{"index":485,"x":261,"y":83,"id":105,"time":"2020-06-17 03:27:48.402546","s":38.1136950904},{"index":486,"x":648,"y":119,"id":108,"time":"2020-06-17 03:27:48.402546","s":38.1136950904},{"index":487,"x":532,"y":674,"id":112,"time":"2020-06-17 03:27:48.402546","s":38.1136950904},{"index":488,"x":347,"y":134,"id":114,"time":"2020-06-17 03:27:48.402546","s":38.1136950904},{"index":489,"x":104,"y":332,"id":3,"time":"2020-06-17 03:27:49.164820","s":38.8759689922},{"index":490,"x":717,"y":142,"id":4,"time":"2020-06-17 03:27:49.164820","s":38.8759689922},{"index":491,"x":821,"y":197,"id":96,"time":"2020-06-17 03:27:49.164820","s":38.8759689922},{"index":492,"x":296,"y":171,"id":99,"time":"2020-06-17 03:27:49.164820","s":38.8759689922},{"index":493,"x":646,"y":104,"id":108,"time":"2020-06-17 03:27:49.164820","s":38.8759689922},{"index":494,"x":344,"y":133,"id":114,"time":"2020-06-17 03:27:49.164820","s":38.8759689922},{"index":495,"x":247,"y":143,"id":120,"time":"2020-06-17 03:27:49.164820","s":38.8759689922},{"index":496,"x":218,"y":185,"id":121,"time":"2020-06-17 03:27:49.164820","s":38.8759689922},{"index":497,"x":78,"y":333,"id":3,"time":"2020-06-17 03:27:49.927094","s":39.6382428941},{"index":498,"x":740,"y":148,"id":4,"time":"2020-06-17 03:27:49.927094","s":39.6382428941},{"index":499,"x":819,"y":198,"id":96,"time":"2020-06-17 03:27:49.927094","s":39.6382428941},{"index":500,"x":295,"y":179,"id":99,"time":"2020-06-17 03:27:49.927094","s":39.6382428941},{"index":501,"x":648,"y":109,"id":108,"time":"2020-06-17 03:27:49.927094","s":39.6382428941},{"index":502,"x":345,"y":135,"id":114,"time":"2020-06-17 03:27:49.927094","s":39.6382428941},{"index":503,"x":109,"y":236,"id":116,"time":"2020-06-17 03:27:49.927094","s":39.6382428941},{"index":504,"x":243,"y":143,"id":120,"time":"2020-06-17 03:27:49.927094","s":39.6382428941},{"index":505,"x":220,"y":186,"id":121,"time":"2020-06-17 03:27:49.927094","s":39.6382428941},{"index":506,"x":491,"y":676,"id":123,"time":"2020-06-17 03:27:49.927094","s":39.6382428941},{"index":507,"x":948,"y":607,"id":124,"time":"2020-06-17 03:27:49.927094","s":39.6382428941},{"index":508,"x":323,"y":264,"id":125,"time":"2020-06-17 03:27:49.927094","s":39.6382428941},{"index":509,"x":74,"y":338,"id":3,"time":"2020-06-17 03:27:50.689368","s":40.4005167959},{"index":510,"x":719,"y":149,"id":4,"time":"2020-06-17 03:27:50.689368","s":40.4005167959},{"index":511,"x":820,"y":197,"id":96,"time":"2020-06-17 03:27:50.689368","s":40.4005167959},{"index":512,"x":295,"y":204,"id":99,"time":"2020-06-17 03:27:50.689368","s":40.4005167959},{"index":513,"x":647,"y":108,"id":108,"time":"2020-06-17 03:27:50.689368","s":40.4005167959},{"index":514,"x":346,"y":135,"id":114,"time":"2020-06-17 03:27:50.689368","s":40.4005167959},{"index":515,"x":104,"y":240,"id":116,"time":"2020-06-17 03:27:50.689368","s":40.4005167959},{"index":516,"x":247,"y":147,"id":120,"time":"2020-06-17 03:27:50.689368","s":40.4005167959},{"index":517,"x":219,"y":186,"id":121,"time":"2020-06-17 03:27:50.689368","s":40.4005167959},{"index":518,"x":522,"y":665,"id":123,"time":"2020-06-17 03:27:50.689368","s":40.4005167959},{"index":519,"x":965,"y":577,"id":124,"time":"2020-06-17 03:27:50.689368","s":40.4005167959},{"index":520,"x":342,"y":267,"id":125,"time":"2020-06-17 03:27:50.689368","s":40.4005167959},{"index":521,"x":76,"y":331,"id":3,"time":"2020-06-17 03:27:51.451642","s":41.1627906977},{"index":522,"x":709,"y":147,"id":4,"time":"2020-06-17 03:27:51.451642","s":41.1627906977},{"index":523,"x":821,"y":197,"id":96,"time":"2020-06-17 03:27:51.451642","s":41.1627906977},{"index":524,"x":270,"y":207,"id":99,"time":"2020-06-17 03:27:51.451642","s":41.1627906977},{"index":525,"x":640,"y":107,"id":108,"time":"2020-06-17 03:27:51.451642","s":41.1627906977},{"index":526,"x":99,"y":234,"id":116,"time":"2020-06-17 03:27:51.451642","s":41.1627906977},{"index":527,"x":520,"y":663,"id":123,"time":"2020-06-17 03:27:51.451642","s":41.1627906977},{"index":528,"x":898,"y":523,"id":124,"time":"2020-06-17 03:27:51.451642","s":41.1627906977},{"index":529,"x":76,"y":330,"id":3,"time":"2020-06-17 03:27:52.213916","s":41.9250645995},{"index":530,"x":716,"y":132,"id":4,"time":"2020-06-17 03:27:52.213916","s":41.9250645995},{"index":531,"x":819,"y":198,"id":96,"time":"2020-06-17 03:27:52.213916","s":41.9250645995},{"index":532,"x":259,"y":222,"id":99,"time":"2020-06-17 03:27:52.213916","s":41.9250645995},{"index":533,"x":639,"y":115,"id":108,"time":"2020-06-17 03:27:52.213916","s":41.9250645995},{"index":534,"x":348,"y":135,"id":114,"time":"2020-06-17 03:27:52.213916","s":41.9250645995},{"index":535,"x":91,"y":233,"id":116,"time":"2020-06-17 03:27:52.213916","s":41.9250645995},{"index":536,"x":515,"y":653,"id":123,"time":"2020-06-17 03:27:52.213916","s":41.9250645995},{"index":537,"x":738,"y":418,"id":124,"time":"2020-06-17 03:27:52.213916","s":41.9250645995},{"index":538,"x":340,"y":270,"id":125,"time":"2020-06-17 03:27:52.213916","s":41.9250645995},{"index":539,"x":61,"y":346,"id":3,"time":"2020-06-17 03:27:52.976190","s":42.6873385013},{"index":540,"x":729,"y":141,"id":4,"time":"2020-06-17 03:27:52.976190","s":42.6873385013},{"index":541,"x":819,"y":198,"id":96,"time":"2020-06-17 03:27:52.976190","s":42.6873385013},{"index":542,"x":293,"y":235,"id":99,"time":"2020-06-17 03:27:52.976190","s":42.6873385013},{"index":543,"x":646,"y":117,"id":108,"time":"2020-06-17 03:27:52.976190","s":42.6873385013},{"index":544,"x":347,"y":135,"id":114,"time":"2020-06-17 03:27:52.976190","s":42.6873385013},{"index":545,"x":92,"y":234,"id":116,"time":"2020-06-17 03:27:52.976190","s":42.6873385013},{"index":546,"x":224,"y":187,"id":121,"time":"2020-06-17 03:27:52.976190","s":42.6873385013},{"index":547,"x":511,"y":650,"id":123,"time":"2020-06-17 03:27:52.976190","s":42.6873385013},{"index":548,"x":588,"y":288,"id":124,"time":"2020-06-17 03:27:52.976190","s":42.6873385013},{"index":549,"x":342,"y":269,"id":125,"time":"2020-06-17 03:27:52.976190","s":42.6873385013},{"index":550,"x":72,"y":349,"id":3,"time":"2020-06-17 03:27:53.738463","s":43.4496124031},{"index":551,"x":713,"y":147,"id":4,"time":"2020-06-17 03:27:53.738463","s":43.4496124031},{"index":552,"x":819,"y":197,"id":96,"time":"2020-06-17 03:27:53.738463","s":43.4496124031},{"index":553,"x":631,"y":111,"id":108,"time":"2020-06-17 03:27:53.738463","s":43.4496124031},{"index":554,"x":346,"y":134,"id":114,"time":"2020-06-17 03:27:53.738463","s":43.4496124031},{"index":555,"x":92,"y":235,"id":116,"time":"2020-06-17 03:27:53.738463","s":43.4496124031},{"index":556,"x":506,"y":647,"id":123,"time":"2020-06-17 03:27:53.738463","s":43.4496124031},{"index":557,"x":457,"y":264,"id":124,"time":"2020-06-17 03:27:53.738463","s":43.4496124031},{"index":558,"x":340,"y":266,"id":125,"time":"2020-06-17 03:27:53.738463","s":43.4496124031},{"index":559,"x":78,"y":372,"id":3,"time":"2020-06-17 03:27:54.500737","s":44.2118863049},{"index":560,"x":708,"y":143,"id":4,"time":"2020-06-17 03:27:54.500737","s":44.2118863049},{"index":561,"x":821,"y":198,"id":96,"time":"2020-06-17 03:27:54.500737","s":44.2118863049},{"index":562,"x":286,"y":233,"id":99,"time":"2020-06-17 03:27:54.500737","s":44.2118863049},{"index":563,"x":627,"y":109,"id":108,"time":"2020-06-17 03:27:54.500737","s":44.2118863049},{"index":564,"x":347,"y":113,"id":114,"time":"2020-06-17 03:27:54.500737","s":44.2118863049},{"index":565,"x":98,"y":241,"id":116,"time":"2020-06-17 03:27:54.500737","s":44.2118863049},{"index":566,"x":513,"y":651,"id":123,"time":"2020-06-17 03:27:54.500737","s":44.2118863049},{"index":567,"x":406,"y":248,"id":124,"time":"2020-06-17 03:27:54.500737","s":44.2118863049},{"index":568,"x":256,"y":114,"id":130,"time":"2020-06-17 03:27:54.500737","s":44.2118863049},{"index":569,"x":82,"y":364,"id":3,"time":"2020-06-17 03:27:55.263011","s":44.9741602067},{"index":570,"x":719,"y":142,"id":4,"time":"2020-06-17 03:27:55.263011","s":44.9741602067},{"index":571,"x":821,"y":197,"id":96,"time":"2020-06-17 03:27:55.263011","s":44.9741602067},{"index":572,"x":286,"y":213,"id":99,"time":"2020-06-17 03:27:55.263011","s":44.9741602067},{"index":573,"x":629,"y":110,"id":108,"time":"2020-06-17 03:27:55.263011","s":44.9741602067},{"index":574,"x":110,"y":250,"id":116,"time":"2020-06-17 03:27:55.263011","s":44.9741602067},{"index":575,"x":524,"y":654,"id":123,"time":"2020-06-17 03:27:55.263011","s":44.9741602067},{"index":576,"x":380,"y":236,"id":124,"time":"2020-06-17 03:27:55.263011","s":44.9741602067},{"index":577,"x":253,"y":112,"id":130,"time":"2020-06-17 03:27:55.263011","s":44.9741602067},{"index":578,"x":81,"y":348,"id":3,"time":"2020-06-17 03:27:56.025285","s":45.7364341085},{"index":579,"x":820,"y":198,"id":96,"time":"2020-06-17 03:27:56.025285","s":45.7364341085},{"index":580,"x":260,"y":211,"id":99,"time":"2020-06-17 03:27:56.025285","s":45.7364341085},{"index":581,"x":637,"y":112,"id":108,"time":"2020-06-17 03:27:56.025285","s":45.7364341085},{"index":582,"x":348,"y":111,"id":114,"time":"2020-06-17 03:27:56.025285","s":45.7364341085},{"index":583,"x":120,"y":244,"id":116,"time":"2020-06-17 03:27:56.025285","s":45.7364341085},{"index":584,"x":544,"y":647,"id":123,"time":"2020-06-17 03:27:56.025285","s":45.7364341085},{"index":585,"x":392,"y":244,"id":124,"time":"2020-06-17 03:27:56.025285","s":45.7364341085},{"index":586,"x":74,"y":379,"id":3,"time":"2020-06-17 03:27:56.787559","s":46.4987080103},{"index":587,"x":805,"y":174,"id":4,"time":"2020-06-17 03:27:56.787559","s":46.4987080103},{"index":588,"x":259,"y":213,"id":99,"time":"2020-06-17 03:27:56.787559","s":46.4987080103},{"index":589,"x":638,"y":111,"id":108,"time":"2020-06-17 03:27:56.787559","s":46.4987080103},{"index":590,"x":347,"y":130,"id":114,"time":"2020-06-17 03:27:56.787559","s":46.4987080103},{"index":591,"x":106,"y":246,"id":116,"time":"2020-06-17 03:27:56.787559","s":46.4987080103},{"index":592,"x":547,"y":634,"id":123,"time":"2020-06-17 03:27:56.787559","s":46.4987080103},{"index":593,"x":440,"y":272,"id":124,"time":"2020-06-17 03:27:56.787559","s":46.4987080103},{"index":594,"x":343,"y":267,"id":125,"time":"2020-06-17 03:27:56.787559","s":46.4987080103},{"index":595,"x":139,"y":429,"id":3,"time":"2020-06-17 03:27:57.549833","s":47.2609819121},{"index":596,"x":854,"y":219,"id":4,"time":"2020-06-17 03:27:57.549833","s":47.2609819121},{"index":597,"x":257,"y":216,"id":99,"time":"2020-06-17 03:27:57.549833","s":47.2609819121},{"index":598,"x":629,"y":110,"id":108,"time":"2020-06-17 03:27:57.549833","s":47.2609819121},{"index":599,"x":347,"y":135,"id":114,"time":"2020-06-17 03:27:57.549833","s":47.2609819121},{"index":600,"x":549,"y":616,"id":123,"time":"2020-06-17 03:27:57.549833","s":47.2609819121},{"index":601,"x":322,"y":276,"id":125,"time":"2020-06-17 03:27:57.549833","s":47.2609819121},{"index":602,"x":707,"y":131,"id":135,"time":"2020-06-17 03:27:57.549833","s":47.2609819121},{"index":603,"x":179,"y":448,"id":3,"time":"2020-06-17 03:27:58.312107","s":48.023255814},{"index":604,"x":915,"y":292,"id":4,"time":"2020-06-17 03:27:58.312107","s":48.023255814},{"index":605,"x":811,"y":195,"id":96,"time":"2020-06-17 03:27:58.312107","s":48.023255814},{"index":606,"x":260,"y":201,"id":99,"time":"2020-06-17 03:27:58.312107","s":48.023255814},{"index":607,"x":639,"y":112,"id":108,"time":"2020-06-17 03:27:58.312107","s":48.023255814},{"index":608,"x":539,"y":613,"id":123,"time":"2020-06-17 03:27:58.312107","s":48.023255814},{"index":609,"x":451,"y":347,"id":124,"time":"2020-06-17 03:27:58.312107","s":48.023255814},{"index":610,"x":319,"y":289,"id":125,"time":"2020-06-17 03:27:58.312107","s":48.023255814},{"index":611,"x":94,"y":234,"id":137,"time":"2020-06-17 03:27:58.312107","s":48.023255814},{"index":612,"x":249,"y":473,"id":3,"time":"2020-06-17 03:27:59.074381","s":48.7855297158},{"index":613,"x":908,"y":342,"id":4,"time":"2020-06-17 03:27:59.074381","s":48.7855297158},{"index":614,"x":793,"y":188,"id":96,"time":"2020-06-17 03:27:59.074381","s":48.7855297158},{"index":615,"x":243,"y":211,"id":99,"time":"2020-06-17 03:27:59.074381","s":48.7855297158},{"index":616,"x":640,"y":111,"id":108,"time":"2020-06-17 03:27:59.074381","s":48.7855297158},{"index":617,"x":523,"y":614,"id":123,"time":"2020-06-17 03:27:59.074381","s":48.7855297158},{"index":618,"x":469,"y":296,"id":124,"time":"2020-06-17 03:27:59.074381","s":48.7855297158},{"index":619,"x":322,"y":286,"id":125,"time":"2020-06-17 03:27:59.074381","s":48.7855297158},{"index":620,"x":106,"y":245,"id":137,"time":"2020-06-17 03:27:59.074381","s":48.7855297158},{"index":621,"x":718,"y":152,"id":141,"time":"2020-06-17 03:27:59.074381","s":48.7855297158},{"index":622,"x":348,"y":428,"id":3,"time":"2020-06-17 03:27:59.836655","s":49.5478036176},{"index":623,"x":791,"y":374,"id":4,"time":"2020-06-17 03:27:59.836655","s":49.5478036176},{"index":624,"x":789,"y":183,"id":96,"time":"2020-06-17 03:27:59.836655","s":49.5478036176},{"index":625,"x":255,"y":183,"id":99,"time":"2020-06-17 03:27:59.836655","s":49.5478036176},{"index":626,"x":635,"y":112,"id":108,"time":"2020-06-17 03:27:59.836655","s":49.5478036176},{"index":627,"x":348,"y":133,"id":114,"time":"2020-06-17 03:27:59.836655","s":49.5478036176},{"index":628,"x":545,"y":620,"id":123,"time":"2020-06-17 03:27:59.836655","s":49.5478036176},{"index":629,"x":512,"y":280,"id":124,"time":"2020-06-17 03:27:59.836655","s":49.5478036176},{"index":630,"x":103,"y":243,"id":137,"time":"2020-06-17 03:27:59.836655","s":49.5478036176},{"index":631,"x":254,"y":97,"id":143,"time":"2020-06-17 03:27:59.836655","s":49.5478036176},{"index":632,"x":374,"y":433,"id":3,"time":"2020-06-17 03:28:00.598929","s":50.3100775194},{"index":633,"x":626,"y":300,"id":4,"time":"2020-06-17 03:28:00.598929","s":50.3100775194},{"index":634,"x":254,"y":209,"id":99,"time":"2020-06-17 03:28:00.598929","s":50.3100775194},{"index":635,"x":665,"y":107,"id":108,"time":"2020-06-17 03:28:00.598929","s":50.3100775194},{"index":636,"x":348,"y":134,"id":114,"time":"2020-06-17 03:28:00.598929","s":50.3100775194},{"index":637,"x":539,"y":623,"id":123,"time":"2020-06-17 03:28:00.598929","s":50.3100775194},{"index":638,"x":492,"y":283,"id":124,"time":"2020-06-17 03:28:00.598929","s":50.3100775194},{"index":639,"x":107,"y":246,"id":137,"time":"2020-06-17 03:28:00.598929","s":50.3100775194},{"index":640,"x":256,"y":87,"id":143,"time":"2020-06-17 03:28:00.598929","s":50.3100775194},{"index":641,"x":371,"y":427,"id":3,"time":"2020-06-17 03:28:01.361202","s":51.0723514212},{"index":642,"x":584,"y":233,"id":4,"time":"2020-06-17 03:28:01.361202","s":51.0723514212},{"index":643,"x":274,"y":206,"id":99,"time":"2020-06-17 03:28:01.361202","s":51.0723514212},{"index":644,"x":663,"y":114,"id":108,"time":"2020-06-17 03:28:01.361202","s":51.0723514212},{"index":645,"x":350,"y":132,"id":114,"time":"2020-06-17 03:28:01.361202","s":51.0723514212},{"index":646,"x":522,"y":619,"id":123,"time":"2020-06-17 03:28:01.361202","s":51.0723514212},{"index":647,"x":487,"y":284,"id":124,"time":"2020-06-17 03:28:01.361202","s":51.0723514212},{"index":648,"x":95,"y":240,"id":137,"time":"2020-06-17 03:28:01.361202","s":51.0723514212},{"index":649,"x":254,"y":84,"id":143,"time":"2020-06-17 03:28:01.361202","s":51.0723514212},{"index":650,"x":371,"y":422,"id":3,"time":"2020-06-17 03:28:02.123476","s":51.834625323},{"index":651,"x":488,"y":153,"id":4,"time":"2020-06-17 03:28:02.123476","s":51.834625323},{"index":652,"x":280,"y":211,"id":99,"time":"2020-06-17 03:28:02.123476","s":51.834625323},{"index":653,"x":659,"y":112,"id":108,"time":"2020-06-17 03:28:02.123476","s":51.834625323},{"index":654,"x":521,"y":622,"id":123,"time":"2020-06-17 03:28:02.123476","s":51.834625323},{"index":655,"x":483,"y":281,"id":124,"time":"2020-06-17 03:28:02.123476","s":51.834625323},{"index":656,"x":87,"y":252,"id":137,"time":"2020-06-17 03:28:02.123476","s":51.834625323},{"index":657,"x":772,"y":145,"id":150,"time":"2020-06-17 03:28:02.123476","s":51.834625323},{"index":658,"x":367,"y":423,"id":3,"time":"2020-06-17 03:28:02.885750","s":52.5968992248},{"index":659,"x":433,"y":171,"id":4,"time":"2020-06-17 03:28:02.885750","s":52.5968992248},{"index":660,"x":290,"y":184,"id":99,"time":"2020-06-17 03:28:02.885750","s":52.5968992248},{"index":661,"x":659,"y":113,"id":108,"time":"2020-06-17 03:28:02.885750","s":52.5968992248},{"index":662,"x":350,"y":132,"id":114,"time":"2020-06-17 03:28:02.885750","s":52.5968992248},{"index":663,"x":517,"y":621,"id":123,"time":"2020-06-17 03:28:02.885750","s":52.5968992248},{"index":664,"x":490,"y":283,"id":124,"time":"2020-06-17 03:28:02.885750","s":52.5968992248},{"index":665,"x":95,"y":252,"id":137,"time":"2020-06-17 03:28:02.885750","s":52.5968992248},{"index":666,"x":253,"y":99,"id":143,"time":"2020-06-17 03:28:02.885750","s":52.5968992248},{"index":667,"x":765,"y":143,"id":150,"time":"2020-06-17 03:28:02.885750","s":52.5968992248},{"index":668,"x":359,"y":421,"id":3,"time":"2020-06-17 03:28:03.648024","s":53.3591731266},{"index":669,"x":404,"y":174,"id":4,"time":"2020-06-17 03:28:03.648024","s":53.3591731266},{"index":670,"x":284,"y":197,"id":99,"time":"2020-06-17 03:28:03.648024","s":53.3591731266},{"index":671,"x":657,"y":113,"id":108,"time":"2020-06-17 03:28:03.648024","s":53.3591731266},{"index":672,"x":490,"y":631,"id":123,"time":"2020-06-17 03:28:03.648024","s":53.3591731266},{"index":673,"x":495,"y":282,"id":124,"time":"2020-06-17 03:28:03.648024","s":53.3591731266},{"index":674,"x":257,"y":110,"id":143,"time":"2020-06-17 03:28:03.648024","s":53.3591731266},{"index":675,"x":1132,"y":577,"id":152,"time":"2020-06-17 03:28:03.648024","s":53.3591731266},{"index":676,"x":372,"y":429,"id":3,"time":"2020-06-17 03:28:04.410298","s":54.1214470284},{"index":677,"x":383,"y":157,"id":4,"time":"2020-06-17 03:28:04.410298","s":54.1214470284},{"index":678,"x":311,"y":201,"id":99,"time":"2020-06-17 03:28:04.410298","s":54.1214470284},{"index":679,"x":658,"y":114,"id":108,"time":"2020-06-17 03:28:04.410298","s":54.1214470284},{"index":680,"x":492,"y":286,"id":124,"time":"2020-06-17 03:28:04.410298","s":54.1214470284},{"index":681,"x":123,"y":250,"id":137,"time":"2020-06-17 03:28:04.410298","s":54.1214470284},{"index":682,"x":269,"y":139,"id":143,"time":"2020-06-17 03:28:04.410298","s":54.1214470284},{"index":683,"x":1024,"y":543,"id":152,"time":"2020-06-17 03:28:04.410298","s":54.1214470284},{"index":684,"x":237,"y":203,"id":155,"time":"2020-06-17 03:28:04.410298","s":54.1214470284},{"index":685,"x":368,"y":438,"id":3,"time":"2020-06-17 03:28:05.172572","s":54.8837209302},{"index":686,"x":382,"y":110,"id":4,"time":"2020-06-17 03:28:05.172572","s":54.8837209302},{"index":687,"x":365,"y":209,"id":99,"time":"2020-06-17 03:28:05.172572","s":54.8837209302},{"index":688,"x":660,"y":114,"id":108,"time":"2020-06-17 03:28:05.172572","s":54.8837209302},{"index":689,"x":490,"y":287,"id":124,"time":"2020-06-17 03:28:05.172572","s":54.8837209302},{"index":690,"x":124,"y":251,"id":137,"time":"2020-06-17 03:28:05.172572","s":54.8837209302},{"index":691,"x":280,"y":187,"id":143,"time":"2020-06-17 03:28:05.172572","s":54.8837209302},{"index":692,"x":771,"y":141,"id":150,"time":"2020-06-17 03:28:05.172572","s":54.8837209302},{"index":693,"x":1014,"y":527,"id":152,"time":"2020-06-17 03:28:05.172572","s":54.8837209302},{"index":694,"x":621,"y":100,"id":157,"time":"2020-06-17 03:28:05.172572","s":54.8837209302},{"index":695,"x":580,"y":664,"id":159,"time":"2020-06-17 03:28:05.172572","s":54.8837209302},{"index":696,"x":337,"y":469,"id":3,"time":"2020-06-17 03:28:05.934846","s":55.645994832},{"index":697,"x":385,"y":251,"id":99,"time":"2020-06-17 03:28:05.934846","s":55.645994832},{"index":698,"x":658,"y":114,"id":108,"time":"2020-06-17 03:28:05.934846","s":55.645994832},{"index":699,"x":490,"y":361,"id":124,"time":"2020-06-17 03:28:05.934846","s":55.645994832},{"index":700,"x":115,"y":252,"id":137,"time":"2020-06-17 03:28:05.934846","s":55.645994832},{"index":701,"x":300,"y":176,"id":143,"time":"2020-06-17 03:28:05.934846","s":55.645994832},{"index":702,"x":1011,"y":555,"id":152,"time":"2020-06-17 03:28:05.934846","s":55.645994832},{"index":703,"x":240,"y":167,"id":155,"time":"2020-06-17 03:28:05.934846","s":55.645994832},{"index":704,"x":620,"y":101,"id":157,"time":"2020-06-17 03:28:05.934846","s":55.645994832},{"index":705,"x":625,"y":582,"id":159,"time":"2020-06-17 03:28:05.934846","s":55.645994832},{"index":706,"x":283,"y":503,"id":3,"time":"2020-06-17 03:28:06.697120","s":56.4082687339},{"index":707,"x":640,"y":108,"id":108,"time":"2020-06-17 03:28:06.697120","s":56.4082687339},{"index":708,"x":417,"y":333,"id":124,"time":"2020-06-17 03:28:06.697120","s":56.4082687339},{"index":709,"x":105,"y":254,"id":137,"time":"2020-06-17 03:28:06.697120","s":56.4082687339},{"index":710,"x":314,"y":242,"id":143,"time":"2020-06-17 03:28:06.697120","s":56.4082687339},{"index":711,"x":763,"y":144,"id":150,"time":"2020-06-17 03:28:06.697120","s":56.4082687339},{"index":712,"x":968,"y":615,"id":152,"time":"2020-06-17 03:28:06.697120","s":56.4082687339},{"index":713,"x":724,"y":540,"id":159,"time":"2020-06-17 03:28:06.697120","s":56.4082687339},{"index":714,"x":437,"y":177,"id":162,"time":"2020-06-17 03:28:06.697120","s":56.4082687339},{"index":715,"x":253,"y":511,"id":3,"time":"2020-06-17 03:28:07.459394","s":57.1705426357},{"index":716,"x":472,"y":242,"id":99,"time":"2020-06-17 03:28:07.459394","s":57.1705426357},{"index":717,"x":634,"y":107,"id":108,"time":"2020-06-17 03:28:07.459394","s":57.1705426357},{"index":718,"x":370,"y":351,"id":124,"time":"2020-06-17 03:28:07.459394","s":57.1705426357},{"index":719,"x":107,"y":251,"id":137,"time":"2020-06-17 03:28:07.459394","s":57.1705426357},{"index":720,"x":316,"y":175,"id":143,"time":"2020-06-17 03:28:07.459394","s":57.1705426357},{"index":721,"x":764,"y":146,"id":150,"time":"2020-06-17 03:28:07.459394","s":57.1705426357},{"index":722,"x":239,"y":152,"id":155,"time":"2020-06-17 03:28:07.459394","s":57.1705426357},{"index":723,"x":722,"y":535,"id":159,"time":"2020-06-17 03:28:07.459394","s":57.1705426357},{"index":724,"x":712,"y":134,"id":165,"time":"2020-06-17 03:28:07.459394","s":57.1705426357},{"index":725,"x":272,"y":528,"id":3,"time":"2020-06-17 03:28:08.221668","s":57.9328165375},{"index":726,"x":537,"y":251,"id":99,"time":"2020-06-17 03:28:08.221668","s":57.9328165375},{"index":727,"x":639,"y":106,"id":108,"time":"2020-06-17 03:28:08.221668","s":57.9328165375},{"index":728,"x":361,"y":401,"id":124,"time":"2020-06-17 03:28:08.221668","s":57.9328165375},{"index":729,"x":315,"y":155,"id":143,"time":"2020-06-17 03:28:08.221668","s":57.9328165375},{"index":730,"x":764,"y":150,"id":150,"time":"2020-06-17 03:28:08.221668","s":57.9328165375},{"index":731,"x":762,"y":530,"id":159,"time":"2020-06-17 03:28:08.221668","s":57.9328165375},{"index":732,"x":288,"y":555,"id":3,"time":"2020-06-17 03:28:08.983941","s":58.6950904393},{"index":733,"x":584,"y":274,"id":99,"time":"2020-06-17 03:28:08.983941","s":58.6950904393},{"index":734,"x":609,"y":101,"id":108,"time":"2020-06-17 03:28:08.983941","s":58.6950904393},{"index":735,"x":365,"y":412,"id":124,"time":"2020-06-17 03:28:08.983941","s":58.6950904393},{"index":736,"x":108,"y":251,"id":137,"time":"2020-06-17 03:28:08.983941","s":58.6950904393},{"index":737,"x":314,"y":147,"id":143,"time":"2020-06-17 03:28:08.983941","s":58.6950904393},{"index":738,"x":764,"y":152,"id":150,"time":"2020-06-17 03:28:08.983941","s":58.6950904393},{"index":739,"x":780,"y":517,"id":159,"time":"2020-06-17 03:28:08.983941","s":58.6950904393},{"index":740,"x":694,"y":434,"id":168,"time":"2020-06-17 03:28:08.983941","s":58.6950904393},{"index":741,"x":709,"y":147,"id":169,"time":"2020-06-17 03:28:08.983941","s":58.6950904393}]}
...\ No newline at end of file ...\ No newline at end of file
1 +{"schema":{"fields":[{"name":"index","type":"string"},{"name":"total","type":"string"},{"name":"now","type":"string"},{"name":"time","type":"string"},{"name":"s","type":"string"}],"primaryKey":["index"],"pandas_version":"0.20.0"},"data":[]}
...\ No newline at end of file ...\ No newline at end of file
1 +{"schema":{"fields":[{"name":"index","type":"string"},{"name":"x","type":"string"},{"name":"y","type":"string"},{"name":"id","type":"string"},{"name":"time","type":"string"},{"name":"s","type":"string"}],"primaryKey":["index"],"pandas_version":"0.20.0"},"data":[]}
...\ No newline at end of file ...\ No newline at end of file
1 +Keras==2.3.1
2 +tensorflow-gpu==1.13.1
3 +numpy==1.15.0
4 +opencv-python==4.2.0.34
5 +scikit-learn==0.23.1
6 +scipy==1.4.1
7 +Pillow
8 +torch==1.4.0
9 +torchvision==0.5.0
This file is too large to display.
This file is too large to display.
1 +import cv2
2 +import os
3 +import moviepy.editor as mp
4 +
5 +video = "00046"
6 +#input_imgs
7 +im_dir = '/home/cai/Desktop/yolo_dataset/t1_video/t1_video_'+video+'/'
8 +#im_dir =
9 +#output_video
10 +video_dir = '/home/cai/Desktop/yolo_dataset/t1_video/test_video/det_t1_video_'+video+'_test_q.avi'
11 +#fps
12 +fps = 50
13 +#num_of_imgs
14 +num = 310
15 +#img_size
16 +img_size = (1920,1080)
17 +fourcc = cv2.VideoWriter_fourcc('M','J','P','G')
18 +#opencv3
19 +
20 +videoWriter = cv2.VideoWriter(video_dir, fourcc, fps, img_size)
21 +for i in range(0,num):
22 + #im_name = os.path.join(im_dir,'frame-' + str(i) + '.png')
23 + im_name = os.path.join(im_dir,'t1_video_'+video+'_' + "%05d" % i + '.jpg')
24 + frame = cv2.imread(im_name)
25 + #frame = cv2.resize(frame, (480, 320))
26 + #frame = cv2.resize(frame,(520,320), interpolation=cv2.INTER_CUBIC)
27 + videoWriter.write(frame)
28 + print (im_name)
29 +videoWriter.release()
30 +print('finish')
1 +import argparse
2 +import tensorflow as tf
3 +import tensorflow.contrib.slim as slim
4 +
5 +
6 +def _batch_norm_fn(x, scope=None):
7 + if scope is None:
8 + scope = tf.get_variable_scope().name + "/bn"
9 + return slim.batch_norm(x, scope=scope)
10 +
11 +
12 +def create_link(
13 + incoming, network_builder, scope, nonlinearity=tf.nn.elu,
14 + weights_initializer=tf.truncated_normal_initializer(stddev=1e-3),
15 + regularizer=None, is_first=False, summarize_activations=True):
16 + if is_first:
17 + network = incoming
18 + else:
19 + network = _batch_norm_fn(incoming, scope=scope + "/bn")
20 + network = nonlinearity(network)
21 + if summarize_activations:
22 + tf.summary.histogram(scope+"/activations", network)
23 +
24 + pre_block_network = network
25 + post_block_network = network_builder(pre_block_network, scope)
26 +
27 + incoming_dim = pre_block_network.get_shape().as_list()[-1]
28 + outgoing_dim = post_block_network.get_shape().as_list()[-1]
29 + if incoming_dim != outgoing_dim:
30 + assert outgoing_dim == 2 * incoming_dim, \
31 + "%d != %d" % (outgoing_dim, 2 * incoming)
32 + projection = slim.conv2d(
33 + incoming, outgoing_dim, 1, 2, padding="SAME", activation_fn=None,
34 + scope=scope+"/projection", weights_initializer=weights_initializer,
35 + biases_initializer=None, weights_regularizer=regularizer)
36 + network = projection + post_block_network
37 + else:
38 + network = incoming + post_block_network
39 + return network
40 +
41 +
42 +def create_inner_block(
43 + incoming, scope, nonlinearity=tf.nn.elu,
44 + weights_initializer=tf.truncated_normal_initializer(1e-3),
45 + bias_initializer=tf.zeros_initializer(), regularizer=None,
46 + increase_dim=False, summarize_activations=True):
47 + n = incoming.get_shape().as_list()[-1]
48 + stride = 1
49 + if increase_dim:
50 + n *= 2
51 + stride = 2
52 +
53 + incoming = slim.conv2d(
54 + incoming, n, [3, 3], stride, activation_fn=nonlinearity, padding="SAME",
55 + normalizer_fn=_batch_norm_fn, weights_initializer=weights_initializer,
56 + biases_initializer=bias_initializer, weights_regularizer=regularizer,
57 + scope=scope + "/1")
58 + if summarize_activations:
59 + tf.summary.histogram(incoming.name + "/activations", incoming)
60 +
61 + incoming = slim.dropout(incoming, keep_prob=0.6)
62 +
63 + incoming = slim.conv2d(
64 + incoming, n, [3, 3], 1, activation_fn=None, padding="SAME",
65 + normalizer_fn=None, weights_initializer=weights_initializer,
66 + biases_initializer=bias_initializer, weights_regularizer=regularizer,
67 + scope=scope + "/2")
68 + return incoming
69 +
70 +
71 +def residual_block(incoming, scope, nonlinearity=tf.nn.elu,
72 + weights_initializer=tf.truncated_normal_initializer(1e3),
73 + bias_initializer=tf.zeros_initializer(), regularizer=None,
74 + increase_dim=False, is_first=False,
75 + summarize_activations=True):
76 +
77 + def network_builder(x, s):
78 + return create_inner_block(
79 + x, s, nonlinearity, weights_initializer, bias_initializer,
80 + regularizer, increase_dim, summarize_activations)
81 +
82 + return create_link(
83 + incoming, network_builder, scope, nonlinearity, weights_initializer,
84 + regularizer, is_first, summarize_activations)
85 +
86 +
87 +def _create_network(incoming, reuse=None, weight_decay=1e-8):
88 + nonlinearity = tf.nn.elu
89 + conv_weight_init = tf.truncated_normal_initializer(stddev=1e-3)
90 + conv_bias_init = tf.zeros_initializer()
91 + conv_regularizer = slim.l2_regularizer(weight_decay)
92 + fc_weight_init = tf.truncated_normal_initializer(stddev=1e-3)
93 + fc_bias_init = tf.zeros_initializer()
94 + fc_regularizer = slim.l2_regularizer(weight_decay)
95 +
96 + def batch_norm_fn(x):
97 + return slim.batch_norm(x, scope=tf.get_variable_scope().name + "/bn")
98 +
99 + network = incoming
100 + network = slim.conv2d(
101 + network, 32, [3, 3], stride=1, activation_fn=nonlinearity,
102 + padding="SAME", normalizer_fn=batch_norm_fn, scope="conv1_1",
103 + weights_initializer=conv_weight_init, biases_initializer=conv_bias_init,
104 + weights_regularizer=conv_regularizer)
105 + network = slim.conv2d(
106 + network, 32, [3, 3], stride=1, activation_fn=nonlinearity,
107 + padding="SAME", normalizer_fn=batch_norm_fn, scope="conv1_2",
108 + weights_initializer=conv_weight_init, biases_initializer=conv_bias_init,
109 + weights_regularizer=conv_regularizer)
110 +
111 + # NOTE(nwojke): This is missing a padding="SAME" to match the CNN
112 + # architecture in Table 1 of the paper. Information on how this affects
113 + # performance on MOT 16 training sequences can be found in
114 + # issue 10 https://github.com/nwojke/deep_sort/issues/10
115 + network = slim.max_pool2d(network, [3, 3], [2, 2], scope="pool1")
116 +
117 + network = residual_block(
118 + network, "conv2_1", nonlinearity, conv_weight_init, conv_bias_init,
119 + conv_regularizer, increase_dim=False, is_first=True)
120 + network = residual_block(
121 + network, "conv2_3", nonlinearity, conv_weight_init, conv_bias_init,
122 + conv_regularizer, increase_dim=False)
123 +
124 + network = residual_block(
125 + network, "conv3_1", nonlinearity, conv_weight_init, conv_bias_init,
126 + conv_regularizer, increase_dim=True)
127 + network = residual_block(
128 + network, "conv3_3", nonlinearity, conv_weight_init, conv_bias_init,
129 + conv_regularizer, increase_dim=False)
130 +
131 + network = residual_block(
132 + network, "conv4_1", nonlinearity, conv_weight_init, conv_bias_init,
133 + conv_regularizer, increase_dim=True)
134 + network = residual_block(
135 + network, "conv4_3", nonlinearity, conv_weight_init, conv_bias_init,
136 + conv_regularizer, increase_dim=False)
137 +
138 + feature_dim = network.get_shape().as_list()[-1]
139 + network = slim.flatten(network)
140 +
141 + network = slim.dropout(network, keep_prob=0.6)
142 + network = slim.fully_connected(
143 + network, feature_dim, activation_fn=nonlinearity,
144 + normalizer_fn=batch_norm_fn, weights_regularizer=fc_regularizer,
145 + scope="fc1", weights_initializer=fc_weight_init,
146 + biases_initializer=fc_bias_init)
147 +
148 + features = network
149 +
150 + # Features in rows, normalize axis 1.
151 + features = slim.batch_norm(features, scope="ball", reuse=reuse)
152 + feature_norm = tf.sqrt(
153 + tf.constant(1e-8, tf.float32) +
154 + tf.reduce_sum(tf.square(features), [1], keepdims=True))
155 + features = features / feature_norm
156 + return features, None
157 +
158 +
159 +def _network_factory(weight_decay=1e-8):
160 +
161 + def factory_fn(image, reuse):
162 + with slim.arg_scope([slim.batch_norm, slim.dropout],
163 + is_training=False):
164 + with slim.arg_scope([slim.conv2d, slim.fully_connected,
165 + slim.batch_norm, slim.layer_norm],
166 + reuse=reuse):
167 + features, logits = _create_network(
168 + image, reuse=reuse, weight_decay=weight_decay)
169 + return features, logits
170 +
171 + return factory_fn
172 +
173 +
174 +def _preprocess(image):
175 + image = image[:, :, ::-1] # BGR to RGB
176 + return image
177 +
178 +
179 +def parse_args():
180 + """Parse command line arguments.
181 + """
182 + parser = argparse.ArgumentParser(description="Freeze old model")
183 + parser.add_argument(
184 + "--checkpoint_in",
185 + default="resources/networks/mars-small128.ckpt-68577",
186 + help="Path to checkpoint file")
187 + parser.add_argument(
188 + "--graphdef_out",
189 + default="resources/networks/mars-small128.pb")
190 + return parser.parse_args()
191 +
192 +
193 +def main():
194 + args = parse_args()
195 +
196 + with tf.Session(graph=tf.Graph()) as session:
197 + input_var = tf.placeholder(
198 + tf.uint8, (None, 128, 64, 3), name="images")
199 + image_var = tf.map_fn(
200 + lambda x: _preprocess(x), tf.cast(input_var, tf.float32),
201 + back_prop=False)
202 +
203 + factory_fn = _network_factory()
204 + features, _ = factory_fn(image_var, reuse=None)
205 + features = tf.identity(features, name="features")
206 +
207 + saver = tf.train.Saver(slim.get# vim: expandtab:ts=4:sw=4
208 +import argparse
209 +import tensorflow as tf
210 +import tensorflow.contrib.slim as slim
211 +
212 +
213 +def _batch_norm_fn(x, scope=None):
214 + if scope is None:
215 + scope = tf.get_variable_scope().name + "/bn"
216 + return slim.batch_norm(x, scope=scope)
217 +
218 +
219 +def create_link(
220 + incoming, network_builder, scope, nonlinearity=tf.nn.elu,
221 + weights_initializer=tf.truncated_normal_initializer(stddev=1e-3),
222 + regularizer=None, is_first=False, summarize_activations=True):
223 + if is_first:
224 + network = incoming
225 + else:
226 + network = _batch_norm_fn(incoming, scope=scope + "/bn")
227 + network = nonlinearity(network)
228 + if summarize_activations:
229 + tf.summary.histogram(scope+"/activations", network)
230 +
231 + pre_block_network = network
232 + post_block_network = network_builder(pre_block_network, scope)
233 +
234 + incoming_dim = pre_block_network.get_shape().as_list()[-1]
235 + outgoing_dim = post_block_network.get_shape().as_list()[-1]
236 + if incoming_dim != outgoing_dim:
237 + assert outgoing_dim == 2 * incoming_dim, \
238 + "%d != %d" % (outgoing_dim, 2 * incoming)
239 + projection = slim.conv2d(
240 + incoming, outgoing_dim, 1, 2, padding="SAME", activation_fn=None,
241 + scope=scope+"/projection", weights_initializer=weights_initializer,
242 + biases_initializer=None, weights_regularizer=regularizer)
243 + network = projection + post_block_network
244 + else:
245 + network = incoming + post_block_network
246 + return network
247 +
248 +
249 +def create_inner_block(
250 + incoming, scope, nonlinearity=tf.nn.elu,
251 + weights_initializer=tf.truncated_normal_initializer(1e-3),
252 + bias_initializer=tf.zeros_initializer(), regularizer=None,
253 + increase_dim=False, summarize_activations=True):
254 + n = incoming.get_shape().as_list()[-1]
255 + stride = 1
256 + if increase_dim:
257 + n *= 2
258 + stride = 2
259 +
260 + incoming = slim.conv2d(
261 + incoming, n, [3, 3], stride, activation_fn=nonlinearity, padding="SAME",
262 + normalizer_fn=_batch_norm_fn, weights_initializer=weights_initializer,
263 + biases_initializer=bias_initializer, weights_regularizer=regularizer,
264 + scope=scope + "/1")
265 + if summarize_activations:
266 + tf.summary.histogram(incoming.name + "/activations", incoming)
267 +
268 + incoming = slim.dropout(incoming, keep_prob=0.6)
269 +
270 + incoming = slim.conv2d(
271 + incoming, n, [3, 3], 1, activation_fn=None, padding="SAME",
272 + normalizer_fn=None, weights_initializer=weights_initializer,
273 + biases_initializer=bias_initializer, weights_regularizer=regularizer,
274 + scope=scope + "/2")
275 + return incoming
276 +
277 +
278 +def residual_block(incoming, scope, nonlinearity=tf.nn.elu,
279 + weights_initializer=tf.truncated_normal_initializer(1e3),
280 + bias_initializer=tf.zeros_initializer(), regularizer=None,
281 + increase_dim=False, is_first=False,
282 + summarize_activations=True):
283 +
284 + def network_builder(x, s):
285 + return create_inner_block(
286 + x, s, nonlinearity, weights_initializer, bias_initializer,
287 + regularizer, increase_dim, summarize_activations)
288 +
289 + return create_link(
290 + incoming, network_builder, scope, nonlinearity, weights_initializer,
291 + regularizer, is_first, summarize_activations)
292 +
293 +
294 +def _create_network(incoming, reuse=None, weight_decay=1e-8):
295 + nonlinearity = tf.nn.elu
296 + conv_weight_init = tf.truncated_normal_initializer(stddev=1e-3)
297 + conv_bias_init = tf.zeros_initializer()
298 + conv_regularizer = slim.l2_regularizer(weight_decay)
299 + fc_weight_init = tf.truncated_normal_initializer(stddev=1e-3)
300 + fc_bias_init = tf.zeros_initializer()
301 + fc_regularizer = slim.l2_regularizer(weight_decay)
302 +
303 + def batch_norm_fn(x):
304 + return slim.batch_norm(x, scope=tf.get_variable_scope().name + "/bn")
305 +
306 + network = incoming
307 + network = slim.conv2d(
308 + network, 32, [3, 3], stride=1, activation_fn=nonlinearity,
309 + padding="SAME", normalizer_fn=batch_norm_fn, scope="conv1_1",
310 + weights_initializer=conv_weight_init, biases_initializer=conv_bias_init,
311 + weights_regularizer=conv_regularizer)
312 + network = slim.conv2d(
313 + network, 32, [3, 3], stride=1, activation_fn=nonlinearity,
314 + padding="SAME", normalizer_fn=batch_norm_fn, scope="conv1_2",
315 + weights_initializer=conv_weight_init, biases_initializer=conv_bias_init,
316 + weights_regularizer=conv_regularizer)
317 +
318 + # NOTE(nwojke): This is missing a padding="SAME" to match the CNN
319 + # architecture in Table 1 of the paper. Information on how this affects
320 + # performance on MOT 16 training sequences can be found in
321 + # issue 10 https://github.com/nwojke/deep_sort/issues/10
322 + network = slim.max_pool2d(network, [3, 3], [2, 2], scope="pool1")
323 +
324 + network = residual_block(
325 + network, "conv2_1", nonlinearity, conv_weight_init, conv_bias_init,
326 + conv_regularizer, increase_dim=False, is_first=True)
327 + network = residual_block(
328 + network, "conv2_3", nonlinearity, conv_weight_init, conv_bias_init,
329 + conv_regularizer, increase_dim=False)
330 +
331 + network = residual_block(
332 + network, "conv3_1", nonlinearity, conv_weight_init, conv_bias_init,
333 + conv_regularizer, increase_dim=True)
334 + network = residual_block(
335 + network, "conv3_3", nonlinearity, conv_weight_init, conv_bias_init,
336 + conv_regularizer, increase_dim=False)
337 +
338 + network = residual_block(
339 + network, "conv4_1", nonlinearity, conv_weight_init, conv_bias_init,
340 + conv_regularizer, increase_dim=True)
341 + network = residual_block(
342 + network, "conv4_3", nonlinearity, conv_weight_init, conv_bias_init,
343 + conv_regularizer, increase_dim=False)
344 +
345 + feature_dim = network.get_shape().as_list()[-1]
346 + network = slim.flatten(network)
347 +
348 + network = slim.dropout(network, keep_prob=0.6)
349 + network = slim.fully_connected(
350 + network, feature_dim, activation_fn=nonlinearity,
351 + normalizer_fn=batch_norm_fn, weights_regularizer=fc_regularizer,
352 + scope="fc1", weights_initializer=fc_weight_init,
353 + biases_initializer=fc_bias_init)
354 +
355 + features = network
356 +
357 + # Features in rows, normalize axis 1.
358 + features = slim.batch_norm(features, scope="ball", reuse=reuse)
359 + feature_norm = tf.sqrt(
360 + tf.constant(1e-8, tf.float32) +
361 + tf.reduce_sum(tf.square(features), [1], keepdims=True))
362 + features = features / feature_norm
363 + return features, None
364 +
365 +
366 +def _network_factory(weight_decay=1e-8):
367 +
368 + def factory_fn(image, reuse):
369 + with slim.arg_scope([slim.batch_norm, slim.dropout],
370 + is_training=False):
371 + with slim.arg_scope([slim.conv2d, slim.fully_connected,
372 + slim.batch_norm, slim.layer_norm],
373 + reuse=reuse):
374 + features, logits = _create_network(
375 + image, reuse=reuse, weight_decay=weight_decay)
376 + return features, logits
377 +
378 + return factory_fn
379 +
380 +
381 +def _preprocess(image):
382 + image = image[:, :, ::-1] # BGR to RGB
383 + return image
384 +
385 +
386 +def parse_args():
387 + """Parse command line arguments.
388 + """
389 + parser = argparse.ArgumentParser(description="Freeze old model")
390 + parser.add_argument(
391 + "--checkpoint_in",# vim: expandtab:ts=4:sw=4
392 +import argparse
393 +import tensorflow as tf
394 +import tensorflow.contrib.slim as slim
395 +
396 +
397 +def _batch_norm_fn(x, scope=None):
398 + if scope is None:
399 + scope = tf.get_variable_scope().name + "/bn"
400 + return slim.batch_norm(x, scope=scope)
401 +
402 +
403 +def create_link(
404 + incoming, network_builder, scope, nonlinearity=tf.nn.elu,
405 + weights_initializer=tf.truncated_normal_initializer(stddev=1e-3),
406 + regularizer=None, is_first=False, summarize_activations=True):
407 + if is_first:
408 + network = incoming
409 + else:
410 + network = _batch_norm_fn(incoming, scope=scope + "/bn")
411 + network = nonlinearity(network)
412 + if summarize_activations:
413 + tf.summary.histogram(scope+"/activations", network)
414 +
415 + pre_block_network = network
416 + post_block_network = network_builder(pre_block_network, scope)
417 +
418 + incoming_dim = pre_block_network.get_shape().as_list()[-1]
419 + outgoing_dim = post_block_network.get_shape().as_list()[-1]
420 + if incoming_dim != outgoing_dim:
421 + assert outgoing_dim == 2 * incoming_dim, \
422 + "%d != %d" % (outgoing_dim, 2 * incoming)
423 + projection = slim.conv2d(
424 + incoming, outgoing_dim, 1, 2, padding="SAME", activation_fn=None,
425 + scope=scope+"/projection", weights_initializer=weights_initializer,
426 + biases_initializer=None, weights_regularizer=regularizer)
427 + network = projection + post_block_network
428 + else:
429 + network = incoming + post_block_network
430 + return network
431 +
432 +
433 +def create_inner_block(
434 + incoming, scope, nonlinearity=tf.nn.elu,
435 + weights_initializer=tf.truncated_normal_initializer(1e-3),
436 + bias_initializer=tf.zeros_initializer(), regularizer=None,
437 + increase_dim=False, summarize_activations=True):
438 + n = incoming.get_shape().as_list()[-1]
439 + stride = 1
440 + if increase_dim:
441 + n *= 2
442 + stride = 2
443 +
444 + incoming = slim.conv2d(
445 + incoming, n, [3, 3], stride, activation_fn=nonlinearity, padding="SAME",
446 + normalizer_fn=_batch_norm_fn, weights_initializer=weights_initializer,
447 + biases_initializer=bias_initializer, weights_regularizer=regularizer,
448 + scope=scope + "/1")
449 + if summarize_activations:
450 + tf.summary.histogram(incoming.name + "/activations", incoming)
451 +
452 + incoming = slim.dropout(incoming, keep_prob=0.6)
453 +
454 + incoming = slim.conv2d(
455 + incoming, n, [3, 3], 1, activation_fn=None, padding="SAME",
456 + normalizer_fn=None, weights_initializer=weights_initializer,
457 + biases_initializer=bias_initializer, weights_regularizer=regularizer,
458 + scope=scope + "/2")
459 + return incoming
460 +
461 +
462 +def residual_block(incoming, scope, nonlinearity=tf.nn.elu,
463 + weights_initializer=tf.truncated_normal_initializer(1e3),
464 + bias_initializer=tf.zeros_initializer(), regularizer=None,
465 + increase_dim=False, is_first=False,
466 + summarize_activations=True):
467 +
468 + def network_builder(x, s):
469 + return create_inner_block(
470 + x, s, nonlinearity, weights_initializer, bias_initializer,
471 + regularizer, increase_dim, summarize_activations)
472 +
473 + return create_link(
474 + incoming, network_builder, scope, nonlinearity, weights_initializer,
475 + regularizer, is_first, summarize_activations)
476 +
477 +
478 +def _create_network(incoming, reuse=None, weight_decay=1e-8):
479 + nonlinearity = tf.nn.elu
480 + conv_weight_init = tf.truncated_normal_initializer(stddev=1e-3)
481 + conv_bias_init = tf.zeros_initializer()
482 + conv_regularizer = slim.l2_regularizer(weight_decay)
483 + fc_weight_init = tf.truncated_normal_initializer(stddev=1e-3)
484 + fc_bias_init = tf.zeros_initializer()
485 + fc_regularizer = slim.l2_regularizer(weight_decay)
486 +
487 + def batch_norm_fn(x):
488 + return slim.batch_norm(x, scope=tf.get_variable_scope().name + "/bn")
489 +
490 + network = incoming
491 + network = slim.conv2d(
492 + network, 32, [3, 3], stride=1, activation_fn=nonlinearity,
493 + padding="SAME", normalizer_fn=batch_norm_fn, scope="conv1_1",
494 + weights_initializer=conv_weight_init, biases_initializer=conv_bias_init,
495 + weights_regularizer=conv_regularizer)
496 + network = slim.conv2d(
497 + network, 32, [3, 3], stride=1, activation_fn=nonlinearity,
498 + padding="SAME", normalizer_fn=batch_norm_fn, scope="conv1_2",
499 + weights_initializer=conv_weight_init, biases_initializer=conv_bias_init,
500 + weights_regularizer=conv_regularizer)
501 +
502 + # NOTE(nwojke): This is missing a padding="SAME" to match the CNN
503 + # architecture in Table 1 of the paper. Information on how this affects
504 + # performance on MOT 16 training sequences can be found in
505 + # issue 10 https://github.com/nwojke/deep_sort/issues/10
506 + network = slim.max_pool2d(network, [3, 3], [2, 2], scope="pool1")
507 +
508 + network = residual_block(
509 + network, "conv2_1", nonlinearity, conv_weight_init, conv_bias_init,
510 + conv_regularizer, increase_dim=False, is_first=True)
511 + network = residual_block(
512 + network, "conv2_3", nonlinearity, conv_weight_init, conv_bias_init,
513 + conv_regularizer, increase_dim=False)
514 +
515 + network = residual_block(
516 + network, "conv3_1", nonlinearity, conv_weight_init, conv_bias_init,
517 + conv_regularizer, increase_dim=True)
518 + network = residual_block(
519 + network, "conv3_3", nonlinearity, conv_weight_init, conv_bias_init,
520 + conv_regularizer, increase_dim=False)
521 +
522 + network = residual_block(
523 + network, "conv4_1", nonlinearity, conv_weight_init, conv_bias_init,
524 + conv_regularizer, increase_dim=True)
525 + network = residual_block(
526 + network, "conv4_3", nonlinearity, conv_weight_init, conv_bias_init,
527 + conv_regularizer, increase_dim=False)
528 +
529 + feature_dim = network.get_shape().as_list()[-1]
530 + network = slim.flatten(network)
531 +
532 + network = slim.dropout(network, keep_prob=0.6)
533 + network = slim.fully_connected(
534 + network, feature_dim, activation_fn=nonlinearity,
535 + normalizer_fn=batch_norm_fn, weights_regularizer=fc_regularizer,
536 + scope="fc1", weights_initializer=fc_weight_init,
537 + biases_initializer=fc_bias_init)
538 +
539 + features = network
540 +
541 + # Features in rows, normalize axis 1.
542 + features = slim.batch_norm(features, scope="ball", reuse=reuse)
543 + feature_norm = tf.sqrt(
544 + tf.constant(1e-8, tf.float32) +
545 + tf.reduce_sum(tf.square(features), [1], keepdims=True))
546 + features = features / feature_norm
547 + return features, None
548 +
549 +
550 +def _network_factory(weight_decay=1e-8):
551 +
552 + def factory_fn(image, reuse):
553 + with slim.arg_scope([slim.batch_norm, slim.dropout],
554 + is_training=False):
555 + with slim.arg_scope([slim.conv2d, slim.fully_connected,
556 + slim.batch_norm, slim.layer_norm],
557 + reuse=reuse):
558 + features, logits = _create_network(
559 + image, reuse=reuse, weight_decay=weight_decay)
560 + return features, logits
561 +
562 + return factory_fn
563 +
564 +
565 +def _preprocess(image):
566 + image = image[:, :, ::-1] # BGR to RGB
567 + return image
568 +
569 +
570 +def parse_args():
571 + """Parse command line arguments.
572 + """
573 + parser = argparse.ArgumentParser(description="Freeze old model")
574 + parser.add_argument(
575 + "--checkpoint_in",
576 + default="resources/networks/mars-small128.ckpt-68577",
577 + help="Path to checkpoint file")
578 + parser.add_argument(
579 + "--graphdef_out",
580 + default="resources/networks/mars-small128.pb")
581 + return parser.parse_args()
582 +
583 +
584 +def main():
585 + args = parse_args()
586 +
587 + with tf.Session(graph=tf.Graph()) as session:
588 + input_var = tf.placeholder(
589 + tf.uint8, (None, 128, 64, 3), name="images")
590 + image_var = tf.map_fn(
591 + lambda x: _preprocess(x), tf.cast(input_var, tf.float32),
592 + back_prop=False)
593 +
594 + factory_fn = _network_factory()
595 + features, _ = factory_fn(image_var, reuse=None)
596 + features = tf.identity(features, name="features")
597 +
598 + saver = tf.train.Saver(slim.get_variables_to_restore())
599 + saver.restore(session, args.checkpoint_in)
600 +
601 + output_graph_def = tf.graph_util.convert_variables_to_constants(
602 + session, tf.get_default_graph().as_graph_def(),
603 + [features.name.split(":")[0]])
604 + with tf.gfile.GFile(args.graphdef_out, "wb") as file_handle:
605 + file_handle.write(output_graph_def.SerializeToString())
606 +
607 +
608 +if __name__ == "__main__":
609 + default="resources/networks/mars-small128.ckpt-68577",
610 + help="Path to checkpoint file")
611 + parser.add_argument(
612 + "--graphdef_out",
613 + default="resources/networks/mars-small128.pb")
614 + return parser.parse_args()
615 +
616 +
617 +def main():
618 + args = parse_args()
619 +
620 + with tf.Session(graph=tf.Graph()) as session:
621 + input_var = tf.placeholder(
622 + tf.uint8, (None, 128, 64, 3), name="images")
623 + image_var = tf.map_fn(
624 + lambda x: _preprocess(x), tf.cast(input_var, tf.float32),
625 + back_prop=False)
626 +
627 + factory_fn = _network_factory()
628 + features, _ = factory_fn(image_var, reuse=None)
629 + features = tf.identity(features, name="features")
630 +
631 + saver = tf.train.Saver(slim.get_variables_to_restore())
632 + saver.restore(session, args.checkpoint_in)
633 +
634 + output_graph_def = tf.graph_util.convert_variables_to_constants(
635 + session, tf.get_default_graph().as_graph_def(),
636 + [features.name.split(":")[0]])
637 + with tf.gfile.GFile(args.graphdef_out, "wb") as file_handle:
638 + file_handle.write(output_graph_def.SerializeToString())
639 +
640 +
641 +if __name__ == "__main__":_variables_to_restore())
642 + saver.restore(session, args.checkpoint_in)
643 +
644 + output_graph_def = tf.graph_util.convert_variables_to_constants(
645 + session, tf.get_default_graph().as_graph_def(),
646 + [features.name.split(":")[0]])
647 + with tf.gfile.GFile(args.graphdef_out, "wb") as file_handle:
648 + file_handle.write(output_graph_def.SerializeToString())
649 +
650 +
651 +if __name__ == "__main__":
652 + main()
1 +import os
2 +import errno
3 +import argparse
4 +import numpy as np
5 +import cv2
6 +import tensorflow as tf
7 +
8 +
9 +def _run_in_batches(f, data_dict, out, batch_size):
10 + data_len = len(out)
11 + num_batches = int(data_len / batch_size)
12 +
13 + s, e = 0, 0
14 + for i in range(num_batches):
15 + s, e = i * batch_size, (i + 1) * batch_size
16 + batch_data_dict = {k: v[s:e] for k, v in data_dict.items()}
17 + out[s:e] = f(batch_data_dict)
18 + if e < len(out):
19 + batch_data_dict = {k: v[e:] for k, v in data_dict.items()}
20 + out[e:] = f(batch_data_dict)
21 +
22 +
23 +def extract_image_patch(image, bbox, patch_shape):
24 + """Extract image patch from bounding box.
25 + Parameters
26 + ----------
27 + image : ndarray
28 + The full image.
29 + bbox : array_like
30 + The bounding box in format (x, y, width, height).
31 + patch_shape : Optional[array_like]
32 + This parameter can be used to enforce a desired patch shape
33 + (height, width). First, the `bbox` is adapted to the aspect ratio
34 + of the patch shape, then it is clipped at the image boundaries.
35 + If None, the shape is computed from :arg:`bbox`.
36 + Returns
37 + -------
38 + ndarray | NoneType
39 + An image patch showing the :arg:`bbox`, optionally reshaped to
40 + :arg:`patch_shape`.
41 + Returns None if the bounding box is empty or fully outside of the image
42 + boundaries.
43 + """
44 + bbox = np.array(bbox)
45 + if patch_shape is not None:
46 + # correct aspect ratio to patch shape
47 + target_aspect = float(patch_shape[1]) / patch_shape[0]
48 + new_width = target_aspect * bbox[3]
49 + bbox[0] -= (new_width - bbox[2]) / 2
50 + bbox[2] = new_width
51 +
52 + # convert to top left, bottom right
53 + bbox[2:] += bbox[:2]
54 + bbox = bbox.astype(np.int)
55 +
56 + # clip at image boundaries
57 + bbox[:2] = np.maximum(0, bbox[:2])
58 + bbox[2:] = np.minimum(np.asarray(image.shape[:2][::-1]) - 1, bbox[2:])
59 + if np.any(bbox[:2] >= bbox[2:]):
60 + return None
61 + sx, sy, ex, ey = bbox
62 + image = image[sy:ey, sx:ex]
63 + image = cv2.resize(image, tuple(patch_shape[::-1]))
64 + return image
65 +
66 +
67 +class ImageEncoder(object):
68 +
69 + def __init__(self, checkpoint_filename, input_name="images",
70 + output_name="features"):
71 + self.session = tf.Session()
72 + with tf.gfile.GFile(checkpoint_filename, "rb") as file_handle:
73 + graph_def = tf.GraphDef()
74 + graph_def.ParseFromString(file_handle.read())
75 + tf.import_graph_def(graph_def, name="net")
76 + self.input_var = tf.get_default_graph().get_tensor_by_name(
77 + "net/%s:0" % input_name)
78 + self.output_var = tf.get_default_graph().get_tensor_by_name(
79 + "net/%s:0" % output_name)
80 +
81 + assert len(self.output_var.get_shape()) == 2
82 + assert len(self.input_var.get_shape()) == 4
83 + self.feature_dim = self.output_var.get_shape().as_list()[-1]
84 + self.image_shape = self.input_var.get_shape().as_list()[1:]
85 +
86 + def __call__(self, data_x, batch_size=32):
87 + out = np.zeros((len(data_x), self.feature_dim), np.float32)
88 + _run_in_batches(
89 + lambda x: self.session.run(self.output_var, feed_dict=x),
90 + {self.input_var: data_x}, out, batch_size)
91 + return out
92 +
93 +
94 +def create_box_encoder(model_filename, input_name="images",
95 + output_name="features", batch_size=32):
96 + image_encoder = ImageEncoder(model_filename, input_name, output_name)
97 + image_shape = image_encoder.image_shape
98 +
99 + def encoder(image, boxes):
100 + image_patches = []
101 + for box in boxes:
102 + patch = extract_image_patch(image, box, image_shape[:2])
103 + if patch is None:
104 + print("WARNING: Failed to extract image patch: %s." % str(box))
105 + patch = np.random.uniform(
106 + 0., 255., image_shape).astype(np.uint8)
107 + image_patches.append(patch)
108 + image_patches = np.asarray(image_patches)
109 + return image_encoder(image_patches, batch_size)
110 +
111 + return encoder
112 +
113 +
114 +def generate_detections(encoder, mot_dir, output_dir, detection_dir=None):
115 + """Generate detections with features.
116 + Parameters
117 + ----------
118 + encoder : Callable[image, ndarray] -> ndarray
119 + The encoder function takes as input a BGR color image and a matrix of
120 + bounding boxes in format `(x, y, w, h)` and returns a matrix of
121 + corresponding feature vectors.
122 + mot_dir : str
123 + Path to the MOTChallenge directory (can be either train or test).
124 + output_dir
125 + Path to the output directory. Will be created if it does not exist.
126 + detection_dir
127 + Path to custom detections. The directory structure should be the default
128 + MOTChallenge structure: `[sequence]/det/det.txt`. If None, uses the
129 + standard MOTChallenge detections.
130 + """
131 + if detection_dir is None:
132 + detection_dir = mot_dir
133 + try:
134 + os.makedirs(output_dir)
135 + except OSError as exception:
136 + if exception.errno == errno.EEXIST and os.path.isdir(output_dir):
137 + pass
138 + else:
139 + raise ValueError(
140 + "Failed to created output directory '%s'" % output_dir)
141 +
142 + for sequence in os.listdir(mot_dir):
143 + print("Processing %s" % sequence)
144 + sequence_dir = os.path.join(mot_dir, sequence)
145 +
146 + image_dir = os.path.join(sequence_dir, "img1")
147 + image_filenames = {
148 + int(os.path.splitext(f)[0]): os.path.join(image_dir, f)
149 + for f in os.listdir(image_dir)}
150 +
151 + detection_file = os.path.join(
152 + detection_dir, sequence, "det/det.txt")
153 + detections_in = np.loadtxt(detection_file, delimiter=',')
154 + detections_out = []
155 +
156 + frame_indices = detections_in[:, 0].astype(np.int)
157 + min_frame_idx = frame_indices.astype(np.int).min()
158 + max_frame_idx = frame_indices.astype(np.int).max()
159 + for frame_idx in range(min_frame_idx, max_frame_idx + 1):
160 + print("Frame %05d/%05d" % (frame_idx, max_frame_idx))
161 + mask = frame_indices == frame_idx
162 + rows = detections_in[mask]
163 +
164 + if frame_idx not in image_filenames:
165 + print("WARNING could not find image for frame %d" % frame_idx)
166 + continue
167 + bgr_image = cv2.imread(
168 + image_filenames[frame_idx], cv2.IMREAD_COLOR)
169 + features = encoder(bgr_image, rows[:, 2:6].copy())
170 + detections_out += [np.r_[(row, feature)] for row, feature
171 + in zip(rows, features)]
172 +
173 + output_filename = os.path.join(output_dir, "%s.npy" % sequence)
174 + np.save(
175 + output_filename, np.asarray(detections_out), allow_pickle=False)
176 +
177 +
178 +def parse_args():
179 + """Parse command line arguments.
180 + """
181 + parser = argparse.ArgumentParser(description="Re-ID feature extractor")
182 + parser.add_argument(
183 + "--model",
184 + default="resources/networks/mars-small128.pb",
185 + help="Path to freezed inference graph protobuf.")
186 + parser.add_argument(
187 + "--mot_dir", help="Path to MOTChallenge directory (train or test)",
188 + required=True)
189 + parser.add_argument(
190 + "--detection_dir", help="Path to custom detections. Defaults to "
191 + "standard MOT detections Directory structure should be the default "
192 + "MOTChallenge structure: [sequence]/det/det.txt", default=None)
193 + parser.add_argument(
194 + "--output_dir", help="Output directory. Will be created if it does not"
195 + " exist.", default="detections")
196 + return parser.parse_args()
197 +
198 +
199 +def main():
200 + args = parse_args()
201 + encoder = create_box_encoder(args.model, batch_size=32)
202 + generate_detections(encoder, args.mot_dir, args.output_dir,
203 + args.detection_dir)
204 +
205 +
206 +if __name__ == "__main__":
207 + main()
1 +import cv2
2 +
3 +image_folder = './mask_face'
4 +video_name = './cut_test.m4v'
5 +
6 +vc = cv2.VideoCapture(video_name)
7 +c = 1
8 +if vc.isOpened():
9 + rval,frame=vc.read()
10 +else:
11 + rval=False
12 +while rval:
13 + rval,frame=vc.read()
14 + cv2.imwrite('./mask_face/IMG_'+str(c)+'.jpg',frame)
15 + c=c+1
16 + cv2.waitKey(1)
17 +vc.release()
1 +import colorsys
2 +
3 +import numpy as np
4 +from keras import backend as K
5 +from keras.models import load_model
6 +
7 +from yolo4.model import yolo_eval, Mish
8 +from yolo4.utils import letterbox_image
9 +import os
10 +from keras.utils import multi_gpu_model
11 +
12 +class YOLO(object):
13 + def __init__(self):
14 + self.model_path = os.getcwd() + '/../deep_sort_yolov4/model_data/yolo4_weight.h5'
15 + self.anchors_path = os.getcwd() + '/../deep_sort_yolov4/model_data/yolo_anchors.txt'
16 + self.classes_path = os.getcwd() + '/../deep_sort_yolov4/model_data/coco_classes.txt'
17 + self.gpu_num = 1
18 + self.score = 0.5
19 + self.iou = 0.5
20 + self.class_names = self._get_class()
21 + self.anchors = self._get_anchors()
22 + self.sess = K.get_session()
23 + self.model_image_size = (608, 608) # fixed size or (None, None)
24 + self.is_fixed_size = self.model_image_size != (None, None)
25 + self.boxes, self.scores, self.classes = self.generate()
26 +
27 + def _get_class(self):
28 + classes_path = os.path.expanduser(self.classes_path)
29 + with open(classes_path) as f:
30 + class_names = f.readlines()
31 + class_names = [c.strip() for c in class_names]
32 + return class_names
33 +
34 + def _get_anchors(self):
35 + anchors_path = os.path.expanduser(self.anchors_path)
36 + with open(anchors_path) as f:
37 + anchors = f.readline()
38 + anchors = [float(x) for x in anchors.split(',')]
39 + anchors = np.array(anchors).reshape(-1, 2)
40 + return anchors
41 +
42 + def generate(self):
43 + model_path = os.path.expanduser(self.model_path)
44 + assert model_path.endswith('.h5'), 'Keras model or weights must be a .h5 file.'
45 +
46 + self.yolo_model = load_model(model_path, custom_objects={'Mish': Mish}, compile=False)
47 +
48 + print('{} model, anchors, and classes loaded.'.format(model_path))
49 +
50 + # Generate colors for drawing bounding boxes.
51 + hsv_tuples = [(x / len(self.class_names), 1., 1.)
52 + for x in range(len(self.class_names))]
53 + self.colors = list(map(lambda x: colorsys.hsv_to_rgb(*x), hsv_tuples))
54 + self.colors = list(
55 + map(lambda x: (int(x[0] * 255), int(x[1] * 255), int(x[2] * 255)),
56 + self.colors))
57 + np.random.seed(10101) # Fixed seed for consistent colors across runs.
58 + np.random.shuffle(self.colors) # Shuffle colors to decorrelate adjacent classes.
59 + np.random.seed(None) # Reset seed to default.
60 +
61 + # Generate output tensor targets for filtered bounding boxes.
62 + self.input_image_shape = K.placeholder(shape=(2, ))
63 + if self.gpu_num>=2:
64 + self.yolo_model = multi_gpu_model(self.yolo_model, gpus=self.gpu_num)
65 + boxes, scores, classes = yolo_eval(self.yolo_model.output, self.anchors,
66 + len(self.class_names), self.input_image_shape,
67 + score_threshold=self.score, iou_threshold=self.iou)
68 + return boxes, scores, classes
69 +
70 + def detect_image(self, image):
71 +
72 + if self.is_fixed_size:
73 + assert self.model_image_size[0]%32 == 0, 'Multiples of 32 required'
74 + assert self.model_image_size[1]%32 == 0, 'Multiples of 32 required'
75 + boxed_image = letterbox_image(image, tuple(reversed(self.model_image_size)))
76 + else:
77 + new_image_size = (image.width - (image.width % 32),
78 + image.height - (image.height % 32))
79 + boxed_image = letterbox_image(image, new_image_size)
80 + image_data = np.array(boxed_image, dtype='float32')
81 +
82 + # print(image_data.shape)
83 + image_data /= 255.
84 + image_data = np.expand_dims(image_data, 0) # Add batch dimension.
85 +
86 + out_boxes, out_scores, out_classes = self.sess.run(
87 + [self.boxes, self.scores, self.classes],
88 + feed_dict={
89 + self.yolo_model.input: image_data,
90 + self.input_image_shape: [image.size[1], image.size[0]],
91 + K.learning_phase(): 0
92 + })
93 + return_boxes = []
94 + return_scores = []
95 + return_class_names = []
96 + for i, c in reversed(list(enumerate(out_classes))):
97 + predicted_class = self.class_names[c]
98 + if predicted_class != 'person': # Modify to detect other classes.
99 + continue
100 + box = out_boxes[i]
101 + score = out_scores[i]
102 + x = int(box[1])
103 + y = int(box[0])
104 + w = int(box[3] - box[1])
105 + h = int(box[2] - box[0])
106 + if x < 0:
107 + w = w + x
108 + x = 0
109 + if y < 0:
110 + h = h + y
111 + y = 0
112 + return_boxes.append([x, y, w, h])
113 + return_scores.append(score)
114 + return_class_names.append(predicted_class)
115 +
116 + return return_boxes, return_scores, return_class_names
117 +
118 + def close_session(self):
119 + self.sess.close()
1 +"""YOLO_v4 Model Defined in Keras."""
2 +
3 +from functools import wraps
4 +
5 +import numpy as np
6 +import tensorflow as tf
7 +from keras import backend as K
8 +from keras.engine.base_layer import Layer
9 +from keras.layers import Conv2D, Add, ZeroPadding2D, UpSampling2D, Concatenate, MaxPooling2D
10 +from keras.layers.advanced_activations import LeakyReLU
11 +from keras.layers.normalization import BatchNormalization
12 +from keras.models import Model
13 +from keras.regularizers import l2
14 +
15 +from yolo4.utils import compose
16 +
17 +class Mish(Layer):
18 +
19 +
20 + def __init__(self, **kwargs):
21 + super(Mish, self).__init__(**kwargs)
22 + self.supports_masking = True
23 +
24 + def call(self, inputs):
25 + return inputs * K.tanh(K.softplus(inputs))
26 +
27 + def get_config(self):
28 + config = super(Mish, self).get_config()
29 + return config
30 +
31 + def compute_output_shape(self, input_shape):
32 + return input_shape
33 +
34 +
35 +@wraps(Conv2D)
36 +def DarknetConv2D(*args, **kwargs):
37 + """Wrapper to set Darknet parameters for Convolution2D."""
38 + darknet_conv_kwargs = {'kernel_regularizer': l2(5e-4)}
39 + darknet_conv_kwargs['padding'] = 'valid' if kwargs.get('strides')==(2,2) else 'same'
40 + darknet_conv_kwargs.update(kwargs)
41 + return Conv2D(*args, **darknet_conv_kwargs)
42 +
43 +def DarknetConv2D_BN_Leaky(*args, **kwargs):
44 + """Darknet Convolution2D followed by BatchNormalization and LeakyReLU."""
45 + no_bias_kwargs = {'use_bias': False}
46 + no_bias_kwargs.update(kwargs)
47 + return compose(
48 + DarknetConv2D(*args, **no_bias_kwargs),
49 + BatchNormalization(),
50 + LeakyReLU(alpha=0.1))
51 +
52 +def DarknetConv2D_BN_Mish(*args, **kwargs):
53 + """Darknet Convolution2D followed by BatchNormalization and LeakyReLU."""
54 + no_bias_kwargs = {'use_bias': False}
55 + no_bias_kwargs.update(kwargs)
56 + return compose(
57 + DarknetConv2D(*args, **no_bias_kwargs),
58 + BatchNormalization(),
59 + Mish())
60 +
61 +def resblock_body(x, num_filters, num_blocks, all_narrow=True):
62 + '''A series of resblocks starting with a downsampling Convolution2D'''
63 + # Darknet uses left and top padding instead of 'same' mode
64 + preconv1 = ZeroPadding2D(((1,0),(1,0)))(x)
65 + preconv1 = DarknetConv2D_BN_Mish(num_filters, (3,3), strides=(2,2))(preconv1)
66 + shortconv = DarknetConv2D_BN_Mish(num_filters//2 if all_narrow else num_filters, (1,1))(preconv1)
67 + mainconv = DarknetConv2D_BN_Mish(num_filters//2 if all_narrow else num_filters, (1,1))(preconv1)
68 + for i in range(num_blocks):
69 + y = compose(
70 + DarknetConv2D_BN_Mish(num_filters//2, (1,1)),
71 + DarknetConv2D_BN_Mish(num_filters//2 if all_narrow else num_filters, (3,3)))(mainconv)
72 + mainconv = Add()([mainconv,y])
73 + postconv = DarknetConv2D_BN_Mish(num_filters//2 if all_narrow else num_filters, (1,1))(mainconv)
74 + route = Concatenate()([postconv, shortconv])
75 + return DarknetConv2D_BN_Mish(num_filters, (1,1))(route)
76 +
77 +def darknet_body(x):
78 + '''Darknent body having 52 Convolution2D layers'''
79 + x = DarknetConv2D_BN_Mish(32, (3,3))(x)
80 + x = resblock_body(x, 64, 1, False)
81 + x = resblock_body(x, 128, 2)
82 + x = resblock_body(x, 256, 8)
83 + x = resblock_body(x, 512, 8)
84 + x = resblock_body(x, 1024, 4)
85 + return x
86 +
87 +def make_last_layers(x, num_filters, out_filters):
88 + '''6 Conv2D_BN_Leaky layers followed by a Conv2D_linear layer'''
89 + x = compose(
90 + DarknetConv2D_BN_Leaky(num_filters, (1,1)),
91 + DarknetConv2D_BN_Leaky(num_filters*2, (3,3)),
92 + DarknetConv2D_BN_Leaky(num_filters, (1,1)),
93 + DarknetConv2D_BN_Leaky(num_filters*2, (3,3)),
94 + DarknetConv2D_BN_Leaky(num_filters, (1,1)))(x)
95 + y = compose(
96 + DarknetConv2D_BN_Leaky(num_filters*2, (3,3)),
97 + DarknetConv2D(out_filters, (1,1)))(x)
98 + return x, y
99 +
100 +
101 +def yolo4_body(inputs, num_anchors, num_classes):
102 + """Create YOLO_V4 model CNN body in Keras."""
103 + darknet = Model(inputs, darknet_body(inputs))
104 +
105 + #19x19 head
106 + y19 = DarknetConv2D_BN_Leaky(512, (1,1))(darknet.output)
107 + y19 = DarknetConv2D_BN_Leaky(1024, (3,3))(y19)
108 + y19 = DarknetConv2D_BN_Leaky(512, (1,1))(y19)
109 + maxpool1 = MaxPooling2D(pool_size=(13,13), strides=(1,1), padding='same')(y19)
110 + maxpool2 = MaxPooling2D(pool_size=(9,9), strides=(1,1), padding='same')(y19)
111 + maxpool3 = MaxPooling2D(pool_size=(5,5), strides=(1,1), padding='same')(y19)
112 + y19 = Concatenate()([maxpool1, maxpool2, maxpool3, y19])
113 + y19 = DarknetConv2D_BN_Leaky(512, (1,1))(y19)
114 + y19 = DarknetConv2D_BN_Leaky(1024, (3,3))(y19)
115 + y19 = DarknetConv2D_BN_Leaky(512, (1,1))(y19)
116 +
117 + y19_upsample = compose(DarknetConv2D_BN_Leaky(256, (1,1)), UpSampling2D(2))(y19)
118 +
119 + #38x38 head
120 + y38 = DarknetConv2D_BN_Leaky(256, (1,1))(darknet.layers[204].output)
121 + y38 = Concatenate()([y38, y19_upsample])
122 + y38 = DarknetConv2D_BN_Leaky(256, (1,1))(y38)
123 + y38 = DarknetConv2D_BN_Leaky(512, (3,3))(y38)
124 + y38 = DarknetConv2D_BN_Leaky(256, (1,1))(y38)
125 + y38 = DarknetConv2D_BN_Leaky(512, (3,3))(y38)
126 + y38 = DarknetConv2D_BN_Leaky(256, (1,1))(y38)
127 +
128 + y38_upsample = compose(DarknetConv2D_BN_Leaky(128, (1,1)), UpSampling2D(2))(y38)
129 +
130 + #76x76 head
131 + y76 = DarknetConv2D_BN_Leaky(128, (1,1))(darknet.layers[131].output)
132 + y76 = Concatenate()([y76, y38_upsample])
133 + y76 = DarknetConv2D_BN_Leaky(128, (1,1))(y76)
134 + y76 = DarknetConv2D_BN_Leaky(256, (3,3))(y76)
135 + y76 = DarknetConv2D_BN_Leaky(128, (1,1))(y76)
136 + y76 = DarknetConv2D_BN_Leaky(256, (3,3))(y76)
137 + y76 = DarknetConv2D_BN_Leaky(128, (1,1))(y76)
138 +
139 + #76x76 output
140 + y76_output = DarknetConv2D_BN_Leaky(256, (3,3))(y76)
141 + y76_output = DarknetConv2D(num_anchors*(num_classes+5), (1,1))(y76_output)
142 +
143 + #38x38 output
144 + y76_downsample = ZeroPadding2D(((1,0),(1,0)))(y76)
145 + y76_downsample = DarknetConv2D_BN_Leaky(256, (3,3), strides=(2,2))(y76_downsample)
146 + y38 = Concatenate()([y76_downsample, y38])
147 + y38 = DarknetConv2D_BN_Leaky(256, (1,1))(y38)
148 + y38 = DarknetConv2D_BN_Leaky(512, (3,3))(y38)
149 + y38 = DarknetConv2D_BN_Leaky(256, (1,1))(y38)
150 + y38 = DarknetConv2D_BN_Leaky(512, (3,3))(y38)
151 + y38 = DarknetConv2D_BN_Leaky(256, (1,1))(y38)
152 +
153 + y38_output = DarknetConv2D_BN_Leaky(512, (3,3))(y38)
154 + y38_output = DarknetConv2D(num_anchors*(num_classes+5), (1,1))(y38_output)
155 +
156 + #19x19 output
157 + y38_downsample = ZeroPadding2D(((1,0),(1,0)))(y38)
158 + y38_downsample = DarknetConv2D_BN_Leaky(512, (3,3), strides=(2,2))(y38_downsample)
159 + y19 = Concatenate()([y38_downsample, y19])
160 + y19 = DarknetConv2D_BN_Leaky(512, (1,1))(y19)
161 + y19 = DarknetConv2D_BN_Leaky(1024, (3,3))(y19)
162 + y19 = DarknetConv2D_BN_Leaky(512, (1,1))(y19)
163 + y19 = DarknetConv2D_BN_Leaky(1024, (3,3))(y19)
164 + y19 = DarknetConv2D_BN_Leaky(512, (1,1))(y19)
165 +
166 + y19_output = DarknetConv2D_BN_Leaky(1024, (3,3))(y19)
167 + y19_output = DarknetConv2D(num_anchors*(num_classes+5), (1,1))(y19_output)
168 +
169 + yolo4_model = Model(inputs, [y19_output, y38_output, y76_output])
170 +
171 + return yolo4_model
172 +
173 +
174 +def yolo_head(feats, anchors, num_classes, input_shape, calc_loss=False):
175 + """Convert final layer features to bounding box parameters."""
176 + num_anchors = len(anchors)
177 + # Reshape to batch, height, width, num_anchors, box_params.
178 + anchors_tensor = K.reshape(K.constant(anchors), [1, 1, 1, num_anchors, 2])
179 +
180 + grid_shape = K.shape(feats)[1:3] # height, width
181 + grid_y = K.tile(K.reshape(K.arange(0, stop=grid_shape[0]), [-1, 1, 1, 1]),
182 + [1, grid_shape[1], 1, 1])
183 + grid_x = K.tile(K.reshape(K.arange(0, stop=grid_shape[1]), [1, -1, 1, 1]),
184 + [grid_shape[0], 1, 1, 1])
185 + grid = K.concatenate([grid_x, grid_y])
186 + grid = K.cast(grid, K.dtype(feats))
187 +
188 + feats = K.reshape(
189 + feats, [-1, grid_shape[0], grid_shape[1], num_anchors, num_classes + 5])
190 +
191 + # Adjust preditions to each spatial grid point and anchor size.
192 + box_xy = (K.sigmoid(feats[..., :2]) + grid) / K.cast(grid_shape[...,::-1], K.dtype(feats))
193 + box_wh = K.exp(feats[..., 2:4]) * anchors_tensor / K.cast(input_shape[...,::-1], K.dtype(feats))
194 + box_confidence = K.sigmoid(feats[..., 4:5])
195 + box_class_probs = K.sigmoid(feats[..., 5:])
196 +
197 + if calc_loss == True:
198 + return grid, feats, box_xy, box_wh
199 + return box_xy, box_wh, box_confidence, box_class_probs
200 +
201 +
202 +def yolo_correct_boxes(box_xy, box_wh, input_shape, image_shape):
203 + '''Get corrected boxes'''
204 + box_yx = box_xy[..., ::-1]
205 + box_hw = box_wh[..., ::-1]
206 + input_shape = K.cast(input_shape, K.dtype(box_yx))
207 + image_shape = K.cast(image_shape, K.dtype(box_yx))
208 + new_shape = K.round(image_shape * K.min(input_shape/image_shape))
209 + offset = (input_shape-new_shape)/2./input_shape
210 + scale = input_shape/new_shape
211 + box_yx = (box_yx - offset) * scale
212 + box_hw *= scale
213 +
214 + box_mins = box_yx - (box_hw / 2.)
215 + box_maxes = box_yx + (box_hw / 2.)
216 + boxes = K.concatenate([
217 + box_mins[..., 0:1], # y_min
218 + box_mins[..., 1:2], # x_min
219 + box_maxes[..., 0:1], # y_max
220 + box_maxes[..., 1:2] # x_max
221 + ])
222 +
223 + # Scale boxes back to original image shape.
224 + boxes *= K.concatenate([image_shape, image_shape])
225 + return boxes
226 +
227 +
228 +def yolo_boxes_and_scores(feats, anchors, num_classes, input_shape, image_shape):
229 + '''Process Conv layer output'''
230 + box_xy, box_wh, box_confidence, box_class_probs = yolo_head(feats,
231 + anchors, num_classes, input_shape)
232 + boxes = yolo_correct_boxes(box_xy, box_wh, input_shape, image_shape)
233 + boxes = K.reshape(boxes, [-1, 4])
234 + box_scores = box_confidence * box_class_probs
235 + box_scores = K.reshape(box_scores, [-1, num_classes])
236 + return boxes, box_scores
237 +
238 +
239 +def yolo_eval(yolo_outputs,
240 + anchors,
241 + num_classes,
242 + image_shape,
243 + max_boxes=200,
244 + score_threshold=.5,
245 + iou_threshold=.5):
246 + """Evaluate YOLO model on given input and return filtered boxes."""
247 + num_layers = len(yolo_outputs)
248 + anchor_mask = [[6,7,8], [3,4,5], [0,1,2]]
249 + input_shape = K.shape(yolo_outputs[0])[1:3] * 32
250 + boxes = []
251 + box_scores = []
252 + for l in range(num_layers):
253 + _boxes, _box_scores = yolo_boxes_and_scores(yolo_outputs[l],
254 + anchors[anchor_mask[l]], num_classes, input_shape, image_shape)
255 + boxes.append(_boxes)
256 + box_scores.append(_box_scores)
257 + boxes = K.concatenate(boxes, axis=0)
258 + box_scores = K.concatenate(box_scores, axis=0)
259 +
260 + mask = box_scores >= score_threshold
261 + max_boxes_tensor = K.constant(max_boxes, dtype='int32')
262 + boxes_ = []
263 + scores_ = []
264 + classes_ = []
265 + for c in range(num_classes):
266 + class_boxes = tf.boolean_mask(boxes, mask[:, c])
267 + class_box_scores = tf.boolean_mask(box_scores[:, c], mask[:, c])
268 + nms_index = tf.image.non_max_suppression(
269 + class_boxes, class_box_scores, max_boxes_tensor, iou_threshold=iou_threshold)
270 + class_boxes = K.gather(class_boxes, nms_index)
271 + class_box_scores = K.gather(class_box_scores, nms_index)
272 + classes = K.ones_like(class_box_scores, 'int32') * c
273 + boxes_.append(class_boxes)
274 + scores_.append(class_box_scores)
275 + classes_.append(classes)
276 + boxes_ = K.concatenate(boxes_, axis=0)
277 + scores_ = K.concatenate(scores_, axis=0)
278 + classes_ = K.concatenate(classes_, axis=0)
279 +
280 + return boxes_, scores_, classes_
281 +
282 +
283 +def preprocess_true_boxes(true_boxes, input_shape, anchors, num_classes):
284 + '''Preprocess true boxes to training input format
285 +
286 + Parameters
287 + ----------
288 + true_boxes: array, shape=(m, T, 5)
289 + Absolute x_min, y_min, x_max, y_max, class_id relative to input_shape.
290 + input_shape: array-like, hw, multiples of 32
291 + anchors: array, shape=(N, 2), wh
292 + num_classes: integer
293 +
294 + Returns
295 + -------
296 + y_true: list of array, shape like yolo_outputs, xywh are reletive value
297 +
298 + '''
299 + assert (true_boxes[..., 4]<num_classes).all(), 'class id must be less than num_classes'
300 + num_layers = len(anchors)//3 # default setting
301 + anchor_mask = [[6,7,8], [3,4,5], [0,1,2]] if num_layers==3 else [[3,4,5], [1,2,3]]
302 +
303 + true_boxes = np.array(true_boxes, dtype='float32')
304 + input_shape = np.array(input_shape, dtype='int32')
305 + boxes_xy = (true_boxes[..., 0:2] + true_boxes[..., 2:4]) // 2
306 + boxes_wh = true_boxes[..., 2:4] - true_boxes[..., 0:2]
307 + true_boxes[..., 0:2] = boxes_xy/input_shape[::-1]
308 + true_boxes[..., 2:4] = boxes_wh/input_shape[::-1]
309 +
310 + m = true_boxes.shape[0]
311 + grid_shapes = [input_shape//{0:32, 1:16, 2:8}[l] for l in range(num_layers)]
312 + y_true = [np.zeros((m,grid_shapes[l][0],grid_shapes[l][1],len(anchor_mask[l]),5+num_classes),
313 + dtype='float32') for l in range(num_layers)]
314 +
315 + # Expand dim to apply broadcasting.
316 + anchors = np.expand_dims(anchors, 0)
317 + anchor_maxes = anchors / 2.
318 + anchor_mins = -anchor_maxes
319 + valid_mask = boxes_wh[..., 0]>0
320 +
321 + for b in range(m):
322 + # Discard zero rows.
323 + wh = boxes_wh[b, valid_mask[b]]
324 + if len(wh)==0: continue
325 + # Expand dim to apply broadcasting.
326 + wh = np.expand_dims(wh, -2)
327 + box_maxes = wh / 2.
328 + box_mins = -box_maxes
329 +
330 + intersect_mins = np.maximum(box_mins, anchor_mins)
331 + intersect_maxes = np.minimum(box_maxes, anchor_maxes)
332 + intersect_wh = np.maximum(intersect_maxes - intersect_mins, 0.)
333 + intersect_area = intersect_wh[..., 0] * intersect_wh[..., 1]
334 + box_area = wh[..., 0] * wh[..., 1]
335 + anchor_area = anchors[..., 0] * anchors[..., 1]
336 + iou = intersect_area / (box_area + anchor_area - intersect_area)
337 +
338 + # Find best anchor for each true box
339 + best_anchor = np.argmax(iou, axis=-1)
340 +
341 + for t, n in enumerate(best_anchor):
342 + for l in range(num_layers):
343 + if n in anchor_mask[l]:
344 + i = np.floor(true_boxes[b,t,0]*grid_shapes[l][1]).astype('int32')
345 + j = np.floor(true_boxes[b,t,1]*grid_shapes[l][0]).astype('int32')
346 + k = anchor_mask[l].index(n)
347 + c = true_boxes[b,t, 4].astype('int32')
348 + y_true[l][b, j, i, k, 0:4] = true_boxes[b,t, 0:4]
349 + y_true[l][b, j, i, k, 4] = 1
350 + y_true[l][b, j, i, k, 5+c] = 1
351 +
352 + return y_true
353 +
354 +
355 +def softmax_focal_loss(y_true, y_pred, gamma=2.0, alpha=0.25):
356 + """
357 + Compute softmax focal loss.
358 + Reference Paper:
359 + "Focal Loss for Dense Object Detection"
360 + https://arxiv.org/abs/1708.02002
361 +
362 + # Arguments
363 + y_true: Ground truth targets,
364 + tensor of shape (?, num_boxes, num_classes).
365 + y_pred: Predicted logits,
366 + tensor of shape (?, num_boxes, num_classes).
367 + gamma: exponent of the modulating factor (1 - p_t) ^ gamma.
368 + alpha: optional alpha weighting factor to balance positives vs negatives.
369 +
370 + # Returns
371 + softmax_focal_loss: Softmax focal loss, tensor of shape (?, num_boxes).
372 + """
373 +
374 + # Scale predictions so that the class probas of each sample sum to 1
375 + #y_pred /= K.sum(y_pred, axis=-1, keepdims=True)
376 +
377 + # Clip the prediction value to prevent NaN's and Inf's
378 + #epsilon = K.epsilon()
379 + #y_pred = K.clip(y_pred, epsilon, 1. - epsilon)
380 + y_pred = tf.nn.softmax(y_pred)
381 + y_pred = tf.maximum(tf.minimum(y_pred, 1 - 1e-15), 1e-15)
382 +
383 + # Calculate Cross Entropy
384 + cross_entropy = -y_true * tf.math.log(y_pred)
385 +
386 + # Calculate Focal Loss
387 + softmax_focal_loss = alpha * tf.pow(1 - y_pred, gamma) * cross_entropy
388 +
389 + return softmax_focal_loss
390 +
391 +
392 +def sigmoid_focal_loss(y_true, y_pred, gamma=2.0, alpha=0.25):
393 + """
394 + Compute sigmoid focal loss.
395 + Reference Paper:
396 + "Focal Loss for Dense Object Detection"
397 + https://arxiv.org/abs/1708.02002
398 +
399 + # Arguments
400 + y_true: Ground truth targets,
401 + tensor of shape (?, num_boxes, num_classes).
402 + y_pred: Predicted logits,
403 + tensor of shape (?, num_boxes, num_classes).
404 + gamma: exponent of the modulating factor (1 - p_t) ^ gamma.
405 + alpha: optional alpha weighting factor to balance positives vs negatives.
406 +
407 + # Returns
408 + sigmoid_focal_loss: Sigmoid focal loss, tensor of shape (?, num_boxes).
409 + """
410 + sigmoid_loss = K.binary_crossentropy(y_true, y_pred, from_logits=True)
411 +
412 + pred_prob = tf.sigmoid(y_pred)
413 + p_t = ((y_true * pred_prob) + ((1 - y_true) * (1 - pred_prob)))
414 + modulating_factor = tf.pow(1.0 - p_t, gamma)
415 + alpha_weight_factor = (y_true * alpha + (1 - y_true) * (1 - alpha))
416 +
417 + sigmoid_focal_loss = modulating_factor * alpha_weight_factor * sigmoid_loss
418 + #sigmoid_focal_loss = tf.reduce_sum(sigmoid_focal_loss, axis=-1)
419 +
420 + return sigmoid_focal_loss
421 +
422 +
423 +def box_iou(b1, b2):
424 + """
425 + Return iou tensor
426 +
427 + Parameters
428 + ----------
429 + b1: tensor, shape=(i1,...,iN, 4), xywh
430 + b2: tensor, shape=(j, 4), xywh
431 +
432 + Returns
433 + -------
434 + iou: tensor, shape=(i1,...,iN, j)
435 + """
436 + # Expand dim to apply broadcasting.
437 + b1 = K.expand_dims(b1, -2)
438 + b1_xy = b1[..., :2]
439 + b1_wh = b1[..., 2:4]
440 + b1_wh_half = b1_wh/2.
441 + b1_mins = b1_xy - b1_wh_half
442 + b1_maxes = b1_xy + b1_wh_half
443 +
444 + # Expand dim to apply broadcasting.
445 + b2 = K.expand_dims(b2, 0)
446 + b2_xy = b2[..., :2]
447 + b2_wh = b2[..., 2:4]
448 + b2_wh_half = b2_wh/2.
449 + b2_mins = b2_xy - b2_wh_half
450 + b2_maxes = b2_xy + b2_wh_half
451 +
452 + intersect_mins = K.maximum(b1_mins, b2_mins)
453 + intersect_maxes = K.minimum(b1_maxes, b2_maxes)
454 + intersect_wh = K.maximum(intersect_maxes - intersect_mins, 0.)
455 + intersect_area = intersect_wh[..., 0] * intersect_wh[..., 1]
456 + b1_area = b1_wh[..., 0] * b1_wh[..., 1]
457 + b2_area = b2_wh[..., 0] * b2_wh[..., 1]
458 + iou = intersect_area / (b1_area + b2_area - intersect_area)
459 +
460 + return iou
461 +
462 +
463 +def box_giou(b1, b2):
464 + """
465 + Calculate GIoU loss on anchor boxes
466 + Reference Paper:
467 + "Generalized Intersection over Union: A Metric and A Loss for Bounding Box Regression"
468 + https://arxiv.org/abs/1902.09630
469 +
470 + Parameters
471 + ----------
472 + b1: tensor, shape=(batch, feat_w, feat_h, anchor_num, 4), xywh
473 + b2: tensor, shape=(batch, feat_w, feat_h, anchor_num, 4), xywh
474 +
475 + Returns
476 + -------
477 + giou: tensor, shape=(batch, feat_w, feat_h, anchor_num, 1)
478 + """
479 + b1_xy = b1[..., :2]
480 + b1_wh = b1[..., 2:4]
481 + b1_wh_half = b1_wh/2.
482 + b1_mins = b1_xy - b1_wh_half
483 + b1_maxes = b1_xy + b1_wh_half
484 +
485 + b2_xy = b2[..., :2]
486 + b2_wh = b2[..., 2:4]
487 + b2_wh_half = b2_wh/2.
488 + b2_mins = b2_xy - b2_wh_half
489 + b2_maxes = b2_xy + b2_wh_half
490 +
491 + intersect_mins = K.maximum(b1_mins, b2_mins)
492 + intersect_maxes = K.minimum(b1_maxes, b2_maxes)
493 + intersect_wh = K.maximum(intersect_maxes - intersect_mins, 0.)
494 + intersect_area = intersect_wh[..., 0] * intersect_wh[..., 1]
495 + b1_area = b1_wh[..., 0] * b1_wh[..., 1]
496 + b2_area = b2_wh[..., 0] * b2_wh[..., 1]
497 + union_area = b1_area + b2_area - intersect_area
498 + # calculate IoU, add epsilon in denominator to avoid dividing by 0
499 + iou = intersect_area / (union_area + K.epsilon())
500 +
501 + # get enclosed area
502 + enclose_mins = K.minimum(b1_mins, b2_mins)
503 + enclose_maxes = K.maximum(b1_maxes, b2_maxes)
504 + enclose_wh = K.maximum(enclose_maxes - enclose_mins, 0.0)
505 + enclose_area = enclose_wh[..., 0] * enclose_wh[..., 1]
506 + # calculate GIoU, add epsilon in denominator to avoid dividing by 0
507 + giou = iou - 1.0 * (enclose_area - union_area) / (enclose_area + K.epsilon())
508 + giou = K.expand_dims(giou, -1)
509 +
510 + return giou
511 +
512 +
513 +def box_diou(b1, b2):
514 + """
515 + Calculate DIoU loss on anchor boxes
516 + Reference Paper:
517 + "Distance-IoU Loss: Faster and Better Learning for Bounding Box Regression"
518 + https://arxiv.org/abs/1911.08287
519 +
520 + Parameters
521 + ----------
522 + b1: tensor, shape=(batch, feat_w, feat_h, anchor_num, 4), xywh
523 + b2: tensor, shape=(batch, feat_w, feat_h, anchor_num, 4), xywh
524 +
525 + Returns
526 + -------
527 + diou: tensor, shape=(batch, feat_w, feat_h, anchor_num, 1)
528 + """
529 + b1_xy = b1[..., :2]
530 + b1_wh = b1[..., 2:4]
531 + b1_wh_half = b1_wh/2.
532 + b1_mins = b1_xy - b1_wh_half
533 + b1_maxes = b1_xy + b1_wh_half
534 +
535 + b2_xy = b2[..., :2]
536 + b2_wh = b2[..., 2:4]
537 + b2_wh_half = b2_wh/2.
538 + b2_mins = b2_xy - b2_wh_half
539 + b2_maxes = b2_xy + b2_wh_half
540 +
541 + intersect_mins = K.maximum(b1_mins, b2_mins)
542 + intersect_maxes = K.minimum(b1_maxes, b2_maxes)
543 + intersect_wh = K.maximum(intersect_maxes - intersect_mins, 0.)
544 + intersect_area = intersect_wh[..., 0] * intersect_wh[..., 1]
545 + b1_area = b1_wh[..., 0] * b1_wh[..., 1]
546 + b2_area = b2_wh[..., 0] * b2_wh[..., 1]
547 + union_area = b1_area + b2_area - intersect_area
548 + # calculate IoU, add epsilon in denominator to avoid dividing by 0
549 + iou = intersect_area / (union_area + K.epsilon())
550 +
551 + # box center distance
552 + center_distance = K.sum(K.square(b1_xy - b2_xy), axis=-1)
553 + # get enclosed area
554 + enclose_mins = K.minimum(b1_mins, b2_mins)
555 + enclose_maxes = K.maximum(b1_maxes, b2_maxes)
556 + enclose_wh = K.maximum(enclose_maxes - enclose_mins, 0.0)
557 + # get enclosed diagonal distance
558 + enclose_diagonal = K.sum(K.square(enclose_wh), axis=-1)
559 + # calculate DIoU, add epsilon in denominator to avoid dividing by 0
560 + diou = iou - 1.0 * (center_distance) / (enclose_diagonal + K.epsilon())
561 +
562 + # calculate param v and alpha to extend to CIoU
563 + #v = 4*K.square(tf.math.atan2(b1_wh[..., 0], b1_wh[..., 1]) - tf.math.atan2(b2_wh[..., 0], b2_wh[..., 1])) / (math.pi * math.pi)
564 + #alpha = v / (1.0 - iou + v)
565 + #diou = diou - alpha*v
566 +
567 + diou = K.expand_dims(diou, -1)
568 + return diou
569 +
570 +
571 +def _smooth_labels(y_true, label_smoothing):
572 + label_smoothing = K.constant(label_smoothing, dtype=K.floatx())
573 + return y_true * (1.0 - label_smoothing) + 0.5 * label_smoothing
574 +
575 +
576 +def yolo4_loss(args, anchors, num_classes, ignore_thresh=.5, label_smoothing=0, use_focal_loss=False, use_focal_obj_loss=False, use_softmax_loss=False, use_giou_loss=False, use_diou_loss=False):
577 + '''Return yolo4_loss tensor
578 +
579 + Parameters
580 + ----------
581 + yolo_outputs: list of tensor, the output of yolo_body or tiny_yolo_body
582 + y_true: list of array, the output of preprocess_true_boxes
583 + anchors: array, shape=(N, 2), wh
584 + num_classes: integer
585 + ignore_thresh: float, the iou threshold whether to ignore object confidence loss
586 +
587 + Returns
588 + -------
589 + loss: tensor, shape=(1,)
590 +
591 + '''
592 + num_layers = len(anchors)//3 # default setting
593 + yolo_outputs = args[:num_layers]
594 + y_true = args[num_layers:]
595 + anchor_mask = [[6,7,8], [3,4,5], [0,1,2]] if num_layers==3 else [[3,4,5], [0,1,2]]
596 + input_shape = K.cast(K.shape(yolo_outputs[0])[1:3] * 32, K.dtype(y_true[0]))
597 + grid_shapes = [K.cast(K.shape(yolo_outputs[l])[1:3], K.dtype(y_true[0])) for l in range(num_layers)]
598 + loss = 0
599 + total_location_loss = 0
600 + total_confidence_loss = 0
601 + total_class_loss = 0
602 + m = K.shape(yolo_outputs[0])[0] # batch size, tensor
603 + mf = K.cast(m, K.dtype(yolo_outputs[0]))
604 +
605 + for l in range(num_layers):
606 + object_mask = y_true[l][..., 4:5]
607 + true_class_probs = y_true[l][..., 5:]
608 + if label_smoothing:
609 + true_class_probs = _smooth_labels(true_class_probs, label_smoothing)
610 +
611 + grid, raw_pred, pred_xy, pred_wh = yolo_head(yolo_outputs[l],
612 + anchors[anchor_mask[l]], num_classes, input_shape, calc_loss=True)
613 + pred_box = K.concatenate([pred_xy, pred_wh])
614 +
615 + # Darknet raw box to calculate loss.
616 + raw_true_xy = y_true[l][..., :2]*grid_shapes[l][::-1] - grid
617 + raw_true_wh = K.log(y_true[l][..., 2:4] / anchors[anchor_mask[l]] * input_shape[::-1])
618 + raw_true_wh = K.switch(object_mask, raw_true_wh, K.zeros_like(raw_true_wh)) # avoid log(0)=-inf
619 + box_loss_scale = 2 - y_true[l][...,2:3]*y_true[l][...,3:4]
620 +
621 + # Find ignore mask, iterate over each of batch.
622 + ignore_mask = tf.TensorArray(K.dtype(y_true[0]), size=1, dynamic_size=True)
623 + object_mask_bool = K.cast(object_mask, 'bool')
624 + def loop_body(b, ignore_mask):
625 + true_box = tf.boolean_mask(y_true[l][b,...,0:4], object_mask_bool[b,...,0])
626 + iou = box_iou(pred_box[b], true_box)
627 + best_iou = K.max(iou, axis=-1)
628 + ignore_mask = ignore_mask.write(b, K.cast(best_iou<ignore_thresh, K.dtype(true_box)))
629 + return b+1, ignore_mask
630 + _, ignore_mask = tf.while_loop(lambda b,*args: b<m, loop_body, [0, ignore_mask])
631 + ignore_mask = ignore_mask.stack()
632 + ignore_mask = K.expand_dims(ignore_mask, -1)
633 +
634 + if use_focal_obj_loss:
635 + # Focal loss for objectness confidence
636 + confidence_loss = sigmoid_focal_loss(object_mask, raw_pred[...,4:5])
637 + else:
638 + confidence_loss = object_mask * K.binary_crossentropy(object_mask, raw_pred[...,4:5], from_logits=True)+ \
639 + (1-object_mask) * K.binary_crossentropy(object_mask, raw_pred[...,4:5], from_logits=True) * ignore_mask
640 +
641 + if use_focal_loss:
642 + # Focal loss for classification score
643 + if use_softmax_loss:
644 + class_loss = softmax_focal_loss(true_class_probs, raw_pred[...,5:])
645 + else:
646 + class_loss = sigmoid_focal_loss(true_class_probs, raw_pred[...,5:])
647 + else:
648 + if use_softmax_loss:
649 + # use softmax style classification output
650 + class_loss = object_mask * K.expand_dims(K.categorical_crossentropy(true_class_probs, raw_pred[...,5:], from_logits=True), axis=-1)
651 + else:
652 + # use sigmoid style classification output
653 + class_loss = object_mask * K.binary_crossentropy(true_class_probs, raw_pred[...,5:], from_logits=True)
654 +
655 +
656 + if use_giou_loss:
657 + # Calculate GIoU loss as location loss
658 + raw_true_box = y_true[l][...,0:4]
659 + giou = box_giou(pred_box, raw_true_box)
660 + giou_loss = object_mask * box_loss_scale * (1 - giou)
661 + giou_loss = K.sum(giou_loss) / mf
662 + location_loss = giou_loss
663 + elif use_diou_loss:
664 + # Calculate DIoU loss as location loss
665 + raw_true_box = y_true[l][...,0:4]
666 + diou = box_diou(pred_box, raw_true_box)
667 + diou_loss = object_mask * box_loss_scale * (1 - diou)
668 + diou_loss = K.sum(diou_loss) / mf
669 + location_loss = diou_loss
670 + else:
671 + # Standard YOLO location loss
672 + # K.binary_crossentropy is helpful to avoid exp overflow.
673 + xy_loss = object_mask * box_loss_scale * K.binary_crossentropy(raw_true_xy, raw_pred[...,0:2], from_logits=True)
674 + wh_loss = object_mask * box_loss_scale * 0.5 * K.square(raw_true_wh-raw_pred[...,2:4])
675 + xy_loss = K.sum(xy_loss) / mf
676 + wh_loss = K.sum(wh_loss) / mf
677 + location_loss = xy_loss + wh_loss
678 +
679 + confidence_loss = K.sum(confidence_loss) / mf
680 + class_loss = K.sum(class_loss) / mf
681 + loss += location_loss + confidence_loss + class_loss
682 + total_location_loss += location_loss
683 + total_confidence_loss += confidence_loss
684 + total_class_loss += class_loss
685 +
686 + # Fit for tf 2.0.0 loss shape
687 + loss = K.expand_dims(loss, axis=-1)
688 +
689 + return loss #, total_location_loss, total_confidence_loss, total_class_loss
690 +
691 +
692 +def yolo_loss(args, anchors, num_classes, ignore_thresh=.5, print_loss=False):
693 + '''Return yolo_loss tensor
694 +
695 + Parameters
696 + ----------
697 + yolo_outputs: list of tensor, the output of yolo_body or tiny_yolo_body
698 + y_true: list of array, the output of preprocess_true_boxes
699 + anchors: array, shape=(N, 2), wh
700 + num_classes: integer
701 + ignore_thresh: float, the iou threshold whether to ignore object confidence loss
702 +
703 + Returns
704 + -------
705 + loss: tensor, shape=(1,)
706 +
707 + '''
708 + num_layers = len(anchors)//3 # default setting
709 + yolo_outputs = args[:num_layers]
710 + y_true = args[num_layers:]
711 + anchor_mask = [[6,7,8], [3,4,5], [0,1,2]] if num_layers==3 else [[3,4,5], [1,2,3]]
712 + input_shape = K.cast(K.shape(yolo_outputs[0])[1:3] * 32, K.dtype(y_true[0]))
713 + grid_shapes = [K.cast(K.shape(yolo_outputs[l])[1:3], K.dtype(y_true[0])) for l in range(num_layers)]
714 + loss = 0
715 + m = K.shape(yolo_outputs[0])[0] # batch size, tensor
716 + mf = K.cast(m, K.dtype(yolo_outputs[0]))
717 +
718 + for l in range(num_layers):
719 + object_mask = y_true[l][..., 4:5]
720 + true_class_probs = y_true[l][..., 5:]
721 +
722 + grid, raw_pred, pred_xy, pred_wh = yolo_head(yolo_outputs[l],
723 + anchors[anchor_mask[l]], num_classes, input_shape, calc_loss=True)
724 + pred_box = K.concatenate([pred_xy, pred_wh])
725 +
726 + # Darknet raw box to calculate loss.
727 + raw_true_xy = y_true[l][..., :2]*grid_shapes[l][::-1] - grid
728 + raw_true_wh = K.log(y_true[l][..., 2:4] / anchors[anchor_mask[l]] * input_shape[::-1])
729 + raw_true_wh = K.switch(object_mask, raw_true_wh, K.zeros_like(raw_true_wh)) # avoid log(0)=-inf
730 + box_loss_scale = 2 - y_true[l][...,2:3]*y_true[l][...,3:4]
731 +
732 + # Find ignore mask, iterate over each of batch.
733 + ignore_mask = tf.TensorArray(K.dtype(y_true[0]), size=1, dynamic_size=True)
734 + object_mask_bool = K.cast(object_mask, 'bool')
735 + def loop_body(b, ignore_mask):
736 + true_box = tf.boolean_mask(y_true[l][b,...,0:4], object_mask_bool[b,...,0])
737 + iou = box_iou(pred_box[b], true_box)
738 + best_iou = K.max(iou, axis=-1)
739 + ignore_mask = ignore_mask.write(b, K.cast(best_iou<ignore_thresh, K.dtype(true_box)))
740 + return b+1, ignore_mask
741 + _, ignore_mask = K.control_flow_ops.while_loop(lambda b,*args: b<m, loop_body, [0, ignore_mask])
742 + ignore_mask = ignore_mask.stack()
743 + ignore_mask = K.expand_dims(ignore_mask, -1)
744 +
745 + # K.binary_crossentropy is helpful to avoid exp overflow.
746 + xy_loss = object_mask * box_loss_scale * K.binary_crossentropy(raw_true_xy, raw_pred[...,0:2], from_logits=True)
747 + wh_loss = object_mask * box_loss_scale * 0.5 * K.square(raw_true_wh-raw_pred[...,2:4])
748 + confidence_loss = object_mask * K.binary_crossentropy(object_mask, raw_pred[...,4:5], from_logits=True)+ \
749 + (1-object_mask) * K.binary_crossentropy(object_mask, raw_pred[...,4:5], from_logits=True) * ignore_mask
750 + class_loss = object_mask * K.binary_crossentropy(true_class_probs, raw_pred[...,5:], from_logits=True)
751 +
752 + xy_loss = K.sum(xy_loss) / mf
753 + wh_loss = K.sum(wh_loss) / mf
754 + confidence_loss = K.sum(confidence_loss) / mf
755 + class_loss = K.sum(class_loss) / mf
756 + loss += xy_loss + wh_loss + confidence_loss + class_loss
757 + if print_loss:
758 + loss = tf.Print(loss, [loss, xy_loss, wh_loss, confidence_loss, class_loss, K.sum(ignore_mask)], message='loss: ')
759 + return loss
1 +"""Miscellaneous utility functions."""
2 +
3 +from functools import reduce
4 +
5 +from PIL import Image
6 +import numpy as np
7 +from matplotlib.colors import rgb_to_hsv, hsv_to_rgb
8 +
9 +def compose(*funcs):
10 + """Compose arbitrarily many functions, evaluated left to right.
11 +
12 + Reference: https://mathieularose.com/function-composition-in-python/
13 + """
14 + # return lambda x: reduce(lambda v, f: f(v), funcs, x)
15 + if funcs:
16 + return reduce(lambda f, g: lambda *a, **kw: g(f(*a, **kw)), funcs)
17 + else:
18 + raise ValueError('Composition of empty sequence not supported.')
19 +
20 +def letterbox_image(image, size):
21 + '''resize image with unchanged aspect ratio using padding'''
22 + iw, ih = image.size
23 + w, h = size
24 + scale = min(w/iw, h/ih)
25 + nw = int(iw*scale)
26 + nh = int(ih*scale)
27 +
28 + image = image.resize((nw,nh), Image.BICUBIC)
29 + new_image = Image.new('RGB', size, (128,128,128))
30 + new_image.paste(image, ((w-nw)//2, (h-nh)//2))
31 + return new_image
32 +
33 +def rand(a=0, b=1):
34 + return np.random.rand()*(b-a) + a
35 +
36 +def get_random_data(annotation_line, input_shape, random=True, max_boxes=100, jitter=.3, hue=.1, sat=1.5, val=1.5, proc_img=True):
37 + '''random preprocessing for real-time data augmentation'''
38 + line = annotation_line.split()
39 + image = Image.open(line[0])
40 + iw, ih = image.size
41 + h, w = input_shape
42 + box = np.array([np.array(list(map(int,box.split(',')))) for box in line[1:]])
43 +
44 + if not random:
45 + # resize image
46 + scale = min(w/iw, h/ih)
47 + nw = int(iw*scale)
48 + nh = int(ih*scale)
49 + dx = (w-nw)//2
50 + dy = (h-nh)//2
51 + image_data=0
52 + if proc_img:
53 + image = image.resize((nw,nh), Image.BICUBIC)
54 + new_image = Image.new('RGB', (w,h), (128,128,128))
55 + new_image.paste(image, (dx, dy))
56 + image_data = np.array(new_image)/255.
57 +
58 + # correct boxes
59 + box_data = np.zeros((max_boxes,5))
60 + if len(box)>0:
61 + np.random.shuffle(box)
62 + if len(box)>max_boxes: box = box[:max_boxes]
63 + box[:, [0,2]] = box[:, [0,2]]*scale + dx
64 + box[:, [1,3]] = box[:, [1,3]]*scale + dy
65 + box_data[:len(box)] = box
66 +
67 + return image_data, box_data
68 +
69 + # resize image
70 + new_ar = w/h * rand(1-jitter,1+jitter)/rand(1-jitter,1+jitter)
71 + scale = rand(.25, 2)
72 + if new_ar < 1:
73 + nh = int(scale*h)
74 + nw = int(nh*new_ar)
75 + else:
76 + nw = int(scale*w)
77 + nh = int(nw/new_ar)
78 + image = image.resize((nw,nh), Image.BICUBIC)
79 +
80 + # place image
81 + dx = int(rand(0, w-nw))
82 + dy = int(rand(0, h-nh))
83 + new_image = Image.new('RGB', (w,h), (128,128,128))
84 + new_image.paste(image, (dx, dy))
85 + image = new_image
86 +
87 + # flip image or not
88 + flip = rand()<.5
89 + if flip: image = image.transpose(Image.FLIP_LEFT_RIGHT)
90 +
91 + # distort image
92 + hue = rand(-hue, hue)
93 + sat = rand(1, sat) if rand()<.5 else 1/rand(1, sat)
94 + val = rand(1, val) if rand()<.5 else 1/rand(1, val)
95 + x = rgb_to_hsv(np.array(image)/255.)
96 + x[..., 0] += hue
97 + x[..., 0][x[..., 0]>1] -= 1
98 + x[..., 0][x[..., 0]<0] += 1
99 + x[..., 1] *= sat
100 + x[..., 2] *= val
101 + x[x>1] = 1
102 + x[x<0] = 0
103 + image_data = hsv_to_rgb(x) # numpy array, 0 to 1
104 +
105 + # correct boxes
106 + box_data = np.zeros((max_boxes,5))
107 + if len(box)>0:
108 + np.random.shuffle(box)
109 + box[:, [0,2]] = box[:, [0,2]]*nw/iw + dx
110 + box[:, [1,3]] = box[:, [1,3]]*nh/ih + dy
111 + if flip: box[:, [0,2]] = w - box[:, [2,0]]
112 + box[:, 0:2][box[:, 0:2]<0] = 0
113 + box[:, 2][box[:, 2]>w] = w
114 + box[:, 3][box[:, 3]>h] = h
115 + box_w = box[:, 2] - box[:, 0]
116 + box_h = box[:, 3] - box[:, 1]
117 + box = box[np.logical_and(box_w>1, box_h>1)] # discard invalid box
118 + if len(box)>max_boxes: box = box[:max_boxes]
119 + box_data[:len(box)] = box
120 +
121 + return image_data, box_data
1 +absl-py==0.9.0
2 +asgiref==3.2.7
3 +astor==0.8.1
4 +astroid==2.3.3
5 +cachetools==4.1.0
6 +certifi==2020.4.5.1
7 +chardet==3.0.4
8 +colorama==0.4.3
9 +cycler==0.10.0
10 +Django==3.0.5
11 +et-xmlfile==1.0.1
12 +gast==0.2.2
13 +google-auth==1.14.1
14 +google-auth-oauthlib==0.4.1
15 +google-pasta==0.2.0
16 +grpcio==1.28.1
17 +h5py==2.10.0
18 +idna==2.9
19 +image==1.5.31
20 +isort==4.3.21
21 +jdcal==1.4.1
22 +joblib==0.14.1
23 +Keras==2.3.1
24 +Keras-Applications==1.0.8
25 +Keras-Preprocessing==1.1.0
26 +kiwisolver==1.2.0
27 +lazy-object-proxy==1.4.3
28 +Markdown==3.2.1
29 +matplotlib==3.2.1
30 +mccabe==0.6.1
31 +numpy==1.18.3
32 +oauthlib==3.1.0
33 +opencv-python==4.2.0.34
34 +openpyxl==3.0.3
35 +opt-einsum==3.2.1
36 +pandas==1.0.3
37 +Pillow==7.1.1
38 +protobuf==3.11.3
39 +pyasn1==0.4.8
40 +pyasn1-modules==0.2.8
41 +pylint==2.4.4
42 +pyparsing==2.4.7
43 +python-dateutil==2.8.1
44 +pytz==2019.3
45 +PyYAML==5.3.1
46 +requests==2.23.0
47 +requests-oauthlib==1.3.0
48 +rsa==4.0
49 +scikit-learn==0.22.2.post1
50 +scipy==1.4.1
51 +seaborn==0.10.1
52 +six==1.14.0
53 +sklearn==0.0
54 +sqlparse==0.3.1
55 +tensorboard==1.15.0
56 +tensorflow==1.15.2
57 +tensorflow-estimator==1.15.1
58 +tensorflow-gpu==1.15.2
59 +tensorflow-gpu-estimator==2.1.0
60 +termcolor==1.1.0
61 +typed-ast==1.4.1
62 +urllib3==1.25.9
63 +Werkzeug==1.0.1
64 +wrapt==1.11.2
65 +xlrd==1.2.0
...@@ -6,12 +6,14 @@ ...@@ -6,12 +6,14 @@
6 "start": "node ./bin/www" 6 "start": "node ./bin/www"
7 }, 7 },
8 "dependencies": { 8 "dependencies": {
9 + "child_process": "^1.0.2",
9 "cookie-parser": "~1.4.4", 10 "cookie-parser": "~1.4.4",
10 "debug": "~2.6.9", 11 "debug": "~2.6.9",
11 "ejs": "~2.6.1", 12 "ejs": "~2.6.1",
12 "express": "~4.16.1", 13 "express": "~4.16.1",
13 "http-errors": "~1.6.3", 14 "http-errors": "~1.6.3",
14 "morgan": "~1.9.1", 15 "morgan": "~1.9.1",
15 - "multer": "^1.4.2" 16 + "multer": "^1.4.2",
17 + "python-shell": "^2.0.1"
16 } 18 }
17 } 19 }
......
...@@ -2,10 +2,13 @@ var express = require('express'); ...@@ -2,10 +2,13 @@ var express = require('express');
2 var router = express.Router(); 2 var router = express.Router();
3 var multer = require("multer"); 3 var multer = require("multer");
4 var path = require("path"); 4 var path = require("path");
5 +var PythonShell = require('python-shell');
6 +var spawn = require("child_process").spawn;
7 +
5 8
6 var storage = multer.diskStorage({ 9 var storage = multer.diskStorage({
7 destination: function(req, file, callback) { 10 destination: function(req, file, callback) {
8 - callback(null, "upload/") 11 + callback(null, "../deep_sort_yolov4/")
9 }, 12 },
10 filename: function(req, file, callback) { 13 filename: function(req, file, callback) {
11 var extension = path.extname(file.originalname); 14 var extension = path.extname(file.originalname);
...@@ -21,20 +24,33 @@ var upload = multer({ ...@@ -21,20 +24,33 @@ var upload = multer({
21 24
22 // 뷰 페이지 경로 25 // 뷰 페이지 경로
23 router.get('/', function(req, res, next) { 26 router.get('/', function(req, res, next) {
24 - res.render("index") 27 + res.render('index', { title: 'Express' });
25 }); 28 });
26 29
27 // 2. 파일 업로드 처리 30 // 2. 파일 업로드 처리
28 -router.post('/create', upload.single("File"), async(req, res) => { 31 +router.post('/create', upload.single("File"), function(req, res) {
29 // 3. 파일 객체 32 // 3. 파일 객체
30 - var file = req.file 33 + var file = req.file;
31 - 34 + var options = {
32 - // 4. 파일 정보 35 + mode: 'text',
33 - var result = { 36 + pythonPath: __dirname + '/../../venv/Scripts/python.exe',
34 - originalName: file.originalname, 37 + pythonOptions: ['-u'],
35 - size: file.size, 38 + scriptPath: __dirname + '/../../deep_sort_yolov4/',
36 - } 39 + args: [file.originalname]
37 - 40 + };
38 - res.json(result); 41 + var shell1 = new PythonShell.PythonShell('main.py', options);
42 + shell1.end(function(err) {
43 + if (err) throw err;
44 + else {
45 + res.render('result', { file_name: file.originalname.split('.')[0] });
46 + }
47 + });
48 + // PythonShell.PythonShell.run('main.py', options, (err, results) => {
49 + // if (err) throw err;
50 + // PythonShell.PythonShell.run('kmean.py', options2, (err, results) => {
51 + // if (err) throw err;
52 + // res.render('result', { file_name: file.originalname.split('.')[0] });
53 + // });
54 + // });
39 }); 55 });
40 module.exports = router; 56 module.exports = router;
...\ No newline at end of file ...\ No newline at end of file
......
...@@ -32,7 +32,8 @@ ...@@ -32,7 +32,8 @@
32 <div class="sidebar-brand-icon rotate-n-15"> 32 <div class="sidebar-brand-icon rotate-n-15">
33 <i class="fas fa-thumbs-up"></i> 33 <i class="fas fa-thumbs-up"></i>
34 </div> 34 </div>
35 - <div class="sidebar-brand-text mx-3">유동 인구 분석</div> 35 + <div class="sidebar-brand-text mx-3">유동 인구 분석
36 + </div>
36 </a> 37 </a>
37 38
38 <!-- Divider --> 39 <!-- Divider -->
......
...@@ -86,10 +86,10 @@ ...@@ -86,10 +86,10 @@
86 <!-- /.container-fluid --> 86 <!-- /.container-fluid -->
87 <div class="container row"> 87 <div class="container row">
88 <div class="container-video col-md-6"> 88 <div class="container-video col-md-6">
89 - <video src="data/output2.avi" width="400" controls autoplay></video> 89 + <video src="data/<%=file_name%>.mp4" width="400" controls autoplay></video>
90 </div> 90 </div>
91 <div class="container-kmeans col-md-6"> 91 <div class="container-kmeans col-md-6">
92 - <img src="data/test3_kmeans.png" style="width: 100%;" alt="Kmeans Image"> 92 + <img src="data/<%=file_name%>_kmeans.png" style="width: 100%;" alt="Kmeans Image">
93 </div> 93 </div>
94 </div> 94 </div>
95 <div class="container-kibana"></div> 95 <div class="container-kibana"></div>
...@@ -127,9 +127,6 @@ ...@@ -127,9 +127,6 @@
127 <!-- Custom scripts for all pages--> 127 <!-- Custom scripts for all pages-->
128 <script src="javascripts/sb-admin-2.min.js"></script> 128 <script src="javascripts/sb-admin-2.min.js"></script>
129 129
130 - <!-- Page level plugins -->
131 - <script src="vendor/chart.js/Chart.min.js"></script>
132 -
133 <!-- Page level custom scripts --> 130 <!-- Page level custom scripts -->
134 <!-- <script src="javascripts/demo/chart-pie-demo.js"></script> --> 131 <!-- <script src="javascripts/demo/chart-pie-demo.js"></script> -->
135 </body> 132 </body>
......