Hong

addyolo

Showing 58 changed files with 3801 additions and 18 deletions
code/web/package-lock.json
code/web/node_modules/**
/code/venv
/code/deep_sort_yolov4/model_data/yolov4.weights
/code/deep_sort_yolov4/model_data/yolo4_weight.h5
\ No newline at end of file
......
# YOLOv4 + Deep_SORT
<img src="https://github.com/yehengchen/Object-Detection-and-Tracking/blob/master/OneStage/yolo/deep_sort_yolov4/output/comparison.png" width="81%" height="81%"> <img src="https://github.com/yehengchen/video_demo/blob/master/video_demo/output.gif" width="40%" height="40%"> <img src="https://github.com/yehengchen/video_demo/blob/master/video_demo/TownCentreXVID_output.gif" width="40%" height="40%">
__Object Tracking & Counting Demo - [[BiliBili]](https://www.bilibili.com/video/BV1Ug4y1i71w#reply3014975828) [[Chinese Version]](https://blog.csdn.net/weixin_38107271/article/details/96741706)__
## Requirement
__Development Environment: [Deep-Learning-Environment-Setup](https://github.com/yehengchen/Ubuntu-16.04-Deep-Learning-Environment-Setup)__
* OpenCV
* sklean
* pillow
* numpy 1.15.0
* torch 1.3.0
* tensorflow-gpu 1.13.1
* CUDA 10.0
***
It uses:
* __Detection__: [YOLOv4](https://github.com/yehengchen/Object-Detection-and-Tracking/tree/master/OneStage/yolo/Train-a-YOLOv4-model) to detect objects on each of the video frames. - 用自己的数据训练YOLOv4模型
* __Tracking__: [Deep_SORT](https://github.com/nwojke/deep_sort) to track those objects over different frames.
*This repository contains code for Simple Online and Realtime Tracking with a Deep Association Metric (Deep SORT). We extend the original SORT algorithm to integrate appearance information based on a deep appearance descriptor. See the [arXiv preprint](https://arxiv.org/abs/1703.07402) for more information.*
## Quick Start
__0.Requirements__
pip install -r requirements.txt
__1. Download the code to your computer.__
git clone https://github.com/yehengchen/Object-Detection-and-Tracking.git
__2. Download [[yolov4.weights]](https://drive.google.com/file/d/1cewMfusmPjYWbrnuJRuKhPMwRe_b9PaT/view) [[Baidu]](https://pan.baidu.com/s/1jRudrrXAS3DRGqT6mL4L3A ) - `mnv6`__ and place it in `deep_sort_yolov4/model_data/`
*Here you can download my trained [[yolo4_weight.h5]](https://pan.baidu.com/s/1JuT4KCUFaE2Gvme0_S37DQ ) - `w17w` weights for detecting person/car/bicycle,etc.*
__3. Convert the Darknet YOLO model to a Keras model:__
```
$ python convert.py model_data/yolov4.cfg model_data/yolov4.weights model_data/yolo.h5
```
__4. Run the YOLO_DEEP_SORT:__
```
$ python main.py -c [CLASS NAME] -i [INPUT VIDEO PATH]
$ python main.py -c person -i ./test_video/testvideo.avi
```
__5. Can change [deep_sort_yolov3/yolo.py] `__Line 100__` to your tracking object__
*DeepSORT pre-trained weights using people-ReID datasets only for person*
```
if predicted_class != args["class"]:
continue
if predicted_class != 'person' and predicted_class != 'car':
continue
```
## Train on Market1501 & MARS
*People Re-identification model*
[cosine_metric_learning](https://github.com/nwojke/cosine_metric_learning) for training a metric feature representation to be used with the deep_sort tracker.
## Citation
### YOLOv4 :
@misc{bochkovskiy2020yolov4,
title={YOLOv4: Optimal Speed and Accuracy of Object Detection},
author={Alexey Bochkovskiy and Chien-Yao Wang and Hong-Yuan Mark Liao},
year={2020},
eprint={2004.10934},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
### Deep_SORT :
@inproceedings{Wojke2017simple,
title={Simple Online and Realtime Tracking with a Deep Association Metric},
author={Wojke, Nicolai and Bewley, Alex and Paulus, Dietrich},
booktitle={2017 IEEE International Conference on Image Processing (ICIP)},
year={2017},
pages={3645--3649},
organization={IEEE},
doi={10.1109/ICIP.2017.8296962}
}
@inproceedings{Wojke2018deep,
title={Deep Cosine Metric Learning for Person Re-identification},
author={Wojke, Nicolai and Bewley, Alex},
booktitle={2018 IEEE Winter Conference on Applications of Computer Vision (WACV)},
year={2018},
pages={748--756},
organization={IEEE},
doi={10.1109/WACV.2018.00087}
}
## Reference
#### Github:deep_sort@[Nicolai Wojke nwojke](https://github.com/nwojke/deep_sort)
#### Github:deep_sort_yolov3@[Qidian213 ](https://github.com/Qidian213/deep_sort_yolov3)
#### Github:Deep-SORT-YOLOv4@[LeonLok](https://github.com/LeonLok/Deep-SORT-YOLOv4)
import os
import colorsys
import numpy as np
from keras import backend as K
from keras.models import load_model
from keras.layers import Input
from yolo4.model import yolo_eval, yolo4_body
from yolo4.utils import letterbox_image
from PIL import Image, ImageFont, ImageDraw
from timeit import default_timer as timer
import matplotlib.pyplot as plt
from operator import itemgetter
class Yolo4(object):
def get_class(self):
classes_path = os.path.expanduser(self.classes_path)
with open(classes_path) as f:
class_names = f.readlines()
class_names = [c.strip() for c in class_names]
return class_names
def get_anchors(self):
anchors_path = os.path.expanduser(self.anchors_path)
with open(anchors_path) as f:
anchors = f.readline()
anchors = [float(x) for x in anchors.split(',')]
return np.array(anchors).reshape(-1, 2)
def load_yolo(self):
model_path = os.path.expanduser(self.model_path)
assert model_path.endswith('.h5'), 'Keras model or weights must be a .h5 file.'
self.class_names = self.get_class()
self.anchors = self.get_anchors()
num_anchors = len(self.anchors)
num_classes = len(self.class_names)
# Generate colors for drawing bounding boxes.
hsv_tuples = [(x / len(self.class_names), 1., 1.)
for x in range(len(self.class_names))]
self.colors = list(map(lambda x: colorsys.hsv_to_rgb(*x), hsv_tuples))
self.colors = list(
map(lambda x: (int(x[0] * 255), int(x[1] * 255), int(x[2] * 255)),
self.colors))
self.sess = K.get_session()
# Load model, or construct model and load weights.
self.yolo4_model = yolo4_body(Input(shape=(608, 608, 3)), num_anchors//3, num_classes)
# Read and convert darknet weight
print('Loading weights.')
weights_file = open(self.weights_path, 'rb')
major, minor, revision = np.ndarray(
shape=(3, ), dtype='int32', buffer=weights_file.read(12))
if (major*10+minor)>=2 and major<1000 and minor<1000:
seen = np.ndarray(shape=(1,), dtype='int64', buffer=weights_file.read(8))
else:
seen = np.ndarray(shape=(1,), dtype='int32', buffer=weights_file.read(4))
print('Weights Header: ', major, minor, revision, seen)
convs_to_load = []
bns_to_load = []
for i in range(len(self.yolo4_model.layers)):
layer_name = self.yolo4_model.layers[i].name
if layer_name.startswith('conv2d_'):
convs_to_load.append((int(layer_name[7:]), i))
if layer_name.startswith('batch_normalization_'):
bns_to_load.append((int(layer_name[20:]), i))
convs_sorted = sorted(convs_to_load, key=itemgetter(0))
bns_sorted = sorted(bns_to_load, key=itemgetter(0))
bn_index = 0
for i in range(len(convs_sorted)):
print('Converting ', i)
if i == 93 or i == 101 or i == 109:
#no bn, with bias
weights_shape = self.yolo4_model.layers[convs_sorted[i][1]].get_weights()[0].shape
bias_shape = self.yolo4_model.layers[convs_sorted[i][1]].get_weights()[0].shape[3]
filters = bias_shape
size = weights_shape[0]
darknet_w_shape = (filters, weights_shape[2], size, size)
weights_size = np.product(weights_shape)
conv_bias = np.ndarray(
shape=(filters, ),
dtype='float32',
buffer=weights_file.read(filters * 4))
conv_weights = np.ndarray(
shape=darknet_w_shape,
dtype='float32',
buffer=weights_file.read(weights_size * 4))
conv_weights = np.transpose(conv_weights, [2, 3, 1, 0])
self.yolo4_model.layers[convs_sorted[i][1]].set_weights([conv_weights, conv_bias])
else:
#with bn, no bias
weights_shape = self.yolo4_model.layers[convs_sorted[i][1]].get_weights()[0].shape
size = weights_shape[0]
bn_shape = self.yolo4_model.layers[bns_sorted[bn_index][1]].get_weights()[0].shape
filters = bn_shape[0]
darknet_w_shape = (filters, weights_shape[2], size, size)
weights_size = np.product(weights_shape)
conv_bias = np.ndarray(
shape=(filters, ),
dtype='float32',
buffer=weights_file.read(filters * 4))
bn_weights = np.ndarray(
shape=(3, filters),
dtype='float32',
buffer=weights_file.read(filters * 12))
bn_weight_list = [
bn_weights[0], # scale gamma
conv_bias, # shift beta
bn_weights[1], # running mean
bn_weights[2] # running var
]
self.yolo4_model.layers[bns_sorted[bn_index][1]].set_weights(bn_weight_list)
conv_weights = np.ndarray(
shape=darknet_w_shape,
dtype='float32',
buffer=weights_file.read(weights_size * 4))
conv_weights = np.transpose(conv_weights, [2, 3, 1, 0])
self.yolo4_model.layers[convs_sorted[i][1]].set_weights([conv_weights])
bn_index += 1
weights_file.close()
self.yolo4_model.save(self.model_path)
if self.gpu_num>=2:
self.yolo4_model = multi_gpu_model(self.yolo4_model, gpus=self.gpu_num)
self.input_image_shape = K.placeholder(shape=(2, ))
self.boxes, self.scores, self.classes = yolo_eval(self.yolo4_model.output, self.anchors,
len(self.class_names), self.input_image_shape,
score_threshold=self.score)
def __init__(self, score, iou, anchors_path, classes_path, model_path, weights_path, gpu_num=1):
self.score = score
self.iou = iou
self.anchors_path = anchors_path
self.classes_path = classes_path
self.weights_path = weights_path
self.model_path = model_path
self.gpu_num = gpu_num
self.load_yolo()
def close_session(self):
self.sess.close()
if __name__ == '__main__':
model_path = 'model_data/yolo4_weight.h5'
anchors_path = 'model_data/yolo_anchors.txt'
classes_path = 'model_data/coco_classes.txt'
weights_path = 'model_data/yolov4.weights'
score = 0.5
iou = 0.5
model_image_size = (608, 608)
yolo4_model = Yolo4(score, iou, anchors_path, classes_path, model_path, weights_path)
yolo4_model.close_session()
\ No newline at end of file
# vim: expandtab:ts=4:sw=4
import numpy as np
class Detection(object):
"""
This class represents a bounding box detection in a single image.
Parameters
----------
tlwh : array_like
Bounding box in format `(x, y, w, h)`.
confidence : float
Detector confidence score.
feature : array_like
A feature vector that describes the object contained in this image.
Attributes
----------
tlwh : ndarray
Bounding box in format `(top left x, top left y, width, height)`.
confidence : ndarray
Detector confidence score.
feature : ndarray | NoneType
A feature vector that describes the object contained in this image.
"""
def __init__(self, tlwh, confidence, feature):
self.tlwh = np.asarray(tlwh, dtype=np.float)
self.confidence = float(confidence)
self.feature = np.asarray(feature, dtype=np.float32)
def to_tlbr(self):
"""Convert bounding box to format `(min x, min y, max x, max y)`, i.e.,
`(top left, bottom right)`.
"""
ret = self.tlwh.copy()
ret[2:] += ret[:2]
return ret
def to_xyah(self):
"""Convert bounding box to format `(center x, center y, aspect ratio,
height)`, where the aspect ratio is `width / height`.
"""
ret = self.tlwh.copy()
ret[:2] += ret[2:] / 2
ret[2] /= ret[3]
return ret
# vim: expandtab:ts=4:sw=4
import numpy as np
class Detection_YOLO(object):
"""
This class represents a bounding box detection in a single image.
Parameters
----------
tlwh : array_like
Bounding box in format `(x, y, w, h)`.
confidence : float
Detector confidence score.
feature : array_like
A feature vector that describes the object contained in this image.
Attributesutils
----------
tlwh : ndarray
Bounding box in format `(top left x, top left y, width, height)`.
confidence : ndarray
Detector confidence score.
feature : ndarray | NoneType
A feature vector that describes the object contained in this image.
"""
def __init__(self, tlwh, confidence, cls):
self.tlwh = np.asarray(tlwh, dtype=np.float)
self.confidence = float(confidence)
self.cls = cls
def to_tlbr(self):
"""Convert bounding box to format `(min x, min y, max x, max y)`, i.e.,
`(top left, bottom right)`.
"""
ret = self.tlwh.copy()
ret[2:] += ret[:2]
return ret
def to_xyah(self):
"""Convert bounding box to format `(center x, center y, aspect ratio,
height)`, where the aspect ratio is `width / height`.
"""
ret = self.tlwh.copy()
ret[:2] += ret[2:] / 2
ret[2] /= ret[3]
return ret
# vim: expandtab:ts=4:sw=4
from __future__ import absolute_import
import numpy as np
from . import linear_assignment
def iou(bbox, candidates):
"""Computer intersection over union.
Parameters
----------
bbox : ndarray
A bounding box in format `(top left x, top left y, width, height)`.
candidates : ndarray
A matrix of candidate bounding boxes (one per row) in the same format
as `bbox`.
Returns
-------
ndarray
The intersection over union in [0, 1] between the `bbox` and each
candidate. A higher score means a larger fraction of the `bbox` is
occluded by the candidate.
"""
bbox_tl, bbox_br = bbox[:2], bbox[:2] + bbox[2:]
candidates_tl = candidates[:, :2]
candidates_br = candidates[:, :2] + candidates[:, 2:]
tl = np.c_[np.maximum(bbox_tl[0], candidates_tl[:, 0])[:, np.newaxis],
np.maximum(bbox_tl[1], candidates_tl[:, 1])[:, np.newaxis]]
br = np.c_[np.minimum(bbox_br[0], candidates_br[:, 0])[:, np.newaxis],
np.minimum(bbox_br[1], candidates_br[:, 1])[:, np.newaxis]]
wh = np.maximum(0., br - tl)
area_intersection = wh.prod(axis=1)
area_bbox = bbox[2:].prod()
area_candidates = candidates[:, 2:].prod(axis=1)
return area_intersection / (area_bbox + area_candidates - area_intersection)
def iou_cost(tracks, detections, track_indices=None,
detection_indices=None):
"""An intersection over union distance metric.
Parameters
----------
tracks : List[deep_sort.track.Track]
A list of tracks.
detections : List[deep_sort.detection.Detection]
A list of detections.
track_indices : Optional[List[int]]
A list of indices to tracks that should be matched. Defaults to
all `tracks`.
detection_indices : Optional[List[int]]
A list of indices to detections that should be matched. Defaults
to all `detections`.
Returns
-------
ndarray
Returns a cost matrix of shape
len(track_indices), len(detection_indices) where entry (i, j) is
`1 - iou(tracks[track_indices[i]], detections[detection_indices[j]])`.
"""
if track_indices is None:
track_indices = np.arange(len(tracks))
if detection_indices is None:
detection_indices = np.arange(len(detections))
cost_matrix = np.zeros((len(track_indices), len(detection_indices)))
for row, track_idx in enumerate(track_indices):
if tracks[track_idx].time_since_update > 1:
cost_matrix[row, :] = linear_assignment.INFTY_COST
continue
bbox = tracks[track_idx].to_tlwh()
candidates = np.asarray([detections[i].tlwh for i in detection_indices])
cost_matrix[row, :] = 1. - iou(bbox, candidates)
return cost_matrix
# vim: expandtab:ts=4:sw=4
import numpy as np
import scipy.linalg
"""
Table for the 0.95 quantile of the chi-square distribution with N degrees of
freedom (contains values for N=1, ..., 9). Taken from MATLAB/Octave's chi2inv
function and used as Mahalanobis gating threshold.
"""
chi2inv95 = {
1: 3.8415,
2: 5.9915,
3: 7.8147,
4: 9.4877,
5: 11.070,
6: 12.592,
7: 14.067,
8: 15.507,
9: 16.919}
class KalmanFilter(object):
"""
A simple Kalman filter for tracking bounding boxes in image space.
The 8-dimensional state space
x, y, a, h, vx, vy, va, vh
contains the bounding box center position (x, y), aspect ratio a, height h,
and their respective velocities.
Object motion follows a constant velocity model. The bounding box location
(x, y, a, h) is taken as direct observation of the state space (linear
observation model).
"""
def __init__(self):
ndim, dt = 4, 1.
# Create Kalman filter model matrices.
self._motion_mat = np.eye(2 * ndim, 2 * ndim)
for i in range(ndim):
self._motion_mat[i, ndim + i] = dt
self._update_mat = np.eye(ndim, 2 * ndim)
# Motion and observation uncertainty are chosen relative to the current
# state estimate. These weights control the amount of uncertainty in
# the model. This is a bit hacky.
self._std_weight_position = 1. / 20
self._std_weight_velocity = 1. / 160
def initiate(self, measurement):
"""Create track from unassociated measurement.
Parameters
----------
measurement : ndarray
Bounding box coordinates (x, y, a, h) with center position (x, y),
aspect ratio a, and height h.
Returns
-------
(ndarray, ndarray)
Returns the mean vector (8 dimensional) and covariance matrix (8x8
dimensional) of the new track. Unobserved velocities are initialized
to 0 mean.
"""
mean_pos = measurement
mean_vel = np.zeros_like(mean_pos)
mean = np.r_[mean_pos, mean_vel]
std = [
2 * self._std_weight_position * measurement[3],
2 * self._std_weight_position * measurement[3],
1e-2,
2 * self._std_weight_position * measurement[3],
10 * self._std_weight_velocity * measurement[3],
10 * self._std_weight_velocity * measurement[3],
1e-5,
10 * self._std_weight_velocity * measurement[3]]
covariance = np.diag(np.square(std))
return mean, covariance
def predict(self, mean, covariance):
"""Run Kalman filter prediction step.
Parameters
----------
mean : ndarray
The 8 dimensional mean vector of the object state at the previous
time step.
covariance : ndarray
The 8x8 dimensional covariance matrix of the object state at the
previous time step.
Returns
-------
(ndarray, ndarray)
Returns the mean vector and covariance matrix of the predicted
state. Unobserved velocities are initialized to 0 mean.
"""
std_pos = [
self._std_weight_position * mean[3],
self._std_weight_position * mean[3],
1e-2,
self._std_weight_position * mean[3]]
std_vel = [
self._std_weight_velocity * mean[3],
self._std_weight_velocity * mean[3],
1e-5,
self._std_weight_velocity * mean[3]]
motion_cov = np.diag(np.square(np.r_[std_pos, std_vel]))
mean = np.dot(self._motion_mat, mean)
covariance = np.linalg.multi_dot((
self._motion_mat, covariance, self._motion_mat.T)) + motion_cov
return mean, covariance
def project(self, mean, covariance):
"""Project state distribution to measurement space.
Parameters
----------
mean : ndarray
The state's mean vector (8 dimensional array).
covariance : ndarray
The state's covariance matrix (8x8 dimensional).
Returns
-------
(ndarray, ndarray)
Returns the projected mean and covariance matrix of the given state
estimate.
"""
std = [
self._std_weight_position * mean[3],
self._std_weight_position * mean[3],
1e-1,
self._std_weight_position * mean[3]]
innovation_cov = np.diag(np.square(std))
mean = np.dot(self._update_mat, mean)
covariance = np.linalg.multi_dot((
self._update_mat, covariance, self._update_mat.T))
return mean, covariance + innovation_cov
def update(self, mean, covariance, measurement):
"""Run Kalman filter correction step.
Parameters
----------
mean : ndarray
The predicted state's mean vector (8 dimensional).
covariance : ndarray
The state's covariance matrix (8x8 dimensional).
measurement : ndarray
The 4 dimensional measurement vector (x, y, a, h), where (x, y)
is the center position, a the aspect ratio, and h the height of the
bounding box.
Returns
-------
(ndarray, ndarray)
Returns the measurement-corrected state distribution.
"""
projected_mean, projected_cov = self.project(mean, covariance)
chol_factor, lower = scipy.linalg.cho_factor(
projected_cov, lower=True, check_finite=False)
kalman_gain = scipy.linalg.cho_solve(
(chol_factor, lower), np.dot(covariance, self._update_mat.T).T,
check_finite=False).T
innovation = measurement - projected_mean
new_mean = mean + np.dot(innovation, kalman_gain.T)
new_covariance = covariance - np.linalg.multi_dot((
kalman_gain, projected_cov, kalman_gain.T))
return new_mean, new_covariance
def gating_distance(self, mean, covariance, measurements,
only_position=False):
"""Compute gating distance between state distribution and measurements.
A suitable distance threshold can be obtained from `chi2inv95`. If
`only_position` is False, the chi-square distribution has 4 degrees of
freedom, otherwise 2.
Parameters
----------
mean : ndarray
Mean vector over the state distribution (8 dimensional).
covariance : ndarray
Covariance of the state distribution (8x8 dimensional).
measurements : ndarray
An Nx4 dimensional matrix of N measurements, each in
format (x, y, a, h) where (x, y) is the bounding box center
position, a the aspect ratio, and h the height.
only_position : Optional[bool]
If True, distance computation is done with respect to the bounding
box center position only.
Returns
-------
ndarray
Returns an array of length N, where the i-th element contains the
squared Mahalanobis distance between (mean, covariance) and
`measurements[i]`.
"""
mean, covariance = self.project(mean, covariance)
if only_position:
mean, covariance = mean[:2], covariance[:2, :2]
measurements = measurements[:, :2]
cholesky_factor = np.linalg.cholesky(covariance)
d = measurements - mean
z = scipy.linalg.solve_triangular(
cholesky_factor, d.T, lower=True, check_finite=False,
overwrite_b=True)
squared_maha = np.sum(z * z, axis=0)
return squared_maha
# vim: expandtab:ts=4:sw=4
from __future__ import absolute_import
import numpy as np
from sklearn.utils.linear_assignment_ import linear_assignment
from . import kalman_filter
INFTY_COST = 1e+5
def min_cost_matching(
distance_metric, max_distance, tracks, detections, track_indices=None,
detection_indices=None):
"""Solve linear assignment problem.
Parameters
----------
distance_metric : Callable[List[Track], List[Detection], List[int], List[int]) -> ndarray
The distance metric is given a list of tracks and detections as well as
a list of N track indices and M detection indices. The metric should
return the NxM dimensional cost matrix, where element (i, j) is the
association cost between the i-th track in the given track indices and
the j-th detection in the given detection_indices.
max_distance : float
Gating threshold. Associations with cost larger than this value are
disregarded.
tracks : List[track.Track]
A list of predicted tracks at the current time step.
detections : List[detection.Detection]
A list of detections at the current time step.
track_indices : List[int]
List of track indices that maps rows in `cost_matrix` to tracks in
`tracks` (see description above).
detection_indices : List[int]
List of detection indices that maps columns in `cost_matrix` to
detections in `detections` (see description above).
Returns
-------
(List[(int, int)], List[int], List[int])
Returns a tuple with the following three entries:
* A list of matched track and detection indices.
* A list of unmatched track indices.
* A list of unmatched detection indices.
"""
if track_indices is None:
track_indices = np.arange(len(tracks))
if detection_indices is None:
detection_indices = np.arange(len(detections))
if len(detection_indices) == 0 or len(track_indices) == 0:
return [], track_indices, detection_indices # Nothing to match.
cost_matrix = distance_metric(
tracks, detections, track_indices, detection_indices)
cost_matrix[cost_matrix > max_distance] = max_distance + 1e-5
indices = linear_assignment(cost_matrix)
matches, unmatched_tracks, unmatched_detections = [], [], []
for col, detection_idx in enumerate(detection_indices):
if col not in indices[:, 1]:
unmatched_detections.append(detection_idx)
for row, track_idx in enumerate(track_indices):
if row not in indices[:, 0]:
unmatched_tracks.append(track_idx)
for row, col in indices:
track_idx = track_indices[row]
detection_idx = detection_indices[col]
if cost_matrix[row, col] > max_distance:
unmatched_tracks.append(track_idx)
unmatched_detections.append(detection_idx)
else:
matches.append((track_idx, detection_idx))
return matches, unmatched_tracks, unmatched_detections
def matching_cascade(
distance_metric, max_distance, cascade_depth, tracks, detections,
track_indices=None, detection_indices=None):
"""Run matching cascade.
Parameters
----------
distance_metric : Callable[List[Track], List[Detection], List[int], List[int]) -> ndarray
The distance metric is given a list of tracks and detections as well as
a list of N track indices and M detection indices. The metric should
return the NxM dimensional cost matrix, where element (i, j) is the
association cost between the i-th track in the given track indices and
the j-th detection in the given detection indices.
max_distance : float
Gating threshold. Associations with cost larger than this value are
disregarded.
cascade_depth: int
The cascade depth, should be se to the maximum track age.
tracks : List[track.Track]
A list of predicted tracks at the current time step.
detections : List[detection.Detection]
A list of detections at the current time step.
track_indices : Optional[List[int]]
List of track indices that maps rows in `cost_matrix` to tracks in
`tracks` (see description above). Defaults to all tracks.
detection_indices : Optional[List[int]]
List of detection indices that maps columns in `cost_matrix` to
detections in `detections` (see description above). Defaults to all
detections.
Returns
-------
(List[(int, int)], List[int], List[int])
Returns a tuple with the following three entries:
* A list of matched track and detection indices.
* A list of unmatched track indices.
* A list of unmatched detection indices.
"""
if track_indices is None:
track_indices = list(range(len(tracks)))
if detection_indices is None:
detection_indices = list(range(len(detections)))
unmatched_detections = detection_indices
matches = []
for level in range(cascade_depth):
if len(unmatched_detections) == 0: # No detections left
break
track_indices_l = [
k for k in track_indices
if tracks[k].time_since_update == 1 + level
]
if len(track_indices_l) == 0: # Nothing to match at this level
continue
matches_l, _, unmatched_detections = \
min_cost_matching(
distance_metric, max_distance, tracks, detections,
track_indices_l, unmatched_detections)
matches += matches_l
unmatched_tracks = list(set(track_indices) - set(k for k, _ in matches))
return matches, unmatched_tracks, unmatched_detections
def gate_cost_matrix(
kf, cost_matrix, tracks, detections, track_indices, detection_indices,
gated_cost=INFTY_COST, only_position=False):
"""Invalidate infeasible entries in cost matrix based on the state
distributions obtained by Kalman filtering.
Parameters
----------
kf : The Kalman filter.
cost_matrix : ndarray
The NxM dimensional cost matrix, where N is the number of track indices
and M is the number of detection indices, such that entry (i, j) is the
association cost between `tracks[track_indices[i]]` and
`detections[detection_indices[j]]`.
tracks : List[track.Track]
A list of predicted tracks at the current time step.
detections : List[detection.Detection]
A list of detections at the current time step.
track_indices : List[int]
List of track indices that maps rows in `cost_matrix` to tracks in
`tracks` (see description above).
detection_indices : List[int]
List of detection indices that maps columns in `cost_matrix` to
detections in `detections` (see description above).
gated_cost : Optional[float]
Entries in the cost matrix corresponding to infeasible associations are
set this value. Defaults to a very large value.
only_position : Optional[bool]
If True, only the x, y position of the state distribution is considered
during gating. Defaults to False.
Returns
-------
ndarray
Returns the modified cost matrix.
"""
gating_dim = 2 if only_position else 4
gating_threshold = kalman_filter.chi2inv95[gating_dim]
measurements = np.asarray(
[detections[i].to_xyah() for i in detection_indices])
for row, track_idx in enumerate(track_indices):
track = tracks[track_idx]
gating_distance = kf.gating_distance(
track.mean, track.covariance, measurements, only_position)
cost_matrix[row, gating_distance > gating_threshold] = gated_cost
return cost_matrix
# vim: expandtab:ts=4:sw=4
import numpy as np
def _pdist(a, b):
"""Compute pair-wise squared distance between points in `a` and `b`.
Parameters
----------
a : array_like
An NxM matrix of N samples of dimensionality M.
b : array_like
An LxM matrix of L samples of dimensionality M.
Returns
-------
ndarray
Returns a matrix of size len(a), len(b) such that eleement (i, j)
contains the squared distance between `a[i]` and `b[j]`.
"""
a, b = np.asarray(a), np.asarray(b)
if len(a) == 0 or len(b) == 0:
return np.zeros((len(a), len(b)))
a2, b2 = np.square(a).sum(axis=1), np.square(b).sum(axis=1)
r2 = -2. * np.dot(a, b.T) + a2[:, None] + b2[None, :]
r2 = np.clip(r2, 0., float(np.inf))
return r2
def _cosine_distance(a, b, data_is_normalized=False):
"""Compute pair-wise cosine distance between points in `a` and `b`.
Parameters
----------
a : array_like
An NxM matrix of N samples of dimensionality M.
b : array_like
An LxM matrix of L samples of dimensionality M.
data_is_normalized : Optional[bool]
If True, assumes rows in a and b are unit length vectors.
Otherwise, a and b are explicitly normalized to lenght 1.
Returns
-------
ndarray
Returns a matrix of size len(a), len(b) such that eleement (i, j)
contains the squared distance between `a[i]` and `b[j]`.
"""
if not data_is_normalized:
a = np.asarray(a) / np.linalg.norm(a, axis=1, keepdims=True)
b = np.asarray(b) / np.linalg.norm(b, axis=1, keepdims=True)
return 1. - np.dot(a, b.T)
def _nn_euclidean_distance(x, y):
""" Helper function for nearest neighbor distance metric (Euclidean).
Parameters
----------
x : ndarray
A matrix of N row-vectors (sample points).
y : ndarray
A matrix of M row-vectors (query points).
Returns
-------
ndarray
A vector of length M that contains for each entry in `y` the
smallest Euclidean distance to a sample in `x`.
"""
distances = _pdist(x, y)
return np.maximum(0.0, distances.min(axis=0))
def _nn_cosine_distance(x, y):
""" Helper function for nearest neighbor distance metric (cosine).
Parameters
----------
x : ndarray
A matrix of N row-vectors (sample points).
y : ndarray
A matrix of M row-vectors (query points).
Returns
-------
ndarray
A vector of length M that contains for each entry in `y` the
smallest cosine distance to a sample in `x`.
"""
distances = _cosine_distance(x, y)
return distances.min(axis=0)
class NearestNeighborDistanceMetric(object):
"""
A nearest neighbor distance metric that, for each target, returns
the closest distance to any sample that has been observed so far.
Parameters
----------
metric : str
Either "euclidean" or "cosine".
matching_threshold: float
The matching threshold. Samples with larger distance are considered an
invalid match.
budget : Optional[int]
If not None, fix samples per class to at most this number. Removes
the oldest samples when the budget is reached.
Attributes
----------
samples : Dict[int -> List[ndarray]]
A dictionary that maps from target identities to the list of samples
that have been observed so far.
"""
def __init__(self, metric, matching_threshold, budget=None):
if metric == "euclidean":
self._metric = _nn_euclidean_distance
elif metric == "cosine":
self._metric = _nn_cosine_distance
else:
raise ValueError(
"Invalid metric; must be either 'euclidean' or 'cosine'")
self.matching_threshold = matching_threshold
self.budget = budget
self.samples = {}
def partial_fit(self, features, targets, active_targets):
"""Update the distance metric with new data.
Parameters
----------
features : ndarray
An NxM matrix of N features of dimensionality M.
targets : ndarray
An integer array of associated target identities.
active_targets : List[int]
A list of targets that are currently present in the scene.
"""
for feature, target in zip(features, targets):
self.samples.setdefault(target, []).append(feature)
if self.budget is not None:
self.samples[target] = self.samples[target][-self.budget:]
self.samples = {k: self.samples[k] for k in active_targets}
def distance(self, features, targets):
"""Compute distance between features and targets.
Parameters
----------
features : ndarray
An NxM matrix of N features of dimensionality M.
targets : List[int]
A list of targets to match the given `features` against.
Returns
-------
ndarray
Returns a cost matrix of shape len(targets), len(features), where
element (i, j) contains the closest squared distance between
`targets[i]` and `features[j]`.
"""
cost_matrix = np.zeros((len(targets), len(features)))
for i, target in enumerate(targets):
cost_matrix[i, :] = self._metric(self.samples[target], features)
return cost_matrix
# vim: expandtab:ts=4:sw=4
import numpy as np
import cv2
def non_max_suppression(boxes, max_bbox_overlap, scores=None):
"""Suppress overlapping detections.
Original code from [1]_ has been adapted to include confidence score.
.. [1] http://www.pyimagesearch.com/2015/02/16/
faster-non-maximum-suppression-python/
Examples
--------
>>> boxes = [d.roi for d in detections]
>>> scores = [d.confidence for d in detections]
>>> indices = non_max_suppression(boxes, max_bbox_overlap, scores)
>>> detections = [detections[i] for i in indices]
Parameters
----------
boxes : ndarray
Array of ROIs (x, y, width, height).
max_bbox_overlap : float
ROIs that overlap more than this values are suppressed.
scores : Optional[array_like]
Detector confidence score.
Returns
-------
List[int]
Returns indices of detections that have survived non-maxima suppression.
"""
if len(boxes) == 0:
return []
boxes = boxes.astype(np.float)
pick = []
x1 = boxes[:, 0]
y1 = boxes[:, 1]
x2 = boxes[:, 2] + boxes[:, 0]
y2 = boxes[:, 3] + boxes[:, 1]
area = (x2 - x1 + 1) * (y2 - y1 + 1)
if scores is not None:
idxs = np.argsort(scores)
else:
idxs = np.argsort(y2)
while len(idxs) > 0:
last = len(idxs) - 1
i = idxs[last]
pick.append(i)
xx1 = np.maximum(x1[i], x1[idxs[:last]])
yy1 = np.maximum(y1[i], y1[idxs[:last]])
xx2 = np.minimum(x2[i], x2[idxs[:last]])
yy2 = np.minimum(y2[i], y2[idxs[:last]])
w = np.maximum(0, xx2 - xx1 + 1)
h = np.maximum(0, yy2 - yy1 + 1)
overlap = (w * h) / area[idxs[:last]]
idxs = np.delete(
idxs, np.concatenate(
([last], np.where(overlap > max_bbox_overlap)[0])))
return pick
# vim: expandtab:ts=4:sw=4
class TrackState:
"""
Enumeration type for the single target track state. Newly created tracks are
classified as `tentative` until enough evidence has been collected. Then,
the track state is changed to `confirmed`. Tracks that are no longer alive
are classified as `deleted` to mark them for removal from the set of active
tracks.
"""
Tentative = 1
Confirmed = 2
Deleted = 3
class Track:
"""
A single target track with state space `(x, y, a, h)` and associated
velocities, where `(x, y)` is the center of the bounding box, `a` is the
aspect ratio and `h` is the height.
Parameters
----------
mean : ndarray
Mean vector of the initial state distribution.
covariance : ndarray
Covariance matrix of the initial state distribution.
track_id : int
A unique track identifier.
n_init : int
Number of consecutive detections before the track is confirmed. The
track state is set to `Deleted` if a miss occurs within the first
`n_init` frames.
max_age : int
The maximum number of consecutive misses before the track state is
set to `Deleted`.
feature : Optional[ndarray]
Feature vector of the detection this track originates from. If not None,
this feature is added to the `features` cache.
Attributes
----------
mean : ndarray
Mean vector of the initial state distribution.
covariance : ndarray
Covariance matrix of the initial state distribution.
track_id : int
A unique track identifier.
hits : int
Total number of measurement updates.
age : int
Total number of frames since first occurance.
time_since_update : int
Total number of frames since last measurement update.
state : TrackState
The current track state.
features : List[ndarray]
A cache of features. On each measurement update, the associated feature
vector is added to this list.
"""
def __init__(self, mean, covariance, track_id, n_init, max_age,
feature=None):
self.mean = mean
self.covariance = covariance
self.track_id = track_id
self.hits = 1
self.age = 1
self.time_since_update = 0
self.state = TrackState.Tentative
self.features = []
if feature is not None:
self.features.append(feature)
self._n_init = n_init
self._max_age = max_age
def to_tlwh(self):
"""Get current position in bounding box format `(top left x, top left y,
width, height)`.
Returns
-------
ndarray
The bounding box.
"""
ret = self.mean[:4].copy()
ret[2] *= ret[3]
ret[:2] -= ret[2:] / 2
return ret
def to_tlbr(self):
"""Get current position in bounding box format `(min x, miny, max x,
max y)`.
Returns
-------
ndarray
The bounding box.
"""
ret = self.to_tlwh()
ret[2:] = ret[:2] + ret[2:]
return ret
def predict(self, kf):
"""Propagate the state distribution to the current time step using a
Kalman filter prediction step.
Parameters
----------
kf : kalman_filter.KalmanFilter
The Kalman filter.
"""
self.mean, self.covariance = kf.predict(self.mean, self.covariance)
self.age += 1
self.time_since_update += 1
def update(self, kf, detection):
"""Perform Kalman filter measurement update step and update the feature
cache.
Parameters
----------
kf : kalman_filter.KalmanFilter
The Kalman filter.
detection : Detection
The associated detection.
"""
self.mean, self.covariance = kf.update(
self.mean, self.covariance, detection.to_xyah())
self.features.append(detection.feature)
self.hits += 1
self.time_since_update = 0
if self.state == TrackState.Tentative and self.hits >= self._n_init:
self.state = TrackState.Confirmed
def mark_missed(self):
"""Mark this track as missed (no association at the current time step).
"""
if self.state == TrackState.Tentative:
self.state = TrackState.Deleted
elif self.time_since_update > self._max_age:
self.state = TrackState.Deleted
def is_tentative(self):
"""Returns True if this track is tentative (unconfirmed).
"""
return self.state == TrackState.Tentative
def is_confirmed(self):
"""Returns True if this track is confirmed."""
return self.state == TrackState.Confirmed
def is_deleted(self):
"""Returns True if this track is dead and should be deleted."""
return self.state == TrackState.Deleted
# vim: expandtab:ts=4:sw=4
from __future__ import absolute_import
import numpy as np
from . import kalman_filter
from . import linear_assignment
from . import iou_matching
from .track import Track
class Tracker:
"""
This is the multi-target tracker.
Parameters
----------
metric : nn_matching.NearestNeighborDistanceMetric
A distance metric for measurement-to-track association.
max_age : int
Maximum number of missed misses before a track is deleted.
n_init : int
Number of consecutive detections before the track is confirmed. The
track state is set to `Deleted` if a miss occurs within the first
`n_init` frames.
Attributes
----------
metric : nn_matching.NearestNeighborDistanceMetric
The distance metric used for measurement to track association.
max_age : int
Maximum number of missed misses before a track is deleted.
n_init : int
Number of frames that a track remains in initialization phase.
kf : kalman_filter.KalmanFilter
A Kalman filter to filter target trajectories in image space.
tracks : List[Track]
The list of active tracks at the current time step.
"""
def __init__(self, metric, max_iou_distance=0.7, max_age=30, n_init=3):
self.metric = metric
self.max_iou_distance = max_iou_distance
self.max_age = max_age
self.n_init = n_init
self.kf = kalman_filter.KalmanFilter()
self.tracks = []
self._next_id = 1
def predict(self):
"""Propagate track state distributions one time step forward.
This function should be called once every time step, before `update`.
"""
for track in self.tracks:
track.predict(self.kf)
def update(self, detections):
"""Perform measurement update and track management.
Parameters
----------
detections : List[deep_sort.detection.Detection]
A list of detections at the current time step.
"""
# Run matching cascade.
matches, unmatched_tracks, unmatched_detections = \
self._match(detections)
# Update track set.
for track_idx, detection_idx in matches:
self.tracks[track_idx].update(
self.kf, detections[detection_idx])
for track_idx in unmatched_tracks:
self.tracks[track_idx].mark_missed()
for detection_idx in unmatched_detections:
self._initiate_track(detections[detection_idx])
self.tracks = [t for t in self.tracks if not t.is_deleted()]
# Update distance metric.
active_targets = [t.track_id for t in self.tracks if t.is_confirmed()]
features, targets = [], []
for track in self.tracks:
if not track.is_confirmed():
continue
features += track.features
targets += [track.track_id for _ in track.features]
track.features = []
self.metric.partial_fit(
np.asarray(features), np.asarray(targets), active_targets)
def _match(self, detections):
def gated_metric(tracks, dets, track_indices, detection_indices):
features = np.array([dets[i].feature for i in detection_indices])
targets = np.array([tracks[i].track_id for i in track_indices])
cost_matrix = self.metric.distance(features, targets)
cost_matrix = linear_assignment.gate_cost_matrix(
self.kf, cost_matrix, tracks, dets, track_indices,
detection_indices)
return cost_matrix
# Split track set into confirmed and unconfirmed tracks.
confirmed_tracks = [
i for i, t in enumerate(self.tracks) if t.is_confirmed()]
unconfirmed_tracks = [
i for i, t in enumerate(self.tracks) if not t.is_confirmed()]
# Associate confirmed tracks using appearance features.
matches_a, unmatched_tracks_a, unmatched_detections = \
linear_assignment.matching_cascade(
gated_metric, self.metric.matching_threshold, self.max_age,
self.tracks, detections, confirmed_tracks)
# Associate remaining tracks together with unconfirmed tracks using IOU.
iou_track_candidates = unconfirmed_tracks + [
k for k in unmatched_tracks_a if
self.tracks[k].time_since_update == 1]
unmatched_tracks_a = [
k for k in unmatched_tracks_a if
self.tracks[k].time_since_update != 1]
matches_b, unmatched_tracks_b, unmatched_detections = \
linear_assignment.min_cost_matching(
iou_matching.iou_cost, self.max_iou_distance, self.tracks,
detections, iou_track_candidates, unmatched_detections)
matches = matches_a + matches_b
unmatched_tracks = list(set(unmatched_tracks_a + unmatched_tracks_b))
return matches, unmatched_tracks, unmatched_detections
def _initiate_track(self, detection):
mean, covariance = self.kf.initiate(detection.to_xyah())
self.tracks.append(Track(
mean, covariance, self._next_id, self.n_init, self.max_age,
detection.feature))
self._next_id += 1
This diff could not be displayed because it is too large.
#! /usr/bin/env python
# -*- coding: utf-8 -*-
from __future__ import division, print_function, absolute_import
import sys
#sys.path.remove('/opt/ros/kinetic/lib/python2.7/dist-packages')
import os
import datetime
from timeit import time
import warnings
import cv2
import numpy as np
import argparse
from sklearn.cluster import KMeans
import matplotlib.pyplot as plt
from PIL import Image
from yolo import YOLO
import matplotlib.cbook as cbook
from deep_sort import preprocessing
from deep_sort import nn_matching
from deep_sort.detection import Detection
from deep_sort.tracker import Tracker
from tools import generate_detections as gdet
from deep_sort.detection import Detection as ddet
from collections import deque
from keras import backend
import tensorflow as tf
from tensorflow.compat.v1 import InteractiveSession
import pandas as pd
import json
def createKmeans(file_name):
data = json.load(open(os.getcwd() + '/../deep_sort_yolov4/output/'+ file_name +'_xy.json'))
data = pd.DataFrame(data["data"])
xy = data[['x','y']]
km = KMeans(n_clusters=5)
km.fit(xy)
predict = pd.DataFrame(km.predict(xy))
predict.columns=['predict']
r = pd.concat([xy,predict],axis=1)
plt.scatter(r['x'],r['y'],c=r['predict'],alpha=0.3, s=200)
imageFile = cbook.get_sample_data(os.getcwd() + '/../deep_sort_yolov4/output/'+ file_name + '_img.png')
image = plt.imread(imageFile)
plt.imshow(image)
plt.savefig(os.getcwd() + '/public/data/'+file_name+'_kmeans.png')
def draw_border(img, pt1, pt2, color, thickness, r, d):
x1,y1 = pt1
x2,y2 = pt2
# Top left
cv2.line(img, (x1 + r, y1), (x1 + r + d, y1), color, thickness)
cv2.line(img, (x1, y1 + r), (x1, y1 + r + d), color, thickness)
cv2.ellipse(img, (x1 + r, y1 + r), (r, r), 180, 0, 90, color, thickness)
# Top right
cv2.line(img, (x2 - r, y1), (x2 - r - d, y1), color, thickness)
cv2.line(img, (x2, y1 + r), (x2, y1 + r + d), color, thickness)
cv2.ellipse(img, (x2 - r, y1 + r), (r, r), 270, 0, 90, color, thickness)
# Bottom left
cv2.line(img, (x1 + r, y2), (x1 + r + d, y2), color, thickness)
cv2.line(img, (x1, y2 - r), (x1, y2 - r - d), color, thickness)
cv2.ellipse(img, (x1 + r, y2 - r), (r, r), 90, 0, 90, color, thickness)
# Bottom right
cv2.line(img, (x2 - r, y2), (x2 - r - d, y2), color, thickness)
cv2.line(img, (x2, y2 - r), (x2, y2 - r - d), color, thickness)
cv2.ellipse(img, (x2 - r, y2 - r), (r, r), 0, 0, 90, color, thickness)
config = tf.ConfigProto()
config.gpu_options.allow_growth = True
session = InteractiveSession(config=config)
#ap = argparse.ArgumentParser()
#ap.add_argument("-i", "--input",help="path to input video", default = "./test_video/test3.mp4")
#ap.add_argument("-c", "--class",help="name of class", default = "person")
#args = vars(ap.parse_args())
pts = [deque(maxlen=30) for _ in range(9999)]
warnings.filterwarnings('ignore')
# initialize a list of colors to represent each possible class label
np.random.seed(100)
COLORS = np.random.randint(0, 255, size=(200, 3),
dtype="uint8")
#list = [[] for _ in range(100)]
def main(yolo):
df1 = pd.DataFrame(columns=['x','y','id','time','s'])
df2 = pd.DataFrame(columns=['total','now','time','s'])
start = datetime.datetime.now()
max_cosine_distance = 0.3
nn_budget = None
nms_max_overlap = 1.0
counter = []
#deep_sort
model_filename = os.getcwd() + '/../deep_sort_yolov4/model_data/market1501.pb'
encoder = gdet.create_box_encoder(model_filename,batch_size=1)
find_objects = ['person']
metric = nn_matching.NearestNeighborDistanceMetric("cosine", max_cosine_distance, nn_budget)
tracker = Tracker(metric)
writeVideo_flag = True
video_capture = cv2.VideoCapture(os.getcwd() + '/../deep_sort_yolov4/' + sys.argv[1])
file_name = sys.argv[1].split('.')[0]
if writeVideo_flag:
# Define the codec and create VideoWriter object
w = int(video_capture.get(3))
h = int(video_capture.get(4))
total_frame = int(video_capture.get(cv2.CAP_PROP_FRAME_COUNT))
fourcc = cv2.VideoWriter_fourcc(*'MP4V')
out = cv2.VideoWriter(os.getcwd() + '/public/data/'+file_name+'.mp4', fourcc, 15, (w, h))
list_file = open(os.getcwd() + '/../deep_sort_yolov4/detection_rslt.txt', 'w')
frame_index = -1
fps = 0.0
test = 1
while True:
ret, frame = video_capture.read() # frame shape 640*480*3
if ret != True:
break
cFrame = int(video_capture.get(cv2.CAP_PROP_POS_MSEC)/76)
#temp = pd.DataFrame({'num':cFrame},index = [0])
#temp.to_csv(os.getcwd() + '/../deep_sort_yolov4/output/temp.csv')
t1 = time.time()
if(test == 1 ):
cv2.imwrite(os.getcwd() + '/../deep_sort_yolov4/output/' + file_name + '_img.png',frame)
test +=1
#image = Image.fromarray(frame)
image = Image.fromarray(frame[...,::-1]) #bgr to rgb
boxs, confidence, class_names = yolo.detect_image(image)
features = encoder(frame,boxs)
# score to 1.0 here).
detections = [Detection(bbox, 1.0, feature) for bbox, feature in zip(boxs, features)]
# Run non-maxima suppression.
boxes = np.array([d.tlwh for d in detections])
scores = np.array([d.confidence for d in detections])
indices = preprocessing.non_max_suppression(boxes, nms_max_overlap, scores)
detections = [detections[i] for i in indices]
cs = float(59)*int(cFrame)/total_frame
t = start + datetime.timedelta(milliseconds=cs*1000)
time_format = "%Y-%m-%d %H:%M:%S.%f"
time_str = t.strftime(time_format)
cTime = time_str
# Call the tracker
tracker.predict()
tracker.update(detections)
i = int(0)
indexIDs = []
c = []
boxes = []
for track in tracker.tracks:
if not track.is_confirmed() or track.time_since_update > 1:
continue
#boxes.append([track[0], track[1], track[2], track[3]])
indexIDs.append(int(track.track_id))
counter.append(int(track.track_id))
bbox = track.to_tlbr()
color = [int(c) for c in COLORS[indexIDs[i] % len(COLORS)]]
#print(frame_index)
list_file.write(str(frame_index)+',')
list_file.write(str(track.track_id)+',')
center = (int(((bbox[0])+(bbox[2]))/2),int(((bbox[1])+(bbox[3]))/2))
r = (int(bbox[2]) - int(bbox[0]))/2
draw_border(frame,(int(((bbox[0])+(bbox[2]))/2) - int(r), int(((bbox[1])+(bbox[3]))/2) - int(r)), (int(((bbox[0])+(bbox[2]))/2) + int(r), int(((bbox[1])+(bbox[3]))/2) + int(r)),(color),4,int(r/5),int(r/5))
#cv2.rectangle(frame, (int(bbox[0]), int(bbox[1])), (int(bbox[2]), int(bbox[3])),(color), 3)
b0 = str(bbox[0])#.split('.')[0] + '.' + str(bbox[0]).split('.')[0][:1]
b1 = str(bbox[1])#.split('.')[0] + '.' + str(bbox[1]).split('.')[0][:1]
b2 = str(bbox[2]-bbox[0])#.split('.')[0] + '.' + str(bbox[3]).split('.')[0][:1]
b3 = str(bbox[3]-bbox[1])
list_file.write(str(b0) + ','+str(b1) + ','+str(b2) + ','+str(b3))
#print(str(track.track_id))
list_file.write('\n')
#list_file.write(str(track.track_id)+',')
#cv2.putText(frame,str(track.track_id),(int(bbox[0]), int(bbox[1] + 30)),0, 5e-3 * 150, (color),2)
cv2.putText(frame,str(track.track_id),(int(((bbox[0])+(bbox[2]))/2), int(((bbox[1])+(bbox[3]))/2 + 30)),0, 5e-3 * 150, (color),2)
if len(class_names) > 0:
class_name = class_names[0]
cv2.putText(frame, str(class_names[0]),(int(((bbox[0])+(bbox[2]))/2), int(((bbox[1])+(bbox[3]))/2 + 10)),0, 5e-3 * 150, (color),2)
i += 1
#bbox_center_point(x,y)
#track_id[center]
x,y = int(((bbox[0])+(bbox[2]))/2),int(((bbox[1])+(bbox[3]))/2)
# x, y
if (cFrame%10==0):
df1 = df1.append({'x':x,'y':y,'id':int(track.track_id),'time':cTime,'s': cs},ignore_index=True)
pts[track.track_id].append(center)
thickness = 5
#center point
#cv2.circle(frame, (center), 1, color, thickness)
# draw motion path
# for j in range(1, len(pts[track.track_id])):
# if pts[track.track_id][j - 1] is None or pts[track.track_id][j] is None:
# continue
# thickness = int(np.sqrt(64 / float(j + 1)) * 2)
#cv2.line(frame,(pts[track.track_id][j-1]), (pts[track.track_id][j]),(color),thickness)
#cv2.putText(frame, str(class_names[j]),(int(bbox[0]), int(bbox[1] -20)),0, 5e-3 * 150, (255,255,255),2)
count = len(set(counter))
if (cFrame%10==0):
df2 = df2.append({'total':count,'now':i,'time':cTime,'s': cs},ignore_index=True)
cv2.putText(frame, "Total Pedestrian Counter: "+str(count),(int(30), int(105)),0, 5e-3 * 150, (0,255,0),2)
cv2.putText(frame, "Current Pedestrian Counter: "+str(i),(int(30), int(80)),0, 5e-3 * 150, (0,255,0),2)
#cv2.putText(frame, "FPS: %f"%(fps),(int(20), int(40)),0, 5e-3 * 200, (0,255,0),3)
cv2.putText(frame, "Time: " + str(cTime),(int(30), int(55)),0, 5e-3 * 150, (0,255,0),3)
cv2.namedWindow("YOLO4_Deep_SORT", 0);
cv2.resizeWindow('YOLO4_Deep_SORT', 1024, 768);
cv2.imshow('YOLO4_Deep_SORT', frame)
if writeVideo_flag:
# save a frame
out.write(frame)
frame_index = frame_index + 1
fps = ( fps + (1./(time.time()-t1)) ) / 2
out.write(frame)
frame_index = frame_index + 1
# Press Q to stop!
if cv2.waitKey(1) & 0xFF == ord('q'):
break
df1.to_json(os.getcwd() + '/../deep_sort_yolov4/output/' + file_name + '_xy.json',orient='table')
df2.to_json(os.getcwd() + '/../deep_sort_yolov4/output/' + file_name + '_count.json',orient='table')
createKmeans(file_name)
print(" ")
print("[Finish]")
end = time.time()
#print("[INFO]: model_image_size = (960, 960)")
video_capture.release()
if writeVideo_flag:
out.release()
list_file.close()
cv2.destroyAllWindows()
if __name__ == '__main__':
main(YOLO())
person
bicycle
car
motorbike
aeroplane
bus
train
truck
boat
traffic light
fire hydrant
stop sign
parking meter
bench
bird
cat
dog
horse
sheep
cow
elephant
bear
zebra
giraffe
backpack
umbrella
handbag
tie
suitcase
frisbee
skis
snowboard
sports ball
kite
baseball bat
baseball glove
skateboard
surfboard
tennis racket
bottle
wine glass
cup
fork
knife
spoon
bowl
banana
apple
sandwich
orange
broccoli
carrot
hot dog
pizza
donut
cake
chair
sofa
pottedplant
bed
diningtable
toilet
tvmonitor
laptop
mouse
remote
keyboard
cell phone
microwave
oven
toaster
sink
refrigerator
book
clock
vase
scissors
teddy bear
hair drier
toothbrush
This file is too large to display.
This file is too large to display.
This file is too large to display.
12, 16, 19, 36, 40, 28, 36, 75, 76, 55, 72, 146, 142, 110, 192, 243, 459, 401
{"schema":{"fields":[{"name":"index","type":"integer"},{"name":"total","type":"string"},{"name":"now","type":"string"},{"name":"time","type":"string"},{"name":"s","type":"number"}],"primaryKey":["index"],"pandas_version":"0.20.0"},"data":[{"index":0,"total":0,"now":0,"time":"2020-06-17 03:27:10.288851","s":0.0},{"index":1,"total":11,"now":7,"time":"2020-06-17 03:27:11.051125","s":0.7622739018},{"index":2,"total":13,"now":8,"time":"2020-06-17 03:27:11.813399","s":1.5245478036},{"index":3,"total":15,"now":10,"time":"2020-06-17 03:27:12.575673","s":2.2868217054},{"index":4,"total":21,"now":13,"time":"2020-06-17 03:27:13.337947","s":3.0490956072},{"index":5,"total":22,"now":13,"time":"2020-06-17 03:27:14.100221","s":3.811369509},{"index":6,"total":23,"now":12,"time":"2020-06-17 03:27:14.862494","s":4.5736434109},{"index":7,"total":24,"now":11,"time":"2020-06-17 03:27:15.624768","s":5.3359173127},{"index":8,"total":24,"now":7,"time":"2020-06-17 03:27:16.387042","s":6.0981912145},{"index":9,"total":25,"now":8,"time":"2020-06-17 03:27:17.149316","s":6.8604651163},{"index":10,"total":26,"now":9,"time":"2020-06-17 03:27:17.911590","s":7.6227390181},{"index":11,"total":26,"now":10,"time":"2020-06-17 03:27:18.673864","s":8.3850129199},{"index":12,"total":26,"now":7,"time":"2020-06-17 03:27:19.436138","s":9.1472868217},{"index":13,"total":29,"now":8,"time":"2020-06-17 03:27:20.198412","s":9.9095607235},{"index":14,"total":31,"now":11,"time":"2020-06-17 03:27:20.960686","s":10.6718346253},{"index":15,"total":31,"now":10,"time":"2020-06-17 03:27:21.722960","s":11.4341085271},{"index":16,"total":32,"now":13,"time":"2020-06-17 03:27:22.485233","s":12.1963824289},{"index":17,"total":33,"now":11,"time":"2020-06-17 03:27:23.247507","s":12.9586563307},{"index":18,"total":33,"now":11,"time":"2020-06-17 03:27:24.009781","s":13.7209302326},{"index":19,"total":35,"now":11,"time":"2020-06-17 03:27:24.772055","s":14.4832041344},{"index":20,"total":37,"now":9,"time":"2020-06-17 03:27:25.534329","s":15.2454780362},{"index":21,"total":38,"now":10,"time":"2020-06-17 03:27:26.296603","s":16.007751938},{"index":22,"total":39,"now":11,"time":"2020-06-17 03:27:27.058877","s":16.7700258398},{"index":23,"total":41,"now":12,"time":"2020-06-17 03:27:27.821151","s":17.5322997416},{"index":24,"total":42,"now":11,"time":"2020-06-17 03:27:28.583425","s":18.2945736434},{"index":25,"total":42,"now":12,"time":"2020-06-17 03:27:29.345699","s":19.0568475452},{"index":26,"total":42,"now":10,"time":"2020-06-17 03:27:30.107972","s":19.819121447},{"index":27,"total":42,"now":8,"time":"2020-06-17 03:27:30.870246","s":20.5813953488},{"index":28,"total":43,"now":9,"time":"2020-06-17 03:27:31.632520","s":21.3436692506},{"index":29,"total":45,"now":11,"time":"2020-06-17 03:27:32.394794","s":22.1059431525},{"index":30,"total":45,"now":9,"time":"2020-06-17 03:27:33.157068","s":22.8682170543},{"index":31,"total":46,"now":11,"time":"2020-06-17 03:27:33.919342","s":23.6304909561},{"index":32,"total":46,"now":11,"time":"2020-06-17 03:27:34.681616","s":24.3927648579},{"index":33,"total":46,"now":11,"time":"2020-06-17 03:27:35.443890","s":25.1550387597},{"index":34,"total":47,"now":10,"time":"2020-06-17 03:27:36.206164","s":25.9173126615},{"index":35,"total":47,"now":9,"time":"2020-06-17 03:27:36.968438","s":26.6795865633},{"index":36,"total":48,"now":10,"time":"2020-06-17 03:27:37.730711","s":27.4418604651},{"index":37,"total":48,"now":10,"time":"2020-06-17 03:27:38.492985","s":28.2041343669},{"index":38,"total":50,"now":10,"time":"2020-06-17 03:27:39.255259","s":28.9664082687},{"index":39,"total":51,"now":10,"time":"2020-06-17 03:27:40.017533","s":29.7286821705},{"index":40,"total":52,"now":9,"time":"2020-06-17 03:27:40.779807","s":30.4909560724},{"index":41,"total":52,"now":8,"time":"2020-06-17 03:27:41.542081","s":31.2532299742},{"index":42,"total":53,"now":7,"time":"2020-06-17 03:27:42.304355","s":32.015503876},{"index":43,"total":55,"now":8,"time":"2020-06-17 03:27:43.066629","s":32.7777777778},{"index":44,"total":56,"now":8,"time":"2020-06-17 03:27:43.828903","s":33.5400516796},{"index":45,"total":56,"now":10,"time":"2020-06-17 03:27:44.591177","s":34.3023255814},{"index":46,"total":56,"now":9,"time":"2020-06-17 03:27:45.353450","s":35.0645994832},{"index":47,"total":57,"now":10,"time":"2020-06-17 03:27:46.115724","s":35.826873385},{"index":48,"total":57,"now":8,"time":"2020-06-17 03:27:46.877998","s":36.5891472868},{"index":49,"total":58,"now":10,"time":"2020-06-17 03:27:47.640272","s":37.3514211886},{"index":50,"total":58,"now":8,"time":"2020-06-17 03:27:48.402546","s":38.1136950904},{"index":51,"total":60,"now":8,"time":"2020-06-17 03:27:49.164820","s":38.8759689922},{"index":52,"total":63,"now":12,"time":"2020-06-17 03:27:49.927094","s":39.6382428941},{"index":53,"total":63,"now":12,"time":"2020-06-17 03:27:50.689368","s":40.4005167959},{"index":54,"total":63,"now":8,"time":"2020-06-17 03:27:51.451642","s":41.1627906977},{"index":55,"total":63,"now":10,"time":"2020-06-17 03:27:52.213916","s":41.9250645995},{"index":56,"total":64,"now":11,"time":"2020-06-17 03:27:52.976190","s":42.6873385013},{"index":57,"total":64,"now":9,"time":"2020-06-17 03:27:53.738463","s":43.4496124031},{"index":58,"total":64,"now":10,"time":"2020-06-17 03:27:54.500737","s":44.2118863049},{"index":59,"total":64,"now":9,"time":"2020-06-17 03:27:55.263011","s":44.9741602067},{"index":60,"total":64,"now":8,"time":"2020-06-17 03:27:56.025285","s":45.7364341085},{"index":61,"total":64,"now":9,"time":"2020-06-17 03:27:56.787559","s":46.4987080103},{"index":62,"total":65,"now":8,"time":"2020-06-17 03:27:57.549833","s":47.2609819121},{"index":63,"total":66,"now":9,"time":"2020-06-17 03:27:58.312107","s":48.023255814},{"index":64,"total":67,"now":10,"time":"2020-06-17 03:27:59.074381","s":48.7855297158},{"index":65,"total":68,"now":10,"time":"2020-06-17 03:27:59.836655","s":49.5478036176},{"index":66,"total":68,"now":9,"time":"2020-06-17 03:28:00.598929","s":50.3100775194},{"index":67,"total":69,"now":9,"time":"2020-06-17 03:28:01.361202","s":51.0723514212},{"index":68,"total":69,"now":8,"time":"2020-06-17 03:28:02.123476","s":51.834625323},{"index":69,"total":69,"now":10,"time":"2020-06-17 03:28:02.885750","s":52.5968992248},{"index":70,"total":70,"now":8,"time":"2020-06-17 03:28:03.648024","s":53.3591731266},{"index":71,"total":71,"now":9,"time":"2020-06-17 03:28:04.410298","s":54.1214470284},{"index":72,"total":73,"now":11,"time":"2020-06-17 03:28:05.172572","s":54.8837209302},{"index":73,"total":73,"now":10,"time":"2020-06-17 03:28:05.934846","s":55.645994832},{"index":74,"total":74,"now":9,"time":"2020-06-17 03:28:06.697120","s":56.4082687339},{"index":75,"total":75,"now":10,"time":"2020-06-17 03:28:07.459394","s":57.1705426357},{"index":76,"total":75,"now":7,"time":"2020-06-17 03:28:08.221668","s":57.9328165375},{"index":77,"total":77,"now":10,"time":"2020-06-17 03:28:08.983941","s":58.6950904393}]}
\ No newline at end of file
{"schema":{"fields":[{"name":"index","type":"integer"},{"name":"x","type":"string"},{"name":"y","type":"string"},{"name":"id","type":"string"},{"name":"time","type":"string"},{"name":"s","type":"number"}],"primaryKey":["index"],"pandas_version":"0.20.0"},"data":[{"index":0,"x":902,"y":316,"id":1,"time":"2020-06-17 03:27:11.051125","s":0.7622739018},{"index":1,"x":747,"y":541,"id":2,"time":"2020-06-17 03:27:11.051125","s":0.7622739018},{"index":2,"x":75,"y":354,"id":3,"time":"2020-06-17 03:27:11.051125","s":0.7622739018},{"index":3,"x":918,"y":559,"id":4,"time":"2020-06-17 03:27:11.051125","s":0.7622739018},{"index":4,"x":652,"y":104,"id":5,"time":"2020-06-17 03:27:11.051125","s":0.7622739018},{"index":5,"x":288,"y":205,"id":7,"time":"2020-06-17 03:27:11.051125","s":0.7622739018},{"index":6,"x":290,"y":110,"id":11,"time":"2020-06-17 03:27:11.051125","s":0.7622739018},{"index":7,"x":919,"y":296,"id":1,"time":"2020-06-17 03:27:11.813399","s":1.5245478036},{"index":8,"x":766,"y":470,"id":2,"time":"2020-06-17 03:27:11.813399","s":1.5245478036},{"index":9,"x":75,"y":352,"id":3,"time":"2020-06-17 03:27:11.813399","s":1.5245478036},{"index":10,"x":927,"y":551,"id":4,"time":"2020-06-17 03:27:11.813399","s":1.5245478036},{"index":11,"x":655,"y":104,"id":5,"time":"2020-06-17 03:27:11.813399","s":1.5245478036},{"index":12,"x":285,"y":202,"id":7,"time":"2020-06-17 03:27:11.813399","s":1.5245478036},{"index":13,"x":364,"y":188,"id":8,"time":"2020-06-17 03:27:11.813399","s":1.5245478036},{"index":14,"x":112,"y":250,"id":14,"time":"2020-06-17 03:27:11.813399","s":1.5245478036},{"index":15,"x":903,"y":281,"id":1,"time":"2020-06-17 03:27:12.575673","s":2.2868217054},{"index":16,"x":827,"y":435,"id":2,"time":"2020-06-17 03:27:12.575673","s":2.2868217054},{"index":17,"x":73,"y":351,"id":3,"time":"2020-06-17 03:27:12.575673","s":2.2868217054},{"index":18,"x":926,"y":569,"id":4,"time":"2020-06-17 03:27:12.575673","s":2.2868217054},{"index":19,"x":653,"y":104,"id":5,"time":"2020-06-17 03:27:12.575673","s":2.2868217054},{"index":20,"x":273,"y":188,"id":7,"time":"2020-06-17 03:27:12.575673","s":2.2868217054},{"index":21,"x":438,"y":205,"id":8,"time":"2020-06-17 03:27:12.575673","s":2.2868217054},{"index":22,"x":139,"y":248,"id":14,"time":"2020-06-17 03:27:12.575673","s":2.2868217054},{"index":23,"x":307,"y":143,"id":20,"time":"2020-06-17 03:27:12.575673","s":2.2868217054},{"index":24,"x":1051,"y":567,"id":24,"time":"2020-06-17 03:27:12.575673","s":2.2868217054},{"index":25,"x":901,"y":227,"id":1,"time":"2020-06-17 03:27:13.337947","s":3.0490956072},{"index":26,"x":81,"y":349,"id":3,"time":"2020-06-17 03:27:13.337947","s":3.0490956072},{"index":27,"x":867,"y":575,"id":4,"time":"2020-06-17 03:27:13.337947","s":3.0490956072},{"index":28,"x":650,"y":104,"id":5,"time":"2020-06-17 03:27:13.337947","s":3.0490956072},{"index":29,"x":275,"y":193,"id":7,"time":"2020-06-17 03:27:13.337947","s":3.0490956072},{"index":30,"x":446,"y":206,"id":8,"time":"2020-06-17 03:27:13.337947","s":3.0490956072},{"index":31,"x":141,"y":248,"id":14,"time":"2020-06-17 03:27:13.337947","s":3.0490956072},{"index":32,"x":293,"y":116,"id":20,"time":"2020-06-17 03:27:13.337947","s":3.0490956072},{"index":33,"x":390,"y":180,"id":26,"time":"2020-06-17 03:27:13.337947","s":3.0490956072},{"index":34,"x":836,"y":406,"id":27,"time":"2020-06-17 03:27:13.337947","s":3.0490956072},{"index":35,"x":394,"y":53,"id":29,"time":"2020-06-17 03:27:13.337947","s":3.0490956072},{"index":36,"x":1014,"y":582,"id":30,"time":"2020-06-17 03:27:13.337947","s":3.0490956072},{"index":37,"x":947,"y":337,"id":32,"time":"2020-06-17 03:27:13.337947","s":3.0490956072},{"index":38,"x":900,"y":274,"id":1,"time":"2020-06-17 03:27:14.100221","s":3.811369509},{"index":39,"x":1013,"y":375,"id":2,"time":"2020-06-17 03:27:14.100221","s":3.811369509},{"index":40,"x":88,"y":341,"id":3,"time":"2020-06-17 03:27:14.100221","s":3.811369509},{"index":41,"x":794,"y":560,"id":4,"time":"2020-06-17 03:27:14.100221","s":3.811369509},{"index":42,"x":654,"y":105,"id":5,"time":"2020-06-17 03:27:14.100221","s":3.811369509},{"index":43,"x":295,"y":236,"id":7,"time":"2020-06-17 03:27:14.100221","s":3.811369509},{"index":44,"x":447,"y":205,"id":8,"time":"2020-06-17 03:27:14.100221","s":3.811369509},{"index":45,"x":111,"y":202,"id":14,"time":"2020-06-17 03:27:14.100221","s":3.811369509},{"index":46,"x":271,"y":97,"id":20,"time":"2020-06-17 03:27:14.100221","s":3.811369509},{"index":47,"x":387,"y":149,"id":26,"time":"2020-06-17 03:27:14.100221","s":3.811369509},{"index":48,"x":917,"y":385,"id":27,"time":"2020-06-17 03:27:14.100221","s":3.811369509},{"index":49,"x":1006,"y":569,"id":30,"time":"2020-06-17 03:27:14.100221","s":3.811369509},{"index":50,"x":340,"y":117,"id":34,"time":"2020-06-17 03:27:14.100221","s":3.811369509},{"index":51,"x":902,"y":294,"id":1,"time":"2020-06-17 03:27:14.862494","s":4.5736434109},{"index":52,"x":92,"y":298,"id":3,"time":"2020-06-17 03:27:14.862494","s":4.5736434109},{"index":53,"x":746,"y":566,"id":4,"time":"2020-06-17 03:27:14.862494","s":4.5736434109},{"index":54,"x":653,"y":107,"id":5,"time":"2020-06-17 03:27:14.862494","s":4.5736434109},{"index":55,"x":296,"y":232,"id":7,"time":"2020-06-17 03:27:14.862494","s":4.5736434109},{"index":56,"x":394,"y":198,"id":8,"time":"2020-06-17 03:27:14.862494","s":4.5736434109},{"index":57,"x":105,"y":204,"id":14,"time":"2020-06-17 03:27:14.862494","s":4.5736434109},{"index":58,"x":271,"y":105,"id":20,"time":"2020-06-17 03:27:14.862494","s":4.5736434109},{"index":59,"x":972,"y":351,"id":27,"time":"2020-06-17 03:27:14.862494","s":4.5736434109},{"index":60,"x":1015,"y":506,"id":30,"time":"2020-06-17 03:27:14.862494","s":4.5736434109},{"index":61,"x":336,"y":116,"id":34,"time":"2020-06-17 03:27:14.862494","s":4.5736434109},{"index":62,"x":223,"y":186,"id":36,"time":"2020-06-17 03:27:14.862494","s":4.5736434109},{"index":63,"x":904,"y":296,"id":1,"time":"2020-06-17 03:27:15.624768","s":5.3359173127},{"index":64,"x":96,"y":322,"id":3,"time":"2020-06-17 03:27:15.624768","s":5.3359173127},{"index":65,"x":741,"y":567,"id":4,"time":"2020-06-17 03:27:15.624768","s":5.3359173127},{"index":66,"x":651,"y":105,"id":5,"time":"2020-06-17 03:27:15.624768","s":5.3359173127},{"index":67,"x":300,"y":231,"id":7,"time":"2020-06-17 03:27:15.624768","s":5.3359173127},{"index":68,"x":387,"y":173,"id":8,"time":"2020-06-17 03:27:15.624768","s":5.3359173127},{"index":69,"x":108,"y":202,"id":14,"time":"2020-06-17 03:27:15.624768","s":5.3359173127},{"index":70,"x":268,"y":89,"id":20,"time":"2020-06-17 03:27:15.624768","s":5.3359173127},{"index":71,"x":1052,"y":446,"id":30,"time":"2020-06-17 03:27:15.624768","s":5.3359173127},{"index":72,"x":225,"y":187,"id":36,"time":"2020-06-17 03:27:15.624768","s":5.3359173127},{"index":73,"x":476,"y":70,"id":41,"time":"2020-06-17 03:27:15.624768","s":5.3359173127},{"index":74,"x":904,"y":296,"id":1,"time":"2020-06-17 03:27:16.387042","s":6.0981912145},{"index":75,"x":112,"y":319,"id":3,"time":"2020-06-17 03:27:16.387042","s":6.0981912145},{"index":76,"x":748,"y":568,"id":4,"time":"2020-06-17 03:27:16.387042","s":6.0981912145},{"index":77,"x":632,"y":100,"id":5,"time":"2020-06-17 03:27:16.387042","s":6.0981912145},{"index":78,"x":263,"y":237,"id":7,"time":"2020-06-17 03:27:16.387042","s":6.0981912145},{"index":79,"x":1079,"y":420,"id":30,"time":"2020-06-17 03:27:16.387042","s":6.0981912145},{"index":80,"x":276,"y":97,"id":34,"time":"2020-06-17 03:27:16.387042","s":6.0981912145},{"index":81,"x":897,"y":284,"id":1,"time":"2020-06-17 03:27:17.149316","s":6.8604651163},{"index":82,"x":118,"y":319,"id":3,"time":"2020-06-17 03:27:17.149316","s":6.8604651163},{"index":83,"x":723,"y":572,"id":4,"time":"2020-06-17 03:27:17.149316","s":6.8604651163},{"index":84,"x":637,"y":102,"id":5,"time":"2020-06-17 03:27:17.149316","s":6.8604651163},{"index":85,"x":251,"y":236,"id":7,"time":"2020-06-17 03:27:17.149316","s":6.8604651163},{"index":86,"x":325,"y":148,"id":8,"time":"2020-06-17 03:27:17.149316","s":6.8604651163},{"index":87,"x":1087,"y":416,"id":30,"time":"2020-06-17 03:27:17.149316","s":6.8604651163},{"index":88,"x":276,"y":110,"id":34,"time":"2020-06-17 03:27:17.149316","s":6.8604651163},{"index":89,"x":915,"y":286,"id":1,"time":"2020-06-17 03:27:17.911590","s":7.6227390181},{"index":90,"x":120,"y":318,"id":3,"time":"2020-06-17 03:27:17.911590","s":7.6227390181},{"index":91,"x":740,"y":572,"id":4,"time":"2020-06-17 03:27:17.911590","s":7.6227390181},{"index":92,"x":614,"y":101,"id":5,"time":"2020-06-17 03:27:17.911590","s":7.6227390181},{"index":93,"x":255,"y":202,"id":7,"time":"2020-06-17 03:27:17.911590","s":7.6227390181},{"index":94,"x":314,"y":143,"id":8,"time":"2020-06-17 03:27:17.911590","s":7.6227390181},{"index":95,"x":1058,"y":406,"id":30,"time":"2020-06-17 03:27:17.911590","s":7.6227390181},{"index":96,"x":383,"y":104,"id":46,"time":"2020-06-17 03:27:17.911590","s":7.6227390181},{"index":97,"x":326,"y":112,"id":49,"time":"2020-06-17 03:27:17.911590","s":7.6227390181},{"index":98,"x":904,"y":274,"id":1,"time":"2020-06-17 03:27:18.673864","s":8.3850129199},{"index":99,"x":120,"y":318,"id":3,"time":"2020-06-17 03:27:18.673864","s":8.3850129199},{"index":100,"x":725,"y":576,"id":4,"time":"2020-06-17 03:27:18.673864","s":8.3850129199},{"index":101,"x":640,"y":109,"id":5,"time":"2020-06-17 03:27:18.673864","s":8.3850129199},{"index":102,"x":259,"y":204,"id":7,"time":"2020-06-17 03:27:18.673864","s":8.3850129199},{"index":103,"x":760,"y":183,"id":27,"time":"2020-06-17 03:27:18.673864","s":8.3850129199},{"index":104,"x":1042,"y":371,"id":30,"time":"2020-06-17 03:27:18.673864","s":8.3850129199},{"index":105,"x":284,"y":119,"id":34,"time":"2020-06-17 03:27:18.673864","s":8.3850129199},{"index":106,"x":382,"y":105,"id":46,"time":"2020-06-17 03:27:18.673864","s":8.3850129199},{"index":107,"x":324,"y":119,"id":49,"time":"2020-06-17 03:27:18.673864","s":8.3850129199},{"index":108,"x":872,"y":247,"id":1,"time":"2020-06-17 03:27:19.436138","s":9.1472868217},{"index":109,"x":120,"y":317,"id":3,"time":"2020-06-17 03:27:19.436138","s":9.1472868217},{"index":110,"x":726,"y":574,"id":4,"time":"2020-06-17 03:27:19.436138","s":9.1472868217},{"index":111,"x":693,"y":114,"id":5,"time":"2020-06-17 03:27:19.436138","s":9.1472868217},{"index":112,"x":278,"y":184,"id":7,"time":"2020-06-17 03:27:19.436138","s":9.1472868217},{"index":113,"x":1006,"y":339,"id":30,"time":"2020-06-17 03:27:19.436138","s":9.1472868217},{"index":114,"x":381,"y":107,"id":46,"time":"2020-06-17 03:27:19.436138","s":9.1472868217},{"index":115,"x":845,"y":190,"id":1,"time":"2020-06-17 03:27:20.198412","s":9.9095607235},{"index":116,"x":116,"y":314,"id":3,"time":"2020-06-17 03:27:20.198412","s":9.9095607235},{"index":117,"x":719,"y":570,"id":4,"time":"2020-06-17 03:27:20.198412","s":9.9095607235},{"index":118,"x":750,"y":132,"id":5,"time":"2020-06-17 03:27:20.198412","s":9.9095607235},{"index":119,"x":958,"y":310,"id":30,"time":"2020-06-17 03:27:20.198412","s":9.9095607235},{"index":120,"x":272,"y":96,"id":34,"time":"2020-06-17 03:27:20.198412","s":9.9095607235},{"index":121,"x":381,"y":116,"id":46,"time":"2020-06-17 03:27:20.198412","s":9.9095607235},{"index":122,"x":675,"y":120,"id":60,"time":"2020-06-17 03:27:20.198412","s":9.9095607235},{"index":123,"x":850,"y":181,"id":1,"time":"2020-06-17 03:27:20.960686","s":10.6718346253},{"index":124,"x":116,"y":326,"id":3,"time":"2020-06-17 03:27:20.960686","s":10.6718346253},{"index":125,"x":715,"y":569,"id":4,"time":"2020-06-17 03:27:20.960686","s":10.6718346253},{"index":126,"x":764,"y":163,"id":5,"time":"2020-06-17 03:27:20.960686","s":10.6718346253},{"index":127,"x":277,"y":184,"id":7,"time":"2020-06-17 03:27:20.960686","s":10.6718346253},{"index":128,"x":955,"y":304,"id":30,"time":"2020-06-17 03:27:20.960686","s":10.6718346253},{"index":129,"x":275,"y":91,"id":34,"time":"2020-06-17 03:27:20.960686","s":10.6718346253},{"index":130,"x":381,"y":117,"id":46,"time":"2020-06-17 03:27:20.960686","s":10.6718346253},{"index":131,"x":328,"y":122,"id":49,"time":"2020-06-17 03:27:20.960686","s":10.6718346253},{"index":132,"x":586,"y":62,"id":57,"time":"2020-06-17 03:27:20.960686","s":10.6718346253},{"index":133,"x":280,"y":228,"id":58,"time":"2020-06-17 03:27:20.960686","s":10.6718346253},{"index":134,"x":828,"y":169,"id":1,"time":"2020-06-17 03:27:21.722960","s":11.4341085271},{"index":135,"x":117,"y":317,"id":3,"time":"2020-06-17 03:27:21.722960","s":11.4341085271},{"index":136,"x":720,"y":569,"id":4,"time":"2020-06-17 03:27:21.722960","s":11.4341085271},{"index":137,"x":857,"y":236,"id":5,"time":"2020-06-17 03:27:21.722960","s":11.4341085271},{"index":138,"x":958,"y":307,"id":30,"time":"2020-06-17 03:27:21.722960","s":11.4341085271},{"index":139,"x":275,"y":95,"id":34,"time":"2020-06-17 03:27:21.722960","s":11.4341085271},{"index":140,"x":379,"y":117,"id":46,"time":"2020-06-17 03:27:21.722960","s":11.4341085271},{"index":141,"x":328,"y":115,"id":49,"time":"2020-06-17 03:27:21.722960","s":11.4341085271},{"index":142,"x":650,"y":84,"id":57,"time":"2020-06-17 03:27:21.722960","s":11.4341085271},{"index":143,"x":280,"y":195,"id":58,"time":"2020-06-17 03:27:21.722960","s":11.4341085271},{"index":144,"x":823,"y":171,"id":1,"time":"2020-06-17 03:27:22.485233","s":12.1963824289},{"index":145,"x":115,"y":332,"id":3,"time":"2020-06-17 03:27:22.485233","s":12.1963824289},{"index":146,"x":731,"y":577,"id":4,"time":"2020-06-17 03:27:22.485233","s":12.1963824289},{"index":147,"x":913,"y":316,"id":5,"time":"2020-06-17 03:27:22.485233","s":12.1963824289},{"index":148,"x":976,"y":307,"id":30,"time":"2020-06-17 03:27:22.485233","s":12.1963824289},{"index":149,"x":275,"y":85,"id":34,"time":"2020-06-17 03:27:22.485233","s":12.1963824289},{"index":150,"x":378,"y":130,"id":46,"time":"2020-06-17 03:27:22.485233","s":12.1963824289},{"index":151,"x":326,"y":114,"id":49,"time":"2020-06-17 03:27:22.485233","s":12.1963824289},{"index":152,"x":725,"y":129,"id":57,"time":"2020-06-17 03:27:22.485233","s":12.1963824289},{"index":153,"x":282,"y":191,"id":58,"time":"2020-06-17 03:27:22.485233","s":12.1963824289},{"index":154,"x":661,"y":131,"id":60,"time":"2020-06-17 03:27:22.485233","s":12.1963824289},{"index":155,"x":216,"y":178,"id":62,"time":"2020-06-17 03:27:22.485233","s":12.1963824289},{"index":156,"x":1090,"y":569,"id":65,"time":"2020-06-17 03:27:22.485233","s":12.1963824289},{"index":157,"x":116,"y":317,"id":3,"time":"2020-06-17 03:27:23.247507","s":12.9586563307},{"index":158,"x":792,"y":553,"id":4,"time":"2020-06-17 03:27:23.247507","s":12.9586563307},{"index":159,"x":977,"y":403,"id":5,"time":"2020-06-17 03:27:23.247507","s":12.9586563307},{"index":160,"x":283,"y":231,"id":7,"time":"2020-06-17 03:27:23.247507","s":12.9586563307},{"index":161,"x":278,"y":96,"id":34,"time":"2020-06-17 03:27:23.247507","s":12.9586563307},{"index":162,"x":373,"y":120,"id":46,"time":"2020-06-17 03:27:23.247507","s":12.9586563307},{"index":163,"x":326,"y":132,"id":49,"time":"2020-06-17 03:27:23.247507","s":12.9586563307},{"index":164,"x":845,"y":198,"id":57,"time":"2020-06-17 03:27:23.247507","s":12.9586563307},{"index":165,"x":677,"y":117,"id":60,"time":"2020-06-17 03:27:23.247507","s":12.9586563307},{"index":166,"x":1027,"y":557,"id":65,"time":"2020-06-17 03:27:23.247507","s":12.9586563307},{"index":167,"x":85,"y":186,"id":67,"time":"2020-06-17 03:27:23.247507","s":12.9586563307},{"index":168,"x":766,"y":148,"id":1,"time":"2020-06-17 03:27:24.009781","s":13.7209302326},{"index":169,"x":112,"y":320,"id":3,"time":"2020-06-17 03:27:24.009781","s":13.7209302326},{"index":170,"x":808,"y":511,"id":4,"time":"2020-06-17 03:27:24.009781","s":13.7209302326},{"index":171,"x":1172,"y":458,"id":5,"time":"2020-06-17 03:27:24.009781","s":13.7209302326},{"index":172,"x":274,"y":230,"id":7,"time":"2020-06-17 03:27:24.009781","s":13.7209302326},{"index":173,"x":276,"y":85,"id":34,"time":"2020-06-17 03:27:24.009781","s":13.7209302326},{"index":174,"x":373,"y":121,"id":46,"time":"2020-06-17 03:27:24.009781","s":13.7209302326},{"index":175,"x":328,"y":117,"id":49,"time":"2020-06-17 03:27:24.009781","s":13.7209302326},{"index":176,"x":904,"y":272,"id":57,"time":"2020-06-17 03:27:24.009781","s":13.7209302326},{"index":177,"x":676,"y":116,"id":60,"time":"2020-06-17 03:27:24.009781","s":13.7209302326},{"index":178,"x":1083,"y":571,"id":65,"time":"2020-06-17 03:27:24.009781","s":13.7209302326},{"index":179,"x":772,"y":141,"id":1,"time":"2020-06-17 03:27:24.772055","s":14.4832041344},{"index":180,"x":95,"y":328,"id":3,"time":"2020-06-17 03:27:24.772055","s":14.4832041344},{"index":181,"x":812,"y":468,"id":4,"time":"2020-06-17 03:27:24.772055","s":14.4832041344},{"index":182,"x":275,"y":234,"id":7,"time":"2020-06-17 03:27:24.772055","s":14.4832041344},{"index":183,"x":281,"y":100,"id":34,"time":"2020-06-17 03:27:24.772055","s":14.4832041344},{"index":184,"x":371,"y":121,"id":46,"time":"2020-06-17 03:27:24.772055","s":14.4832041344},{"index":185,"x":322,"y":134,"id":49,"time":"2020-06-17 03:27:24.772055","s":14.4832041344},{"index":186,"x":882,"y":246,"id":57,"time":"2020-06-17 03:27:24.772055","s":14.4832041344},{"index":187,"x":661,"y":115,"id":60,"time":"2020-06-17 03:27:24.772055","s":14.4832041344},{"index":188,"x":902,"y":586,"id":65,"time":"2020-06-17 03:27:24.772055","s":14.4832041344},{"index":189,"x":93,"y":199,"id":67,"time":"2020-06-17 03:27:24.772055","s":14.4832041344},{"index":190,"x":778,"y":134,"id":1,"time":"2020-06-17 03:27:25.534329","s":15.2454780362},{"index":191,"x":838,"y":427,"id":4,"time":"2020-06-17 03:27:25.534329","s":15.2454780362},{"index":192,"x":277,"y":99,"id":34,"time":"2020-06-17 03:27:25.534329","s":15.2454780362},{"index":193,"x":369,"y":121,"id":46,"time":"2020-06-17 03:27:25.534329","s":15.2454780362},{"index":194,"x":319,"y":114,"id":49,"time":"2020-06-17 03:27:25.534329","s":15.2454780362},{"index":195,"x":704,"y":338,"id":57,"time":"2020-06-17 03:27:25.534329","s":15.2454780362},{"index":196,"x":642,"y":116,"id":60,"time":"2020-06-17 03:27:25.534329","s":15.2454780362},{"index":197,"x":815,"y":642,"id":65,"time":"2020-06-17 03:27:25.534329","s":15.2454780362},{"index":198,"x":850,"y":185,"id":79,"time":"2020-06-17 03:27:25.534329","s":15.2454780362},{"index":199,"x":775,"y":138,"id":1,"time":"2020-06-17 03:27:26.296603","s":16.007751938},{"index":200,"x":77,"y":321,"id":3,"time":"2020-06-17 03:27:26.296603","s":16.007751938},{"index":201,"x":843,"y":398,"id":4,"time":"2020-06-17 03:27:26.296603","s":16.007751938},{"index":202,"x":271,"y":184,"id":7,"time":"2020-06-17 03:27:26.296603","s":16.007751938},{"index":203,"x":278,"y":103,"id":34,"time":"2020-06-17 03:27:26.296603","s":16.007751938},{"index":204,"x":370,"y":120,"id":46,"time":"2020-06-17 03:27:26.296603","s":16.007751938},{"index":205,"x":649,"y":117,"id":60,"time":"2020-06-17 03:27:26.296603","s":16.007751938},{"index":206,"x":677,"y":636,"id":65,"time":"2020-06-17 03:27:26.296603","s":16.007751938},{"index":207,"x":850,"y":170,"id":79,"time":"2020-06-17 03:27:26.296603","s":16.007751938},{"index":208,"x":213,"y":149,"id":81,"time":"2020-06-17 03:27:26.296603","s":16.007751938},{"index":209,"x":762,"y":140,"id":1,"time":"2020-06-17 03:27:27.058877","s":16.7700258398},{"index":210,"x":80,"y":327,"id":3,"time":"2020-06-17 03:27:27.058877","s":16.7700258398},{"index":211,"x":892,"y":351,"id":4,"time":"2020-06-17 03:27:27.058877","s":16.7700258398},{"index":212,"x":277,"y":190,"id":7,"time":"2020-06-17 03:27:27.058877","s":16.7700258398},{"index":213,"x":371,"y":128,"id":46,"time":"2020-06-17 03:27:27.058877","s":16.7700258398},{"index":214,"x":653,"y":116,"id":60,"time":"2020-06-17 03:27:27.058877","s":16.7700258398},{"index":215,"x":653,"y":656,"id":65,"time":"2020-06-17 03:27:27.058877","s":16.7700258398},{"index":216,"x":93,"y":200,"id":67,"time":"2020-06-17 03:27:27.058877","s":16.7700258398},{"index":217,"x":855,"y":203,"id":79,"time":"2020-06-17 03:27:27.058877","s":16.7700258398},{"index":218,"x":205,"y":153,"id":81,"time":"2020-06-17 03:27:27.058877","s":16.7700258398},{"index":219,"x":269,"y":451,"id":82,"time":"2020-06-17 03:27:27.058877","s":16.7700258398},{"index":220,"x":759,"y":145,"id":1,"time":"2020-06-17 03:27:27.821151","s":17.5322997416},{"index":221,"x":73,"y":294,"id":3,"time":"2020-06-17 03:27:27.821151","s":17.5322997416},{"index":222,"x":886,"y":302,"id":4,"time":"2020-06-17 03:27:27.821151","s":17.5322997416},{"index":223,"x":271,"y":182,"id":7,"time":"2020-06-17 03:27:27.821151","s":17.5322997416},{"index":224,"x":322,"y":141,"id":34,"time":"2020-06-17 03:27:27.821151","s":17.5322997416},{"index":225,"x":371,"y":120,"id":46,"time":"2020-06-17 03:27:27.821151","s":17.5322997416},{"index":226,"x":661,"y":116,"id":60,"time":"2020-06-17 03:27:27.821151","s":17.5322997416},{"index":227,"x":88,"y":185,"id":67,"time":"2020-06-17 03:27:27.821151","s":17.5322997416},{"index":228,"x":850,"y":179,"id":79,"time":"2020-06-17 03:27:27.821151","s":17.5322997416},{"index":229,"x":184,"y":489,"id":82,"time":"2020-06-17 03:27:27.821151","s":17.5322997416},{"index":230,"x":693,"y":671,"id":84,"time":"2020-06-17 03:27:27.821151","s":17.5322997416},{"index":231,"x":279,"y":109,"id":86,"time":"2020-06-17 03:27:27.821151","s":17.5322997416},{"index":232,"x":760,"y":146,"id":1,"time":"2020-06-17 03:27:28.583425","s":18.2945736434},{"index":233,"x":69,"y":321,"id":3,"time":"2020-06-17 03:27:28.583425","s":18.2945736434},{"index":234,"x":893,"y":275,"id":4,"time":"2020-06-17 03:27:28.583425","s":18.2945736434},{"index":235,"x":373,"y":120,"id":46,"time":"2020-06-17 03:27:28.583425","s":18.2945736434},{"index":236,"x":102,"y":197,"id":67,"time":"2020-06-17 03:27:28.583425","s":18.2945736434},{"index":237,"x":689,"y":145,"id":76,"time":"2020-06-17 03:27:28.583425","s":18.2945736434},{"index":238,"x":830,"y":196,"id":79,"time":"2020-06-17 03:27:28.583425","s":18.2945736434},{"index":239,"x":209,"y":538,"id":82,"time":"2020-06-17 03:27:28.583425","s":18.2945736434},{"index":240,"x":652,"y":670,"id":84,"time":"2020-06-17 03:27:28.583425","s":18.2945736434},{"index":241,"x":280,"y":107,"id":86,"time":"2020-06-17 03:27:28.583425","s":18.2945736434},{"index":242,"x":252,"y":143,"id":87,"time":"2020-06-17 03:27:28.583425","s":18.2945736434},{"index":243,"x":757,"y":148,"id":1,"time":"2020-06-17 03:27:29.345699","s":19.0568475452},{"index":244,"x":76,"y":349,"id":3,"time":"2020-06-17 03:27:29.345699","s":19.0568475452},{"index":245,"x":844,"y":203,"id":4,"time":"2020-06-17 03:27:29.345699","s":19.0568475452},{"index":246,"x":253,"y":186,"id":7,"time":"2020-06-17 03:27:29.345699","s":19.0568475452},{"index":247,"x":319,"y":114,"id":34,"time":"2020-06-17 03:27:29.345699","s":19.0568475452},{"index":248,"x":373,"y":119,"id":46,"time":"2020-06-17 03:27:29.345699","s":19.0568475452},{"index":249,"x":105,"y":213,"id":67,"time":"2020-06-17 03:27:29.345699","s":19.0568475452},{"index":250,"x":694,"y":146,"id":76,"time":"2020-06-17 03:27:29.345699","s":19.0568475452},{"index":251,"x":214,"y":599,"id":82,"time":"2020-06-17 03:27:29.345699","s":19.0568475452},{"index":252,"x":654,"y":672,"id":84,"time":"2020-06-17 03:27:29.345699","s":19.0568475452},{"index":253,"x":278,"y":106,"id":86,"time":"2020-06-17 03:27:29.345699","s":19.0568475452},{"index":254,"x":246,"y":145,"id":87,"time":"2020-06-17 03:27:29.345699","s":19.0568475452},{"index":255,"x":749,"y":149,"id":1,"time":"2020-06-17 03:27:30.107972","s":19.819121447},{"index":256,"x":79,"y":354,"id":3,"time":"2020-06-17 03:27:30.107972","s":19.819121447},{"index":257,"x":820,"y":178,"id":4,"time":"2020-06-17 03:27:30.107972","s":19.819121447},{"index":258,"x":253,"y":198,"id":7,"time":"2020-06-17 03:27:30.107972","s":19.819121447},{"index":259,"x":273,"y":105,"id":34,"time":"2020-06-17 03:27:30.107972","s":19.819121447},{"index":260,"x":372,"y":121,"id":46,"time":"2020-06-17 03:27:30.107972","s":19.819121447},{"index":261,"x":108,"y":208,"id":67,"time":"2020-06-17 03:27:30.107972","s":19.819121447},{"index":262,"x":212,"y":592,"id":82,"time":"2020-06-17 03:27:30.107972","s":19.819121447},{"index":263,"x":285,"y":138,"id":86,"time":"2020-06-17 03:27:30.107972","s":19.819121447},{"index":264,"x":245,"y":145,"id":87,"time":"2020-06-17 03:27:30.107972","s":19.819121447},{"index":265,"x":64,"y":347,"id":3,"time":"2020-06-17 03:27:30.870246","s":20.5813953488},{"index":266,"x":788,"y":173,"id":4,"time":"2020-06-17 03:27:30.870246","s":20.5813953488},{"index":267,"x":253,"y":223,"id":7,"time":"2020-06-17 03:27:30.870246","s":20.5813953488},{"index":268,"x":372,"y":122,"id":46,"time":"2020-06-17 03:27:30.870246","s":20.5813953488},{"index":269,"x":105,"y":221,"id":67,"time":"2020-06-17 03:27:30.870246","s":20.5813953488},{"index":270,"x":219,"y":493,"id":82,"time":"2020-06-17 03:27:30.870246","s":20.5813953488},{"index":271,"x":286,"y":137,"id":86,"time":"2020-06-17 03:27:30.870246","s":20.5813953488},{"index":272,"x":248,"y":135,"id":87,"time":"2020-06-17 03:27:30.870246","s":20.5813953488},{"index":273,"x":69,"y":280,"id":3,"time":"2020-06-17 03:27:31.632520","s":21.3436692506},{"index":274,"x":804,"y":175,"id":4,"time":"2020-06-17 03:27:31.632520","s":21.3436692506},{"index":275,"x":275,"y":180,"id":7,"time":"2020-06-17 03:27:31.632520","s":21.3436692506},{"index":276,"x":277,"y":102,"id":34,"time":"2020-06-17 03:27:31.632520","s":21.3436692506},{"index":277,"x":372,"y":121,"id":46,"time":"2020-06-17 03:27:31.632520","s":21.3436692506},{"index":278,"x":108,"y":203,"id":67,"time":"2020-06-17 03:27:31.632520","s":21.3436692506},{"index":279,"x":193,"y":489,"id":82,"time":"2020-06-17 03:27:31.632520","s":21.3436692506},{"index":280,"x":680,"y":676,"id":84,"time":"2020-06-17 03:27:31.632520","s":21.3436692506},{"index":281,"x":287,"y":243,"id":91,"time":"2020-06-17 03:27:31.632520","s":21.3436692506},{"index":282,"x":750,"y":107,"id":1,"time":"2020-06-17 03:27:32.394794","s":22.1059431525},{"index":283,"x":78,"y":334,"id":3,"time":"2020-06-17 03:27:32.394794","s":22.1059431525},{"index":284,"x":793,"y":172,"id":4,"time":"2020-06-17 03:27:32.394794","s":22.1059431525},{"index":285,"x":318,"y":110,"id":34,"time":"2020-06-17 03:27:32.394794","s":22.1059431525},{"index":286,"x":372,"y":121,"id":46,"time":"2020-06-17 03:27:32.394794","s":22.1059431525},{"index":287,"x":122,"y":209,"id":67,"time":"2020-06-17 03:27:32.394794","s":22.1059431525},{"index":288,"x":228,"y":471,"id":82,"time":"2020-06-17 03:27:32.394794","s":22.1059431525},{"index":289,"x":272,"y":85,"id":86,"time":"2020-06-17 03:27:32.394794","s":22.1059431525},{"index":290,"x":300,"y":240,"id":91,"time":"2020-06-17 03:27:32.394794","s":22.1059431525},{"index":291,"x":647,"y":111,"id":92,"time":"2020-06-17 03:27:32.394794","s":22.1059431525},{"index":292,"x":227,"y":149,"id":93,"time":"2020-06-17 03:27:32.394794","s":22.1059431525},{"index":293,"x":84,"y":343,"id":3,"time":"2020-06-17 03:27:33.157068","s":22.8682170543},{"index":294,"x":791,"y":175,"id":4,"time":"2020-06-17 03:27:33.157068","s":22.8682170543},{"index":295,"x":304,"y":205,"id":7,"time":"2020-06-17 03:27:33.157068","s":22.8682170543},{"index":296,"x":318,"y":114,"id":34,"time":"2020-06-17 03:27:33.157068","s":22.8682170543},{"index":297,"x":372,"y":120,"id":46,"time":"2020-06-17 03:27:33.157068","s":22.8682170543},{"index":298,"x":94,"y":207,"id":67,"time":"2020-06-17 03:27:33.157068","s":22.8682170543},{"index":299,"x":292,"y":442,"id":82,"time":"2020-06-17 03:27:33.157068","s":22.8682170543},{"index":300,"x":273,"y":87,"id":86,"time":"2020-06-17 03:27:33.157068","s":22.8682170543},{"index":301,"x":217,"y":168,"id":93,"time":"2020-06-17 03:27:33.157068","s":22.8682170543},{"index":302,"x":744,"y":101,"id":1,"time":"2020-06-17 03:27:33.919342","s":23.6304909561},{"index":303,"x":85,"y":342,"id":3,"time":"2020-06-17 03:27:33.919342","s":23.6304909561},{"index":304,"x":779,"y":179,"id":4,"time":"2020-06-17 03:27:33.919342","s":23.6304909561},{"index":305,"x":319,"y":115,"id":34,"time":"2020-06-17 03:27:33.919342","s":23.6304909561},{"index":306,"x":373,"y":113,"id":46,"time":"2020-06-17 03:27:33.919342","s":23.6304909561},{"index":307,"x":107,"y":211,"id":67,"time":"2020-06-17 03:27:33.919342","s":23.6304909561},{"index":308,"x":401,"y":356,"id":82,"time":"2020-06-17 03:27:33.919342","s":23.6304909561},{"index":309,"x":272,"y":90,"id":86,"time":"2020-06-17 03:27:33.919342","s":23.6304909561},{"index":310,"x":289,"y":212,"id":91,"time":"2020-06-17 03:27:33.919342","s":23.6304909561},{"index":311,"x":214,"y":170,"id":93,"time":"2020-06-17 03:27:33.919342","s":23.6304909561},{"index":312,"x":688,"y":143,"id":94,"time":"2020-06-17 03:27:33.919342","s":23.6304909561},{"index":313,"x":746,"y":101,"id":1,"time":"2020-06-17 03:27:34.681616","s":24.3927648579},{"index":314,"x":100,"y":331,"id":3,"time":"2020-06-17 03:27:34.681616","s":24.3927648579},{"index":315,"x":778,"y":175,"id":4,"time":"2020-06-17 03:27:34.681616","s":24.3927648579},{"index":316,"x":320,"y":112,"id":34,"time":"2020-06-17 03:27:34.681616","s":24.3927648579},{"index":317,"x":373,"y":121,"id":46,"time":"2020-06-17 03:27:34.681616","s":24.3927648579},{"index":318,"x":436,"y":299,"id":82,"time":"2020-06-17 03:27:34.681616","s":24.3927648579},{"index":319,"x":271,"y":99,"id":86,"time":"2020-06-17 03:27:34.681616","s":24.3927648579},{"index":320,"x":293,"y":244,"id":91,"time":"2020-06-17 03:27:34.681616","s":24.3927648579},{"index":321,"x":656,"y":105,"id":92,"time":"2020-06-17 03:27:34.681616","s":24.3927648579},{"index":322,"x":215,"y":166,"id":93,"time":"2020-06-17 03:27:34.681616","s":24.3927648579},{"index":323,"x":689,"y":144,"id":94,"time":"2020-06-17 03:27:34.681616","s":24.3927648579},{"index":324,"x":747,"y":105,"id":1,"time":"2020-06-17 03:27:35.443890","s":25.1550387597},{"index":325,"x":106,"y":317,"id":3,"time":"2020-06-17 03:27:35.443890","s":25.1550387597},{"index":326,"x":772,"y":167,"id":4,"time":"2020-06-17 03:27:35.443890","s":25.1550387597},{"index":327,"x":321,"y":109,"id":34,"time":"2020-06-17 03:27:35.443890","s":25.1550387597},{"index":328,"x":372,"y":120,"id":46,"time":"2020-06-17 03:27:35.443890","s":25.1550387597},{"index":329,"x":116,"y":201,"id":67,"time":"2020-06-17 03:27:35.443890","s":25.1550387597},{"index":330,"x":460,"y":234,"id":82,"time":"2020-06-17 03:27:35.443890","s":25.1550387597},{"index":331,"x":274,"y":85,"id":86,"time":"2020-06-17 03:27:35.443890","s":25.1550387597},{"index":332,"x":301,"y":239,"id":91,"time":"2020-06-17 03:27:35.443890","s":25.1550387597},{"index":333,"x":639,"y":106,"id":92,"time":"2020-06-17 03:27:35.443890","s":25.1550387597},{"index":334,"x":229,"y":166,"id":93,"time":"2020-06-17 03:27:35.443890","s":25.1550387597},{"index":335,"x":100,"y":319,"id":3,"time":"2020-06-17 03:27:36.206164","s":25.9173126615},{"index":336,"x":761,"y":167,"id":4,"time":"2020-06-17 03:27:36.206164","s":25.9173126615},{"index":337,"x":312,"y":116,"id":34,"time":"2020-06-17 03:27:36.206164","s":25.9173126615},{"index":338,"x":371,"y":121,"id":46,"time":"2020-06-17 03:27:36.206164","s":25.9173126615},{"index":339,"x":122,"y":204,"id":67,"time":"2020-06-17 03:27:36.206164","s":25.9173126615},{"index":340,"x":431,"y":216,"id":82,"time":"2020-06-17 03:27:36.206164","s":25.9173126615},{"index":341,"x":255,"y":85,"id":86,"time":"2020-06-17 03:27:36.206164","s":25.9173126615},{"index":342,"x":280,"y":236,"id":91,"time":"2020-06-17 03:27:36.206164","s":25.9173126615},{"index":343,"x":630,"y":108,"id":92,"time":"2020-06-17 03:27:36.206164","s":25.9173126615},{"index":344,"x":826,"y":182,"id":96,"time":"2020-06-17 03:27:36.206164","s":25.9173126615},{"index":345,"x":97,"y":326,"id":3,"time":"2020-06-17 03:27:36.968438","s":26.6795865633},{"index":346,"x":722,"y":150,"id":4,"time":"2020-06-17 03:27:36.968438","s":26.6795865633},{"index":347,"x":315,"y":110,"id":34,"time":"2020-06-17 03:27:36.968438","s":26.6795865633},{"index":348,"x":119,"y":210,"id":67,"time":"2020-06-17 03:27:36.968438","s":26.6795865633},{"index":349,"x":382,"y":174,"id":82,"time":"2020-06-17 03:27:36.968438","s":26.6795865633},{"index":350,"x":271,"y":83,"id":86,"time":"2020-06-17 03:27:36.968438","s":26.6795865633},{"index":351,"x":288,"y":242,"id":91,"time":"2020-06-17 03:27:36.968438","s":26.6795865633},{"index":352,"x":653,"y":109,"id":92,"time":"2020-06-17 03:27:36.968438","s":26.6795865633},{"index":353,"x":829,"y":194,"id":96,"time":"2020-06-17 03:27:36.968438","s":26.6795865633},{"index":354,"x":692,"y":122,"id":1,"time":"2020-06-17 03:27:37.730711","s":27.4418604651},{"index":355,"x":102,"y":318,"id":3,"time":"2020-06-17 03:27:37.730711","s":27.4418604651},{"index":356,"x":752,"y":157,"id":4,"time":"2020-06-17 03:27:37.730711","s":27.4418604651},{"index":357,"x":310,"y":99,"id":34,"time":"2020-06-17 03:27:37.730711","s":27.4418604651},{"index":358,"x":120,"y":211,"id":67,"time":"2020-06-17 03:27:37.730711","s":27.4418604651},{"index":359,"x":369,"y":147,"id":82,"time":"2020-06-17 03:27:37.730711","s":27.4418604651},{"index":360,"x":258,"y":85,"id":86,"time":"2020-06-17 03:27:37.730711","s":27.4418604651},{"index":361,"x":315,"y":247,"id":91,"time":"2020-06-17 03:27:37.730711","s":27.4418604651},{"index":362,"x":825,"y":186,"id":96,"time":"2020-06-17 03:27:37.730711","s":27.4418604651},{"index":363,"x":242,"y":169,"id":99,"time":"2020-06-17 03:27:37.730711","s":27.4418604651},{"index":364,"x":705,"y":131,"id":1,"time":"2020-06-17 03:27:38.492985","s":28.2041343669},{"index":365,"x":101,"y":318,"id":3,"time":"2020-06-17 03:27:38.492985","s":28.2041343669},{"index":366,"x":745,"y":157,"id":4,"time":"2020-06-17 03:27:38.492985","s":28.2041343669},{"index":367,"x":305,"y":86,"id":34,"time":"2020-06-17 03:27:38.492985","s":28.2041343669},{"index":368,"x":123,"y":205,"id":67,"time":"2020-06-17 03:27:38.492985","s":28.2041343669},{"index":369,"x":384,"y":134,"id":82,"time":"2020-06-17 03:27:38.492985","s":28.2041343669},{"index":370,"x":255,"y":85,"id":86,"time":"2020-06-17 03:27:38.492985","s":28.2041343669},{"index":371,"x":314,"y":246,"id":91,"time":"2020-06-17 03:27:38.492985","s":28.2041343669},{"index":372,"x":820,"y":191,"id":96,"time":"2020-06-17 03:27:38.492985","s":28.2041343669},{"index":373,"x":243,"y":151,"id":99,"time":"2020-06-17 03:27:38.492985","s":28.2041343669},{"index":374,"x":104,"y":318,"id":3,"time":"2020-06-17 03:27:39.255259","s":28.9664082687},{"index":375,"x":713,"y":147,"id":4,"time":"2020-06-17 03:27:39.255259","s":28.9664082687},{"index":376,"x":294,"y":84,"id":34,"time":"2020-06-17 03:27:39.255259","s":28.9664082687},{"index":377,"x":125,"y":207,"id":67,"time":"2020-06-17 03:27:39.255259","s":28.9664082687},{"index":378,"x":253,"y":84,"id":86,"time":"2020-06-17 03:27:39.255259","s":28.9664082687},{"index":379,"x":319,"y":246,"id":91,"time":"2020-06-17 03:27:39.255259","s":28.9664082687},{"index":380,"x":819,"y":197,"id":96,"time":"2020-06-17 03:27:39.255259","s":28.9664082687},{"index":381,"x":244,"y":174,"id":99,"time":"2020-06-17 03:27:39.255259","s":28.9664082687},{"index":382,"x":344,"y":101,"id":101,"time":"2020-06-17 03:27:39.255259","s":28.9664082687},{"index":383,"x":471,"y":68,"id":103,"time":"2020-06-17 03:27:39.255259","s":28.9664082687},{"index":384,"x":100,"y":319,"id":3,"time":"2020-06-17 03:27:40.017533","s":29.7286821705},{"index":385,"x":711,"y":149,"id":4,"time":"2020-06-17 03:27:40.017533","s":29.7286821705},{"index":386,"x":126,"y":213,"id":67,"time":"2020-06-17 03:27:40.017533","s":29.7286821705},{"index":387,"x":250,"y":78,"id":86,"time":"2020-06-17 03:27:40.017533","s":29.7286821705},{"index":388,"x":292,"y":243,"id":91,"time":"2020-06-17 03:27:40.017533","s":29.7286821705},{"index":389,"x":818,"y":198,"id":96,"time":"2020-06-17 03:27:40.017533","s":29.7286821705},{"index":390,"x":242,"y":133,"id":99,"time":"2020-06-17 03:27:40.017533","s":29.7286821705},{"index":391,"x":348,"y":132,"id":101,"time":"2020-06-17 03:27:40.017533","s":29.7286821705},{"index":392,"x":512,"y":56,"id":103,"time":"2020-06-17 03:27:40.017533","s":29.7286821705},{"index":393,"x":661,"y":113,"id":104,"time":"2020-06-17 03:27:40.017533","s":29.7286821705},{"index":394,"x":101,"y":317,"id":3,"time":"2020-06-17 03:27:40.779807","s":30.4909560724},{"index":395,"x":713,"y":143,"id":4,"time":"2020-06-17 03:27:40.779807","s":30.4909560724},{"index":396,"x":120,"y":206,"id":67,"time":"2020-06-17 03:27:40.779807","s":30.4909560724},{"index":397,"x":296,"y":249,"id":91,"time":"2020-06-17 03:27:40.779807","s":30.4909560724},{"index":398,"x":820,"y":198,"id":96,"time":"2020-06-17 03:27:40.779807","s":30.4909560724},{"index":399,"x":349,"y":132,"id":101,"time":"2020-06-17 03:27:40.779807","s":30.4909560724},{"index":400,"x":488,"y":48,"id":103,"time":"2020-06-17 03:27:40.779807","s":30.4909560724},{"index":401,"x":659,"y":108,"id":104,"time":"2020-06-17 03:27:40.779807","s":30.4909560724},{"index":402,"x":305,"y":101,"id":105,"time":"2020-06-17 03:27:40.779807","s":30.4909560724},{"index":403,"x":88,"y":318,"id":3,"time":"2020-06-17 03:27:41.542081","s":31.2532299742},{"index":404,"x":715,"y":153,"id":4,"time":"2020-06-17 03:27:41.542081","s":31.2532299742},{"index":405,"x":306,"y":87,"id":34,"time":"2020-06-17 03:27:41.542081","s":31.2532299742},{"index":406,"x":258,"y":81,"id":86,"time":"2020-06-17 03:27:41.542081","s":31.2532299742},{"index":407,"x":308,"y":247,"id":91,"time":"2020-06-17 03:27:41.542081","s":31.2532299742},{"index":408,"x":815,"y":198,"id":96,"time":"2020-06-17 03:27:41.542081","s":31.2532299742},{"index":409,"x":256,"y":140,"id":99,"time":"2020-06-17 03:27:41.542081","s":31.2532299742},{"index":410,"x":351,"y":97,"id":101,"time":"2020-06-17 03:27:41.542081","s":31.2532299742},{"index":411,"x":88,"y":314,"id":3,"time":"2020-06-17 03:27:42.304355","s":32.015503876},{"index":412,"x":306,"y":85,"id":34,"time":"2020-06-17 03:27:42.304355","s":32.015503876},{"index":413,"x":257,"y":78,"id":86,"time":"2020-06-17 03:27:42.304355","s":32.015503876},{"index":414,"x":310,"y":250,"id":91,"time":"2020-06-17 03:27:42.304355","s":32.015503876},{"index":415,"x":823,"y":196,"id":96,"time":"2020-06-17 03:27:42.304355","s":32.015503876},{"index":416,"x":250,"y":138,"id":99,"time":"2020-06-17 03:27:42.304355","s":32.015503876},{"index":417,"x":349,"y":99,"id":101,"time":"2020-06-17 03:27:42.304355","s":32.015503876},{"index":418,"x":92,"y":308,"id":3,"time":"2020-06-17 03:27:43.066629","s":32.7777777778},{"index":419,"x":367,"y":267,"id":91,"time":"2020-06-17 03:27:43.066629","s":32.7777777778},{"index":420,"x":823,"y":197,"id":96,"time":"2020-06-17 03:27:43.066629","s":32.7777777778},{"index":421,"x":261,"y":197,"id":99,"time":"2020-06-17 03:27:43.066629","s":32.7777777778},{"index":422,"x":351,"y":87,"id":101,"time":"2020-06-17 03:27:43.066629","s":32.7777777778},{"index":423,"x":306,"y":113,"id":105,"time":"2020-06-17 03:27:43.066629","s":32.7777777778},{"index":424,"x":657,"y":671,"id":112,"time":"2020-06-17 03:27:43.066629","s":32.7777777778},{"index":425,"x":124,"y":266,"id":113,"time":"2020-06-17 03:27:43.066629","s":32.7777777778},{"index":426,"x":94,"y":316,"id":3,"time":"2020-06-17 03:27:43.828903","s":33.5400516796},{"index":427,"x":105,"y":207,"id":67,"time":"2020-06-17 03:27:43.828903","s":33.5400516796},{"index":428,"x":410,"y":276,"id":91,"time":"2020-06-17 03:27:43.828903","s":33.5400516796},{"index":429,"x":823,"y":196,"id":96,"time":"2020-06-17 03:27:43.828903","s":33.5400516796},{"index":430,"x":261,"y":205,"id":99,"time":"2020-06-17 03:27:43.828903","s":33.5400516796},{"index":431,"x":309,"y":141,"id":105,"time":"2020-06-17 03:27:43.828903","s":33.5400516796},{"index":432,"x":656,"y":671,"id":112,"time":"2020-06-17 03:27:43.828903","s":33.5400516796},{"index":433,"x":348,"y":139,"id":114,"time":"2020-06-17 03:27:43.828903","s":33.5400516796},{"index":434,"x":96,"y":314,"id":3,"time":"2020-06-17 03:27:44.591177","s":34.3023255814},{"index":435,"x":731,"y":147,"id":4,"time":"2020-06-17 03:27:44.591177","s":34.3023255814},{"index":436,"x":94,"y":204,"id":67,"time":"2020-06-17 03:27:44.591177","s":34.3023255814},{"index":437,"x":423,"y":268,"id":91,"time":"2020-06-17 03:27:44.591177","s":34.3023255814},{"index":438,"x":822,"y":198,"id":96,"time":"2020-06-17 03:27:44.591177","s":34.3023255814},{"index":439,"x":271,"y":204,"id":99,"time":"2020-06-17 03:27:44.591177","s":34.3023255814},{"index":440,"x":309,"y":139,"id":105,"time":"2020-06-17 03:27:44.591177","s":34.3023255814},{"index":441,"x":671,"y":99,"id":108,"time":"2020-06-17 03:27:44.591177","s":34.3023255814},{"index":442,"x":657,"y":671,"id":112,"time":"2020-06-17 03:27:44.591177","s":34.3023255814},{"index":443,"x":347,"y":136,"id":114,"time":"2020-06-17 03:27:44.591177","s":34.3023255814},{"index":444,"x":97,"y":311,"id":3,"time":"2020-06-17 03:27:45.353450","s":35.0645994832},{"index":445,"x":738,"y":143,"id":4,"time":"2020-06-17 03:27:45.353450","s":35.0645994832},{"index":446,"x":95,"y":207,"id":67,"time":"2020-06-17 03:27:45.353450","s":35.0645994832},{"index":447,"x":476,"y":281,"id":91,"time":"2020-06-17 03:27:45.353450","s":35.0645994832},{"index":448,"x":822,"y":197,"id":96,"time":"2020-06-17 03:27:45.353450","s":35.0645994832},{"index":449,"x":243,"y":168,"id":99,"time":"2020-06-17 03:27:45.353450","s":35.0645994832},{"index":450,"x":667,"y":101,"id":108,"time":"2020-06-17 03:27:45.353450","s":35.0645994832},{"index":451,"x":661,"y":672,"id":112,"time":"2020-06-17 03:27:45.353450","s":35.0645994832},{"index":452,"x":349,"y":134,"id":114,"time":"2020-06-17 03:27:45.353450","s":35.0645994832},{"index":453,"x":95,"y":289,"id":3,"time":"2020-06-17 03:27:46.115724","s":35.826873385},{"index":454,"x":738,"y":140,"id":4,"time":"2020-06-17 03:27:46.115724","s":35.826873385},{"index":455,"x":96,"y":212,"id":67,"time":"2020-06-17 03:27:46.115724","s":35.826873385},{"index":456,"x":824,"y":196,"id":96,"time":"2020-06-17 03:27:46.115724","s":35.826873385},{"index":457,"x":252,"y":172,"id":99,"time":"2020-06-17 03:27:46.115724","s":35.826873385},{"index":458,"x":261,"y":101,"id":105,"time":"2020-06-17 03:27:46.115724","s":35.826873385},{"index":459,"x":662,"y":101,"id":108,"time":"2020-06-17 03:27:46.115724","s":35.826873385},{"index":460,"x":661,"y":672,"id":112,"time":"2020-06-17 03:27:46.115724","s":35.826873385},{"index":461,"x":349,"y":134,"id":114,"time":"2020-06-17 03:27:46.115724","s":35.826873385},{"index":462,"x":632,"y":364,"id":115,"time":"2020-06-17 03:27:46.115724","s":35.826873385},{"index":463,"x":83,"y":327,"id":3,"time":"2020-06-17 03:27:46.877998","s":36.5891472868},{"index":464,"x":739,"y":149,"id":4,"time":"2020-06-17 03:27:46.877998","s":36.5891472868},{"index":465,"x":822,"y":197,"id":96,"time":"2020-06-17 03:27:46.877998","s":36.5891472868},{"index":466,"x":247,"y":184,"id":99,"time":"2020-06-17 03:27:46.877998","s":36.5891472868},{"index":467,"x":643,"y":108,"id":108,"time":"2020-06-17 03:27:46.877998","s":36.5891472868},{"index":468,"x":654,"y":666,"id":112,"time":"2020-06-17 03:27:46.877998","s":36.5891472868},{"index":469,"x":345,"y":134,"id":114,"time":"2020-06-17 03:27:46.877998","s":36.5891472868},{"index":470,"x":811,"y":551,"id":115,"time":"2020-06-17 03:27:46.877998","s":36.5891472868},{"index":471,"x":76,"y":336,"id":3,"time":"2020-06-17 03:27:47.640272","s":37.3514211886},{"index":472,"x":738,"y":146,"id":4,"time":"2020-06-17 03:27:47.640272","s":37.3514211886},{"index":473,"x":113,"y":217,"id":67,"time":"2020-06-17 03:27:47.640272","s":37.3514211886},{"index":474,"x":824,"y":196,"id":96,"time":"2020-06-17 03:27:47.640272","s":37.3514211886},{"index":475,"x":239,"y":168,"id":99,"time":"2020-06-17 03:27:47.640272","s":37.3514211886},{"index":476,"x":648,"y":104,"id":108,"time":"2020-06-17 03:27:47.640272","s":37.3514211886},{"index":477,"x":612,"y":639,"id":112,"time":"2020-06-17 03:27:47.640272","s":37.3514211886},{"index":478,"x":347,"y":134,"id":114,"time":"2020-06-17 03:27:47.640272","s":37.3514211886},{"index":479,"x":943,"y":642,"id":115,"time":"2020-06-17 03:27:47.640272","s":37.3514211886},{"index":480,"x":112,"y":240,"id":116,"time":"2020-06-17 03:27:47.640272","s":37.3514211886},{"index":481,"x":105,"y":331,"id":3,"time":"2020-06-17 03:27:48.402546","s":38.1136950904},{"index":482,"x":742,"y":147,"id":4,"time":"2020-06-17 03:27:48.402546","s":38.1136950904},{"index":483,"x":823,"y":197,"id":96,"time":"2020-06-17 03:27:48.402546","s":38.1136950904},{"index":484,"x":245,"y":166,"id":99,"time":"2020-06-17 03:27:48.402546","s":38.1136950904},{"index":485,"x":261,"y":83,"id":105,"time":"2020-06-17 03:27:48.402546","s":38.1136950904},{"index":486,"x":648,"y":119,"id":108,"time":"2020-06-17 03:27:48.402546","s":38.1136950904},{"index":487,"x":532,"y":674,"id":112,"time":"2020-06-17 03:27:48.402546","s":38.1136950904},{"index":488,"x":347,"y":134,"id":114,"time":"2020-06-17 03:27:48.402546","s":38.1136950904},{"index":489,"x":104,"y":332,"id":3,"time":"2020-06-17 03:27:49.164820","s":38.8759689922},{"index":490,"x":717,"y":142,"id":4,"time":"2020-06-17 03:27:49.164820","s":38.8759689922},{"index":491,"x":821,"y":197,"id":96,"time":"2020-06-17 03:27:49.164820","s":38.8759689922},{"index":492,"x":296,"y":171,"id":99,"time":"2020-06-17 03:27:49.164820","s":38.8759689922},{"index":493,"x":646,"y":104,"id":108,"time":"2020-06-17 03:27:49.164820","s":38.8759689922},{"index":494,"x":344,"y":133,"id":114,"time":"2020-06-17 03:27:49.164820","s":38.8759689922},{"index":495,"x":247,"y":143,"id":120,"time":"2020-06-17 03:27:49.164820","s":38.8759689922},{"index":496,"x":218,"y":185,"id":121,"time":"2020-06-17 03:27:49.164820","s":38.8759689922},{"index":497,"x":78,"y":333,"id":3,"time":"2020-06-17 03:27:49.927094","s":39.6382428941},{"index":498,"x":740,"y":148,"id":4,"time":"2020-06-17 03:27:49.927094","s":39.6382428941},{"index":499,"x":819,"y":198,"id":96,"time":"2020-06-17 03:27:49.927094","s":39.6382428941},{"index":500,"x":295,"y":179,"id":99,"time":"2020-06-17 03:27:49.927094","s":39.6382428941},{"index":501,"x":648,"y":109,"id":108,"time":"2020-06-17 03:27:49.927094","s":39.6382428941},{"index":502,"x":345,"y":135,"id":114,"time":"2020-06-17 03:27:49.927094","s":39.6382428941},{"index":503,"x":109,"y":236,"id":116,"time":"2020-06-17 03:27:49.927094","s":39.6382428941},{"index":504,"x":243,"y":143,"id":120,"time":"2020-06-17 03:27:49.927094","s":39.6382428941},{"index":505,"x":220,"y":186,"id":121,"time":"2020-06-17 03:27:49.927094","s":39.6382428941},{"index":506,"x":491,"y":676,"id":123,"time":"2020-06-17 03:27:49.927094","s":39.6382428941},{"index":507,"x":948,"y":607,"id":124,"time":"2020-06-17 03:27:49.927094","s":39.6382428941},{"index":508,"x":323,"y":264,"id":125,"time":"2020-06-17 03:27:49.927094","s":39.6382428941},{"index":509,"x":74,"y":338,"id":3,"time":"2020-06-17 03:27:50.689368","s":40.4005167959},{"index":510,"x":719,"y":149,"id":4,"time":"2020-06-17 03:27:50.689368","s":40.4005167959},{"index":511,"x":820,"y":197,"id":96,"time":"2020-06-17 03:27:50.689368","s":40.4005167959},{"index":512,"x":295,"y":204,"id":99,"time":"2020-06-17 03:27:50.689368","s":40.4005167959},{"index":513,"x":647,"y":108,"id":108,"time":"2020-06-17 03:27:50.689368","s":40.4005167959},{"index":514,"x":346,"y":135,"id":114,"time":"2020-06-17 03:27:50.689368","s":40.4005167959},{"index":515,"x":104,"y":240,"id":116,"time":"2020-06-17 03:27:50.689368","s":40.4005167959},{"index":516,"x":247,"y":147,"id":120,"time":"2020-06-17 03:27:50.689368","s":40.4005167959},{"index":517,"x":219,"y":186,"id":121,"time":"2020-06-17 03:27:50.689368","s":40.4005167959},{"index":518,"x":522,"y":665,"id":123,"time":"2020-06-17 03:27:50.689368","s":40.4005167959},{"index":519,"x":965,"y":577,"id":124,"time":"2020-06-17 03:27:50.689368","s":40.4005167959},{"index":520,"x":342,"y":267,"id":125,"time":"2020-06-17 03:27:50.689368","s":40.4005167959},{"index":521,"x":76,"y":331,"id":3,"time":"2020-06-17 03:27:51.451642","s":41.1627906977},{"index":522,"x":709,"y":147,"id":4,"time":"2020-06-17 03:27:51.451642","s":41.1627906977},{"index":523,"x":821,"y":197,"id":96,"time":"2020-06-17 03:27:51.451642","s":41.1627906977},{"index":524,"x":270,"y":207,"id":99,"time":"2020-06-17 03:27:51.451642","s":41.1627906977},{"index":525,"x":640,"y":107,"id":108,"time":"2020-06-17 03:27:51.451642","s":41.1627906977},{"index":526,"x":99,"y":234,"id":116,"time":"2020-06-17 03:27:51.451642","s":41.1627906977},{"index":527,"x":520,"y":663,"id":123,"time":"2020-06-17 03:27:51.451642","s":41.1627906977},{"index":528,"x":898,"y":523,"id":124,"time":"2020-06-17 03:27:51.451642","s":41.1627906977},{"index":529,"x":76,"y":330,"id":3,"time":"2020-06-17 03:27:52.213916","s":41.9250645995},{"index":530,"x":716,"y":132,"id":4,"time":"2020-06-17 03:27:52.213916","s":41.9250645995},{"index":531,"x":819,"y":198,"id":96,"time":"2020-06-17 03:27:52.213916","s":41.9250645995},{"index":532,"x":259,"y":222,"id":99,"time":"2020-06-17 03:27:52.213916","s":41.9250645995},{"index":533,"x":639,"y":115,"id":108,"time":"2020-06-17 03:27:52.213916","s":41.9250645995},{"index":534,"x":348,"y":135,"id":114,"time":"2020-06-17 03:27:52.213916","s":41.9250645995},{"index":535,"x":91,"y":233,"id":116,"time":"2020-06-17 03:27:52.213916","s":41.9250645995},{"index":536,"x":515,"y":653,"id":123,"time":"2020-06-17 03:27:52.213916","s":41.9250645995},{"index":537,"x":738,"y":418,"id":124,"time":"2020-06-17 03:27:52.213916","s":41.9250645995},{"index":538,"x":340,"y":270,"id":125,"time":"2020-06-17 03:27:52.213916","s":41.9250645995},{"index":539,"x":61,"y":346,"id":3,"time":"2020-06-17 03:27:52.976190","s":42.6873385013},{"index":540,"x":729,"y":141,"id":4,"time":"2020-06-17 03:27:52.976190","s":42.6873385013},{"index":541,"x":819,"y":198,"id":96,"time":"2020-06-17 03:27:52.976190","s":42.6873385013},{"index":542,"x":293,"y":235,"id":99,"time":"2020-06-17 03:27:52.976190","s":42.6873385013},{"index":543,"x":646,"y":117,"id":108,"time":"2020-06-17 03:27:52.976190","s":42.6873385013},{"index":544,"x":347,"y":135,"id":114,"time":"2020-06-17 03:27:52.976190","s":42.6873385013},{"index":545,"x":92,"y":234,"id":116,"time":"2020-06-17 03:27:52.976190","s":42.6873385013},{"index":546,"x":224,"y":187,"id":121,"time":"2020-06-17 03:27:52.976190","s":42.6873385013},{"index":547,"x":511,"y":650,"id":123,"time":"2020-06-17 03:27:52.976190","s":42.6873385013},{"index":548,"x":588,"y":288,"id":124,"time":"2020-06-17 03:27:52.976190","s":42.6873385013},{"index":549,"x":342,"y":269,"id":125,"time":"2020-06-17 03:27:52.976190","s":42.6873385013},{"index":550,"x":72,"y":349,"id":3,"time":"2020-06-17 03:27:53.738463","s":43.4496124031},{"index":551,"x":713,"y":147,"id":4,"time":"2020-06-17 03:27:53.738463","s":43.4496124031},{"index":552,"x":819,"y":197,"id":96,"time":"2020-06-17 03:27:53.738463","s":43.4496124031},{"index":553,"x":631,"y":111,"id":108,"time":"2020-06-17 03:27:53.738463","s":43.4496124031},{"index":554,"x":346,"y":134,"id":114,"time":"2020-06-17 03:27:53.738463","s":43.4496124031},{"index":555,"x":92,"y":235,"id":116,"time":"2020-06-17 03:27:53.738463","s":43.4496124031},{"index":556,"x":506,"y":647,"id":123,"time":"2020-06-17 03:27:53.738463","s":43.4496124031},{"index":557,"x":457,"y":264,"id":124,"time":"2020-06-17 03:27:53.738463","s":43.4496124031},{"index":558,"x":340,"y":266,"id":125,"time":"2020-06-17 03:27:53.738463","s":43.4496124031},{"index":559,"x":78,"y":372,"id":3,"time":"2020-06-17 03:27:54.500737","s":44.2118863049},{"index":560,"x":708,"y":143,"id":4,"time":"2020-06-17 03:27:54.500737","s":44.2118863049},{"index":561,"x":821,"y":198,"id":96,"time":"2020-06-17 03:27:54.500737","s":44.2118863049},{"index":562,"x":286,"y":233,"id":99,"time":"2020-06-17 03:27:54.500737","s":44.2118863049},{"index":563,"x":627,"y":109,"id":108,"time":"2020-06-17 03:27:54.500737","s":44.2118863049},{"index":564,"x":347,"y":113,"id":114,"time":"2020-06-17 03:27:54.500737","s":44.2118863049},{"index":565,"x":98,"y":241,"id":116,"time":"2020-06-17 03:27:54.500737","s":44.2118863049},{"index":566,"x":513,"y":651,"id":123,"time":"2020-06-17 03:27:54.500737","s":44.2118863049},{"index":567,"x":406,"y":248,"id":124,"time":"2020-06-17 03:27:54.500737","s":44.2118863049},{"index":568,"x":256,"y":114,"id":130,"time":"2020-06-17 03:27:54.500737","s":44.2118863049},{"index":569,"x":82,"y":364,"id":3,"time":"2020-06-17 03:27:55.263011","s":44.9741602067},{"index":570,"x":719,"y":142,"id":4,"time":"2020-06-17 03:27:55.263011","s":44.9741602067},{"index":571,"x":821,"y":197,"id":96,"time":"2020-06-17 03:27:55.263011","s":44.9741602067},{"index":572,"x":286,"y":213,"id":99,"time":"2020-06-17 03:27:55.263011","s":44.9741602067},{"index":573,"x":629,"y":110,"id":108,"time":"2020-06-17 03:27:55.263011","s":44.9741602067},{"index":574,"x":110,"y":250,"id":116,"time":"2020-06-17 03:27:55.263011","s":44.9741602067},{"index":575,"x":524,"y":654,"id":123,"time":"2020-06-17 03:27:55.263011","s":44.9741602067},{"index":576,"x":380,"y":236,"id":124,"time":"2020-06-17 03:27:55.263011","s":44.9741602067},{"index":577,"x":253,"y":112,"id":130,"time":"2020-06-17 03:27:55.263011","s":44.9741602067},{"index":578,"x":81,"y":348,"id":3,"time":"2020-06-17 03:27:56.025285","s":45.7364341085},{"index":579,"x":820,"y":198,"id":96,"time":"2020-06-17 03:27:56.025285","s":45.7364341085},{"index":580,"x":260,"y":211,"id":99,"time":"2020-06-17 03:27:56.025285","s":45.7364341085},{"index":581,"x":637,"y":112,"id":108,"time":"2020-06-17 03:27:56.025285","s":45.7364341085},{"index":582,"x":348,"y":111,"id":114,"time":"2020-06-17 03:27:56.025285","s":45.7364341085},{"index":583,"x":120,"y":244,"id":116,"time":"2020-06-17 03:27:56.025285","s":45.7364341085},{"index":584,"x":544,"y":647,"id":123,"time":"2020-06-17 03:27:56.025285","s":45.7364341085},{"index":585,"x":392,"y":244,"id":124,"time":"2020-06-17 03:27:56.025285","s":45.7364341085},{"index":586,"x":74,"y":379,"id":3,"time":"2020-06-17 03:27:56.787559","s":46.4987080103},{"index":587,"x":805,"y":174,"id":4,"time":"2020-06-17 03:27:56.787559","s":46.4987080103},{"index":588,"x":259,"y":213,"id":99,"time":"2020-06-17 03:27:56.787559","s":46.4987080103},{"index":589,"x":638,"y":111,"id":108,"time":"2020-06-17 03:27:56.787559","s":46.4987080103},{"index":590,"x":347,"y":130,"id":114,"time":"2020-06-17 03:27:56.787559","s":46.4987080103},{"index":591,"x":106,"y":246,"id":116,"time":"2020-06-17 03:27:56.787559","s":46.4987080103},{"index":592,"x":547,"y":634,"id":123,"time":"2020-06-17 03:27:56.787559","s":46.4987080103},{"index":593,"x":440,"y":272,"id":124,"time":"2020-06-17 03:27:56.787559","s":46.4987080103},{"index":594,"x":343,"y":267,"id":125,"time":"2020-06-17 03:27:56.787559","s":46.4987080103},{"index":595,"x":139,"y":429,"id":3,"time":"2020-06-17 03:27:57.549833","s":47.2609819121},{"index":596,"x":854,"y":219,"id":4,"time":"2020-06-17 03:27:57.549833","s":47.2609819121},{"index":597,"x":257,"y":216,"id":99,"time":"2020-06-17 03:27:57.549833","s":47.2609819121},{"index":598,"x":629,"y":110,"id":108,"time":"2020-06-17 03:27:57.549833","s":47.2609819121},{"index":599,"x":347,"y":135,"id":114,"time":"2020-06-17 03:27:57.549833","s":47.2609819121},{"index":600,"x":549,"y":616,"id":123,"time":"2020-06-17 03:27:57.549833","s":47.2609819121},{"index":601,"x":322,"y":276,"id":125,"time":"2020-06-17 03:27:57.549833","s":47.2609819121},{"index":602,"x":707,"y":131,"id":135,"time":"2020-06-17 03:27:57.549833","s":47.2609819121},{"index":603,"x":179,"y":448,"id":3,"time":"2020-06-17 03:27:58.312107","s":48.023255814},{"index":604,"x":915,"y":292,"id":4,"time":"2020-06-17 03:27:58.312107","s":48.023255814},{"index":605,"x":811,"y":195,"id":96,"time":"2020-06-17 03:27:58.312107","s":48.023255814},{"index":606,"x":260,"y":201,"id":99,"time":"2020-06-17 03:27:58.312107","s":48.023255814},{"index":607,"x":639,"y":112,"id":108,"time":"2020-06-17 03:27:58.312107","s":48.023255814},{"index":608,"x":539,"y":613,"id":123,"time":"2020-06-17 03:27:58.312107","s":48.023255814},{"index":609,"x":451,"y":347,"id":124,"time":"2020-06-17 03:27:58.312107","s":48.023255814},{"index":610,"x":319,"y":289,"id":125,"time":"2020-06-17 03:27:58.312107","s":48.023255814},{"index":611,"x":94,"y":234,"id":137,"time":"2020-06-17 03:27:58.312107","s":48.023255814},{"index":612,"x":249,"y":473,"id":3,"time":"2020-06-17 03:27:59.074381","s":48.7855297158},{"index":613,"x":908,"y":342,"id":4,"time":"2020-06-17 03:27:59.074381","s":48.7855297158},{"index":614,"x":793,"y":188,"id":96,"time":"2020-06-17 03:27:59.074381","s":48.7855297158},{"index":615,"x":243,"y":211,"id":99,"time":"2020-06-17 03:27:59.074381","s":48.7855297158},{"index":616,"x":640,"y":111,"id":108,"time":"2020-06-17 03:27:59.074381","s":48.7855297158},{"index":617,"x":523,"y":614,"id":123,"time":"2020-06-17 03:27:59.074381","s":48.7855297158},{"index":618,"x":469,"y":296,"id":124,"time":"2020-06-17 03:27:59.074381","s":48.7855297158},{"index":619,"x":322,"y":286,"id":125,"time":"2020-06-17 03:27:59.074381","s":48.7855297158},{"index":620,"x":106,"y":245,"id":137,"time":"2020-06-17 03:27:59.074381","s":48.7855297158},{"index":621,"x":718,"y":152,"id":141,"time":"2020-06-17 03:27:59.074381","s":48.7855297158},{"index":622,"x":348,"y":428,"id":3,"time":"2020-06-17 03:27:59.836655","s":49.5478036176},{"index":623,"x":791,"y":374,"id":4,"time":"2020-06-17 03:27:59.836655","s":49.5478036176},{"index":624,"x":789,"y":183,"id":96,"time":"2020-06-17 03:27:59.836655","s":49.5478036176},{"index":625,"x":255,"y":183,"id":99,"time":"2020-06-17 03:27:59.836655","s":49.5478036176},{"index":626,"x":635,"y":112,"id":108,"time":"2020-06-17 03:27:59.836655","s":49.5478036176},{"index":627,"x":348,"y":133,"id":114,"time":"2020-06-17 03:27:59.836655","s":49.5478036176},{"index":628,"x":545,"y":620,"id":123,"time":"2020-06-17 03:27:59.836655","s":49.5478036176},{"index":629,"x":512,"y":280,"id":124,"time":"2020-06-17 03:27:59.836655","s":49.5478036176},{"index":630,"x":103,"y":243,"id":137,"time":"2020-06-17 03:27:59.836655","s":49.5478036176},{"index":631,"x":254,"y":97,"id":143,"time":"2020-06-17 03:27:59.836655","s":49.5478036176},{"index":632,"x":374,"y":433,"id":3,"time":"2020-06-17 03:28:00.598929","s":50.3100775194},{"index":633,"x":626,"y":300,"id":4,"time":"2020-06-17 03:28:00.598929","s":50.3100775194},{"index":634,"x":254,"y":209,"id":99,"time":"2020-06-17 03:28:00.598929","s":50.3100775194},{"index":635,"x":665,"y":107,"id":108,"time":"2020-06-17 03:28:00.598929","s":50.3100775194},{"index":636,"x":348,"y":134,"id":114,"time":"2020-06-17 03:28:00.598929","s":50.3100775194},{"index":637,"x":539,"y":623,"id":123,"time":"2020-06-17 03:28:00.598929","s":50.3100775194},{"index":638,"x":492,"y":283,"id":124,"time":"2020-06-17 03:28:00.598929","s":50.3100775194},{"index":639,"x":107,"y":246,"id":137,"time":"2020-06-17 03:28:00.598929","s":50.3100775194},{"index":640,"x":256,"y":87,"id":143,"time":"2020-06-17 03:28:00.598929","s":50.3100775194},{"index":641,"x":371,"y":427,"id":3,"time":"2020-06-17 03:28:01.361202","s":51.0723514212},{"index":642,"x":584,"y":233,"id":4,"time":"2020-06-17 03:28:01.361202","s":51.0723514212},{"index":643,"x":274,"y":206,"id":99,"time":"2020-06-17 03:28:01.361202","s":51.0723514212},{"index":644,"x":663,"y":114,"id":108,"time":"2020-06-17 03:28:01.361202","s":51.0723514212},{"index":645,"x":350,"y":132,"id":114,"time":"2020-06-17 03:28:01.361202","s":51.0723514212},{"index":646,"x":522,"y":619,"id":123,"time":"2020-06-17 03:28:01.361202","s":51.0723514212},{"index":647,"x":487,"y":284,"id":124,"time":"2020-06-17 03:28:01.361202","s":51.0723514212},{"index":648,"x":95,"y":240,"id":137,"time":"2020-06-17 03:28:01.361202","s":51.0723514212},{"index":649,"x":254,"y":84,"id":143,"time":"2020-06-17 03:28:01.361202","s":51.0723514212},{"index":650,"x":371,"y":422,"id":3,"time":"2020-06-17 03:28:02.123476","s":51.834625323},{"index":651,"x":488,"y":153,"id":4,"time":"2020-06-17 03:28:02.123476","s":51.834625323},{"index":652,"x":280,"y":211,"id":99,"time":"2020-06-17 03:28:02.123476","s":51.834625323},{"index":653,"x":659,"y":112,"id":108,"time":"2020-06-17 03:28:02.123476","s":51.834625323},{"index":654,"x":521,"y":622,"id":123,"time":"2020-06-17 03:28:02.123476","s":51.834625323},{"index":655,"x":483,"y":281,"id":124,"time":"2020-06-17 03:28:02.123476","s":51.834625323},{"index":656,"x":87,"y":252,"id":137,"time":"2020-06-17 03:28:02.123476","s":51.834625323},{"index":657,"x":772,"y":145,"id":150,"time":"2020-06-17 03:28:02.123476","s":51.834625323},{"index":658,"x":367,"y":423,"id":3,"time":"2020-06-17 03:28:02.885750","s":52.5968992248},{"index":659,"x":433,"y":171,"id":4,"time":"2020-06-17 03:28:02.885750","s":52.5968992248},{"index":660,"x":290,"y":184,"id":99,"time":"2020-06-17 03:28:02.885750","s":52.5968992248},{"index":661,"x":659,"y":113,"id":108,"time":"2020-06-17 03:28:02.885750","s":52.5968992248},{"index":662,"x":350,"y":132,"id":114,"time":"2020-06-17 03:28:02.885750","s":52.5968992248},{"index":663,"x":517,"y":621,"id":123,"time":"2020-06-17 03:28:02.885750","s":52.5968992248},{"index":664,"x":490,"y":283,"id":124,"time":"2020-06-17 03:28:02.885750","s":52.5968992248},{"index":665,"x":95,"y":252,"id":137,"time":"2020-06-17 03:28:02.885750","s":52.5968992248},{"index":666,"x":253,"y":99,"id":143,"time":"2020-06-17 03:28:02.885750","s":52.5968992248},{"index":667,"x":765,"y":143,"id":150,"time":"2020-06-17 03:28:02.885750","s":52.5968992248},{"index":668,"x":359,"y":421,"id":3,"time":"2020-06-17 03:28:03.648024","s":53.3591731266},{"index":669,"x":404,"y":174,"id":4,"time":"2020-06-17 03:28:03.648024","s":53.3591731266},{"index":670,"x":284,"y":197,"id":99,"time":"2020-06-17 03:28:03.648024","s":53.3591731266},{"index":671,"x":657,"y":113,"id":108,"time":"2020-06-17 03:28:03.648024","s":53.3591731266},{"index":672,"x":490,"y":631,"id":123,"time":"2020-06-17 03:28:03.648024","s":53.3591731266},{"index":673,"x":495,"y":282,"id":124,"time":"2020-06-17 03:28:03.648024","s":53.3591731266},{"index":674,"x":257,"y":110,"id":143,"time":"2020-06-17 03:28:03.648024","s":53.3591731266},{"index":675,"x":1132,"y":577,"id":152,"time":"2020-06-17 03:28:03.648024","s":53.3591731266},{"index":676,"x":372,"y":429,"id":3,"time":"2020-06-17 03:28:04.410298","s":54.1214470284},{"index":677,"x":383,"y":157,"id":4,"time":"2020-06-17 03:28:04.410298","s":54.1214470284},{"index":678,"x":311,"y":201,"id":99,"time":"2020-06-17 03:28:04.410298","s":54.1214470284},{"index":679,"x":658,"y":114,"id":108,"time":"2020-06-17 03:28:04.410298","s":54.1214470284},{"index":680,"x":492,"y":286,"id":124,"time":"2020-06-17 03:28:04.410298","s":54.1214470284},{"index":681,"x":123,"y":250,"id":137,"time":"2020-06-17 03:28:04.410298","s":54.1214470284},{"index":682,"x":269,"y":139,"id":143,"time":"2020-06-17 03:28:04.410298","s":54.1214470284},{"index":683,"x":1024,"y":543,"id":152,"time":"2020-06-17 03:28:04.410298","s":54.1214470284},{"index":684,"x":237,"y":203,"id":155,"time":"2020-06-17 03:28:04.410298","s":54.1214470284},{"index":685,"x":368,"y":438,"id":3,"time":"2020-06-17 03:28:05.172572","s":54.8837209302},{"index":686,"x":382,"y":110,"id":4,"time":"2020-06-17 03:28:05.172572","s":54.8837209302},{"index":687,"x":365,"y":209,"id":99,"time":"2020-06-17 03:28:05.172572","s":54.8837209302},{"index":688,"x":660,"y":114,"id":108,"time":"2020-06-17 03:28:05.172572","s":54.8837209302},{"index":689,"x":490,"y":287,"id":124,"time":"2020-06-17 03:28:05.172572","s":54.8837209302},{"index":690,"x":124,"y":251,"id":137,"time":"2020-06-17 03:28:05.172572","s":54.8837209302},{"index":691,"x":280,"y":187,"id":143,"time":"2020-06-17 03:28:05.172572","s":54.8837209302},{"index":692,"x":771,"y":141,"id":150,"time":"2020-06-17 03:28:05.172572","s":54.8837209302},{"index":693,"x":1014,"y":527,"id":152,"time":"2020-06-17 03:28:05.172572","s":54.8837209302},{"index":694,"x":621,"y":100,"id":157,"time":"2020-06-17 03:28:05.172572","s":54.8837209302},{"index":695,"x":580,"y":664,"id":159,"time":"2020-06-17 03:28:05.172572","s":54.8837209302},{"index":696,"x":337,"y":469,"id":3,"time":"2020-06-17 03:28:05.934846","s":55.645994832},{"index":697,"x":385,"y":251,"id":99,"time":"2020-06-17 03:28:05.934846","s":55.645994832},{"index":698,"x":658,"y":114,"id":108,"time":"2020-06-17 03:28:05.934846","s":55.645994832},{"index":699,"x":490,"y":361,"id":124,"time":"2020-06-17 03:28:05.934846","s":55.645994832},{"index":700,"x":115,"y":252,"id":137,"time":"2020-06-17 03:28:05.934846","s":55.645994832},{"index":701,"x":300,"y":176,"id":143,"time":"2020-06-17 03:28:05.934846","s":55.645994832},{"index":702,"x":1011,"y":555,"id":152,"time":"2020-06-17 03:28:05.934846","s":55.645994832},{"index":703,"x":240,"y":167,"id":155,"time":"2020-06-17 03:28:05.934846","s":55.645994832},{"index":704,"x":620,"y":101,"id":157,"time":"2020-06-17 03:28:05.934846","s":55.645994832},{"index":705,"x":625,"y":582,"id":159,"time":"2020-06-17 03:28:05.934846","s":55.645994832},{"index":706,"x":283,"y":503,"id":3,"time":"2020-06-17 03:28:06.697120","s":56.4082687339},{"index":707,"x":640,"y":108,"id":108,"time":"2020-06-17 03:28:06.697120","s":56.4082687339},{"index":708,"x":417,"y":333,"id":124,"time":"2020-06-17 03:28:06.697120","s":56.4082687339},{"index":709,"x":105,"y":254,"id":137,"time":"2020-06-17 03:28:06.697120","s":56.4082687339},{"index":710,"x":314,"y":242,"id":143,"time":"2020-06-17 03:28:06.697120","s":56.4082687339},{"index":711,"x":763,"y":144,"id":150,"time":"2020-06-17 03:28:06.697120","s":56.4082687339},{"index":712,"x":968,"y":615,"id":152,"time":"2020-06-17 03:28:06.697120","s":56.4082687339},{"index":713,"x":724,"y":540,"id":159,"time":"2020-06-17 03:28:06.697120","s":56.4082687339},{"index":714,"x":437,"y":177,"id":162,"time":"2020-06-17 03:28:06.697120","s":56.4082687339},{"index":715,"x":253,"y":511,"id":3,"time":"2020-06-17 03:28:07.459394","s":57.1705426357},{"index":716,"x":472,"y":242,"id":99,"time":"2020-06-17 03:28:07.459394","s":57.1705426357},{"index":717,"x":634,"y":107,"id":108,"time":"2020-06-17 03:28:07.459394","s":57.1705426357},{"index":718,"x":370,"y":351,"id":124,"time":"2020-06-17 03:28:07.459394","s":57.1705426357},{"index":719,"x":107,"y":251,"id":137,"time":"2020-06-17 03:28:07.459394","s":57.1705426357},{"index":720,"x":316,"y":175,"id":143,"time":"2020-06-17 03:28:07.459394","s":57.1705426357},{"index":721,"x":764,"y":146,"id":150,"time":"2020-06-17 03:28:07.459394","s":57.1705426357},{"index":722,"x":239,"y":152,"id":155,"time":"2020-06-17 03:28:07.459394","s":57.1705426357},{"index":723,"x":722,"y":535,"id":159,"time":"2020-06-17 03:28:07.459394","s":57.1705426357},{"index":724,"x":712,"y":134,"id":165,"time":"2020-06-17 03:28:07.459394","s":57.1705426357},{"index":725,"x":272,"y":528,"id":3,"time":"2020-06-17 03:28:08.221668","s":57.9328165375},{"index":726,"x":537,"y":251,"id":99,"time":"2020-06-17 03:28:08.221668","s":57.9328165375},{"index":727,"x":639,"y":106,"id":108,"time":"2020-06-17 03:28:08.221668","s":57.9328165375},{"index":728,"x":361,"y":401,"id":124,"time":"2020-06-17 03:28:08.221668","s":57.9328165375},{"index":729,"x":315,"y":155,"id":143,"time":"2020-06-17 03:28:08.221668","s":57.9328165375},{"index":730,"x":764,"y":150,"id":150,"time":"2020-06-17 03:28:08.221668","s":57.9328165375},{"index":731,"x":762,"y":530,"id":159,"time":"2020-06-17 03:28:08.221668","s":57.9328165375},{"index":732,"x":288,"y":555,"id":3,"time":"2020-06-17 03:28:08.983941","s":58.6950904393},{"index":733,"x":584,"y":274,"id":99,"time":"2020-06-17 03:28:08.983941","s":58.6950904393},{"index":734,"x":609,"y":101,"id":108,"time":"2020-06-17 03:28:08.983941","s":58.6950904393},{"index":735,"x":365,"y":412,"id":124,"time":"2020-06-17 03:28:08.983941","s":58.6950904393},{"index":736,"x":108,"y":251,"id":137,"time":"2020-06-17 03:28:08.983941","s":58.6950904393},{"index":737,"x":314,"y":147,"id":143,"time":"2020-06-17 03:28:08.983941","s":58.6950904393},{"index":738,"x":764,"y":152,"id":150,"time":"2020-06-17 03:28:08.983941","s":58.6950904393},{"index":739,"x":780,"y":517,"id":159,"time":"2020-06-17 03:28:08.983941","s":58.6950904393},{"index":740,"x":694,"y":434,"id":168,"time":"2020-06-17 03:28:08.983941","s":58.6950904393},{"index":741,"x":709,"y":147,"id":169,"time":"2020-06-17 03:28:08.983941","s":58.6950904393}]}
\ No newline at end of file
{"schema":{"fields":[{"name":"index","type":"string"},{"name":"total","type":"string"},{"name":"now","type":"string"},{"name":"time","type":"string"},{"name":"s","type":"string"}],"primaryKey":["index"],"pandas_version":"0.20.0"},"data":[]}
\ No newline at end of file
{"schema":{"fields":[{"name":"index","type":"string"},{"name":"x","type":"string"},{"name":"y","type":"string"},{"name":"id","type":"string"},{"name":"time","type":"string"},{"name":"s","type":"string"}],"primaryKey":["index"],"pandas_version":"0.20.0"},"data":[]}
\ No newline at end of file
Keras==2.3.1
tensorflow-gpu==1.13.1
numpy==1.15.0
opencv-python==4.2.0.34
scikit-learn==0.23.1
scipy==1.4.1
Pillow
torch==1.4.0
torchvision==0.5.0
This file is too large to display.
This file is too large to display.
import cv2
import os
import moviepy.editor as mp
video = "00046"
#input_imgs
im_dir = '/home/cai/Desktop/yolo_dataset/t1_video/t1_video_'+video+'/'
#im_dir =
#output_video
video_dir = '/home/cai/Desktop/yolo_dataset/t1_video/test_video/det_t1_video_'+video+'_test_q.avi'
#fps
fps = 50
#num_of_imgs
num = 310
#img_size
img_size = (1920,1080)
fourcc = cv2.VideoWriter_fourcc('M','J','P','G')
#opencv3
videoWriter = cv2.VideoWriter(video_dir, fourcc, fps, img_size)
for i in range(0,num):
#im_name = os.path.join(im_dir,'frame-' + str(i) + '.png')
im_name = os.path.join(im_dir,'t1_video_'+video+'_' + "%05d" % i + '.jpg')
frame = cv2.imread(im_name)
#frame = cv2.resize(frame, (480, 320))
#frame = cv2.resize(frame,(520,320), interpolation=cv2.INTER_CUBIC)
videoWriter.write(frame)
print (im_name)
videoWriter.release()
print('finish')
import argparse
import tensorflow as tf
import tensorflow.contrib.slim as slim
def _batch_norm_fn(x, scope=None):
if scope is None:
scope = tf.get_variable_scope().name + "/bn"
return slim.batch_norm(x, scope=scope)
def create_link(
incoming, network_builder, scope, nonlinearity=tf.nn.elu,
weights_initializer=tf.truncated_normal_initializer(stddev=1e-3),
regularizer=None, is_first=False, summarize_activations=True):
if is_first:
network = incoming
else:
network = _batch_norm_fn(incoming, scope=scope + "/bn")
network = nonlinearity(network)
if summarize_activations:
tf.summary.histogram(scope+"/activations", network)
pre_block_network = network
post_block_network = network_builder(pre_block_network, scope)
incoming_dim = pre_block_network.get_shape().as_list()[-1]
outgoing_dim = post_block_network.get_shape().as_list()[-1]
if incoming_dim != outgoing_dim:
assert outgoing_dim == 2 * incoming_dim, \
"%d != %d" % (outgoing_dim, 2 * incoming)
projection = slim.conv2d(
incoming, outgoing_dim, 1, 2, padding="SAME", activation_fn=None,
scope=scope+"/projection", weights_initializer=weights_initializer,
biases_initializer=None, weights_regularizer=regularizer)
network = projection + post_block_network
else:
network = incoming + post_block_network
return network
def create_inner_block(
incoming, scope, nonlinearity=tf.nn.elu,
weights_initializer=tf.truncated_normal_initializer(1e-3),
bias_initializer=tf.zeros_initializer(), regularizer=None,
increase_dim=False, summarize_activations=True):
n = incoming.get_shape().as_list()[-1]
stride = 1
if increase_dim:
n *= 2
stride = 2
incoming = slim.conv2d(
incoming, n, [3, 3], stride, activation_fn=nonlinearity, padding="SAME",
normalizer_fn=_batch_norm_fn, weights_initializer=weights_initializer,
biases_initializer=bias_initializer, weights_regularizer=regularizer,
scope=scope + "/1")
if summarize_activations:
tf.summary.histogram(incoming.name + "/activations", incoming)
incoming = slim.dropout(incoming, keep_prob=0.6)
incoming = slim.conv2d(
incoming, n, [3, 3], 1, activation_fn=None, padding="SAME",
normalizer_fn=None, weights_initializer=weights_initializer,
biases_initializer=bias_initializer, weights_regularizer=regularizer,
scope=scope + "/2")
return incoming
def residual_block(incoming, scope, nonlinearity=tf.nn.elu,
weights_initializer=tf.truncated_normal_initializer(1e3),
bias_initializer=tf.zeros_initializer(), regularizer=None,
increase_dim=False, is_first=False,
summarize_activations=True):
def network_builder(x, s):
return create_inner_block(
x, s, nonlinearity, weights_initializer, bias_initializer,
regularizer, increase_dim, summarize_activations)
return create_link(
incoming, network_builder, scope, nonlinearity, weights_initializer,
regularizer, is_first, summarize_activations)
def _create_network(incoming, reuse=None, weight_decay=1e-8):
nonlinearity = tf.nn.elu
conv_weight_init = tf.truncated_normal_initializer(stddev=1e-3)
conv_bias_init = tf.zeros_initializer()
conv_regularizer = slim.l2_regularizer(weight_decay)
fc_weight_init = tf.truncated_normal_initializer(stddev=1e-3)
fc_bias_init = tf.zeros_initializer()
fc_regularizer = slim.l2_regularizer(weight_decay)
def batch_norm_fn(x):
return slim.batch_norm(x, scope=tf.get_variable_scope().name + "/bn")
network = incoming
network = slim.conv2d(
network, 32, [3, 3], stride=1, activation_fn=nonlinearity,
padding="SAME", normalizer_fn=batch_norm_fn, scope="conv1_1",
weights_initializer=conv_weight_init, biases_initializer=conv_bias_init,
weights_regularizer=conv_regularizer)
network = slim.conv2d(
network, 32, [3, 3], stride=1, activation_fn=nonlinearity,
padding="SAME", normalizer_fn=batch_norm_fn, scope="conv1_2",
weights_initializer=conv_weight_init, biases_initializer=conv_bias_init,
weights_regularizer=conv_regularizer)
# NOTE(nwojke): This is missing a padding="SAME" to match the CNN
# architecture in Table 1 of the paper. Information on how this affects
# performance on MOT 16 training sequences can be found in
# issue 10 https://github.com/nwojke/deep_sort/issues/10
network = slim.max_pool2d(network, [3, 3], [2, 2], scope="pool1")
network = residual_block(
network, "conv2_1", nonlinearity, conv_weight_init, conv_bias_init,
conv_regularizer, increase_dim=False, is_first=True)
network = residual_block(
network, "conv2_3", nonlinearity, conv_weight_init, conv_bias_init,
conv_regularizer, increase_dim=False)
network = residual_block(
network, "conv3_1", nonlinearity, conv_weight_init, conv_bias_init,
conv_regularizer, increase_dim=True)
network = residual_block(
network, "conv3_3", nonlinearity, conv_weight_init, conv_bias_init,
conv_regularizer, increase_dim=False)
network = residual_block(
network, "conv4_1", nonlinearity, conv_weight_init, conv_bias_init,
conv_regularizer, increase_dim=True)
network = residual_block(
network, "conv4_3", nonlinearity, conv_weight_init, conv_bias_init,
conv_regularizer, increase_dim=False)
feature_dim = network.get_shape().as_list()[-1]
network = slim.flatten(network)
network = slim.dropout(network, keep_prob=0.6)
network = slim.fully_connected(
network, feature_dim, activation_fn=nonlinearity,
normalizer_fn=batch_norm_fn, weights_regularizer=fc_regularizer,
scope="fc1", weights_initializer=fc_weight_init,
biases_initializer=fc_bias_init)
features = network
# Features in rows, normalize axis 1.
features = slim.batch_norm(features, scope="ball", reuse=reuse)
feature_norm = tf.sqrt(
tf.constant(1e-8, tf.float32) +
tf.reduce_sum(tf.square(features), [1], keepdims=True))
features = features / feature_norm
return features, None
def _network_factory(weight_decay=1e-8):
def factory_fn(image, reuse):
with slim.arg_scope([slim.batch_norm, slim.dropout],
is_training=False):
with slim.arg_scope([slim.conv2d, slim.fully_connected,
slim.batch_norm, slim.layer_norm],
reuse=reuse):
features, logits = _create_network(
image, reuse=reuse, weight_decay=weight_decay)
return features, logits
return factory_fn
def _preprocess(image):
image = image[:, :, ::-1] # BGR to RGB
return image
def parse_args():
"""Parse command line arguments.
"""
parser = argparse.ArgumentParser(description="Freeze old model")
parser.add_argument(
"--checkpoint_in",
default="resources/networks/mars-small128.ckpt-68577",
help="Path to checkpoint file")
parser.add_argument(
"--graphdef_out",
default="resources/networks/mars-small128.pb")
return parser.parse_args()
def main():
args = parse_args()
with tf.Session(graph=tf.Graph()) as session:
input_var = tf.placeholder(
tf.uint8, (None, 128, 64, 3), name="images")
image_var = tf.map_fn(
lambda x: _preprocess(x), tf.cast(input_var, tf.float32),
back_prop=False)
factory_fn = _network_factory()
features, _ = factory_fn(image_var, reuse=None)
features = tf.identity(features, name="features")
saver = tf.train.Saver(slim.get# vim: expandtab:ts=4:sw=4
import argparse
import tensorflow as tf
import tensorflow.contrib.slim as slim
def _batch_norm_fn(x, scope=None):
if scope is None:
scope = tf.get_variable_scope().name + "/bn"
return slim.batch_norm(x, scope=scope)
def create_link(
incoming, network_builder, scope, nonlinearity=tf.nn.elu,
weights_initializer=tf.truncated_normal_initializer(stddev=1e-3),
regularizer=None, is_first=False, summarize_activations=True):
if is_first:
network = incoming
else:
network = _batch_norm_fn(incoming, scope=scope + "/bn")
network = nonlinearity(network)
if summarize_activations:
tf.summary.histogram(scope+"/activations", network)
pre_block_network = network
post_block_network = network_builder(pre_block_network, scope)
incoming_dim = pre_block_network.get_shape().as_list()[-1]
outgoing_dim = post_block_network.get_shape().as_list()[-1]
if incoming_dim != outgoing_dim:
assert outgoing_dim == 2 * incoming_dim, \
"%d != %d" % (outgoing_dim, 2 * incoming)
projection = slim.conv2d(
incoming, outgoing_dim, 1, 2, padding="SAME", activation_fn=None,
scope=scope+"/projection", weights_initializer=weights_initializer,
biases_initializer=None, weights_regularizer=regularizer)
network = projection + post_block_network
else:
network = incoming + post_block_network
return network
def create_inner_block(
incoming, scope, nonlinearity=tf.nn.elu,
weights_initializer=tf.truncated_normal_initializer(1e-3),
bias_initializer=tf.zeros_initializer(), regularizer=None,
increase_dim=False, summarize_activations=True):
n = incoming.get_shape().as_list()[-1]
stride = 1
if increase_dim:
n *= 2
stride = 2
incoming = slim.conv2d(
incoming, n, [3, 3], stride, activation_fn=nonlinearity, padding="SAME",
normalizer_fn=_batch_norm_fn, weights_initializer=weights_initializer,
biases_initializer=bias_initializer, weights_regularizer=regularizer,
scope=scope + "/1")
if summarize_activations:
tf.summary.histogram(incoming.name + "/activations", incoming)
incoming = slim.dropout(incoming, keep_prob=0.6)
incoming = slim.conv2d(
incoming, n, [3, 3], 1, activation_fn=None, padding="SAME",
normalizer_fn=None, weights_initializer=weights_initializer,
biases_initializer=bias_initializer, weights_regularizer=regularizer,
scope=scope + "/2")
return incoming
def residual_block(incoming, scope, nonlinearity=tf.nn.elu,
weights_initializer=tf.truncated_normal_initializer(1e3),
bias_initializer=tf.zeros_initializer(), regularizer=None,
increase_dim=False, is_first=False,
summarize_activations=True):
def network_builder(x, s):
return create_inner_block(
x, s, nonlinearity, weights_initializer, bias_initializer,
regularizer, increase_dim, summarize_activations)
return create_link(
incoming, network_builder, scope, nonlinearity, weights_initializer,
regularizer, is_first, summarize_activations)
def _create_network(incoming, reuse=None, weight_decay=1e-8):
nonlinearity = tf.nn.elu
conv_weight_init = tf.truncated_normal_initializer(stddev=1e-3)
conv_bias_init = tf.zeros_initializer()
conv_regularizer = slim.l2_regularizer(weight_decay)
fc_weight_init = tf.truncated_normal_initializer(stddev=1e-3)
fc_bias_init = tf.zeros_initializer()
fc_regularizer = slim.l2_regularizer(weight_decay)
def batch_norm_fn(x):
return slim.batch_norm(x, scope=tf.get_variable_scope().name + "/bn")
network = incoming
network = slim.conv2d(
network, 32, [3, 3], stride=1, activation_fn=nonlinearity,
padding="SAME", normalizer_fn=batch_norm_fn, scope="conv1_1",
weights_initializer=conv_weight_init, biases_initializer=conv_bias_init,
weights_regularizer=conv_regularizer)
network = slim.conv2d(
network, 32, [3, 3], stride=1, activation_fn=nonlinearity,
padding="SAME", normalizer_fn=batch_norm_fn, scope="conv1_2",
weights_initializer=conv_weight_init, biases_initializer=conv_bias_init,
weights_regularizer=conv_regularizer)
# NOTE(nwojke): This is missing a padding="SAME" to match the CNN
# architecture in Table 1 of the paper. Information on how this affects
# performance on MOT 16 training sequences can be found in
# issue 10 https://github.com/nwojke/deep_sort/issues/10
network = slim.max_pool2d(network, [3, 3], [2, 2], scope="pool1")
network = residual_block(
network, "conv2_1", nonlinearity, conv_weight_init, conv_bias_init,
conv_regularizer, increase_dim=False, is_first=True)
network = residual_block(
network, "conv2_3", nonlinearity, conv_weight_init, conv_bias_init,
conv_regularizer, increase_dim=False)
network = residual_block(
network, "conv3_1", nonlinearity, conv_weight_init, conv_bias_init,
conv_regularizer, increase_dim=True)
network = residual_block(
network, "conv3_3", nonlinearity, conv_weight_init, conv_bias_init,
conv_regularizer, increase_dim=False)
network = residual_block(
network, "conv4_1", nonlinearity, conv_weight_init, conv_bias_init,
conv_regularizer, increase_dim=True)
network = residual_block(
network, "conv4_3", nonlinearity, conv_weight_init, conv_bias_init,
conv_regularizer, increase_dim=False)
feature_dim = network.get_shape().as_list()[-1]
network = slim.flatten(network)
network = slim.dropout(network, keep_prob=0.6)
network = slim.fully_connected(
network, feature_dim, activation_fn=nonlinearity,
normalizer_fn=batch_norm_fn, weights_regularizer=fc_regularizer,
scope="fc1", weights_initializer=fc_weight_init,
biases_initializer=fc_bias_init)
features = network
# Features in rows, normalize axis 1.
features = slim.batch_norm(features, scope="ball", reuse=reuse)
feature_norm = tf.sqrt(
tf.constant(1e-8, tf.float32) +
tf.reduce_sum(tf.square(features), [1], keepdims=True))
features = features / feature_norm
return features, None
def _network_factory(weight_decay=1e-8):
def factory_fn(image, reuse):
with slim.arg_scope([slim.batch_norm, slim.dropout],
is_training=False):
with slim.arg_scope([slim.conv2d, slim.fully_connected,
slim.batch_norm, slim.layer_norm],
reuse=reuse):
features, logits = _create_network(
image, reuse=reuse, weight_decay=weight_decay)
return features, logits
return factory_fn
def _preprocess(image):
image = image[:, :, ::-1] # BGR to RGB
return image
def parse_args():
"""Parse command line arguments.
"""
parser = argparse.ArgumentParser(description="Freeze old model")
parser.add_argument(
"--checkpoint_in",# vim: expandtab:ts=4:sw=4
import argparse
import tensorflow as tf
import tensorflow.contrib.slim as slim
def _batch_norm_fn(x, scope=None):
if scope is None:
scope = tf.get_variable_scope().name + "/bn"
return slim.batch_norm(x, scope=scope)
def create_link(
incoming, network_builder, scope, nonlinearity=tf.nn.elu,
weights_initializer=tf.truncated_normal_initializer(stddev=1e-3),
regularizer=None, is_first=False, summarize_activations=True):
if is_first:
network = incoming
else:
network = _batch_norm_fn(incoming, scope=scope + "/bn")
network = nonlinearity(network)
if summarize_activations:
tf.summary.histogram(scope+"/activations", network)
pre_block_network = network
post_block_network = network_builder(pre_block_network, scope)
incoming_dim = pre_block_network.get_shape().as_list()[-1]
outgoing_dim = post_block_network.get_shape().as_list()[-1]
if incoming_dim != outgoing_dim:
assert outgoing_dim == 2 * incoming_dim, \
"%d != %d" % (outgoing_dim, 2 * incoming)
projection = slim.conv2d(
incoming, outgoing_dim, 1, 2, padding="SAME", activation_fn=None,
scope=scope+"/projection", weights_initializer=weights_initializer,
biases_initializer=None, weights_regularizer=regularizer)
network = projection + post_block_network
else:
network = incoming + post_block_network
return network
def create_inner_block(
incoming, scope, nonlinearity=tf.nn.elu,
weights_initializer=tf.truncated_normal_initializer(1e-3),
bias_initializer=tf.zeros_initializer(), regularizer=None,
increase_dim=False, summarize_activations=True):
n = incoming.get_shape().as_list()[-1]
stride = 1
if increase_dim:
n *= 2
stride = 2
incoming = slim.conv2d(
incoming, n, [3, 3], stride, activation_fn=nonlinearity, padding="SAME",
normalizer_fn=_batch_norm_fn, weights_initializer=weights_initializer,
biases_initializer=bias_initializer, weights_regularizer=regularizer,
scope=scope + "/1")
if summarize_activations:
tf.summary.histogram(incoming.name + "/activations", incoming)
incoming = slim.dropout(incoming, keep_prob=0.6)
incoming = slim.conv2d(
incoming, n, [3, 3], 1, activation_fn=None, padding="SAME",
normalizer_fn=None, weights_initializer=weights_initializer,
biases_initializer=bias_initializer, weights_regularizer=regularizer,
scope=scope + "/2")
return incoming
def residual_block(incoming, scope, nonlinearity=tf.nn.elu,
weights_initializer=tf.truncated_normal_initializer(1e3),
bias_initializer=tf.zeros_initializer(), regularizer=None,
increase_dim=False, is_first=False,
summarize_activations=True):
def network_builder(x, s):
return create_inner_block(
x, s, nonlinearity, weights_initializer, bias_initializer,
regularizer, increase_dim, summarize_activations)
return create_link(
incoming, network_builder, scope, nonlinearity, weights_initializer,
regularizer, is_first, summarize_activations)
def _create_network(incoming, reuse=None, weight_decay=1e-8):
nonlinearity = tf.nn.elu
conv_weight_init = tf.truncated_normal_initializer(stddev=1e-3)
conv_bias_init = tf.zeros_initializer()
conv_regularizer = slim.l2_regularizer(weight_decay)
fc_weight_init = tf.truncated_normal_initializer(stddev=1e-3)
fc_bias_init = tf.zeros_initializer()
fc_regularizer = slim.l2_regularizer(weight_decay)
def batch_norm_fn(x):
return slim.batch_norm(x, scope=tf.get_variable_scope().name + "/bn")
network = incoming
network = slim.conv2d(
network, 32, [3, 3], stride=1, activation_fn=nonlinearity,
padding="SAME", normalizer_fn=batch_norm_fn, scope="conv1_1",
weights_initializer=conv_weight_init, biases_initializer=conv_bias_init,
weights_regularizer=conv_regularizer)
network = slim.conv2d(
network, 32, [3, 3], stride=1, activation_fn=nonlinearity,
padding="SAME", normalizer_fn=batch_norm_fn, scope="conv1_2",
weights_initializer=conv_weight_init, biases_initializer=conv_bias_init,
weights_regularizer=conv_regularizer)
# NOTE(nwojke): This is missing a padding="SAME" to match the CNN
# architecture in Table 1 of the paper. Information on how this affects
# performance on MOT 16 training sequences can be found in
# issue 10 https://github.com/nwojke/deep_sort/issues/10
network = slim.max_pool2d(network, [3, 3], [2, 2], scope="pool1")
network = residual_block(
network, "conv2_1", nonlinearity, conv_weight_init, conv_bias_init,
conv_regularizer, increase_dim=False, is_first=True)
network = residual_block(
network, "conv2_3", nonlinearity, conv_weight_init, conv_bias_init,
conv_regularizer, increase_dim=False)
network = residual_block(
network, "conv3_1", nonlinearity, conv_weight_init, conv_bias_init,
conv_regularizer, increase_dim=True)
network = residual_block(
network, "conv3_3", nonlinearity, conv_weight_init, conv_bias_init,
conv_regularizer, increase_dim=False)
network = residual_block(
network, "conv4_1", nonlinearity, conv_weight_init, conv_bias_init,
conv_regularizer, increase_dim=True)
network = residual_block(
network, "conv4_3", nonlinearity, conv_weight_init, conv_bias_init,
conv_regularizer, increase_dim=False)
feature_dim = network.get_shape().as_list()[-1]
network = slim.flatten(network)
network = slim.dropout(network, keep_prob=0.6)
network = slim.fully_connected(
network, feature_dim, activation_fn=nonlinearity,
normalizer_fn=batch_norm_fn, weights_regularizer=fc_regularizer,
scope="fc1", weights_initializer=fc_weight_init,
biases_initializer=fc_bias_init)
features = network
# Features in rows, normalize axis 1.
features = slim.batch_norm(features, scope="ball", reuse=reuse)
feature_norm = tf.sqrt(
tf.constant(1e-8, tf.float32) +
tf.reduce_sum(tf.square(features), [1], keepdims=True))
features = features / feature_norm
return features, None
def _network_factory(weight_decay=1e-8):
def factory_fn(image, reuse):
with slim.arg_scope([slim.batch_norm, slim.dropout],
is_training=False):
with slim.arg_scope([slim.conv2d, slim.fully_connected,
slim.batch_norm, slim.layer_norm],
reuse=reuse):
features, logits = _create_network(
image, reuse=reuse, weight_decay=weight_decay)
return features, logits
return factory_fn
def _preprocess(image):
image = image[:, :, ::-1] # BGR to RGB
return image
def parse_args():
"""Parse command line arguments.
"""
parser = argparse.ArgumentParser(description="Freeze old model")
parser.add_argument(
"--checkpoint_in",
default="resources/networks/mars-small128.ckpt-68577",
help="Path to checkpoint file")
parser.add_argument(
"--graphdef_out",
default="resources/networks/mars-small128.pb")
return parser.parse_args()
def main():
args = parse_args()
with tf.Session(graph=tf.Graph()) as session:
input_var = tf.placeholder(
tf.uint8, (None, 128, 64, 3), name="images")
image_var = tf.map_fn(
lambda x: _preprocess(x), tf.cast(input_var, tf.float32),
back_prop=False)
factory_fn = _network_factory()
features, _ = factory_fn(image_var, reuse=None)
features = tf.identity(features, name="features")
saver = tf.train.Saver(slim.get_variables_to_restore())
saver.restore(session, args.checkpoint_in)
output_graph_def = tf.graph_util.convert_variables_to_constants(
session, tf.get_default_graph().as_graph_def(),
[features.name.split(":")[0]])
with tf.gfile.GFile(args.graphdef_out, "wb") as file_handle:
file_handle.write(output_graph_def.SerializeToString())
if __name__ == "__main__":
default="resources/networks/mars-small128.ckpt-68577",
help="Path to checkpoint file")
parser.add_argument(
"--graphdef_out",
default="resources/networks/mars-small128.pb")
return parser.parse_args()
def main():
args = parse_args()
with tf.Session(graph=tf.Graph()) as session:
input_var = tf.placeholder(
tf.uint8, (None, 128, 64, 3), name="images")
image_var = tf.map_fn(
lambda x: _preprocess(x), tf.cast(input_var, tf.float32),
back_prop=False)
factory_fn = _network_factory()
features, _ = factory_fn(image_var, reuse=None)
features = tf.identity(features, name="features")
saver = tf.train.Saver(slim.get_variables_to_restore())
saver.restore(session, args.checkpoint_in)
output_graph_def = tf.graph_util.convert_variables_to_constants(
session, tf.get_default_graph().as_graph_def(),
[features.name.split(":")[0]])
with tf.gfile.GFile(args.graphdef_out, "wb") as file_handle:
file_handle.write(output_graph_def.SerializeToString())
if __name__ == "__main__":_variables_to_restore())
saver.restore(session, args.checkpoint_in)
output_graph_def = tf.graph_util.convert_variables_to_constants(
session, tf.get_default_graph().as_graph_def(),
[features.name.split(":")[0]])
with tf.gfile.GFile(args.graphdef_out, "wb") as file_handle:
file_handle.write(output_graph_def.SerializeToString())
if __name__ == "__main__":
main()
import os
import errno
import argparse
import numpy as np
import cv2
import tensorflow as tf
def _run_in_batches(f, data_dict, out, batch_size):
data_len = len(out)
num_batches = int(data_len / batch_size)
s, e = 0, 0
for i in range(num_batches):
s, e = i * batch_size, (i + 1) * batch_size
batch_data_dict = {k: v[s:e] for k, v in data_dict.items()}
out[s:e] = f(batch_data_dict)
if e < len(out):
batch_data_dict = {k: v[e:] for k, v in data_dict.items()}
out[e:] = f(batch_data_dict)
def extract_image_patch(image, bbox, patch_shape):
"""Extract image patch from bounding box.
Parameters
----------
image : ndarray
The full image.
bbox : array_like
The bounding box in format (x, y, width, height).
patch_shape : Optional[array_like]
This parameter can be used to enforce a desired patch shape
(height, width). First, the `bbox` is adapted to the aspect ratio
of the patch shape, then it is clipped at the image boundaries.
If None, the shape is computed from :arg:`bbox`.
Returns
-------
ndarray | NoneType
An image patch showing the :arg:`bbox`, optionally reshaped to
:arg:`patch_shape`.
Returns None if the bounding box is empty or fully outside of the image
boundaries.
"""
bbox = np.array(bbox)
if patch_shape is not None:
# correct aspect ratio to patch shape
target_aspect = float(patch_shape[1]) / patch_shape[0]
new_width = target_aspect * bbox[3]
bbox[0] -= (new_width - bbox[2]) / 2
bbox[2] = new_width
# convert to top left, bottom right
bbox[2:] += bbox[:2]
bbox = bbox.astype(np.int)
# clip at image boundaries
bbox[:2] = np.maximum(0, bbox[:2])
bbox[2:] = np.minimum(np.asarray(image.shape[:2][::-1]) - 1, bbox[2:])
if np.any(bbox[:2] >= bbox[2:]):
return None
sx, sy, ex, ey = bbox
image = image[sy:ey, sx:ex]
image = cv2.resize(image, tuple(patch_shape[::-1]))
return image
class ImageEncoder(object):
def __init__(self, checkpoint_filename, input_name="images",
output_name="features"):
self.session = tf.Session()
with tf.gfile.GFile(checkpoint_filename, "rb") as file_handle:
graph_def = tf.GraphDef()
graph_def.ParseFromString(file_handle.read())
tf.import_graph_def(graph_def, name="net")
self.input_var = tf.get_default_graph().get_tensor_by_name(
"net/%s:0" % input_name)
self.output_var = tf.get_default_graph().get_tensor_by_name(
"net/%s:0" % output_name)
assert len(self.output_var.get_shape()) == 2
assert len(self.input_var.get_shape()) == 4
self.feature_dim = self.output_var.get_shape().as_list()[-1]
self.image_shape = self.input_var.get_shape().as_list()[1:]
def __call__(self, data_x, batch_size=32):
out = np.zeros((len(data_x), self.feature_dim), np.float32)
_run_in_batches(
lambda x: self.session.run(self.output_var, feed_dict=x),
{self.input_var: data_x}, out, batch_size)
return out
def create_box_encoder(model_filename, input_name="images",
output_name="features", batch_size=32):
image_encoder = ImageEncoder(model_filename, input_name, output_name)
image_shape = image_encoder.image_shape
def encoder(image, boxes):
image_patches = []
for box in boxes:
patch = extract_image_patch(image, box, image_shape[:2])
if patch is None:
print("WARNING: Failed to extract image patch: %s." % str(box))
patch = np.random.uniform(
0., 255., image_shape).astype(np.uint8)
image_patches.append(patch)
image_patches = np.asarray(image_patches)
return image_encoder(image_patches, batch_size)
return encoder
def generate_detections(encoder, mot_dir, output_dir, detection_dir=None):
"""Generate detections with features.
Parameters
----------
encoder : Callable[image, ndarray] -> ndarray
The encoder function takes as input a BGR color image and a matrix of
bounding boxes in format `(x, y, w, h)` and returns a matrix of
corresponding feature vectors.
mot_dir : str
Path to the MOTChallenge directory (can be either train or test).
output_dir
Path to the output directory. Will be created if it does not exist.
detection_dir
Path to custom detections. The directory structure should be the default
MOTChallenge structure: `[sequence]/det/det.txt`. If None, uses the
standard MOTChallenge detections.
"""
if detection_dir is None:
detection_dir = mot_dir
try:
os.makedirs(output_dir)
except OSError as exception:
if exception.errno == errno.EEXIST and os.path.isdir(output_dir):
pass
else:
raise ValueError(
"Failed to created output directory '%s'" % output_dir)
for sequence in os.listdir(mot_dir):
print("Processing %s" % sequence)
sequence_dir = os.path.join(mot_dir, sequence)
image_dir = os.path.join(sequence_dir, "img1")
image_filenames = {
int(os.path.splitext(f)[0]): os.path.join(image_dir, f)
for f in os.listdir(image_dir)}
detection_file = os.path.join(
detection_dir, sequence, "det/det.txt")
detections_in = np.loadtxt(detection_file, delimiter=',')
detections_out = []
frame_indices = detections_in[:, 0].astype(np.int)
min_frame_idx = frame_indices.astype(np.int).min()
max_frame_idx = frame_indices.astype(np.int).max()
for frame_idx in range(min_frame_idx, max_frame_idx + 1):
print("Frame %05d/%05d" % (frame_idx, max_frame_idx))
mask = frame_indices == frame_idx
rows = detections_in[mask]
if frame_idx not in image_filenames:
print("WARNING could not find image for frame %d" % frame_idx)
continue
bgr_image = cv2.imread(
image_filenames[frame_idx], cv2.IMREAD_COLOR)
features = encoder(bgr_image, rows[:, 2:6].copy())
detections_out += [np.r_[(row, feature)] for row, feature
in zip(rows, features)]
output_filename = os.path.join(output_dir, "%s.npy" % sequence)
np.save(
output_filename, np.asarray(detections_out), allow_pickle=False)
def parse_args():
"""Parse command line arguments.
"""
parser = argparse.ArgumentParser(description="Re-ID feature extractor")
parser.add_argument(
"--model",
default="resources/networks/mars-small128.pb",
help="Path to freezed inference graph protobuf.")
parser.add_argument(
"--mot_dir", help="Path to MOTChallenge directory (train or test)",
required=True)
parser.add_argument(
"--detection_dir", help="Path to custom detections. Defaults to "
"standard MOT detections Directory structure should be the default "
"MOTChallenge structure: [sequence]/det/det.txt", default=None)
parser.add_argument(
"--output_dir", help="Output directory. Will be created if it does not"
" exist.", default="detections")
return parser.parse_args()
def main():
args = parse_args()
encoder = create_box_encoder(args.model, batch_size=32)
generate_detections(encoder, args.mot_dir, args.output_dir,
args.detection_dir)
if __name__ == "__main__":
main()
import cv2
image_folder = './mask_face'
video_name = './cut_test.m4v'
vc = cv2.VideoCapture(video_name)
c = 1
if vc.isOpened():
rval,frame=vc.read()
else:
rval=False
while rval:
rval,frame=vc.read()
cv2.imwrite('./mask_face/IMG_'+str(c)+'.jpg',frame)
c=c+1
cv2.waitKey(1)
vc.release()
import colorsys
import numpy as np
from keras import backend as K
from keras.models import load_model
from yolo4.model import yolo_eval, Mish
from yolo4.utils import letterbox_image
import os
from keras.utils import multi_gpu_model
class YOLO(object):
def __init__(self):
self.model_path = os.getcwd() + '/../deep_sort_yolov4/model_data/yolo4_weight.h5'
self.anchors_path = os.getcwd() + '/../deep_sort_yolov4/model_data/yolo_anchors.txt'
self.classes_path = os.getcwd() + '/../deep_sort_yolov4/model_data/coco_classes.txt'
self.gpu_num = 1
self.score = 0.5
self.iou = 0.5
self.class_names = self._get_class()
self.anchors = self._get_anchors()
self.sess = K.get_session()
self.model_image_size = (608, 608) # fixed size or (None, None)
self.is_fixed_size = self.model_image_size != (None, None)
self.boxes, self.scores, self.classes = self.generate()
def _get_class(self):
classes_path = os.path.expanduser(self.classes_path)
with open(classes_path) as f:
class_names = f.readlines()
class_names = [c.strip() for c in class_names]
return class_names
def _get_anchors(self):
anchors_path = os.path.expanduser(self.anchors_path)
with open(anchors_path) as f:
anchors = f.readline()
anchors = [float(x) for x in anchors.split(',')]
anchors = np.array(anchors).reshape(-1, 2)
return anchors
def generate(self):
model_path = os.path.expanduser(self.model_path)
assert model_path.endswith('.h5'), 'Keras model or weights must be a .h5 file.'
self.yolo_model = load_model(model_path, custom_objects={'Mish': Mish}, compile=False)
print('{} model, anchors, and classes loaded.'.format(model_path))
# Generate colors for drawing bounding boxes.
hsv_tuples = [(x / len(self.class_names), 1., 1.)
for x in range(len(self.class_names))]
self.colors = list(map(lambda x: colorsys.hsv_to_rgb(*x), hsv_tuples))
self.colors = list(
map(lambda x: (int(x[0] * 255), int(x[1] * 255), int(x[2] * 255)),
self.colors))
np.random.seed(10101) # Fixed seed for consistent colors across runs.
np.random.shuffle(self.colors) # Shuffle colors to decorrelate adjacent classes.
np.random.seed(None) # Reset seed to default.
# Generate output tensor targets for filtered bounding boxes.
self.input_image_shape = K.placeholder(shape=(2, ))
if self.gpu_num>=2:
self.yolo_model = multi_gpu_model(self.yolo_model, gpus=self.gpu_num)
boxes, scores, classes = yolo_eval(self.yolo_model.output, self.anchors,
len(self.class_names), self.input_image_shape,
score_threshold=self.score, iou_threshold=self.iou)
return boxes, scores, classes
def detect_image(self, image):
if self.is_fixed_size:
assert self.model_image_size[0]%32 == 0, 'Multiples of 32 required'
assert self.model_image_size[1]%32 == 0, 'Multiples of 32 required'
boxed_image = letterbox_image(image, tuple(reversed(self.model_image_size)))
else:
new_image_size = (image.width - (image.width % 32),
image.height - (image.height % 32))
boxed_image = letterbox_image(image, new_image_size)
image_data = np.array(boxed_image, dtype='float32')
# print(image_data.shape)
image_data /= 255.
image_data = np.expand_dims(image_data, 0) # Add batch dimension.
out_boxes, out_scores, out_classes = self.sess.run(
[self.boxes, self.scores, self.classes],
feed_dict={
self.yolo_model.input: image_data,
self.input_image_shape: [image.size[1], image.size[0]],
K.learning_phase(): 0
})
return_boxes = []
return_scores = []
return_class_names = []
for i, c in reversed(list(enumerate(out_classes))):
predicted_class = self.class_names[c]
if predicted_class != 'person': # Modify to detect other classes.
continue
box = out_boxes[i]
score = out_scores[i]
x = int(box[1])
y = int(box[0])
w = int(box[3] - box[1])
h = int(box[2] - box[0])
if x < 0:
w = w + x
x = 0
if y < 0:
h = h + y
y = 0
return_boxes.append([x, y, w, h])
return_scores.append(score)
return_class_names.append(predicted_class)
return return_boxes, return_scores, return_class_names
def close_session(self):
self.sess.close()
"""YOLO_v4 Model Defined in Keras."""
from functools import wraps
import numpy as np
import tensorflow as tf
from keras import backend as K
from keras.engine.base_layer import Layer
from keras.layers import Conv2D, Add, ZeroPadding2D, UpSampling2D, Concatenate, MaxPooling2D
from keras.layers.advanced_activations import LeakyReLU
from keras.layers.normalization import BatchNormalization
from keras.models import Model
from keras.regularizers import l2
from yolo4.utils import compose
class Mish(Layer):
def __init__(self, **kwargs):
super(Mish, self).__init__(**kwargs)
self.supports_masking = True
def call(self, inputs):
return inputs * K.tanh(K.softplus(inputs))
def get_config(self):
config = super(Mish, self).get_config()
return config
def compute_output_shape(self, input_shape):
return input_shape
@wraps(Conv2D)
def DarknetConv2D(*args, **kwargs):
"""Wrapper to set Darknet parameters for Convolution2D."""
darknet_conv_kwargs = {'kernel_regularizer': l2(5e-4)}
darknet_conv_kwargs['padding'] = 'valid' if kwargs.get('strides')==(2,2) else 'same'
darknet_conv_kwargs.update(kwargs)
return Conv2D(*args, **darknet_conv_kwargs)
def DarknetConv2D_BN_Leaky(*args, **kwargs):
"""Darknet Convolution2D followed by BatchNormalization and LeakyReLU."""
no_bias_kwargs = {'use_bias': False}
no_bias_kwargs.update(kwargs)
return compose(
DarknetConv2D(*args, **no_bias_kwargs),
BatchNormalization(),
LeakyReLU(alpha=0.1))
def DarknetConv2D_BN_Mish(*args, **kwargs):
"""Darknet Convolution2D followed by BatchNormalization and LeakyReLU."""
no_bias_kwargs = {'use_bias': False}
no_bias_kwargs.update(kwargs)
return compose(
DarknetConv2D(*args, **no_bias_kwargs),
BatchNormalization(),
Mish())
def resblock_body(x, num_filters, num_blocks, all_narrow=True):
'''A series of resblocks starting with a downsampling Convolution2D'''
# Darknet uses left and top padding instead of 'same' mode
preconv1 = ZeroPadding2D(((1,0),(1,0)))(x)
preconv1 = DarknetConv2D_BN_Mish(num_filters, (3,3), strides=(2,2))(preconv1)
shortconv = DarknetConv2D_BN_Mish(num_filters//2 if all_narrow else num_filters, (1,1))(preconv1)
mainconv = DarknetConv2D_BN_Mish(num_filters//2 if all_narrow else num_filters, (1,1))(preconv1)
for i in range(num_blocks):
y = compose(
DarknetConv2D_BN_Mish(num_filters//2, (1,1)),
DarknetConv2D_BN_Mish(num_filters//2 if all_narrow else num_filters, (3,3)))(mainconv)
mainconv = Add()([mainconv,y])
postconv = DarknetConv2D_BN_Mish(num_filters//2 if all_narrow else num_filters, (1,1))(mainconv)
route = Concatenate()([postconv, shortconv])
return DarknetConv2D_BN_Mish(num_filters, (1,1))(route)
def darknet_body(x):
'''Darknent body having 52 Convolution2D layers'''
x = DarknetConv2D_BN_Mish(32, (3,3))(x)
x = resblock_body(x, 64, 1, False)
x = resblock_body(x, 128, 2)
x = resblock_body(x, 256, 8)
x = resblock_body(x, 512, 8)
x = resblock_body(x, 1024, 4)
return x
def make_last_layers(x, num_filters, out_filters):
'''6 Conv2D_BN_Leaky layers followed by a Conv2D_linear layer'''
x = compose(
DarknetConv2D_BN_Leaky(num_filters, (1,1)),
DarknetConv2D_BN_Leaky(num_filters*2, (3,3)),
DarknetConv2D_BN_Leaky(num_filters, (1,1)),
DarknetConv2D_BN_Leaky(num_filters*2, (3,3)),
DarknetConv2D_BN_Leaky(num_filters, (1,1)))(x)
y = compose(
DarknetConv2D_BN_Leaky(num_filters*2, (3,3)),
DarknetConv2D(out_filters, (1,1)))(x)
return x, y
def yolo4_body(inputs, num_anchors, num_classes):
"""Create YOLO_V4 model CNN body in Keras."""
darknet = Model(inputs, darknet_body(inputs))
#19x19 head
y19 = DarknetConv2D_BN_Leaky(512, (1,1))(darknet.output)
y19 = DarknetConv2D_BN_Leaky(1024, (3,3))(y19)
y19 = DarknetConv2D_BN_Leaky(512, (1,1))(y19)
maxpool1 = MaxPooling2D(pool_size=(13,13), strides=(1,1), padding='same')(y19)
maxpool2 = MaxPooling2D(pool_size=(9,9), strides=(1,1), padding='same')(y19)
maxpool3 = MaxPooling2D(pool_size=(5,5), strides=(1,1), padding='same')(y19)
y19 = Concatenate()([maxpool1, maxpool2, maxpool3, y19])
y19 = DarknetConv2D_BN_Leaky(512, (1,1))(y19)
y19 = DarknetConv2D_BN_Leaky(1024, (3,3))(y19)
y19 = DarknetConv2D_BN_Leaky(512, (1,1))(y19)
y19_upsample = compose(DarknetConv2D_BN_Leaky(256, (1,1)), UpSampling2D(2))(y19)
#38x38 head
y38 = DarknetConv2D_BN_Leaky(256, (1,1))(darknet.layers[204].output)
y38 = Concatenate()([y38, y19_upsample])
y38 = DarknetConv2D_BN_Leaky(256, (1,1))(y38)
y38 = DarknetConv2D_BN_Leaky(512, (3,3))(y38)
y38 = DarknetConv2D_BN_Leaky(256, (1,1))(y38)
y38 = DarknetConv2D_BN_Leaky(512, (3,3))(y38)
y38 = DarknetConv2D_BN_Leaky(256, (1,1))(y38)
y38_upsample = compose(DarknetConv2D_BN_Leaky(128, (1,1)), UpSampling2D(2))(y38)
#76x76 head
y76 = DarknetConv2D_BN_Leaky(128, (1,1))(darknet.layers[131].output)
y76 = Concatenate()([y76, y38_upsample])
y76 = DarknetConv2D_BN_Leaky(128, (1,1))(y76)
y76 = DarknetConv2D_BN_Leaky(256, (3,3))(y76)
y76 = DarknetConv2D_BN_Leaky(128, (1,1))(y76)
y76 = DarknetConv2D_BN_Leaky(256, (3,3))(y76)
y76 = DarknetConv2D_BN_Leaky(128, (1,1))(y76)
#76x76 output
y76_output = DarknetConv2D_BN_Leaky(256, (3,3))(y76)
y76_output = DarknetConv2D(num_anchors*(num_classes+5), (1,1))(y76_output)
#38x38 output
y76_downsample = ZeroPadding2D(((1,0),(1,0)))(y76)
y76_downsample = DarknetConv2D_BN_Leaky(256, (3,3), strides=(2,2))(y76_downsample)
y38 = Concatenate()([y76_downsample, y38])
y38 = DarknetConv2D_BN_Leaky(256, (1,1))(y38)
y38 = DarknetConv2D_BN_Leaky(512, (3,3))(y38)
y38 = DarknetConv2D_BN_Leaky(256, (1,1))(y38)
y38 = DarknetConv2D_BN_Leaky(512, (3,3))(y38)
y38 = DarknetConv2D_BN_Leaky(256, (1,1))(y38)
y38_output = DarknetConv2D_BN_Leaky(512, (3,3))(y38)
y38_output = DarknetConv2D(num_anchors*(num_classes+5), (1,1))(y38_output)
#19x19 output
y38_downsample = ZeroPadding2D(((1,0),(1,0)))(y38)
y38_downsample = DarknetConv2D_BN_Leaky(512, (3,3), strides=(2,2))(y38_downsample)
y19 = Concatenate()([y38_downsample, y19])
y19 = DarknetConv2D_BN_Leaky(512, (1,1))(y19)
y19 = DarknetConv2D_BN_Leaky(1024, (3,3))(y19)
y19 = DarknetConv2D_BN_Leaky(512, (1,1))(y19)
y19 = DarknetConv2D_BN_Leaky(1024, (3,3))(y19)
y19 = DarknetConv2D_BN_Leaky(512, (1,1))(y19)
y19_output = DarknetConv2D_BN_Leaky(1024, (3,3))(y19)
y19_output = DarknetConv2D(num_anchors*(num_classes+5), (1,1))(y19_output)
yolo4_model = Model(inputs, [y19_output, y38_output, y76_output])
return yolo4_model
def yolo_head(feats, anchors, num_classes, input_shape, calc_loss=False):
"""Convert final layer features to bounding box parameters."""
num_anchors = len(anchors)
# Reshape to batch, height, width, num_anchors, box_params.
anchors_tensor = K.reshape(K.constant(anchors), [1, 1, 1, num_anchors, 2])
grid_shape = K.shape(feats)[1:3] # height, width
grid_y = K.tile(K.reshape(K.arange(0, stop=grid_shape[0]), [-1, 1, 1, 1]),
[1, grid_shape[1], 1, 1])
grid_x = K.tile(K.reshape(K.arange(0, stop=grid_shape[1]), [1, -1, 1, 1]),
[grid_shape[0], 1, 1, 1])
grid = K.concatenate([grid_x, grid_y])
grid = K.cast(grid, K.dtype(feats))
feats = K.reshape(
feats, [-1, grid_shape[0], grid_shape[1], num_anchors, num_classes + 5])
# Adjust preditions to each spatial grid point and anchor size.
box_xy = (K.sigmoid(feats[..., :2]) + grid) / K.cast(grid_shape[...,::-1], K.dtype(feats))
box_wh = K.exp(feats[..., 2:4]) * anchors_tensor / K.cast(input_shape[...,::-1], K.dtype(feats))
box_confidence = K.sigmoid(feats[..., 4:5])
box_class_probs = K.sigmoid(feats[..., 5:])
if calc_loss == True:
return grid, feats, box_xy, box_wh
return box_xy, box_wh, box_confidence, box_class_probs
def yolo_correct_boxes(box_xy, box_wh, input_shape, image_shape):
'''Get corrected boxes'''
box_yx = box_xy[..., ::-1]
box_hw = box_wh[..., ::-1]
input_shape = K.cast(input_shape, K.dtype(box_yx))
image_shape = K.cast(image_shape, K.dtype(box_yx))
new_shape = K.round(image_shape * K.min(input_shape/image_shape))
offset = (input_shape-new_shape)/2./input_shape
scale = input_shape/new_shape
box_yx = (box_yx - offset) * scale
box_hw *= scale
box_mins = box_yx - (box_hw / 2.)
box_maxes = box_yx + (box_hw / 2.)
boxes = K.concatenate([
box_mins[..., 0:1], # y_min
box_mins[..., 1:2], # x_min
box_maxes[..., 0:1], # y_max
box_maxes[..., 1:2] # x_max
])
# Scale boxes back to original image shape.
boxes *= K.concatenate([image_shape, image_shape])
return boxes
def yolo_boxes_and_scores(feats, anchors, num_classes, input_shape, image_shape):
'''Process Conv layer output'''
box_xy, box_wh, box_confidence, box_class_probs = yolo_head(feats,
anchors, num_classes, input_shape)
boxes = yolo_correct_boxes(box_xy, box_wh, input_shape, image_shape)
boxes = K.reshape(boxes, [-1, 4])
box_scores = box_confidence * box_class_probs
box_scores = K.reshape(box_scores, [-1, num_classes])
return boxes, box_scores
def yolo_eval(yolo_outputs,
anchors,
num_classes,
image_shape,
max_boxes=200,
score_threshold=.5,
iou_threshold=.5):
"""Evaluate YOLO model on given input and return filtered boxes."""
num_layers = len(yolo_outputs)
anchor_mask = [[6,7,8], [3,4,5], [0,1,2]]
input_shape = K.shape(yolo_outputs[0])[1:3] * 32
boxes = []
box_scores = []
for l in range(num_layers):
_boxes, _box_scores = yolo_boxes_and_scores(yolo_outputs[l],
anchors[anchor_mask[l]], num_classes, input_shape, image_shape)
boxes.append(_boxes)
box_scores.append(_box_scores)
boxes = K.concatenate(boxes, axis=0)
box_scores = K.concatenate(box_scores, axis=0)
mask = box_scores >= score_threshold
max_boxes_tensor = K.constant(max_boxes, dtype='int32')
boxes_ = []
scores_ = []
classes_ = []
for c in range(num_classes):
class_boxes = tf.boolean_mask(boxes, mask[:, c])
class_box_scores = tf.boolean_mask(box_scores[:, c], mask[:, c])
nms_index = tf.image.non_max_suppression(
class_boxes, class_box_scores, max_boxes_tensor, iou_threshold=iou_threshold)
class_boxes = K.gather(class_boxes, nms_index)
class_box_scores = K.gather(class_box_scores, nms_index)
classes = K.ones_like(class_box_scores, 'int32') * c
boxes_.append(class_boxes)
scores_.append(class_box_scores)
classes_.append(classes)
boxes_ = K.concatenate(boxes_, axis=0)
scores_ = K.concatenate(scores_, axis=0)
classes_ = K.concatenate(classes_, axis=0)
return boxes_, scores_, classes_
def preprocess_true_boxes(true_boxes, input_shape, anchors, num_classes):
'''Preprocess true boxes to training input format
Parameters
----------
true_boxes: array, shape=(m, T, 5)
Absolute x_min, y_min, x_max, y_max, class_id relative to input_shape.
input_shape: array-like, hw, multiples of 32
anchors: array, shape=(N, 2), wh
num_classes: integer
Returns
-------
y_true: list of array, shape like yolo_outputs, xywh are reletive value
'''
assert (true_boxes[..., 4]<num_classes).all(), 'class id must be less than num_classes'
num_layers = len(anchors)//3 # default setting
anchor_mask = [[6,7,8], [3,4,5], [0,1,2]] if num_layers==3 else [[3,4,5], [1,2,3]]
true_boxes = np.array(true_boxes, dtype='float32')
input_shape = np.array(input_shape, dtype='int32')
boxes_xy = (true_boxes[..., 0:2] + true_boxes[..., 2:4]) // 2
boxes_wh = true_boxes[..., 2:4] - true_boxes[..., 0:2]
true_boxes[..., 0:2] = boxes_xy/input_shape[::-1]
true_boxes[..., 2:4] = boxes_wh/input_shape[::-1]
m = true_boxes.shape[0]
grid_shapes = [input_shape//{0:32, 1:16, 2:8}[l] for l in range(num_layers)]
y_true = [np.zeros((m,grid_shapes[l][0],grid_shapes[l][1],len(anchor_mask[l]),5+num_classes),
dtype='float32') for l in range(num_layers)]
# Expand dim to apply broadcasting.
anchors = np.expand_dims(anchors, 0)
anchor_maxes = anchors / 2.
anchor_mins = -anchor_maxes
valid_mask = boxes_wh[..., 0]>0
for b in range(m):
# Discard zero rows.
wh = boxes_wh[b, valid_mask[b]]
if len(wh)==0: continue
# Expand dim to apply broadcasting.
wh = np.expand_dims(wh, -2)
box_maxes = wh / 2.
box_mins = -box_maxes
intersect_mins = np.maximum(box_mins, anchor_mins)
intersect_maxes = np.minimum(box_maxes, anchor_maxes)
intersect_wh = np.maximum(intersect_maxes - intersect_mins, 0.)
intersect_area = intersect_wh[..., 0] * intersect_wh[..., 1]
box_area = wh[..., 0] * wh[..., 1]
anchor_area = anchors[..., 0] * anchors[..., 1]
iou = intersect_area / (box_area + anchor_area - intersect_area)
# Find best anchor for each true box
best_anchor = np.argmax(iou, axis=-1)
for t, n in enumerate(best_anchor):
for l in range(num_layers):
if n in anchor_mask[l]:
i = np.floor(true_boxes[b,t,0]*grid_shapes[l][1]).astype('int32')
j = np.floor(true_boxes[b,t,1]*grid_shapes[l][0]).astype('int32')
k = anchor_mask[l].index(n)
c = true_boxes[b,t, 4].astype('int32')
y_true[l][b, j, i, k, 0:4] = true_boxes[b,t, 0:4]
y_true[l][b, j, i, k, 4] = 1
y_true[l][b, j, i, k, 5+c] = 1
return y_true
def softmax_focal_loss(y_true, y_pred, gamma=2.0, alpha=0.25):
"""
Compute softmax focal loss.
Reference Paper:
"Focal Loss for Dense Object Detection"
https://arxiv.org/abs/1708.02002
# Arguments
y_true: Ground truth targets,
tensor of shape (?, num_boxes, num_classes).
y_pred: Predicted logits,
tensor of shape (?, num_boxes, num_classes).
gamma: exponent of the modulating factor (1 - p_t) ^ gamma.
alpha: optional alpha weighting factor to balance positives vs negatives.
# Returns
softmax_focal_loss: Softmax focal loss, tensor of shape (?, num_boxes).
"""
# Scale predictions so that the class probas of each sample sum to 1
#y_pred /= K.sum(y_pred, axis=-1, keepdims=True)
# Clip the prediction value to prevent NaN's and Inf's
#epsilon = K.epsilon()
#y_pred = K.clip(y_pred, epsilon, 1. - epsilon)
y_pred = tf.nn.softmax(y_pred)
y_pred = tf.maximum(tf.minimum(y_pred, 1 - 1e-15), 1e-15)
# Calculate Cross Entropy
cross_entropy = -y_true * tf.math.log(y_pred)
# Calculate Focal Loss
softmax_focal_loss = alpha * tf.pow(1 - y_pred, gamma) * cross_entropy
return softmax_focal_loss
def sigmoid_focal_loss(y_true, y_pred, gamma=2.0, alpha=0.25):
"""
Compute sigmoid focal loss.
Reference Paper:
"Focal Loss for Dense Object Detection"
https://arxiv.org/abs/1708.02002
# Arguments
y_true: Ground truth targets,
tensor of shape (?, num_boxes, num_classes).
y_pred: Predicted logits,
tensor of shape (?, num_boxes, num_classes).
gamma: exponent of the modulating factor (1 - p_t) ^ gamma.
alpha: optional alpha weighting factor to balance positives vs negatives.
# Returns
sigmoid_focal_loss: Sigmoid focal loss, tensor of shape (?, num_boxes).
"""
sigmoid_loss = K.binary_crossentropy(y_true, y_pred, from_logits=True)
pred_prob = tf.sigmoid(y_pred)
p_t = ((y_true * pred_prob) + ((1 - y_true) * (1 - pred_prob)))
modulating_factor = tf.pow(1.0 - p_t, gamma)
alpha_weight_factor = (y_true * alpha + (1 - y_true) * (1 - alpha))
sigmoid_focal_loss = modulating_factor * alpha_weight_factor * sigmoid_loss
#sigmoid_focal_loss = tf.reduce_sum(sigmoid_focal_loss, axis=-1)
return sigmoid_focal_loss
def box_iou(b1, b2):
"""
Return iou tensor
Parameters
----------
b1: tensor, shape=(i1,...,iN, 4), xywh
b2: tensor, shape=(j, 4), xywh
Returns
-------
iou: tensor, shape=(i1,...,iN, j)
"""
# Expand dim to apply broadcasting.
b1 = K.expand_dims(b1, -2)
b1_xy = b1[..., :2]
b1_wh = b1[..., 2:4]
b1_wh_half = b1_wh/2.
b1_mins = b1_xy - b1_wh_half
b1_maxes = b1_xy + b1_wh_half
# Expand dim to apply broadcasting.
b2 = K.expand_dims(b2, 0)
b2_xy = b2[..., :2]
b2_wh = b2[..., 2:4]
b2_wh_half = b2_wh/2.
b2_mins = b2_xy - b2_wh_half
b2_maxes = b2_xy + b2_wh_half
intersect_mins = K.maximum(b1_mins, b2_mins)
intersect_maxes = K.minimum(b1_maxes, b2_maxes)
intersect_wh = K.maximum(intersect_maxes - intersect_mins, 0.)
intersect_area = intersect_wh[..., 0] * intersect_wh[..., 1]
b1_area = b1_wh[..., 0] * b1_wh[..., 1]
b2_area = b2_wh[..., 0] * b2_wh[..., 1]
iou = intersect_area / (b1_area + b2_area - intersect_area)
return iou
def box_giou(b1, b2):
"""
Calculate GIoU loss on anchor boxes
Reference Paper:
"Generalized Intersection over Union: A Metric and A Loss for Bounding Box Regression"
https://arxiv.org/abs/1902.09630
Parameters
----------
b1: tensor, shape=(batch, feat_w, feat_h, anchor_num, 4), xywh
b2: tensor, shape=(batch, feat_w, feat_h, anchor_num, 4), xywh
Returns
-------
giou: tensor, shape=(batch, feat_w, feat_h, anchor_num, 1)
"""
b1_xy = b1[..., :2]
b1_wh = b1[..., 2:4]
b1_wh_half = b1_wh/2.
b1_mins = b1_xy - b1_wh_half
b1_maxes = b1_xy + b1_wh_half
b2_xy = b2[..., :2]
b2_wh = b2[..., 2:4]
b2_wh_half = b2_wh/2.
b2_mins = b2_xy - b2_wh_half
b2_maxes = b2_xy + b2_wh_half
intersect_mins = K.maximum(b1_mins, b2_mins)
intersect_maxes = K.minimum(b1_maxes, b2_maxes)
intersect_wh = K.maximum(intersect_maxes - intersect_mins, 0.)
intersect_area = intersect_wh[..., 0] * intersect_wh[..., 1]
b1_area = b1_wh[..., 0] * b1_wh[..., 1]
b2_area = b2_wh[..., 0] * b2_wh[..., 1]
union_area = b1_area + b2_area - intersect_area
# calculate IoU, add epsilon in denominator to avoid dividing by 0
iou = intersect_area / (union_area + K.epsilon())
# get enclosed area
enclose_mins = K.minimum(b1_mins, b2_mins)
enclose_maxes = K.maximum(b1_maxes, b2_maxes)
enclose_wh = K.maximum(enclose_maxes - enclose_mins, 0.0)
enclose_area = enclose_wh[..., 0] * enclose_wh[..., 1]
# calculate GIoU, add epsilon in denominator to avoid dividing by 0
giou = iou - 1.0 * (enclose_area - union_area) / (enclose_area + K.epsilon())
giou = K.expand_dims(giou, -1)
return giou
def box_diou(b1, b2):
"""
Calculate DIoU loss on anchor boxes
Reference Paper:
"Distance-IoU Loss: Faster and Better Learning for Bounding Box Regression"
https://arxiv.org/abs/1911.08287
Parameters
----------
b1: tensor, shape=(batch, feat_w, feat_h, anchor_num, 4), xywh
b2: tensor, shape=(batch, feat_w, feat_h, anchor_num, 4), xywh
Returns
-------
diou: tensor, shape=(batch, feat_w, feat_h, anchor_num, 1)
"""
b1_xy = b1[..., :2]
b1_wh = b1[..., 2:4]
b1_wh_half = b1_wh/2.
b1_mins = b1_xy - b1_wh_half
b1_maxes = b1_xy + b1_wh_half
b2_xy = b2[..., :2]
b2_wh = b2[..., 2:4]
b2_wh_half = b2_wh/2.
b2_mins = b2_xy - b2_wh_half
b2_maxes = b2_xy + b2_wh_half
intersect_mins = K.maximum(b1_mins, b2_mins)
intersect_maxes = K.minimum(b1_maxes, b2_maxes)
intersect_wh = K.maximum(intersect_maxes - intersect_mins, 0.)
intersect_area = intersect_wh[..., 0] * intersect_wh[..., 1]
b1_area = b1_wh[..., 0] * b1_wh[..., 1]
b2_area = b2_wh[..., 0] * b2_wh[..., 1]
union_area = b1_area + b2_area - intersect_area
# calculate IoU, add epsilon in denominator to avoid dividing by 0
iou = intersect_area / (union_area + K.epsilon())
# box center distance
center_distance = K.sum(K.square(b1_xy - b2_xy), axis=-1)
# get enclosed area
enclose_mins = K.minimum(b1_mins, b2_mins)
enclose_maxes = K.maximum(b1_maxes, b2_maxes)
enclose_wh = K.maximum(enclose_maxes - enclose_mins, 0.0)
# get enclosed diagonal distance
enclose_diagonal = K.sum(K.square(enclose_wh), axis=-1)
# calculate DIoU, add epsilon in denominator to avoid dividing by 0
diou = iou - 1.0 * (center_distance) / (enclose_diagonal + K.epsilon())
# calculate param v and alpha to extend to CIoU
#v = 4*K.square(tf.math.atan2(b1_wh[..., 0], b1_wh[..., 1]) - tf.math.atan2(b2_wh[..., 0], b2_wh[..., 1])) / (math.pi * math.pi)
#alpha = v / (1.0 - iou + v)
#diou = diou - alpha*v
diou = K.expand_dims(diou, -1)
return diou
def _smooth_labels(y_true, label_smoothing):
label_smoothing = K.constant(label_smoothing, dtype=K.floatx())
return y_true * (1.0 - label_smoothing) + 0.5 * label_smoothing
def yolo4_loss(args, anchors, num_classes, ignore_thresh=.5, label_smoothing=0, use_focal_loss=False, use_focal_obj_loss=False, use_softmax_loss=False, use_giou_loss=False, use_diou_loss=False):
'''Return yolo4_loss tensor
Parameters
----------
yolo_outputs: list of tensor, the output of yolo_body or tiny_yolo_body
y_true: list of array, the output of preprocess_true_boxes
anchors: array, shape=(N, 2), wh
num_classes: integer
ignore_thresh: float, the iou threshold whether to ignore object confidence loss
Returns
-------
loss: tensor, shape=(1,)
'''
num_layers = len(anchors)//3 # default setting
yolo_outputs = args[:num_layers]
y_true = args[num_layers:]
anchor_mask = [[6,7,8], [3,4,5], [0,1,2]] if num_layers==3 else [[3,4,5], [0,1,2]]
input_shape = K.cast(K.shape(yolo_outputs[0])[1:3] * 32, K.dtype(y_true[0]))
grid_shapes = [K.cast(K.shape(yolo_outputs[l])[1:3], K.dtype(y_true[0])) for l in range(num_layers)]
loss = 0
total_location_loss = 0
total_confidence_loss = 0
total_class_loss = 0
m = K.shape(yolo_outputs[0])[0] # batch size, tensor
mf = K.cast(m, K.dtype(yolo_outputs[0]))
for l in range(num_layers):
object_mask = y_true[l][..., 4:5]
true_class_probs = y_true[l][..., 5:]
if label_smoothing:
true_class_probs = _smooth_labels(true_class_probs, label_smoothing)
grid, raw_pred, pred_xy, pred_wh = yolo_head(yolo_outputs[l],
anchors[anchor_mask[l]], num_classes, input_shape, calc_loss=True)
pred_box = K.concatenate([pred_xy, pred_wh])
# Darknet raw box to calculate loss.
raw_true_xy = y_true[l][..., :2]*grid_shapes[l][::-1] - grid
raw_true_wh = K.log(y_true[l][..., 2:4] / anchors[anchor_mask[l]] * input_shape[::-1])
raw_true_wh = K.switch(object_mask, raw_true_wh, K.zeros_like(raw_true_wh)) # avoid log(0)=-inf
box_loss_scale = 2 - y_true[l][...,2:3]*y_true[l][...,3:4]
# Find ignore mask, iterate over each of batch.
ignore_mask = tf.TensorArray(K.dtype(y_true[0]), size=1, dynamic_size=True)
object_mask_bool = K.cast(object_mask, 'bool')
def loop_body(b, ignore_mask):
true_box = tf.boolean_mask(y_true[l][b,...,0:4], object_mask_bool[b,...,0])
iou = box_iou(pred_box[b], true_box)
best_iou = K.max(iou, axis=-1)
ignore_mask = ignore_mask.write(b, K.cast(best_iou<ignore_thresh, K.dtype(true_box)))
return b+1, ignore_mask
_, ignore_mask = tf.while_loop(lambda b,*args: b<m, loop_body, [0, ignore_mask])
ignore_mask = ignore_mask.stack()
ignore_mask = K.expand_dims(ignore_mask, -1)
if use_focal_obj_loss:
# Focal loss for objectness confidence
confidence_loss = sigmoid_focal_loss(object_mask, raw_pred[...,4:5])
else:
confidence_loss = object_mask * K.binary_crossentropy(object_mask, raw_pred[...,4:5], from_logits=True)+ \
(1-object_mask) * K.binary_crossentropy(object_mask, raw_pred[...,4:5], from_logits=True) * ignore_mask
if use_focal_loss:
# Focal loss for classification score
if use_softmax_loss:
class_loss = softmax_focal_loss(true_class_probs, raw_pred[...,5:])
else:
class_loss = sigmoid_focal_loss(true_class_probs, raw_pred[...,5:])
else:
if use_softmax_loss:
# use softmax style classification output
class_loss = object_mask * K.expand_dims(K.categorical_crossentropy(true_class_probs, raw_pred[...,5:], from_logits=True), axis=-1)
else:
# use sigmoid style classification output
class_loss = object_mask * K.binary_crossentropy(true_class_probs, raw_pred[...,5:], from_logits=True)
if use_giou_loss:
# Calculate GIoU loss as location loss
raw_true_box = y_true[l][...,0:4]
giou = box_giou(pred_box, raw_true_box)
giou_loss = object_mask * box_loss_scale * (1 - giou)
giou_loss = K.sum(giou_loss) / mf
location_loss = giou_loss
elif use_diou_loss:
# Calculate DIoU loss as location loss
raw_true_box = y_true[l][...,0:4]
diou = box_diou(pred_box, raw_true_box)
diou_loss = object_mask * box_loss_scale * (1 - diou)
diou_loss = K.sum(diou_loss) / mf
location_loss = diou_loss
else:
# Standard YOLO location loss
# K.binary_crossentropy is helpful to avoid exp overflow.
xy_loss = object_mask * box_loss_scale * K.binary_crossentropy(raw_true_xy, raw_pred[...,0:2], from_logits=True)
wh_loss = object_mask * box_loss_scale * 0.5 * K.square(raw_true_wh-raw_pred[...,2:4])
xy_loss = K.sum(xy_loss) / mf
wh_loss = K.sum(wh_loss) / mf
location_loss = xy_loss + wh_loss
confidence_loss = K.sum(confidence_loss) / mf
class_loss = K.sum(class_loss) / mf
loss += location_loss + confidence_loss + class_loss
total_location_loss += location_loss
total_confidence_loss += confidence_loss
total_class_loss += class_loss
# Fit for tf 2.0.0 loss shape
loss = K.expand_dims(loss, axis=-1)
return loss #, total_location_loss, total_confidence_loss, total_class_loss
def yolo_loss(args, anchors, num_classes, ignore_thresh=.5, print_loss=False):
'''Return yolo_loss tensor
Parameters
----------
yolo_outputs: list of tensor, the output of yolo_body or tiny_yolo_body
y_true: list of array, the output of preprocess_true_boxes
anchors: array, shape=(N, 2), wh
num_classes: integer
ignore_thresh: float, the iou threshold whether to ignore object confidence loss
Returns
-------
loss: tensor, shape=(1,)
'''
num_layers = len(anchors)//3 # default setting
yolo_outputs = args[:num_layers]
y_true = args[num_layers:]
anchor_mask = [[6,7,8], [3,4,5], [0,1,2]] if num_layers==3 else [[3,4,5], [1,2,3]]
input_shape = K.cast(K.shape(yolo_outputs[0])[1:3] * 32, K.dtype(y_true[0]))
grid_shapes = [K.cast(K.shape(yolo_outputs[l])[1:3], K.dtype(y_true[0])) for l in range(num_layers)]
loss = 0
m = K.shape(yolo_outputs[0])[0] # batch size, tensor
mf = K.cast(m, K.dtype(yolo_outputs[0]))
for l in range(num_layers):
object_mask = y_true[l][..., 4:5]
true_class_probs = y_true[l][..., 5:]
grid, raw_pred, pred_xy, pred_wh = yolo_head(yolo_outputs[l],
anchors[anchor_mask[l]], num_classes, input_shape, calc_loss=True)
pred_box = K.concatenate([pred_xy, pred_wh])
# Darknet raw box to calculate loss.
raw_true_xy = y_true[l][..., :2]*grid_shapes[l][::-1] - grid
raw_true_wh = K.log(y_true[l][..., 2:4] / anchors[anchor_mask[l]] * input_shape[::-1])
raw_true_wh = K.switch(object_mask, raw_true_wh, K.zeros_like(raw_true_wh)) # avoid log(0)=-inf
box_loss_scale = 2 - y_true[l][...,2:3]*y_true[l][...,3:4]
# Find ignore mask, iterate over each of batch.
ignore_mask = tf.TensorArray(K.dtype(y_true[0]), size=1, dynamic_size=True)
object_mask_bool = K.cast(object_mask, 'bool')
def loop_body(b, ignore_mask):
true_box = tf.boolean_mask(y_true[l][b,...,0:4], object_mask_bool[b,...,0])
iou = box_iou(pred_box[b], true_box)
best_iou = K.max(iou, axis=-1)
ignore_mask = ignore_mask.write(b, K.cast(best_iou<ignore_thresh, K.dtype(true_box)))
return b+1, ignore_mask
_, ignore_mask = K.control_flow_ops.while_loop(lambda b,*args: b<m, loop_body, [0, ignore_mask])
ignore_mask = ignore_mask.stack()
ignore_mask = K.expand_dims(ignore_mask, -1)
# K.binary_crossentropy is helpful to avoid exp overflow.
xy_loss = object_mask * box_loss_scale * K.binary_crossentropy(raw_true_xy, raw_pred[...,0:2], from_logits=True)
wh_loss = object_mask * box_loss_scale * 0.5 * K.square(raw_true_wh-raw_pred[...,2:4])
confidence_loss = object_mask * K.binary_crossentropy(object_mask, raw_pred[...,4:5], from_logits=True)+ \
(1-object_mask) * K.binary_crossentropy(object_mask, raw_pred[...,4:5], from_logits=True) * ignore_mask
class_loss = object_mask * K.binary_crossentropy(true_class_probs, raw_pred[...,5:], from_logits=True)
xy_loss = K.sum(xy_loss) / mf
wh_loss = K.sum(wh_loss) / mf
confidence_loss = K.sum(confidence_loss) / mf
class_loss = K.sum(class_loss) / mf
loss += xy_loss + wh_loss + confidence_loss + class_loss
if print_loss:
loss = tf.Print(loss, [loss, xy_loss, wh_loss, confidence_loss, class_loss, K.sum(ignore_mask)], message='loss: ')
return loss
"""Miscellaneous utility functions."""
from functools import reduce
from PIL import Image
import numpy as np
from matplotlib.colors import rgb_to_hsv, hsv_to_rgb
def compose(*funcs):
"""Compose arbitrarily many functions, evaluated left to right.
Reference: https://mathieularose.com/function-composition-in-python/
"""
# return lambda x: reduce(lambda v, f: f(v), funcs, x)
if funcs:
return reduce(lambda f, g: lambda *a, **kw: g(f(*a, **kw)), funcs)
else:
raise ValueError('Composition of empty sequence not supported.')
def letterbox_image(image, size):
'''resize image with unchanged aspect ratio using padding'''
iw, ih = image.size
w, h = size
scale = min(w/iw, h/ih)
nw = int(iw*scale)
nh = int(ih*scale)
image = image.resize((nw,nh), Image.BICUBIC)
new_image = Image.new('RGB', size, (128,128,128))
new_image.paste(image, ((w-nw)//2, (h-nh)//2))
return new_image
def rand(a=0, b=1):
return np.random.rand()*(b-a) + a
def get_random_data(annotation_line, input_shape, random=True, max_boxes=100, jitter=.3, hue=.1, sat=1.5, val=1.5, proc_img=True):
'''random preprocessing for real-time data augmentation'''
line = annotation_line.split()
image = Image.open(line[0])
iw, ih = image.size
h, w = input_shape
box = np.array([np.array(list(map(int,box.split(',')))) for box in line[1:]])
if not random:
# resize image
scale = min(w/iw, h/ih)
nw = int(iw*scale)
nh = int(ih*scale)
dx = (w-nw)//2
dy = (h-nh)//2
image_data=0
if proc_img:
image = image.resize((nw,nh), Image.BICUBIC)
new_image = Image.new('RGB', (w,h), (128,128,128))
new_image.paste(image, (dx, dy))
image_data = np.array(new_image)/255.
# correct boxes
box_data = np.zeros((max_boxes,5))
if len(box)>0:
np.random.shuffle(box)
if len(box)>max_boxes: box = box[:max_boxes]
box[:, [0,2]] = box[:, [0,2]]*scale + dx
box[:, [1,3]] = box[:, [1,3]]*scale + dy
box_data[:len(box)] = box
return image_data, box_data
# resize image
new_ar = w/h * rand(1-jitter,1+jitter)/rand(1-jitter,1+jitter)
scale = rand(.25, 2)
if new_ar < 1:
nh = int(scale*h)
nw = int(nh*new_ar)
else:
nw = int(scale*w)
nh = int(nw/new_ar)
image = image.resize((nw,nh), Image.BICUBIC)
# place image
dx = int(rand(0, w-nw))
dy = int(rand(0, h-nh))
new_image = Image.new('RGB', (w,h), (128,128,128))
new_image.paste(image, (dx, dy))
image = new_image
# flip image or not
flip = rand()<.5
if flip: image = image.transpose(Image.FLIP_LEFT_RIGHT)
# distort image
hue = rand(-hue, hue)
sat = rand(1, sat) if rand()<.5 else 1/rand(1, sat)
val = rand(1, val) if rand()<.5 else 1/rand(1, val)
x = rgb_to_hsv(np.array(image)/255.)
x[..., 0] += hue
x[..., 0][x[..., 0]>1] -= 1
x[..., 0][x[..., 0]<0] += 1
x[..., 1] *= sat
x[..., 2] *= val
x[x>1] = 1
x[x<0] = 0
image_data = hsv_to_rgb(x) # numpy array, 0 to 1
# correct boxes
box_data = np.zeros((max_boxes,5))
if len(box)>0:
np.random.shuffle(box)
box[:, [0,2]] = box[:, [0,2]]*nw/iw + dx
box[:, [1,3]] = box[:, [1,3]]*nh/ih + dy
if flip: box[:, [0,2]] = w - box[:, [2,0]]
box[:, 0:2][box[:, 0:2]<0] = 0
box[:, 2][box[:, 2]>w] = w
box[:, 3][box[:, 3]>h] = h
box_w = box[:, 2] - box[:, 0]
box_h = box[:, 3] - box[:, 1]
box = box[np.logical_and(box_w>1, box_h>1)] # discard invalid box
if len(box)>max_boxes: box = box[:max_boxes]
box_data[:len(box)] = box
return image_data, box_data
absl-py==0.9.0
asgiref==3.2.7
astor==0.8.1
astroid==2.3.3
cachetools==4.1.0
certifi==2020.4.5.1
chardet==3.0.4
colorama==0.4.3
cycler==0.10.0
Django==3.0.5
et-xmlfile==1.0.1
gast==0.2.2
google-auth==1.14.1
google-auth-oauthlib==0.4.1
google-pasta==0.2.0
grpcio==1.28.1
h5py==2.10.0
idna==2.9
image==1.5.31
isort==4.3.21
jdcal==1.4.1
joblib==0.14.1
Keras==2.3.1
Keras-Applications==1.0.8
Keras-Preprocessing==1.1.0
kiwisolver==1.2.0
lazy-object-proxy==1.4.3
Markdown==3.2.1
matplotlib==3.2.1
mccabe==0.6.1
numpy==1.18.3
oauthlib==3.1.0
opencv-python==4.2.0.34
openpyxl==3.0.3
opt-einsum==3.2.1
pandas==1.0.3
Pillow==7.1.1
protobuf==3.11.3
pyasn1==0.4.8
pyasn1-modules==0.2.8
pylint==2.4.4
pyparsing==2.4.7
python-dateutil==2.8.1
pytz==2019.3
PyYAML==5.3.1
requests==2.23.0
requests-oauthlib==1.3.0
rsa==4.0
scikit-learn==0.22.2.post1
scipy==1.4.1
seaborn==0.10.1
six==1.14.0
sklearn==0.0
sqlparse==0.3.1
tensorboard==1.15.0
tensorflow==1.15.2
tensorflow-estimator==1.15.1
tensorflow-gpu==1.15.2
tensorflow-gpu-estimator==2.1.0
termcolor==1.1.0
typed-ast==1.4.1
urllib3==1.25.9
Werkzeug==1.0.1
wrapt==1.11.2
xlrd==1.2.0
......@@ -6,12 +6,14 @@
"start": "node ./bin/www"
},
"dependencies": {
"child_process": "^1.0.2",
"cookie-parser": "~1.4.4",
"debug": "~2.6.9",
"ejs": "~2.6.1",
"express": "~4.16.1",
"http-errors": "~1.6.3",
"morgan": "~1.9.1",
"multer": "^1.4.2"
"multer": "^1.4.2",
"python-shell": "^2.0.1"
}
}
......
......@@ -2,10 +2,13 @@ var express = require('express');
var router = express.Router();
var multer = require("multer");
var path = require("path");
var PythonShell = require('python-shell');
var spawn = require("child_process").spawn;
var storage = multer.diskStorage({
destination: function(req, file, callback) {
callback(null, "upload/")
callback(null, "../deep_sort_yolov4/")
},
filename: function(req, file, callback) {
var extension = path.extname(file.originalname);
......@@ -21,20 +24,33 @@ var upload = multer({
// 뷰 페이지 경로
router.get('/', function(req, res, next) {
res.render("index")
res.render('index', { title: 'Express' });
});
// 2. 파일 업로드 처리
router.post('/create', upload.single("File"), async(req, res) => {
router.post('/create', upload.single("File"), function(req, res) {
// 3. 파일 객체
var file = req.file
// 4. 파일 정보
var result = {
originalName: file.originalname,
size: file.size,
var file = req.file;
var options = {
mode: 'text',
pythonPath: __dirname + '/../../venv/Scripts/python.exe',
pythonOptions: ['-u'],
scriptPath: __dirname + '/../../deep_sort_yolov4/',
args: [file.originalname]
};
var shell1 = new PythonShell.PythonShell('main.py', options);
shell1.end(function(err) {
if (err) throw err;
else {
res.render('result', { file_name: file.originalname.split('.')[0] });
}
res.json(result);
});
// PythonShell.PythonShell.run('main.py', options, (err, results) => {
// if (err) throw err;
// PythonShell.PythonShell.run('kmean.py', options2, (err, results) => {
// if (err) throw err;
// res.render('result', { file_name: file.originalname.split('.')[0] });
// });
// });
});
module.exports = router;
\ No newline at end of file
......
......@@ -32,7 +32,8 @@
<div class="sidebar-brand-icon rotate-n-15">
<i class="fas fa-thumbs-up"></i>
</div>
<div class="sidebar-brand-text mx-3">유동 인구 분석</div>
<div class="sidebar-brand-text mx-3">유동 인구 분석
</div>
</a>
<!-- Divider -->
......
......@@ -86,10 +86,10 @@
<!-- /.container-fluid -->
<div class="container row">
<div class="container-video col-md-6">
<video src="data/output2.avi" width="400" controls autoplay></video>
<video src="data/<%=file_name%>.mp4" width="400" controls autoplay></video>
</div>
<div class="container-kmeans col-md-6">
<img src="data/test3_kmeans.png" style="width: 100%;" alt="Kmeans Image">
<img src="data/<%=file_name%>_kmeans.png" style="width: 100%;" alt="Kmeans Image">
</div>
</div>
<div class="container-kibana"></div>
......@@ -127,9 +127,6 @@
<!-- Custom scripts for all pages-->
<script src="javascripts/sb-admin-2.min.js"></script>
<!-- Page level plugins -->
<script src="vendor/chart.js/Chart.min.js"></script>
<!-- Page level custom scripts -->
<!-- <script src="javascripts/demo/chart-pie-demo.js"></script> -->
</body>
......