Annolid on Detectron2 Tutorial 2 : Train a Model#

This is modified from the official colab tutorial of detectron2. Here, we will

  • Train (fine-tune) a detectron2 model on our dataset.

You can make a copy of this tutorial by “File -> Open in playground mode” and play with it yourself. DO NOT request access to this tutorial.

# Is running in colab or in jupyter-notebook
try:
  import google.colab
  IN_COLAB = True
except:
  IN_COLAB = False
# install dependencies: 
!pip install pyyaml==5.3
import torch, torchvision
TORCH_VERSION = ".".join(torch.__version__.split(".")[:2])
CUDA_VERSION = torch.__version__.split("+")[-1]
print("torch: ", TORCH_VERSION, "; cuda: ", CUDA_VERSION)
# Install detectron2 that matches the above pytorch version
# See https://detectron2.readthedocs.io/tutorials/install.html for instructions
!pip install detectron2 -f https://dl.fbaipublicfiles.com/detectron2/wheels/$CUDA_VERSION/torch$TORCH_VERSION/index.html
# If there is not yet a detectron2 release that matches the given torch + CUDA version, you need to install a different pytorch.

# exit(0)  # After installation, you may need to "restart runtime" in Colab. This line can also restart runtime
Requirement already satisfied: pyyaml==5.3 in /home/jeremy/anaconda3/lib/python3.9/site-packages (5.3)
torch:  1.10 ; cuda:  cu102
Looking in links: https://dl.fbaipublicfiles.com/detectron2/wheels/cu102/torch1.10/index.html
Requirement already satisfied: detectron2 in /home/jeremy/anaconda3/lib/python3.9/site-packages (0.6+cu102)
Requirement already satisfied: pycocotools>=2.0.2 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from detectron2) (2.0.3)
Requirement already satisfied: pydot in /home/jeremy/anaconda3/lib/python3.9/site-packages (from detectron2) (1.4.2)
Requirement already satisfied: hydra-core>=1.1 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from detectron2) (1.1.1)
Requirement already satisfied: yacs>=0.1.8 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from detectron2) (0.1.8)
Requirement already satisfied: black==21.4b2 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from detectron2) (21.4b2)
Requirement already satisfied: omegaconf>=2.1 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from detectron2) (2.1.1)
Requirement already satisfied: fvcore<0.1.6,>=0.1.5 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from detectron2) (0.1.5.post20211023)
Requirement already satisfied: matplotlib in /home/jeremy/anaconda3/lib/python3.9/site-packages (from detectron2) (3.4.3)
Requirement already satisfied: cloudpickle in /home/jeremy/anaconda3/lib/python3.9/site-packages (from detectron2) (2.0.0)
Requirement already satisfied: iopath<0.1.10,>=0.1.7 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from detectron2) (0.1.9)
Requirement already satisfied: future in /home/jeremy/anaconda3/lib/python3.9/site-packages (from detectron2) (0.18.2)
Requirement already satisfied: termcolor>=1.1 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from detectron2) (1.1.0)
Requirement already satisfied: tensorboard in /home/jeremy/anaconda3/lib/python3.9/site-packages (from detectron2) (2.7.0)
Requirement already satisfied: Pillow>=7.1 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from detectron2) (8.4.0)
Requirement already satisfied: tqdm>4.29.0 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from detectron2) (4.62.3)
Requirement already satisfied: tabulate in /home/jeremy/anaconda3/lib/python3.9/site-packages (from detectron2) (0.8.9)
Requirement already satisfied: pathspec<1,>=0.8.1 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from black==21.4b2->detectron2) (0.9.0)
Requirement already satisfied: toml>=0.10.1 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from black==21.4b2->detectron2) (0.10.2)
Requirement already satisfied: regex>=2020.1.8 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from black==21.4b2->detectron2) (2021.8.3)
Requirement already satisfied: mypy-extensions>=0.4.3 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from black==21.4b2->detectron2) (0.4.3)
Requirement already satisfied: appdirs in /home/jeremy/anaconda3/lib/python3.9/site-packages (from black==21.4b2->detectron2) (1.4.4)
Requirement already satisfied: click>=7.1.2 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from black==21.4b2->detectron2) (8.0.3)
Requirement already satisfied: numpy in /home/jeremy/anaconda3/lib/python3.9/site-packages (from fvcore<0.1.6,>=0.1.5->detectron2) (1.20.3)
Requirement already satisfied: pyyaml>=5.1 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from fvcore<0.1.6,>=0.1.5->detectron2) (5.3)
Requirement already satisfied: antlr4-python3-runtime==4.8 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from hydra-core>=1.1->detectron2) (4.8)
Requirement already satisfied: portalocker in /home/jeremy/anaconda3/lib/python3.9/site-packages (from iopath<0.1.10,>=0.1.7->detectron2) (2.3.2)
Requirement already satisfied: cython>=0.27.3 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from pycocotools>=2.0.2->detectron2) (0.29.24)
Requirement already satisfied: setuptools>=18.0 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from pycocotools>=2.0.2->detectron2) (58.0.4)
Requirement already satisfied: python-dateutil>=2.7 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from matplotlib->detectron2) (2.8.2)
Requirement already satisfied: cycler>=0.10 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from matplotlib->detectron2) (0.10.0)
Requirement already satisfied: kiwisolver>=1.0.1 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from matplotlib->detectron2) (1.3.1)
Requirement already satisfied: pyparsing>=2.2.1 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from matplotlib->detectron2) (3.0.4)
Requirement already satisfied: six in /home/jeremy/anaconda3/lib/python3.9/site-packages (from cycler>=0.10->matplotlib->detectron2) (1.16.0)
Requirement already satisfied: absl-py>=0.4 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from tensorboard->detectron2) (1.0.0)
Requirement already satisfied: grpcio>=1.24.3 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from tensorboard->detectron2) (1.43.0)
Requirement already satisfied: google-auth<3,>=1.6.3 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from tensorboard->detectron2) (2.3.3)
Requirement already satisfied: wheel>=0.26 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from tensorboard->detectron2) (0.37.0)
Requirement already satisfied: tensorboard-plugin-wit>=1.6.0 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from tensorboard->detectron2) (1.8.1)
Requirement already satisfied: google-auth-oauthlib<0.5,>=0.4.1 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from tensorboard->detectron2) (0.4.6)
Requirement already satisfied: requests<3,>=2.21.0 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from tensorboard->detectron2) (2.26.0)
Requirement already satisfied: markdown>=2.6.8 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from tensorboard->detectron2) (3.3.6)
Requirement already satisfied: protobuf>=3.6.0 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from tensorboard->detectron2) (3.19.1)
Requirement already satisfied: werkzeug>=0.11.15 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from tensorboard->detectron2) (2.0.2)
Requirement already satisfied: tensorboard-data-server<0.7.0,>=0.6.0 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from tensorboard->detectron2) (0.6.1)
Requirement already satisfied: cachetools<5.0,>=2.0.0 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from google-auth<3,>=1.6.3->tensorboard->detectron2) (4.2.4)
Requirement already satisfied: rsa<5,>=3.1.4 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from google-auth<3,>=1.6.3->tensorboard->detectron2) (4.8)
Requirement already satisfied: pyasn1-modules>=0.2.1 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from google-auth<3,>=1.6.3->tensorboard->detectron2) (0.2.8)
Requirement already satisfied: requests-oauthlib>=0.7.0 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from google-auth-oauthlib<0.5,>=0.4.1->tensorboard->detectron2) (1.3.0)
Requirement already satisfied: importlib-metadata>=4.4 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from markdown>=2.6.8->tensorboard->detectron2) (4.8.1)
Requirement already satisfied: zipp>=0.5 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from importlib-metadata>=4.4->markdown>=2.6.8->tensorboard->detectron2) (3.6.0)
Requirement already satisfied: pyasn1<0.5.0,>=0.4.6 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from pyasn1-modules>=0.2.1->google-auth<3,>=1.6.3->tensorboard->detectron2) (0.4.8)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from requests<3,>=2.21.0->tensorboard->detectron2) (1.26.7)
Requirement already satisfied: charset-normalizer~=2.0.0 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from requests<3,>=2.21.0->tensorboard->detectron2) (2.0.4)
Requirement already satisfied: certifi>=2017.4.17 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from requests<3,>=2.21.0->tensorboard->detectron2) (2021.10.8)
Requirement already satisfied: idna<4,>=2.5 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from requests<3,>=2.21.0->tensorboard->detectron2) (3.2)
Requirement already satisfied: oauthlib>=3.0.0 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from requests-oauthlib>=0.7.0->google-auth-oauthlib<0.5,>=0.4.1->tensorboard->detectron2) (3.1.1)
# import some common libraries
import json
import os
import cv2
import random
import glob
import numpy as np
if IN_COLAB:
  from google.colab.patches import cv2_imshow
import matplotlib.pyplot as plt
%matplotlib inline
# Setup detectron2 logger
import detectron2
from detectron2.utils.logger import setup_logger
setup_logger()

# import some common detectron2 utilities
from detectron2 import model_zoo
from detectron2.engine import DefaultPredictor
from detectron2.config import get_cfg
from detectron2.utils.visualizer import Visualizer
from detectron2.data import MetadataCatalog, DatasetCatalog
# is there a gpu
if torch.cuda.is_available():
    GPU = True
    print('gpu available')
else:
    GPU = False
    print('no gpu')
gpu available

Train on a custom dataset#

In this section, we show how to train an existing detectron2 model on a custom dataset in COCO format.

Prepare the dataset#

Upload a labeled dataset.#

The following code is expecting the dataset in the COCO format to be in a .zip file. For example: sample_dataset.zip
Note: please make sure there is no white space in your file path if you encounter file not found issues.

!pip install gdown 
!gdown --id 1fUXCLnoJ5SwXg54mj0NBKGzidsV8ALVR
Requirement already satisfied: gdown in /home/jeremy/anaconda3/lib/python3.9/site-packages (4.2.0)
Requirement already satisfied: filelock in /home/jeremy/anaconda3/lib/python3.9/site-packages (from gdown) (3.3.1)
Requirement already satisfied: requests[socks] in /home/jeremy/anaconda3/lib/python3.9/site-packages (from gdown) (2.26.0)
Requirement already satisfied: beautifulsoup4 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from gdown) (4.10.0)
Requirement already satisfied: six in /home/jeremy/anaconda3/lib/python3.9/site-packages (from gdown) (1.16.0)
Requirement already satisfied: tqdm in /home/jeremy/anaconda3/lib/python3.9/site-packages (from gdown) (4.62.3)
Requirement already satisfied: soupsieve>1.2 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from beautifulsoup4->gdown) (2.2.1)
Requirement already satisfied: charset-normalizer~=2.0.0 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from requests[socks]->gdown) (2.0.4)
Requirement already satisfied: idna<4,>=2.5 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from requests[socks]->gdown) (3.2)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from requests[socks]->gdown) (1.26.7)
Requirement already satisfied: certifi>=2017.4.17 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from requests[socks]->gdown) (2021.10.8)
Requirement already satisfied: PySocks!=1.5.7,>=1.5.6 in /home/jeremy/anaconda3/lib/python3.9/site-packages (from requests[socks]->gdown) (1.7.1)
Downloading...
From: https://drive.google.com/uc?id=1fUXCLnoJ5SwXg54mj0NBKGzidsV8ALVR
To: /home/jeremy/Documents/annolid/book/tutorials/novelctrlk6_8_coco_dataset.zip
100%|██████████████████████████████████████| 10.3M/10.3M [00:00<00:00, 95.1MB/s]
if IN_COLAB:
    dataset = '/content/novelctrlk6_8_coco_dataset.zip'
else:
    dataset = 'novelctrlk6_8_coco_dataset.zip'
if IN_COLAB:
    !unzip $dataset -d /content/
else:
    #TODO generalize this
    !unzip -o $dataset -d .
Archive:  novelctrlk6_8_coco_dataset.zip
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00001416_41.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00004233_81.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00004515_22.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00000636_6.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00006297_11.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00006818_79.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00006056_25.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00006094_12.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00004340_96.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00000557_50.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00000979_94.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00005018_19.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00001257_80.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00004335_26.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00004804_39.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00006396_43.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00006993_65.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00000865_92.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00003114_35.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00001918_2.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00004935_56.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00006959_78.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00005216_14.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00003092_22.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00001528_44.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00003208_12.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00004966_35.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00004517_47.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00004769_82.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00004767_87.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00000378_62.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00006563_31.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00001476_48.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00004555_67.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00003070_62.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00005007_38.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00003342_41.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00002267_91.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00002248_99.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00000702_46.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00003237_15.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00002412_96.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00001322_73.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00002328_97.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00004312_85.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00004443_46.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00003783_81.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00000779_3.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00002654_1.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00004685_90.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00004098_59.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00006538_99.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00006832_71.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00007024_26.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00005772_51.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00006330_76.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00000871_10.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00003034_64.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00004539_89.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/JPEGImages/00005788_49.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/valid/annotations.json  
  inflating: ./novelctrlk6_8_coco_dataset/.DS_Store  
  inflating: ./__MACOSX/novelctrlk6_8_coco_dataset/._.DS_Store  
  inflating: ./novelctrlk6_8_coco_dataset/data.yaml  
  inflating: ./novelctrlk6_8_coco_dataset/train/.DS_Store  
  inflating: ./__MACOSX/novelctrlk6_8_coco_dataset/train/._.DS_Store  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00001466_29.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00006804_57.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00007044_50.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00006232_55.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00000759_28.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00003680_32.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00004441_88.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00005090_94.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00002476_27.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00000965_42.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00001652_55.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00002000_18.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00002333_0.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00005752_39.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00006670_0.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00005763_18.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00004469_75.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00005566_43.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00004336_74.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00006574_88.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00001922_1.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00001795_5.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00005755_34.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00003576_54.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00002647_78.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00004295_47.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00003667_56.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00006131_85.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00000311_77.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00000879_40.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00002991_7.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00000757_5.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00004628_68.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00004348_33.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00003269_66.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00003224_49.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00002628_21.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00001838_25.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00002825_40.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00002247_29.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00003100_30.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00001922_57.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00001320_92.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00006299_63.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00000952_60.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00000842_70.jpg  
  inflating: ./__MACOSX/novelctrlk6_8_coco_dataset/train/JPEGImages/._00000842_70.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00002295_32.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00004117_59.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00003976_44.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00004032_20.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00006212_69.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00003134_31.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00006169_98.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00006830_36.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00004570_24.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00000735_23.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00003905_42.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00003012_37.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00005456_95.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00002456_9.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00002433_90.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00006715_97.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00005597_15.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00006319_53.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00005025_83.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00000530_64.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00002362_63.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00002311_72.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00005068_37.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00006353_11.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00005056_91.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00000970_34.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00002577_24.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00006601_16.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00000652_65.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00002639_95.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00005953_8.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00006000_27.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00005700_89.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00004672_51.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00004062_19.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00005246_28.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00005154_53.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00005584_93.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00003834_80.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00004711_6.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00006038_20.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00006373_2.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00006470_84.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00003956_16.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00002625_79.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00006270_45.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00006521_86.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00001999_45.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00004081_4.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00000776_14.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00002546_58.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00001181_61.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00002226_87.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00001632_33.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00006218_71.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00005400_9.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00001299_86.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00002067_75.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00006844_69.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00001892_72.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00001153_93.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00001389_82.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00001439_52.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00006367_98.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00004018_67.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00004535_48.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00006361_13.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00003167_10.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00006832_66.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00004956_38.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00001980_61.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/JPEGImages/00003364_13.jpg  
  inflating: ./novelctrlk6_8_coco_dataset/train/annotations.json  
DATASET_NAME = DATASET_DIR = f"{dataset.replace('.zip','')}"

Register the custom dataset to Detectron2, following the detectron2 custom dataset tutorial. Here, the dataset is in COCO format, therefore we register into Detectron2’s standard format. User should write such a function when using a dataset in custom format. See the tutorial for more details.

from detectron2.data.datasets import register_coco_instances
from detectron2.data import get_detection_dataset_dicts
from detectron2.data.datasets import  builtin_meta
register_coco_instances(f"{DATASET_NAME}_train", {}, f"{DATASET_DIR}/train/annotations.json", f"{DATASET_DIR}/train/")
register_coco_instances(f"{DATASET_NAME}_valid", {}, f"{DATASET_DIR}/valid/annotations.json", f"{DATASET_DIR}/valid/")
dataset_dicts = get_detection_dataset_dicts([f"{DATASET_NAME}_train"])
[01/24 12:59:37 d2.data.datasets.coco]: Loaded 118 images in COCO format from novelctrlk6_8_coco_dataset/train/annotations.json
[01/24 12:59:37 d2.data.build]: Removed 0 images with no usable annotations. 118 images left.
[01/24 12:59:37 d2.data.build]: Distribution of instances among all 7 categories:
|   category   | #instances   |  category  | #instances   |  category  | #instances   |
|:------------:|:-------------|:----------:|:-------------|:----------:|:-------------|
| _background_ | 0            |    nose    | 118          |  left_ear  | 118          |
|  right_ear   | 117          | tail_base  | 119          |   mouse    | 118          |
|   centroid   | 1            |            |              |            |              |
|    total     | 591          |            |              |            |              |
_dataset_metadata = MetadataCatalog.get(f"{DATASET_NAME}_train")
_dataset_metadata.thing_colors = [cc['color'] for cc in builtin_meta.COCO_CATEGORIES]
_dataset_metadata
namespace(name='novelctrlk6_8_coco_dataset_train',
          json_file='novelctrlk6_8_coco_dataset/train/annotations.json',
          image_root='novelctrlk6_8_coco_dataset/train/',
          evaluator_type='coco',
          thing_classes=['_background_',
                         'nose',
                         'left_ear',
                         'right_ear',
                         'tail_base',
                         'mouse',
                         'centroid'],
          thing_dataset_id_to_contiguous_id={0: 0,
                                             1: 1,
                                             2: 2,
                                             3: 3,
                                             4: 4,
                                             5: 5,
                                             6: 6},
          thing_colors=[[220, 20, 60],
                        [119, 11, 32],
                        [0, 0, 142],
                        [0, 0, 230],
                        [106, 0, 228],
                        [0, 60, 100],
                        [0, 80, 100],
                        [0, 0, 70],
                        [0, 0, 192],
                        [250, 170, 30],
                        [100, 170, 30],
                        [220, 220, 0],
                        [175, 116, 175],
                        [250, 0, 30],
                        [165, 42, 42],
                        [255, 77, 255],
                        [0, 226, 252],
                        [182, 182, 255],
                        [0, 82, 0],
                        [120, 166, 157],
                        [110, 76, 0],
                        [174, 57, 255],
                        [199, 100, 0],
                        [72, 0, 118],
                        [255, 179, 240],
                        [0, 125, 92],
                        [209, 0, 151],
                        [188, 208, 182],
                        [0, 220, 176],
                        [255, 99, 164],
                        [92, 0, 73],
                        [133, 129, 255],
                        [78, 180, 255],
                        [0, 228, 0],
                        [174, 255, 243],
                        [45, 89, 255],
                        [134, 134, 103],
                        [145, 148, 174],
                        [255, 208, 186],
                        [197, 226, 255],
                        [171, 134, 1],
                        [109, 63, 54],
                        [207, 138, 255],
                        [151, 0, 95],
                        [9, 80, 61],
                        [84, 105, 51],
                        [74, 65, 105],
                        [166, 196, 102],
                        [208, 195, 210],
                        [255, 109, 65],
                        [0, 143, 149],
                        [179, 0, 194],
                        [209, 99, 106],
                        [5, 121, 0],
                        [227, 255, 205],
                        [147, 186, 208],
                        [153, 69, 1],
                        [3, 95, 161],
                        [163, 255, 0],
                        [119, 0, 170],
                        [0, 182, 199],
                        [0, 165, 120],
                        [183, 130, 88],
                        [95, 32, 0],
                        [130, 114, 135],
                        [110, 129, 133],
                        [166, 74, 118],
                        [219, 142, 185],
                        [79, 210, 114],
                        [178, 90, 62],
                        [65, 70, 15],
                        [127, 167, 115],
                        [59, 105, 106],
                        [142, 108, 45],
                        [196, 172, 0],
                        [95, 54, 80],
                        [128, 76, 255],
                        [201, 57, 1],
                        [246, 0, 122],
                        [191, 162, 208],
                        [255, 255, 128],
                        [147, 211, 203],
                        [150, 100, 100],
                        [168, 171, 172],
                        [146, 112, 198],
                        [210, 170, 100],
                        [92, 136, 89],
                        [218, 88, 184],
                        [241, 129, 0],
                        [217, 17, 255],
                        [124, 74, 181],
                        [70, 70, 70],
                        [255, 228, 255],
                        [154, 208, 0],
                        [193, 0, 92],
                        [76, 91, 113],
                        [255, 180, 195],
                        [106, 154, 176],
                        [230, 150, 140],
                        [60, 143, 255],
                        [128, 64, 128],
                        [92, 82, 55],
                        [254, 212, 124],
                        [73, 77, 174],
                        [255, 160, 98],
                        [255, 255, 255],
                        [104, 84, 109],
                        [169, 164, 131],
                        [225, 199, 255],
                        [137, 54, 74],
                        [135, 158, 223],
                        [7, 246, 231],
                        [107, 255, 200],
                        [58, 41, 149],
                        [183, 121, 142],
                        [255, 73, 97],
                        [107, 142, 35],
                        [190, 153, 153],
                        [146, 139, 141],
                        [70, 130, 180],
                        [134, 199, 156],
                        [209, 226, 140],
                        [96, 36, 108],
                        [96, 96, 96],
                        [64, 170, 64],
                        [152, 251, 152],
                        [208, 229, 228],
                        [206, 186, 171],
                        [152, 161, 64],
                        [116, 112, 0],
                        [0, 114, 143],
                        [102, 102, 156],
                        [250, 141, 255]])
NUM_CLASSES = len(_dataset_metadata.thing_classes)
print(f"{NUM_CLASSES} Number of classes in the dataset")
7 Number of classes in the dataset

To verify the data loading is correct, let’s visualize the annotations of a randomly selected sample in the training set:

for d in random.sample(dataset_dicts, 2):
    if '\\' in d['file_name']:
        d['file_name'] = d['file_name'].replace('\\','/')
    img = cv2.imread(d["file_name"])
    visualizer = Visualizer(img[:, :, ::-1], metadata=_dataset_metadata, scale=0.5)
    out = visualizer.draw_dataset_dict(d)
    if IN_COLAB:
        cv2_imshow(out.get_image()[:, :, ::-1])
    else:
        plt.imshow(out.get_image()[:, :, ::-1])
        
../_images/ee65d409a0e292136033d4dc361e49b4eb74d1dd621595e90427de07a4ccba05.png

Train!#

Now, let’s fine-tune the COCO-pretrained R50-FPN Mask R-CNN model with our custom dataset. It takes ~2 hours to train 3000 iterations on Colab’s K80 GPU, or ~1.5 hours on a P100 GPU.

if GPU:
    !nvidia-smi
Mon Jan 24 12:59:37 2022       
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 470.86       Driver Version: 470.86       CUDA Version: 11.4     |
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                               |                      |               MIG M. |
|===============================+======================+======================|
|   0  NVIDIA GeForce ...  Off  | 00000000:C1:00.0 Off |                  N/A |
| 24%   34C    P8    20W / 250W |   2150MiB /  7979MiB |      0%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+
                                                                               
+-----------------------------------------------------------------------------+
| Processes:                                                                  |
|  GPU   GI   CI        PID   Type   Process name                  GPU Memory |
|        ID   ID                                                   Usage      |
|=============================================================================|
|    0   N/A  N/A      1405      G   /usr/lib/xorg/Xorg                132MiB |
|    0   N/A  N/A     64929      G   /usr/lib/xorg/Xorg                265MiB |
|    0   N/A  N/A     65055      G   /usr/bin/gnome-shell               10MiB |
|    0   N/A  N/A     65407      G   ...AAAAAAAAA= --shared-files       10MiB |
|    0   N/A  N/A    204260      C   ...vs/annolid-env/bin/python     1715MiB |
+-----------------------------------------------------------------------------+
from detectron2.engine import DefaultTrainer
cfg = get_cfg()
if GPU:
    pass
else:
    cfg.MODEL.DEVICE='cpu'
cfg.merge_from_file(model_zoo.get_config_file("COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml"))
cfg.DATASETS.TRAIN = (f"{DATASET_NAME}_train",)
cfg.DATASETS.TEST = ()
cfg.DATALOADER.NUM_WORKERS = 2 #@param
cfg.DATALOADER.SAMPLER_TRAIN = "RepeatFactorTrainingSampler"
cfg.DATALOADER.REPEAT_THRESHOLD = 0.3
cfg.MODEL.WEIGHTS = model_zoo.get_checkpoint_url("COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml")  # Let training initialize from model zoo
cfg.SOLVER.IMS_PER_BATCH =  4 #@param
cfg.SOLVER.BASE_LR = 0.0025 #@param # pick a good LR
cfg.SOLVER.MAX_ITER = 3000 #@param    # 300 iterations seems good enough for 100 frames dataset; you will need to train longer for a practical dataset
cfg.SOLVER.CHECKPOINT_PERIOD = 1000 #@param 
cfg.MODEL.ROI_HEADS.BATCH_SIZE_PER_IMAGE = 32 #@param   # faster, and good enough for this toy dataset (default: 512)
cfg.MODEL.ROI_HEADS.NUM_CLASSES = NUM_CLASSES  #  (see https://detectron2.readthedocs.io/tutorials/datasets.html#update-the-config-for-new-datasets)
os.makedirs(cfg.OUTPUT_DIR, exist_ok=True)
trainer = DefaultTrainer(cfg) 
trainer.resume_or_load(resume=False)
[01/24 12:59:40 d2.engine.defaults]: Model:
GeneralizedRCNN(
  (backbone): FPN(
    (fpn_lateral2): Conv2d(256, 256, kernel_size=(1, 1), stride=(1, 1))
    (fpn_output2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    (fpn_lateral3): Conv2d(512, 256, kernel_size=(1, 1), stride=(1, 1))
    (fpn_output3): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    (fpn_lateral4): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1))
    (fpn_output4): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    (fpn_lateral5): Conv2d(2048, 256, kernel_size=(1, 1), stride=(1, 1))
    (fpn_output5): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    (top_block): LastLevelMaxPool()
    (bottom_up): ResNet(
      (stem): BasicStem(
        (conv1): Conv2d(
          3, 64, kernel_size=(7, 7), stride=(2, 2), padding=(3, 3), bias=False
          (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
        )
      )
      (res2): Sequential(
        (0): BottleneckBlock(
          (shortcut): Conv2d(
            64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv1): Conv2d(
            64, 64, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
          )
          (conv2): Conv2d(
            64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
          )
          (conv3): Conv2d(
            64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
        )
        (1): BottleneckBlock(
          (conv1): Conv2d(
            256, 64, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
          )
          (conv2): Conv2d(
            64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
          )
          (conv3): Conv2d(
            64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
        )
        (2): BottleneckBlock(
          (conv1): Conv2d(
            256, 64, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
          )
          (conv2): Conv2d(
            64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
          )
          (conv3): Conv2d(
            64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
        )
      )
      (res3): Sequential(
        (0): BottleneckBlock(
          (shortcut): Conv2d(
            256, 512, kernel_size=(1, 1), stride=(2, 2), bias=False
            (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
          )
          (conv1): Conv2d(
            256, 128, kernel_size=(1, 1), stride=(2, 2), bias=False
            (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
          )
          (conv2): Conv2d(
            128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
          )
          (conv3): Conv2d(
            128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
          )
        )
        (1): BottleneckBlock(
          (conv1): Conv2d(
            512, 128, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
          )
          (conv2): Conv2d(
            128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
          )
          (conv3): Conv2d(
            128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
          )
        )
        (2): BottleneckBlock(
          (conv1): Conv2d(
            512, 128, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
          )
          (conv2): Conv2d(
            128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
          )
          (conv3): Conv2d(
            128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
          )
        )
        (3): BottleneckBlock(
          (conv1): Conv2d(
            512, 128, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
          )
          (conv2): Conv2d(
            128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
          )
          (conv3): Conv2d(
            128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
          )
        )
      )
      (res4): Sequential(
        (0): BottleneckBlock(
          (shortcut): Conv2d(
            512, 1024, kernel_size=(1, 1), stride=(2, 2), bias=False
            (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
          )
          (conv1): Conv2d(
            512, 256, kernel_size=(1, 1), stride=(2, 2), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv2): Conv2d(
            256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv3): Conv2d(
            256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
          )
        )
        (1): BottleneckBlock(
          (conv1): Conv2d(
            1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv2): Conv2d(
            256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv3): Conv2d(
            256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
          )
        )
        (2): BottleneckBlock(
          (conv1): Conv2d(
            1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv2): Conv2d(
            256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv3): Conv2d(
            256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
          )
        )
        (3): BottleneckBlock(
          (conv1): Conv2d(
            1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv2): Conv2d(
            256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv3): Conv2d(
            256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
          )
        )
        (4): BottleneckBlock(
          (conv1): Conv2d(
            1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv2): Conv2d(
            256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv3): Conv2d(
            256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
          )
        )
        (5): BottleneckBlock(
          (conv1): Conv2d(
            1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv2): Conv2d(
            256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv3): Conv2d(
            256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
          )
        )
      )
      (res5): Sequential(
        (0): BottleneckBlock(
          (shortcut): Conv2d(
            1024, 2048, kernel_size=(1, 1), stride=(2, 2), bias=False
            (norm): FrozenBatchNorm2d(num_features=2048, eps=1e-05)
          )
          (conv1): Conv2d(
            1024, 512, kernel_size=(1, 1), stride=(2, 2), bias=False
            (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
          )
          (conv2): Conv2d(
            512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
          )
          (conv3): Conv2d(
            512, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=2048, eps=1e-05)
          )
        )
        (1): BottleneckBlock(
          (conv1): Conv2d(
            2048, 512, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
          )
          (conv2): Conv2d(
            512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
          )
          (conv3): Conv2d(
            512, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=2048, eps=1e-05)
          )
        )
        (2): BottleneckBlock(
          (conv1): Conv2d(
            2048, 512, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
          )
          (conv2): Conv2d(
            512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
          )
          (conv3): Conv2d(
            512, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=2048, eps=1e-05)
          )
        )
      )
    )
  )
  (proposal_generator): RPN(
    (rpn_head): StandardRPNHead(
      (conv): Conv2d(
        256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)
        (activation): ReLU()
      )
      (objectness_logits): Conv2d(256, 3, kernel_size=(1, 1), stride=(1, 1))
      (anchor_deltas): Conv2d(256, 12, kernel_size=(1, 1), stride=(1, 1))
    )
    (anchor_generator): DefaultAnchorGenerator(
      (cell_anchors): BufferList()
    )
  )
  (roi_heads): StandardROIHeads(
    (box_pooler): ROIPooler(
      (level_poolers): ModuleList(
        (0): ROIAlign(output_size=(7, 7), spatial_scale=0.25, sampling_ratio=0, aligned=True)
        (1): ROIAlign(output_size=(7, 7), spatial_scale=0.125, sampling_ratio=0, aligned=True)
        (2): ROIAlign(output_size=(7, 7), spatial_scale=0.0625, sampling_ratio=0, aligned=True)
        (3): ROIAlign(output_size=(7, 7), spatial_scale=0.03125, sampling_ratio=0, aligned=True)
      )
    )
    (box_head): FastRCNNConvFCHead(
      (flatten): Flatten(start_dim=1, end_dim=-1)
      (fc1): Linear(in_features=12544, out_features=1024, bias=True)
      (fc_relu1): ReLU()
      (fc2): Linear(in_features=1024, out_features=1024, bias=True)
      (fc_relu2): ReLU()
    )
    (box_predictor): FastRCNNOutputLayers(
      (cls_score): Linear(in_features=1024, out_features=8, bias=True)
      (bbox_pred): Linear(in_features=1024, out_features=28, bias=True)
    )
    (mask_pooler): ROIPooler(
      (level_poolers): ModuleList(
        (0): ROIAlign(output_size=(14, 14), spatial_scale=0.25, sampling_ratio=0, aligned=True)
        (1): ROIAlign(output_size=(14, 14), spatial_scale=0.125, sampling_ratio=0, aligned=True)
        (2): ROIAlign(output_size=(14, 14), spatial_scale=0.0625, sampling_ratio=0, aligned=True)
        (3): ROIAlign(output_size=(14, 14), spatial_scale=0.03125, sampling_ratio=0, aligned=True)
      )
    )
    (mask_head): MaskRCNNConvUpsampleHead(
      (mask_fcn1): Conv2d(
        256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)
        (activation): ReLU()
      )
      (mask_fcn2): Conv2d(
        256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)
        (activation): ReLU()
      )
      (mask_fcn3): Conv2d(
        256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)
        (activation): ReLU()
      )
      (mask_fcn4): Conv2d(
        256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)
        (activation): ReLU()
      )
      (deconv): ConvTranspose2d(256, 256, kernel_size=(2, 2), stride=(2, 2))
      (deconv_relu): ReLU()
      (predictor): Conv2d(256, 7, kernel_size=(1, 1), stride=(1, 1))
    )
  )
)
[01/24 12:59:40 d2.data.datasets.coco]: Loaded 118 images in COCO format from novelctrlk6_8_coco_dataset/train/annotations.json
[01/24 12:59:40 d2.data.build]: Removed 0 images with no usable annotations. 118 images left.
[01/24 12:59:40 d2.data.dataset_mapper]: [DatasetMapper] Augmentations used in training: [ResizeShortestEdge(short_edge_length=(640, 672, 704, 736, 768, 800), max_size=1333, sample_style='choice'), RandomFlip()]
[01/24 12:59:40 d2.data.build]: Using training sampler RepeatFactorTrainingSampler
[01/24 12:59:40 d2.data.common]: Serializing 118 elements to byte tensors and concatenating them all ...
[01/24 12:59:40 d2.data.common]: Serialized dataset takes 0.31 MiB
WARNING [01/24 12:59:40 d2.solver.build]: SOLVER.STEPS contains values larger than SOLVER.MAX_ITER. These values will be ignored.
Skip loading parameter 'roi_heads.box_predictor.cls_score.weight' to the model due to incompatible shapes: (81, 1024) in the checkpoint but (8, 1024) in the model! You might want to double check if this is expected.
Skip loading parameter 'roi_heads.box_predictor.cls_score.bias' to the model due to incompatible shapes: (81,) in the checkpoint but (8,) in the model! You might want to double check if this is expected.
Skip loading parameter 'roi_heads.box_predictor.bbox_pred.weight' to the model due to incompatible shapes: (320, 1024) in the checkpoint but (28, 1024) in the model! You might want to double check if this is expected.
Skip loading parameter 'roi_heads.box_predictor.bbox_pred.bias' to the model due to incompatible shapes: (320,) in the checkpoint but (28,) in the model! You might want to double check if this is expected.
Skip loading parameter 'roi_heads.mask_head.predictor.weight' to the model due to incompatible shapes: (80, 256, 1, 1) in the checkpoint but (7, 256, 1, 1) in the model! You might want to double check if this is expected.
Skip loading parameter 'roi_heads.mask_head.predictor.bias' to the model due to incompatible shapes: (80,) in the checkpoint but (7,) in the model! You might want to double check if this is expected.
Some model parameters or buffers are not found in the checkpoint:
roi_heads.box_predictor.bbox_pred.{bias, weight}
roi_heads.box_predictor.cls_score.{bias, weight}
roi_heads.mask_head.predictor.{bias, weight}
# Look at training curves in tensorboard:
%load_ext tensorboard
%tensorboard --logdir output
Reusing TensorBoard on port 6007 (pid 153647), started 3 days, 1:03:08 ago. (Use '!kill 153647' to kill it.)
trainer.train()
[01/24 12:59:41 d2.engine.train_loop]: Starting training from iteration 0
/home/jeremy/anaconda3/envs/annolid-env/lib/python3.7/site-packages/detectron2/structures/image_list.py:88: UserWarning: __floordiv__ is deprecated, and its behavior will change in a future version of pytorch. It currently rounds toward 0 (like the 'trunc' function NOT 'floor'). This results in incorrect rounding for negative values. To keep the current behavior, use torch.div(a, b, rounding_mode='trunc'), or for actual floor division, use torch.div(a, b, rounding_mode='floor').
  max_size = (max_size + (stride - 1)) // stride * stride
/home/jeremy/anaconda3/envs/annolid-env/lib/python3.7/site-packages/torch/functional.py:445: UserWarning: torch.meshgrid: in an upcoming release, it will be required to pass the indexing argument. (Triggered internally at  ../aten/src/ATen/native/TensorShape.cpp:2157.)
  return _VF.meshgrid(tensors, **kwargs)  # type: ignore[attr-defined]
[01/24 12:59:49 d2.utils.events]:  eta: 0:21:30  iter: 19  total_loss: 5.746  loss_cls: 2.05  loss_box_reg: 0.7793  loss_mask: 0.6885  loss_rpn_cls: 1.848  loss_rpn_loc: 0.3801  time: 0.4068  data_time: 0.0162  lr: 4.9952e-05  max_mem: 3599M
[01/24 12:59:56 d2.utils.events]:  eta: 0:18:17  iter: 39  total_loss: 3.158  loss_cls: 1.374  loss_box_reg: 0.8234  loss_mask: 0.6686  loss_rpn_cls: 0.06725  loss_rpn_loc: 0.2583  time: 0.3941  data_time: 0.0078  lr: 9.9902e-05  max_mem: 3599M
[01/24 13:00:04 d2.utils.events]:  eta: 0:18:10  iter: 59  total_loss: 2.673  loss_cls: 0.9473  loss_box_reg: 0.836  loss_mask: 0.6551  loss_rpn_cls: 0.04344  loss_rpn_loc: 0.1974  time: 0.3931  data_time: 0.0070  lr: 0.00014985  max_mem: 3599M
[01/24 13:00:13 d2.utils.events]:  eta: 0:21:04  iter: 79  total_loss: 2.518  loss_cls: 0.8533  loss_box_reg: 0.8565  loss_mask: 0.6405  loss_rpn_cls: 0.03807  loss_rpn_loc: 0.1586  time: 0.4006  data_time: 0.0078  lr: 0.0001998  max_mem: 3599M
[01/24 13:00:21 d2.utils.events]:  eta: 0:21:00  iter: 99  total_loss: 2.406  loss_cls: 0.78  loss_box_reg: 0.8517  loss_mask: 0.6017  loss_rpn_cls: 0.02938  loss_rpn_loc: 0.1399  time: 0.4034  data_time: 0.0075  lr: 0.00024975  max_mem: 3599M
[01/24 13:00:29 d2.utils.events]:  eta: 0:20:54  iter: 119  total_loss: 2.241  loss_cls: 0.6895  loss_box_reg: 0.8214  loss_mask: 0.5775  loss_rpn_cls: 0.02272  loss_rpn_loc: 0.1236  time: 0.4042  data_time: 0.0079  lr: 0.0002997  max_mem: 3599M
[01/24 13:00:37 d2.utils.events]:  eta: 0:20:46  iter: 139  total_loss: 2.12  loss_cls: 0.5678  loss_box_reg: 0.8671  loss_mask: 0.5229  loss_rpn_cls: 0.01866  loss_rpn_loc: 0.1212  time: 0.4030  data_time: 0.0076  lr: 0.00034965  max_mem: 3599M
[01/24 13:00:45 d2.utils.events]:  eta: 0:20:38  iter: 159  total_loss: 2.016  loss_cls: 0.4978  loss_box_reg: 0.8265  loss_mask: 0.4958  loss_rpn_cls: 0.01693  loss_rpn_loc: 0.125  time: 0.4043  data_time: 0.0076  lr: 0.0003996  max_mem: 3599M
[01/24 13:00:54 d2.utils.events]:  eta: 0:20:32  iter: 179  total_loss: 1.806  loss_cls: 0.4218  loss_box_reg: 0.7675  loss_mask: 0.4541  loss_rpn_cls: 0.01643  loss_rpn_loc: 0.1312  time: 0.4061  data_time: 0.0080  lr: 0.00044955  max_mem: 3599M
[01/24 13:01:02 d2.utils.events]:  eta: 0:20:23  iter: 199  total_loss: 1.665  loss_cls: 0.3805  loss_box_reg: 0.6936  loss_mask: 0.4368  loss_rpn_cls: 0.01149  loss_rpn_loc: 0.1111  time: 0.4059  data_time: 0.0075  lr: 0.0004995  max_mem: 3599M
[01/24 13:01:10 d2.utils.events]:  eta: 0:20:15  iter: 219  total_loss: 1.588  loss_cls: 0.3465  loss_box_reg: 0.6821  loss_mask: 0.4154  loss_rpn_cls: 0.01313  loss_rpn_loc: 0.1095  time: 0.4064  data_time: 0.0079  lr: 0.00054945  max_mem: 3599M
[01/24 13:01:18 d2.utils.events]:  eta: 0:20:06  iter: 239  total_loss: 1.587  loss_cls: 0.3155  loss_box_reg: 0.6624  loss_mask: 0.4086  loss_rpn_cls: 0.0119  loss_rpn_loc: 0.1579  time: 0.4049  data_time: 0.0075  lr: 0.0005994  max_mem: 3599M
[01/24 13:01:26 d2.utils.events]:  eta: 0:19:57  iter: 259  total_loss: 1.376  loss_cls: 0.2926  loss_box_reg: 0.5809  loss_mask: 0.382  loss_rpn_cls: 0.01482  loss_rpn_loc: 0.1179  time: 0.4053  data_time: 0.0073  lr: 0.00064935  max_mem: 3599M
[01/24 13:01:34 d2.utils.events]:  eta: 0:19:49  iter: 279  total_loss: 1.526  loss_cls: 0.3108  loss_box_reg: 0.6488  loss_mask: 0.3984  loss_rpn_cls: 0.01381  loss_rpn_loc: 0.1297  time: 0.4059  data_time: 0.0078  lr: 0.0006993  max_mem: 3599M
[01/24 13:01:43 d2.utils.events]:  eta: 0:19:41  iter: 299  total_loss: 1.441  loss_cls: 0.2848  loss_box_reg: 0.6111  loss_mask: 0.4023  loss_rpn_cls: 0.00954  loss_rpn_loc: 0.1277  time: 0.4060  data_time: 0.0077  lr: 0.00074925  max_mem: 3599M
[01/24 13:01:50 d2.utils.events]:  eta: 0:19:31  iter: 319  total_loss: 1.365  loss_cls: 0.2394  loss_box_reg: 0.5746  loss_mask: 0.3674  loss_rpn_cls: 0.0158  loss_rpn_loc: 0.1355  time: 0.4048  data_time: 0.0086  lr: 0.0007992  max_mem: 3599M
[01/24 13:01:58 d2.utils.events]:  eta: 0:19:22  iter: 339  total_loss: 1.339  loss_cls: 0.2419  loss_box_reg: 0.5722  loss_mask: 0.3764  loss_rpn_cls: 0.01139  loss_rpn_loc: 0.1346  time: 0.4044  data_time: 0.0074  lr: 0.00084915  max_mem: 3599M
[01/24 13:02:06 d2.utils.events]:  eta: 0:19:13  iter: 359  total_loss: 1.361  loss_cls: 0.2535  loss_box_reg: 0.5578  loss_mask: 0.3782  loss_rpn_cls: 0.01327  loss_rpn_loc: 0.1219  time: 0.4042  data_time: 0.0079  lr: 0.0008991  max_mem: 3599M
[01/24 13:02:15 d2.utils.events]:  eta: 0:19:05  iter: 379  total_loss: 1.316  loss_cls: 0.2279  loss_box_reg: 0.5624  loss_mask: 0.3795  loss_rpn_cls: 0.009875  loss_rpn_loc: 0.1138  time: 0.4048  data_time: 0.0089  lr: 0.00094905  max_mem: 3599M
[01/24 13:02:23 d2.utils.events]:  eta: 0:18:56  iter: 399  total_loss: 1.303  loss_cls: 0.2409  loss_box_reg: 0.5437  loss_mask: 0.3664  loss_rpn_cls: 0.01186  loss_rpn_loc: 0.109  time: 0.4047  data_time: 0.0076  lr: 0.000999  max_mem: 3599M
[01/24 13:02:31 d2.utils.events]:  eta: 0:18:47  iter: 419  total_loss: 1.251  loss_cls: 0.2311  loss_box_reg: 0.5163  loss_mask: 0.3879  loss_rpn_cls: 0.009109  loss_rpn_loc: 0.09089  time: 0.4043  data_time: 0.0075  lr: 0.001049  max_mem: 3599M
[01/24 13:02:39 d2.utils.events]:  eta: 0:18:38  iter: 439  total_loss: 1.287  loss_cls: 0.2367  loss_box_reg: 0.5588  loss_mask: 0.3844  loss_rpn_cls: 0.008798  loss_rpn_loc: 0.1366  time: 0.4040  data_time: 0.0079  lr: 0.0010989  max_mem: 3599M
[01/24 13:02:47 d2.utils.events]:  eta: 0:18:30  iter: 459  total_loss: 1.248  loss_cls: 0.2094  loss_box_reg: 0.5386  loss_mask: 0.3669  loss_rpn_cls: 0.01216  loss_rpn_loc: 0.0985  time: 0.4044  data_time: 0.0081  lr: 0.0011489  max_mem: 3599M
[01/24 13:02:55 d2.utils.events]:  eta: 0:18:21  iter: 479  total_loss: 1.26  loss_cls: 0.2303  loss_box_reg: 0.5218  loss_mask: 0.3885  loss_rpn_cls: 0.008737  loss_rpn_loc: 0.09136  time: 0.4042  data_time: 0.0077  lr: 0.0011988  max_mem: 3599M
[01/24 13:03:03 d2.utils.events]:  eta: 0:18:11  iter: 499  total_loss: 1.278  loss_cls: 0.2205  loss_box_reg: 0.5256  loss_mask: 0.3842  loss_rpn_cls: 0.008998  loss_rpn_loc: 0.1466  time: 0.4039  data_time: 0.0069  lr: 0.0012488  max_mem: 3599M
[01/24 13:03:11 d2.utils.events]:  eta: 0:18:01  iter: 519  total_loss: 1.235  loss_cls: 0.1978  loss_box_reg: 0.5216  loss_mask: 0.3535  loss_rpn_cls: 0.009164  loss_rpn_loc: 0.1052  time: 0.4037  data_time: 0.0074  lr: 0.0012987  max_mem: 3599M
[01/24 13:03:19 d2.utils.events]:  eta: 0:17:52  iter: 539  total_loss: 1.294  loss_cls: 0.2079  loss_box_reg: 0.5372  loss_mask: 0.3774  loss_rpn_cls: 0.008165  loss_rpn_loc: 0.127  time: 0.4037  data_time: 0.0072  lr: 0.0013487  max_mem: 3599M
[01/24 13:03:27 d2.utils.events]:  eta: 0:17:44  iter: 559  total_loss: 1.206  loss_cls: 0.2037  loss_box_reg: 0.5117  loss_mask: 0.3873  loss_rpn_cls: 0.00716  loss_rpn_loc: 0.08379  time: 0.4041  data_time: 0.0124  lr: 0.0013986  max_mem: 3599M
[01/24 13:03:36 d2.utils.events]:  eta: 0:17:36  iter: 579  total_loss: 1.226  loss_cls: 0.2023  loss_box_reg: 0.5093  loss_mask: 0.3575  loss_rpn_cls: 0.007724  loss_rpn_loc: 0.1302  time: 0.4042  data_time: 0.0081  lr: 0.0014486  max_mem: 3599M
[01/24 13:03:43 d2.utils.events]:  eta: 0:17:26  iter: 599  total_loss: 1.218  loss_cls: 0.2002  loss_box_reg: 0.5103  loss_mask: 0.3751  loss_rpn_cls: 0.008259  loss_rpn_loc: 0.08441  time: 0.4040  data_time: 0.0077  lr: 0.0014985  max_mem: 3599M
[01/24 13:03:51 d2.utils.events]:  eta: 0:17:16  iter: 619  total_loss: 1.236  loss_cls: 0.2084  loss_box_reg: 0.4961  loss_mask: 0.3602  loss_rpn_cls: 0.008962  loss_rpn_loc: 0.1545  time: 0.4037  data_time: 0.0077  lr: 0.0015485  max_mem: 3599M
[01/24 13:04:00 d2.utils.events]:  eta: 0:17:07  iter: 639  total_loss: 1.165  loss_cls: 0.1905  loss_box_reg: 0.4875  loss_mask: 0.3654  loss_rpn_cls: 0.006638  loss_rpn_loc: 0.08714  time: 0.4039  data_time: 0.0077  lr: 0.0015984  max_mem: 3599M
[01/24 13:04:08 d2.utils.events]:  eta: 0:16:59  iter: 659  total_loss: 1.17  loss_cls: 0.1785  loss_box_reg: 0.4771  loss_mask: 0.3947  loss_rpn_cls: 0.008973  loss_rpn_loc: 0.08612  time: 0.4041  data_time: 0.0080  lr: 0.0016484  max_mem: 3599M
[01/24 13:04:16 d2.utils.events]:  eta: 0:16:48  iter: 679  total_loss: 1.246  loss_cls: 0.2001  loss_box_reg: 0.5117  loss_mask: 0.3799  loss_rpn_cls: 0.006953  loss_rpn_loc: 0.1028  time: 0.4039  data_time: 0.0072  lr: 0.0016983  max_mem: 3599M
[01/24 13:04:24 d2.utils.events]:  eta: 0:16:37  iter: 699  total_loss: 1.182  loss_cls: 0.1981  loss_box_reg: 0.5156  loss_mask: 0.3698  loss_rpn_cls: 0.005824  loss_rpn_loc: 0.08289  time: 0.4038  data_time: 0.0079  lr: 0.0017483  max_mem: 3599M
[01/24 13:04:32 d2.utils.events]:  eta: 0:16:31  iter: 719  total_loss: 1.16  loss_cls: 0.184  loss_box_reg: 0.477  loss_mask: 0.3763  loss_rpn_cls: 0.009227  loss_rpn_loc: 0.09665  time: 0.4039  data_time: 0.0077  lr: 0.0017982  max_mem: 3599M
[01/24 13:04:40 d2.utils.events]:  eta: 0:16:23  iter: 739  total_loss: 1.207  loss_cls: 0.1747  loss_box_reg: 0.4921  loss_mask: 0.3792  loss_rpn_cls: 0.007379  loss_rpn_loc: 0.1228  time: 0.4040  data_time: 0.0080  lr: 0.0018482  max_mem: 3599M
[01/24 13:04:48 d2.utils.events]:  eta: 0:16:13  iter: 759  total_loss: 1.13  loss_cls: 0.1746  loss_box_reg: 0.4666  loss_mask: 0.3685  loss_rpn_cls: 0.007878  loss_rpn_loc: 0.1208  time: 0.4041  data_time: 0.0078  lr: 0.0018981  max_mem: 3599M
[01/24 13:04:57 d2.utils.events]:  eta: 0:16:05  iter: 779  total_loss: 1.109  loss_cls: 0.1631  loss_box_reg: 0.454  loss_mask: 0.3618  loss_rpn_cls: 0.006868  loss_rpn_loc: 0.07819  time: 0.4042  data_time: 0.0077  lr: 0.0019481  max_mem: 3599M
[01/24 13:05:05 d2.utils.events]:  eta: 0:15:58  iter: 799  total_loss: 1.154  loss_cls: 0.1682  loss_box_reg: 0.4597  loss_mask: 0.3658  loss_rpn_cls: 0.006494  loss_rpn_loc: 0.1422  time: 0.4046  data_time: 0.0075  lr: 0.001998  max_mem: 3599M
[01/24 13:05:13 d2.utils.events]:  eta: 0:15:51  iter: 819  total_loss: 1.123  loss_cls: 0.1581  loss_box_reg: 0.4616  loss_mask: 0.3871  loss_rpn_cls: 0.008865  loss_rpn_loc: 0.07634  time: 0.4049  data_time: 0.0073  lr: 0.002048  max_mem: 3599M
[01/24 13:05:21 d2.utils.events]:  eta: 0:15:42  iter: 839  total_loss: 1.055  loss_cls: 0.1515  loss_box_reg: 0.441  loss_mask: 0.3505  loss_rpn_cls: 0.01104  loss_rpn_loc: 0.07654  time: 0.4049  data_time: 0.0080  lr: 0.0020979  max_mem: 3599M
[01/24 13:05:30 d2.utils.events]:  eta: 0:15:33  iter: 859  total_loss: 1.11  loss_cls: 0.1592  loss_box_reg: 0.4547  loss_mask: 0.3688  loss_rpn_cls: 0.008113  loss_rpn_loc: 0.09421  time: 0.4050  data_time: 0.0074  lr: 0.0021479  max_mem: 3599M
[01/24 13:05:38 d2.utils.events]:  eta: 0:15:26  iter: 879  total_loss: 1.033  loss_cls: 0.1423  loss_box_reg: 0.4381  loss_mask: 0.3383  loss_rpn_cls: 0.008075  loss_rpn_loc: 0.08529  time: 0.4053  data_time: 0.0079  lr: 0.0021978  max_mem: 3599M
[01/24 13:05:46 d2.utils.events]:  eta: 0:15:17  iter: 899  total_loss: 1.095  loss_cls: 0.1465  loss_box_reg: 0.4649  loss_mask: 0.3577  loss_rpn_cls: 0.008079  loss_rpn_loc: 0.1055  time: 0.4057  data_time: 0.0080  lr: 0.0022478  max_mem: 3599M
[01/24 13:05:55 d2.utils.events]:  eta: 0:15:09  iter: 919  total_loss: 1.124  loss_cls: 0.1714  loss_box_reg: 0.4702  loss_mask: 0.37  loss_rpn_cls: 0.007444  loss_rpn_loc: 0.09555  time: 0.4061  data_time: 0.0078  lr: 0.0022977  max_mem: 3599M
[01/24 13:06:03 d2.utils.events]:  eta: 0:15:00  iter: 939  total_loss: 1.067  loss_cls: 0.1541  loss_box_reg: 0.4402  loss_mask: 0.3522  loss_rpn_cls: 0.006906  loss_rpn_loc: 0.08798  time: 0.4059  data_time: 0.0075  lr: 0.0023477  max_mem: 3599M
[01/24 13:06:11 d2.utils.events]:  eta: 0:14:52  iter: 959  total_loss: 1.121  loss_cls: 0.1728  loss_box_reg: 0.4442  loss_mask: 0.3732  loss_rpn_cls: 0.006422  loss_rpn_loc: 0.1111  time: 0.4060  data_time: 0.0077  lr: 0.0023976  max_mem: 3599M
[01/24 13:06:19 d2.utils.events]:  eta: 0:14:44  iter: 979  total_loss: 1.068  loss_cls: 0.1658  loss_box_reg: 0.4307  loss_mask: 0.3403  loss_rpn_cls: 0.005036  loss_rpn_loc: 0.1029  time: 0.4061  data_time: 0.0077  lr: 0.0024476  max_mem: 3599M
[01/24 13:06:28 d2.utils.events]:  eta: 0:14:34  iter: 999  total_loss: 1.071  loss_cls: 0.1445  loss_box_reg: 0.4257  loss_mask: 0.3737  loss_rpn_cls: 0.007741  loss_rpn_loc: 0.08348  time: 0.4060  data_time: 0.0079  lr: 0.0024975  max_mem: 3599M
[01/24 13:06:36 d2.utils.events]:  eta: 0:14:29  iter: 1019  total_loss: 1.043  loss_cls: 0.1412  loss_box_reg: 0.4311  loss_mask: 0.3415  loss_rpn_cls: 0.005507  loss_rpn_loc: 0.08056  time: 0.4062  data_time: 0.0081  lr: 0.0025  max_mem: 3599M
[01/24 13:06:44 d2.utils.events]:  eta: 0:14:21  iter: 1039  total_loss: 1.085  loss_cls: 0.1345  loss_box_reg: 0.4438  loss_mask: 0.358  loss_rpn_cls: 0.008674  loss_rpn_loc: 0.103  time: 0.4061  data_time: 0.0074  lr: 0.0025  max_mem: 3599M
[01/24 13:06:52 d2.utils.events]:  eta: 0:14:14  iter: 1059  total_loss: 1.094  loss_cls: 0.1406  loss_box_reg: 0.4456  loss_mask: 0.3678  loss_rpn_cls: 0.007375  loss_rpn_loc: 0.09417  time: 0.4061  data_time: 0.0078  lr: 0.0025  max_mem: 3599M
[01/24 13:07:00 d2.utils.events]:  eta: 0:14:07  iter: 1079  total_loss: 1.071  loss_cls: 0.1673  loss_box_reg: 0.4388  loss_mask: 0.3491  loss_rpn_cls: 0.007313  loss_rpn_loc: 0.09513  time: 0.4062  data_time: 0.0075  lr: 0.0025  max_mem: 3599M
[01/24 13:07:09 d2.utils.events]:  eta: 0:13:59  iter: 1099  total_loss: 0.988  loss_cls: 0.1296  loss_box_reg: 0.4077  loss_mask: 0.3368  loss_rpn_cls: 0.006058  loss_rpn_loc: 0.09165  time: 0.4063  data_time: 0.0079  lr: 0.0025  max_mem: 3599M
[01/24 13:07:17 d2.utils.events]:  eta: 0:13:51  iter: 1119  total_loss: 1.065  loss_cls: 0.1404  loss_box_reg: 0.4233  loss_mask: 0.3584  loss_rpn_cls: 0.008979  loss_rpn_loc: 0.07634  time: 0.4065  data_time: 0.0075  lr: 0.0025  max_mem: 3599M
[01/24 13:07:25 d2.utils.events]:  eta: 0:13:42  iter: 1139  total_loss: 0.9815  loss_cls: 0.1392  loss_box_reg: 0.3999  loss_mask: 0.3436  loss_rpn_cls: 0.0072  loss_rpn_loc: 0.0684  time: 0.4065  data_time: 0.0079  lr: 0.0025  max_mem: 3599M
[01/24 13:07:33 d2.utils.events]:  eta: 0:13:33  iter: 1159  total_loss: 1.001  loss_cls: 0.1146  loss_box_reg: 0.4069  loss_mask: 0.3454  loss_rpn_cls: 0.01099  loss_rpn_loc: 0.1088  time: 0.4062  data_time: 0.0073  lr: 0.0025  max_mem: 3599M
[01/24 13:07:41 d2.utils.events]:  eta: 0:13:24  iter: 1179  total_loss: 0.9924  loss_cls: 0.1227  loss_box_reg: 0.4145  loss_mask: 0.3627  loss_rpn_cls: 0.004273  loss_rpn_loc: 0.08362  time: 0.4064  data_time: 0.0079  lr: 0.0025  max_mem: 3599M
[01/24 13:07:49 d2.utils.events]:  eta: 0:13:16  iter: 1199  total_loss: 0.9583  loss_cls: 0.131  loss_box_reg: 0.3828  loss_mask: 0.3551  loss_rpn_cls: 0.006091  loss_rpn_loc: 0.07373  time: 0.4063  data_time: 0.0074  lr: 0.0025  max_mem: 3599M
[01/24 13:07:57 d2.utils.events]:  eta: 0:13:08  iter: 1219  total_loss: 0.9978  loss_cls: 0.1256  loss_box_reg: 0.3876  loss_mask: 0.3547  loss_rpn_cls: 0.009088  loss_rpn_loc: 0.1077  time: 0.4063  data_time: 0.0073  lr: 0.0025  max_mem: 3599M
[01/24 13:08:05 d2.utils.events]:  eta: 0:12:59  iter: 1239  total_loss: 0.9762  loss_cls: 0.1284  loss_box_reg: 0.3908  loss_mask: 0.3467  loss_rpn_cls: 0.006199  loss_rpn_loc: 0.07743  time: 0.4060  data_time: 0.0073  lr: 0.0025  max_mem: 3599M
[01/24 13:08:14 d2.utils.events]:  eta: 0:12:50  iter: 1259  total_loss: 0.9489  loss_cls: 0.118  loss_box_reg: 0.3868  loss_mask: 0.3566  loss_rpn_cls: 0.00547  loss_rpn_loc: 0.0862  time: 0.4062  data_time: 0.0081  lr: 0.0025  max_mem: 3599M
[01/24 13:08:22 d2.utils.events]:  eta: 0:12:42  iter: 1279  total_loss: 0.9863  loss_cls: 0.1137  loss_box_reg: 0.4087  loss_mask: 0.3569  loss_rpn_cls: 0.008421  loss_rpn_loc: 0.07432  time: 0.4064  data_time: 0.0076  lr: 0.0025  max_mem: 3599M
[01/24 13:08:30 d2.utils.events]:  eta: 0:12:33  iter: 1299  total_loss: 0.9329  loss_cls: 0.1131  loss_box_reg: 0.377  loss_mask: 0.3447  loss_rpn_cls: 0.008236  loss_rpn_loc: 0.07238  time: 0.4065  data_time: 0.0070  lr: 0.0025  max_mem: 3599M
[01/24 13:08:39 d2.utils.events]:  eta: 0:12:25  iter: 1319  total_loss: 0.9631  loss_cls: 0.1129  loss_box_reg: 0.3778  loss_mask: 0.3393  loss_rpn_cls: 0.007597  loss_rpn_loc: 0.1198  time: 0.4067  data_time: 0.0080  lr: 0.0025  max_mem: 3599M
[01/24 13:08:47 d2.utils.events]:  eta: 0:12:17  iter: 1339  total_loss: 0.9168  loss_cls: 0.1201  loss_box_reg: 0.3791  loss_mask: 0.3526  loss_rpn_cls: 0.007066  loss_rpn_loc: 0.06937  time: 0.4069  data_time: 0.0076  lr: 0.0025  max_mem: 3599M
[01/24 13:08:55 d2.utils.events]:  eta: 0:12:08  iter: 1359  total_loss: 0.9252  loss_cls: 0.1088  loss_box_reg: 0.3694  loss_mask: 0.3321  loss_rpn_cls: 0.007251  loss_rpn_loc: 0.07654  time: 0.4067  data_time: 0.0074  lr: 0.0025  max_mem: 3599M
[01/24 13:09:03 d2.utils.events]:  eta: 0:11:59  iter: 1379  total_loss: 0.9791  loss_cls: 0.1147  loss_box_reg: 0.3836  loss_mask: 0.3581  loss_rpn_cls: 0.007708  loss_rpn_loc: 0.08671  time: 0.4068  data_time: 0.0080  lr: 0.0025  max_mem: 3599M
[01/24 13:09:12 d2.utils.events]:  eta: 0:11:51  iter: 1399  total_loss: 0.899  loss_cls: 0.1194  loss_box_reg: 0.3629  loss_mask: 0.3435  loss_rpn_cls: 0.005933  loss_rpn_loc: 0.1211  time: 0.4069  data_time: 0.0078  lr: 0.0025  max_mem: 3599M
[01/24 13:09:20 d2.utils.events]:  eta: 0:11:42  iter: 1419  total_loss: 0.9121  loss_cls: 0.09668  loss_box_reg: 0.3527  loss_mask: 0.3444  loss_rpn_cls: 0.006468  loss_rpn_loc: 0.07001  time: 0.4072  data_time: 0.0077  lr: 0.0025  max_mem: 3599M
[01/24 13:09:28 d2.utils.events]:  eta: 0:11:33  iter: 1439  total_loss: 0.9734  loss_cls: 0.104  loss_box_reg: 0.3967  loss_mask: 0.3419  loss_rpn_cls: 0.004942  loss_rpn_loc: 0.09219  time: 0.4072  data_time: 0.0079  lr: 0.0025  max_mem: 3599M
[01/24 13:09:37 d2.utils.events]:  eta: 0:11:25  iter: 1459  total_loss: 0.9322  loss_cls: 0.119  loss_box_reg: 0.3553  loss_mask: 0.3312  loss_rpn_cls: 0.00727  loss_rpn_loc: 0.06361  time: 0.4073  data_time: 0.0085  lr: 0.0025  max_mem: 3599M
[01/24 13:09:45 d2.utils.events]:  eta: 0:11:16  iter: 1479  total_loss: 0.9113  loss_cls: 0.09831  loss_box_reg: 0.3612  loss_mask: 0.3404  loss_rpn_cls: 0.007151  loss_rpn_loc: 0.08264  time: 0.4075  data_time: 0.0077  lr: 0.0025  max_mem: 3599M
[01/24 13:09:53 d2.utils.events]:  eta: 0:11:07  iter: 1499  total_loss: 0.9247  loss_cls: 0.1038  loss_box_reg: 0.358  loss_mask: 0.3469  loss_rpn_cls: 0.006618  loss_rpn_loc: 0.06793  time: 0.4075  data_time: 0.0079  lr: 0.0025  max_mem: 3599M
[01/24 13:10:01 d2.utils.events]:  eta: 0:10:59  iter: 1519  total_loss: 0.9105  loss_cls: 0.1081  loss_box_reg: 0.3508  loss_mask: 0.3367  loss_rpn_cls: 0.006828  loss_rpn_loc: 0.1114  time: 0.4075  data_time: 0.0079  lr: 0.0025  max_mem: 3599M
[01/24 13:10:09 d2.utils.events]:  eta: 0:10:50  iter: 1539  total_loss: 0.9023  loss_cls: 0.08591  loss_box_reg: 0.344  loss_mask: 0.339  loss_rpn_cls: 0.00864  loss_rpn_loc: 0.1366  time: 0.4074  data_time: 0.0076  lr: 0.0025  max_mem: 3599M
[01/24 13:10:17 d2.utils.events]:  eta: 0:10:41  iter: 1559  total_loss: 0.8922  loss_cls: 0.1031  loss_box_reg: 0.3554  loss_mask: 0.3537  loss_rpn_cls: 0.007561  loss_rpn_loc: 0.08405  time: 0.4073  data_time: 0.0074  lr: 0.0025  max_mem: 3599M
[01/24 13:10:26 d2.utils.events]:  eta: 0:10:32  iter: 1579  total_loss: 0.881  loss_cls: 0.09391  loss_box_reg: 0.3351  loss_mask: 0.3258  loss_rpn_cls: 0.005898  loss_rpn_loc: 0.08909  time: 0.4073  data_time: 0.0076  lr: 0.0025  max_mem: 3599M
[01/24 13:10:34 d2.utils.events]:  eta: 0:10:23  iter: 1599  total_loss: 0.9362  loss_cls: 0.09237  loss_box_reg: 0.3629  loss_mask: 0.3276  loss_rpn_cls: 0.007497  loss_rpn_loc: 0.1479  time: 0.4074  data_time: 0.0075  lr: 0.0025  max_mem: 3599M
[01/24 13:10:42 d2.utils.events]:  eta: 0:10:14  iter: 1619  total_loss: 0.8826  loss_cls: 0.08761  loss_box_reg: 0.3585  loss_mask: 0.3305  loss_rpn_cls: 0.005354  loss_rpn_loc: 0.07594  time: 0.4076  data_time: 0.0074  lr: 0.0025  max_mem: 3599M
[01/24 13:10:51 d2.utils.events]:  eta: 0:10:05  iter: 1639  total_loss: 0.8617  loss_cls: 0.1011  loss_box_reg: 0.3362  loss_mask: 0.333  loss_rpn_cls: 0.005574  loss_rpn_loc: 0.07444  time: 0.4076  data_time: 0.0078  lr: 0.0025  max_mem: 3599M
[01/24 13:10:58 d2.utils.events]:  eta: 0:09:56  iter: 1659  total_loss: 0.8981  loss_cls: 0.07631  loss_box_reg: 0.3364  loss_mask: 0.3218  loss_rpn_cls: 0.006208  loss_rpn_loc: 0.1339  time: 0.4074  data_time: 0.0078  lr: 0.0025  max_mem: 3599M
[01/24 13:11:07 d2.utils.events]:  eta: 0:09:48  iter: 1679  total_loss: 0.8744  loss_cls: 0.09407  loss_box_reg: 0.3541  loss_mask: 0.3304  loss_rpn_cls: 0.005485  loss_rpn_loc: 0.08007  time: 0.4074  data_time: 0.0077  lr: 0.0025  max_mem: 3599M
[01/24 13:11:15 d2.utils.events]:  eta: 0:09:39  iter: 1699  total_loss: 0.8851  loss_cls: 0.07669  loss_box_reg: 0.3345  loss_mask: 0.3331  loss_rpn_cls: 0.005673  loss_rpn_loc: 0.08636  time: 0.4075  data_time: 0.0075  lr: 0.0025  max_mem: 3599M
[01/24 13:11:23 d2.utils.events]:  eta: 0:09:30  iter: 1719  total_loss: 0.8533  loss_cls: 0.1068  loss_box_reg: 0.336  loss_mask: 0.3344  loss_rpn_cls: 0.006851  loss_rpn_loc: 0.07491  time: 0.4075  data_time: 0.0074  lr: 0.0025  max_mem: 3599M
[01/24 13:11:31 d2.utils.events]:  eta: 0:09:21  iter: 1739  total_loss: 0.8465  loss_cls: 0.0938  loss_box_reg: 0.3012  loss_mask: 0.3283  loss_rpn_cls: 0.005964  loss_rpn_loc: 0.06621  time: 0.4073  data_time: 0.0071  lr: 0.0025  max_mem: 3599M
[01/24 13:11:39 d2.utils.events]:  eta: 0:09:12  iter: 1759  total_loss: 0.8577  loss_cls: 0.08192  loss_box_reg: 0.3205  loss_mask: 0.3321  loss_rpn_cls: 0.005296  loss_rpn_loc: 0.0607  time: 0.4071  data_time: 0.0077  lr: 0.0025  max_mem: 3599M
[01/24 13:11:47 d2.utils.events]:  eta: 0:09:03  iter: 1779  total_loss: 0.8663  loss_cls: 0.09168  loss_box_reg: 0.3352  loss_mask: 0.3335  loss_rpn_cls: 0.004898  loss_rpn_loc: 0.1002  time: 0.4073  data_time: 0.0075  lr: 0.0025  max_mem: 3599M
[01/24 13:11:55 d2.utils.events]:  eta: 0:08:54  iter: 1799  total_loss: 0.7885  loss_cls: 0.07729  loss_box_reg: 0.3191  loss_mask: 0.3147  loss_rpn_cls: 0.003958  loss_rpn_loc: 0.06303  time: 0.4073  data_time: 0.0078  lr: 0.0025  max_mem: 3599M
[01/24 13:12:03 d2.utils.events]:  eta: 0:08:45  iter: 1819  total_loss: 0.872  loss_cls: 0.07849  loss_box_reg: 0.326  loss_mask: 0.3327  loss_rpn_cls: 0.004781  loss_rpn_loc: 0.08725  time: 0.4072  data_time: 0.0079  lr: 0.0025  max_mem: 3599M
[01/24 13:12:11 d2.utils.events]:  eta: 0:08:36  iter: 1839  total_loss: 0.8682  loss_cls: 0.08067  loss_box_reg: 0.3302  loss_mask: 0.3126  loss_rpn_cls: 0.005218  loss_rpn_loc: 0.11  time: 0.4070  data_time: 0.0078  lr: 0.0025  max_mem: 3599M
[01/24 13:12:19 d2.utils.events]:  eta: 0:08:27  iter: 1859  total_loss: 0.848  loss_cls: 0.08207  loss_box_reg: 0.3161  loss_mask: 0.3266  loss_rpn_cls: 0.006865  loss_rpn_loc: 0.08856  time: 0.4069  data_time: 0.0076  lr: 0.0025  max_mem: 3599M
[01/24 13:12:27 d2.utils.events]:  eta: 0:08:18  iter: 1879  total_loss: 0.8464  loss_cls: 0.09783  loss_box_reg: 0.324  loss_mask: 0.3325  loss_rpn_cls: 0.004321  loss_rpn_loc: 0.08057  time: 0.4067  data_time: 0.0077  lr: 0.0025  max_mem: 3599M
[01/24 13:12:35 d2.utils.events]:  eta: 0:08:09  iter: 1899  total_loss: 0.8308  loss_cls: 0.09051  loss_box_reg: 0.3189  loss_mask: 0.3282  loss_rpn_cls: 0.004824  loss_rpn_loc: 0.09344  time: 0.4066  data_time: 0.0079  lr: 0.0025  max_mem: 3599M
[01/24 13:12:43 d2.utils.events]:  eta: 0:08:00  iter: 1919  total_loss: 0.7939  loss_cls: 0.08784  loss_box_reg: 0.2963  loss_mask: 0.325  loss_rpn_cls: 0.005509  loss_rpn_loc: 0.06425  time: 0.4068  data_time: 0.0081  lr: 0.0025  max_mem: 3599M
[01/24 13:12:52 d2.utils.events]:  eta: 0:07:51  iter: 1939  total_loss: 0.7955  loss_cls: 0.08848  loss_box_reg: 0.286  loss_mask: 0.3058  loss_rpn_cls: 0.007493  loss_rpn_loc: 0.09741  time: 0.4068  data_time: 0.0078  lr: 0.0025  max_mem: 3599M
[01/24 13:13:00 d2.utils.events]:  eta: 0:07:42  iter: 1959  total_loss: 0.8473  loss_cls: 0.08336  loss_box_reg: 0.3061  loss_mask: 0.3191  loss_rpn_cls: 0.005327  loss_rpn_loc: 0.09746  time: 0.4067  data_time: 0.0073  lr: 0.0025  max_mem: 3599M
[01/24 13:13:08 d2.utils.events]:  eta: 0:07:33  iter: 1979  total_loss: 0.8539  loss_cls: 0.08872  loss_box_reg: 0.3069  loss_mask: 0.3383  loss_rpn_cls: 0.008127  loss_rpn_loc: 0.1077  time: 0.4066  data_time: 0.0074  lr: 0.0025  max_mem: 3599M
[01/24 13:13:16 d2.utils.events]:  eta: 0:07:25  iter: 1999  total_loss: 0.8048  loss_cls: 0.07482  loss_box_reg: 0.3065  loss_mask: 0.3102  loss_rpn_cls: 0.004048  loss_rpn_loc: 0.09146  time: 0.4066  data_time: 0.0078  lr: 0.0025  max_mem: 3599M
[01/24 13:13:24 d2.utils.events]:  eta: 0:07:16  iter: 2019  total_loss: 0.7888  loss_cls: 0.06711  loss_box_reg: 0.297  loss_mask: 0.3241  loss_rpn_cls: 0.004265  loss_rpn_loc: 0.08338  time: 0.4067  data_time: 0.0082  lr: 0.0025  max_mem: 3599M
[01/24 13:13:32 d2.utils.events]:  eta: 0:07:07  iter: 2039  total_loss: 0.8322  loss_cls: 0.09426  loss_box_reg: 0.3311  loss_mask: 0.3144  loss_rpn_cls: 0.004307  loss_rpn_loc: 0.05577  time: 0.4067  data_time: 0.0078  lr: 0.0025  max_mem: 3599M
[01/24 13:13:41 d2.utils.events]:  eta: 0:06:58  iter: 2059  total_loss: 0.7891  loss_cls: 0.08497  loss_box_reg: 0.301  loss_mask: 0.3115  loss_rpn_cls: 0.005715  loss_rpn_loc: 0.05674  time: 0.4068  data_time: 0.0092  lr: 0.0025  max_mem: 3599M
[01/24 13:13:49 d2.utils.events]:  eta: 0:06:49  iter: 2079  total_loss: 0.7805  loss_cls: 0.07294  loss_box_reg: 0.2847  loss_mask: 0.3003  loss_rpn_cls: 0.005284  loss_rpn_loc: 0.07338  time: 0.4068  data_time: 0.0079  lr: 0.0025  max_mem: 3599M
[01/24 13:13:58 d2.utils.events]:  eta: 0:06:40  iter: 2099  total_loss: 0.7829  loss_cls: 0.08096  loss_box_reg: 0.2854  loss_mask: 0.3064  loss_rpn_cls: 0.006051  loss_rpn_loc: 0.1192  time: 0.4070  data_time: 0.0078  lr: 0.0025  max_mem: 3599M
[01/24 13:14:06 d2.utils.events]:  eta: 0:06:31  iter: 2119  total_loss: 0.7601  loss_cls: 0.07892  loss_box_reg: 0.2936  loss_mask: 0.3189  loss_rpn_cls: 0.004799  loss_rpn_loc: 0.05819  time: 0.4070  data_time: 0.0078  lr: 0.0025  max_mem: 3599M
[01/24 13:14:14 d2.utils.events]:  eta: 0:06:22  iter: 2139  total_loss: 0.7907  loss_cls: 0.08108  loss_box_reg: 0.2949  loss_mask: 0.3223  loss_rpn_cls: 0.004407  loss_rpn_loc: 0.06553  time: 0.4070  data_time: 0.0081  lr: 0.0025  max_mem: 3599M
[01/24 13:14:22 d2.utils.events]:  eta: 0:06:13  iter: 2159  total_loss: 0.7618  loss_cls: 0.06952  loss_box_reg: 0.2891  loss_mask: 0.3121  loss_rpn_cls: 0.005302  loss_rpn_loc: 0.06034  time: 0.4069  data_time: 0.0081  lr: 0.0025  max_mem: 3599M
[01/24 13:14:30 d2.utils.events]:  eta: 0:06:05  iter: 2179  total_loss: 0.8223  loss_cls: 0.06911  loss_box_reg: 0.3076  loss_mask: 0.3309  loss_rpn_cls: 0.005022  loss_rpn_loc: 0.0704  time: 0.4069  data_time: 0.0079  lr: 0.0025  max_mem: 3599M
[01/24 13:14:39 d2.utils.events]:  eta: 0:05:56  iter: 2199  total_loss: 0.797  loss_cls: 0.0712  loss_box_reg: 0.3048  loss_mask: 0.3177  loss_rpn_cls: 0.003261  loss_rpn_loc: 0.06827  time: 0.4071  data_time: 0.0077  lr: 0.0025  max_mem: 3599M
[01/24 13:14:47 d2.utils.events]:  eta: 0:05:47  iter: 2219  total_loss: 0.802  loss_cls: 0.06918  loss_box_reg: 0.2915  loss_mask: 0.3272  loss_rpn_cls: 0.005211  loss_rpn_loc: 0.08944  time: 0.4071  data_time: 0.0079  lr: 0.0025  max_mem: 3599M
[01/24 13:14:55 d2.utils.events]:  eta: 0:05:38  iter: 2239  total_loss: 0.7654  loss_cls: 0.07413  loss_box_reg: 0.2873  loss_mask: 0.3198  loss_rpn_cls: 0.004251  loss_rpn_loc: 0.07268  time: 0.4071  data_time: 0.0075  lr: 0.0025  max_mem: 3599M
[01/24 13:15:03 d2.utils.events]:  eta: 0:05:29  iter: 2259  total_loss: 0.7589  loss_cls: 0.06129  loss_box_reg: 0.2713  loss_mask: 0.3025  loss_rpn_cls: 0.004542  loss_rpn_loc: 0.09818  time: 0.4072  data_time: 0.0073  lr: 0.0025  max_mem: 3599M
[01/24 13:15:11 d2.utils.events]:  eta: 0:05:20  iter: 2279  total_loss: 0.7193  loss_cls: 0.06614  loss_box_reg: 0.2725  loss_mask: 0.3018  loss_rpn_cls: 0.002605  loss_rpn_loc: 0.06126  time: 0.4071  data_time: 0.0076  lr: 0.0025  max_mem: 3599M
[01/24 13:15:19 d2.utils.events]:  eta: 0:05:11  iter: 2299  total_loss: 0.7637  loss_cls: 0.0649  loss_box_reg: 0.2781  loss_mask: 0.3133  loss_rpn_cls: 0.004004  loss_rpn_loc: 0.07071  time: 0.4071  data_time: 0.0077  lr: 0.0025  max_mem: 3599M
[01/24 13:15:28 d2.utils.events]:  eta: 0:05:02  iter: 2319  total_loss: 0.7424  loss_cls: 0.06156  loss_box_reg: 0.2702  loss_mask: 0.3141  loss_rpn_cls: 0.005257  loss_rpn_loc: 0.05424  time: 0.4073  data_time: 0.0078  lr: 0.0025  max_mem: 3599M
[01/24 13:15:36 d2.utils.events]:  eta: 0:04:53  iter: 2339  total_loss: 0.7443  loss_cls: 0.0756  loss_box_reg: 0.2788  loss_mask: 0.3156  loss_rpn_cls: 0.006159  loss_rpn_loc: 0.06478  time: 0.4072  data_time: 0.0077  lr: 0.0025  max_mem: 3599M
[01/24 13:15:44 d2.utils.events]:  eta: 0:04:45  iter: 2359  total_loss: 0.7533  loss_cls: 0.06184  loss_box_reg: 0.283  loss_mask: 0.3165  loss_rpn_cls: 0.006892  loss_rpn_loc: 0.06055  time: 0.4072  data_time: 0.0074  lr: 0.0025  max_mem: 3599M
[01/24 13:15:52 d2.utils.events]:  eta: 0:04:36  iter: 2379  total_loss: 0.73  loss_cls: 0.06937  loss_box_reg: 0.2518  loss_mask: 0.3008  loss_rpn_cls: 0.005094  loss_rpn_loc: 0.08697  time: 0.4073  data_time: 0.0083  lr: 0.0025  max_mem: 3599M
[01/24 13:16:01 d2.utils.events]:  eta: 0:04:27  iter: 2399  total_loss: 0.7532  loss_cls: 0.08279  loss_box_reg: 0.2892  loss_mask: 0.2957  loss_rpn_cls: 0.005467  loss_rpn_loc: 0.06526  time: 0.4073  data_time: 0.0081  lr: 0.0025  max_mem: 3599M
[01/24 13:16:08 d2.utils.events]:  eta: 0:04:18  iter: 2419  total_loss: 0.7571  loss_cls: 0.07034  loss_box_reg: 0.2869  loss_mask: 0.3187  loss_rpn_cls: 0.005902  loss_rpn_loc: 0.06619  time: 0.4072  data_time: 0.0075  lr: 0.0025  max_mem: 3599M
[01/24 13:16:17 d2.utils.events]:  eta: 0:04:09  iter: 2439  total_loss: 0.7745  loss_cls: 0.07982  loss_box_reg: 0.2773  loss_mask: 0.3125  loss_rpn_cls: 0.006192  loss_rpn_loc: 0.08791  time: 0.4072  data_time: 0.0080  lr: 0.0025  max_mem: 3599M
[01/24 13:16:25 d2.utils.events]:  eta: 0:04:00  iter: 2459  total_loss: 0.7548  loss_cls: 0.06145  loss_box_reg: 0.2984  loss_mask: 0.3143  loss_rpn_cls: 0.00386  loss_rpn_loc: 0.06951  time: 0.4071  data_time: 0.0077  lr: 0.0025  max_mem: 3599M
[01/24 13:16:33 d2.utils.events]:  eta: 0:03:51  iter: 2479  total_loss: 0.7363  loss_cls: 0.06576  loss_box_reg: 0.2685  loss_mask: 0.3058  loss_rpn_cls: 0.003771  loss_rpn_loc: 0.07585  time: 0.4071  data_time: 0.0077  lr: 0.0025  max_mem: 3599M
[01/24 13:16:41 d2.utils.events]:  eta: 0:03:11  iter: 2499  total_loss: 0.7719  loss_cls: 0.06745  loss_box_reg: 0.28  loss_mask: 0.3067  loss_rpn_cls: 0.004944  loss_rpn_loc: 0.07438  time: 0.4070  data_time: 0.0075  lr: 0.0025  max_mem: 3599M
[01/24 13:16:49 d2.utils.events]:  eta: 0:03:03  iter: 2519  total_loss: 0.7665  loss_cls: 0.06719  loss_box_reg: 0.2701  loss_mask: 0.3091  loss_rpn_cls: 0.003799  loss_rpn_loc: 0.05953  time: 0.4070  data_time: 0.0082  lr: 0.0025  max_mem: 3599M
[01/24 13:16:57 d2.utils.events]:  eta: 0:02:56  iter: 2539  total_loss: 0.7577  loss_cls: 0.07123  loss_box_reg: 0.2673  loss_mask: 0.3058  loss_rpn_cls: 0.006099  loss_rpn_loc: 0.09618  time: 0.4071  data_time: 0.0075  lr: 0.0025  max_mem: 3599M
[01/24 13:17:05 d2.utils.events]:  eta: 0:03:15  iter: 2559  total_loss: 0.7365  loss_cls: 0.06262  loss_box_reg: 0.2806  loss_mask: 0.3122  loss_rpn_cls: 0.004006  loss_rpn_loc: 0.05422  time: 0.4072  data_time: 0.0076  lr: 0.0025  max_mem: 3599M
[01/24 13:17:14 d2.utils.events]:  eta: 0:03:06  iter: 2579  total_loss: 0.7619  loss_cls: 0.06285  loss_box_reg: 0.2812  loss_mask: 0.3219  loss_rpn_cls: 0.006236  loss_rpn_loc: 0.08666  time: 0.4072  data_time: 0.0072  lr: 0.0025  max_mem: 3599M
[01/24 13:17:22 d2.utils.events]:  eta: 0:02:57  iter: 2599  total_loss: 0.7731  loss_cls: 0.07393  loss_box_reg: 0.2865  loss_mask: 0.3332  loss_rpn_cls: 0.004507  loss_rpn_loc: 0.061  time: 0.4072  data_time: 0.0079  lr: 0.0025  max_mem: 3599M
[01/24 13:17:30 d2.utils.events]:  eta: 0:02:25  iter: 2619  total_loss: 0.7453  loss_cls: 0.07436  loss_box_reg: 0.278  loss_mask: 0.3081  loss_rpn_cls: 0.004401  loss_rpn_loc: 0.06051  time: 0.4072  data_time: 0.0079  lr: 0.0025  max_mem: 3599M
[01/24 13:17:38 d2.utils.events]:  eta: 0:02:17  iter: 2639  total_loss: 0.7367  loss_cls: 0.06714  loss_box_reg: 0.2739  loss_mask: 0.3024  loss_rpn_cls: 0.005877  loss_rpn_loc: 0.06885  time: 0.4071  data_time: 0.0081  lr: 0.0025  max_mem: 3599M
[01/24 13:17:46 d2.utils.events]:  eta: 0:02:20  iter: 2659  total_loss: 0.7465  loss_cls: 0.05305  loss_box_reg: 0.2743  loss_mask: 0.3195  loss_rpn_cls: 0.004345  loss_rpn_loc: 0.08189  time: 0.4072  data_time: 0.0079  lr: 0.0025  max_mem: 3599M
[01/24 13:17:55 d2.utils.events]:  eta: 0:02:22  iter: 2679  total_loss: 0.736  loss_cls: 0.05752  loss_box_reg: 0.2565  loss_mask: 0.3142  loss_rpn_cls: 0.005848  loss_rpn_loc: 0.07894  time: 0.4073  data_time: 0.0080  lr: 0.0025  max_mem: 3599M
[01/24 13:18:03 d2.utils.events]:  eta: 0:01:55  iter: 2699  total_loss: 0.7671  loss_cls: 0.0624  loss_box_reg: 0.2683  loss_mask: 0.3052  loss_rpn_cls: 0.004146  loss_rpn_loc: 0.07695  time: 0.4072  data_time: 0.0079  lr: 0.0025  max_mem: 3599M
[01/24 13:18:11 d2.utils.events]:  eta: 0:01:47  iter: 2719  total_loss: 0.7953  loss_cls: 0.0692  loss_box_reg: 0.2772  loss_mask: 0.3138  loss_rpn_cls: 0.004813  loss_rpn_loc: 0.07641  time: 0.4072  data_time: 0.0077  lr: 0.0025  max_mem: 3599M
[01/24 13:18:19 d2.utils.events]:  eta: 0:01:47  iter: 2739  total_loss: 0.7134  loss_cls: 0.05796  loss_box_reg: 0.2701  loss_mask: 0.2858  loss_rpn_cls: 0.004703  loss_rpn_loc: 0.06998  time: 0.4071  data_time: 0.0080  lr: 0.0025  max_mem: 3599M
[01/24 13:18:27 d2.utils.events]:  eta: 0:01:46  iter: 2759  total_loss: 0.7071  loss_cls: 0.06649  loss_box_reg: 0.2728  loss_mask: 0.2973  loss_rpn_cls: 0.00605  loss_rpn_loc: 0.05202  time: 0.4072  data_time: 0.0080  lr: 0.0025  max_mem: 3599M
[01/24 13:18:35 d2.utils.events]:  eta: 0:01:37  iter: 2779  total_loss: 0.7213  loss_cls: 0.06138  loss_box_reg: 0.2577  loss_mask: 0.306  loss_rpn_cls: 0.004518  loss_rpn_loc: 0.07574  time: 0.4071  data_time: 0.0077  lr: 0.0025  max_mem: 3599M
[01/24 13:18:43 d2.utils.events]:  eta: 0:01:28  iter: 2799  total_loss: 0.7137  loss_cls: 0.0788  loss_box_reg: 0.2527  loss_mask: 0.3145  loss_rpn_cls: 0.00386  loss_rpn_loc: 0.062  time: 0.4072  data_time: 0.0076  lr: 0.0025  max_mem: 3599M
[01/24 13:18:52 d2.utils.events]:  eta: 0:01:20  iter: 2819  total_loss: 0.746  loss_cls: 0.07518  loss_box_reg: 0.26  loss_mask: 0.3076  loss_rpn_cls: 0.004785  loss_rpn_loc: 0.06857  time: 0.4072  data_time: 0.0080  lr: 0.0025  max_mem: 3599M
[01/24 13:19:00 d2.utils.events]:  eta: 0:01:11  iter: 2839  total_loss: 0.6707  loss_cls: 0.06073  loss_box_reg: 0.2389  loss_mask: 0.2921  loss_rpn_cls: 0.004386  loss_rpn_loc: 0.05147  time: 0.4072  data_time: 0.0090  lr: 0.0025  max_mem: 3599M
[01/24 13:19:08 d2.utils.events]:  eta: 0:01:02  iter: 2859  total_loss: 0.6936  loss_cls: 0.0542  loss_box_reg: 0.2383  loss_mask: 0.2956  loss_rpn_cls: 0.005314  loss_rpn_loc: 0.07623  time: 0.4071  data_time: 0.0088  lr: 0.0025  max_mem: 3599M
[01/24 13:19:16 d2.utils.events]:  eta: 0:00:53  iter: 2879  total_loss: 0.6797  loss_cls: 0.06399  loss_box_reg: 0.2539  loss_mask: 0.306  loss_rpn_cls: 0.004842  loss_rpn_loc: 0.05527  time: 0.4070  data_time: 0.0075  lr: 0.0025  max_mem: 3599M
[01/24 13:19:23 d2.utils.events]:  eta: 0:00:44  iter: 2899  total_loss: 0.719  loss_cls: 0.06619  loss_box_reg: 0.257  loss_mask: 0.3058  loss_rpn_cls: 0.004704  loss_rpn_loc: 0.06113  time: 0.4069  data_time: 0.0078  lr: 0.0025  max_mem: 3599M
[01/24 13:19:32 d2.utils.events]:  eta: 0:00:35  iter: 2919  total_loss: 0.7184  loss_cls: 0.06804  loss_box_reg: 0.252  loss_mask: 0.2889  loss_rpn_cls: 0.003962  loss_rpn_loc: 0.07091  time: 0.4069  data_time: 0.0077  lr: 0.0025  max_mem: 3599M
[01/24 13:19:40 d2.utils.events]:  eta: 0:00:26  iter: 2939  total_loss: 0.681  loss_cls: 0.05947  loss_box_reg: 0.2415  loss_mask: 0.3007  loss_rpn_cls: 0.004218  loss_rpn_loc: 0.07667  time: 0.4070  data_time: 0.0079  lr: 0.0025  max_mem: 3599M
[01/24 13:19:48 d2.utils.events]:  eta: 0:00:17  iter: 2959  total_loss: 0.7122  loss_cls: 0.05969  loss_box_reg: 0.2504  loss_mask: 0.3078  loss_rpn_cls: 0.00646  loss_rpn_loc: 0.08837  time: 0.4070  data_time: 0.0081  lr: 0.0025  max_mem: 3599M
[01/24 13:19:56 d2.utils.events]:  eta: 0:00:08  iter: 2979  total_loss: 0.6653  loss_cls: 0.05044  loss_box_reg: 0.2434  loss_mask: 0.2963  loss_rpn_cls: 0.003653  loss_rpn_loc: 0.05362  time: 0.4069  data_time: 0.0078  lr: 0.0025  max_mem: 3599M
[01/24 13:20:05 d2.utils.events]:  eta: 0:00:00  iter: 2999  total_loss: 0.6853  loss_cls: 0.06781  loss_box_reg: 0.2578  loss_mask: 0.3007  loss_rpn_cls: 0.003891  loss_rpn_loc: 0.05239  time: 0.4069  data_time: 0.0080  lr: 0.0025  max_mem: 3599M
[01/24 13:20:05 d2.engine.hooks]: Overall training speed: 2998 iterations in 0:20:20 (0.4070 s / it)
[01/24 13:20:05 d2.engine.hooks]: Total training time: 0:20:23 (0:00:03 on hooks)

Save and download the trained model weights#

final_model_file = os.path.join(cfg.OUTPUT_DIR,'model_final.pth')
if IN_COLAB:
    from google.colab import files
    files.download(final_model_file)
else:
    from IPython.display import FileLink
    FileLink(final_model_file)