October 28, 2019

230 words 2 mins read

lxtGH/OctaveConv_pytorch

lxtGH/OctaveConv_pytorch

Pytorch implementation of newly added convolution

repo name lxtGH/OctaveConv_pytorch
repo link https://github.com/lxtGH/OctaveConv_pytorch
homepage
language Python
size (curr.) 3216 kB
stars (curr.) 506
created 2019-04-16
license MIT License

Beyond Convolution

OctaveConv_pytorch

Pytorch implementation of recent operators

This is third parity implementation(un-official) of Following Paper.

  1. Drop an Octave: Reducing Spatial Redundancy in Convolutional Neural Networks with Octave Convolution(ICCV 2019). paper
  2. Adaptively Connected Neural Networks.(CVPR 2019) paper
  3. Res2net:A New Multi-scale Backbone Architecture(PAMI 2019) paper
  4. ScaleNet:Data-Driven Neuron Allocation for Scale Aggregation Networks (CVPR2019) paper
  5. SRM : A Style-based Recalibration Module for Convolutional Neural Networks paper
  6. SEnet: Squeeze-and-Excitation Networks(CVPR 2018) paper
  7. GEnet: Exploiting Feature Context in Convolutional Neural Networks(NIPS 2018) paper
  8. ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks paper
  9. SK-Net: Selective Kernel Networks(CVPR 2019) paper
  10. More Net will be added.

Plan

  1. add Res2Net bolock with SE-layer (done)
  2. add Adaptive-Convolution: both pixel-aware and dataset-aware (done)
  3. Train code on Imagenet. (done)
  4. Add SE-like models. (done)
  5. Keep tracking with new proposed operators. (-)

Usage

check model files under the fig/nn floder.

from lib.nn.OCtaveResnet import resnet50
from lib.nn.res2net import se_resnet50
from lib.nn.AdaptiveConvResnet import PixelAwareResnet50, DataSetAwareResnet50

model = resnet50().cuda()
model = se_resnet50().cuda()
model = PixelAwareResnet50().cuda()
model = DataSetAwareResnet50().cuda()

Training

see exp floder for the detailed information

CheckPoint

Reference and Citation:

  1. OctaveConv: MXNet implementation here
  2. AdaptiveCov: Offical tensorflow implementation here
  3. ScaleNet: here
  4. SGENet:here

Please consider cite the author’s paper when using the code for your research.

License

MIT License
comments powered by Disqus