Up-sampling with Transposed Convolution

Naoki
6 min readNov 13, 2017

If you’ve heard about the transposed convolution and got confused about what it actually means, this article is written for you.

The content of this article is as follows:

  • The Need for Up-sampling
  • Why Transposed Convolution?
  • Convolution Operation
  • Going Backward
  • Convolution Matrix
  • Transposed Convolution Matrix
  • Summary

The notebook is available on my GitHub.

The Need for Up-sampling

When we use neural networks to generate images, it usually involves up-sampling from low resolution to high resolution.

There are various methods to conduct up-sampling operations:

  • Nearest neighbor interpolation
  • Bi-linear interpolation
  • Bi-cubic interpolation

All these methods involve some interpolation method we must choose when deciding on a network architecture. It is like manual feature engineering, and there is nothing that the network can learn about.

Why Transposed Convolution?

--

--