All convolutions within a dense block are ReLU-activated and use batch normalization. Channel-smart concatenation is barely probable if the peak and width Proportions of the data continue being unchanged, so convolutions in a very dense block are all of stride 1. Pooling layers are inserted between dense blocks for even further dimensionality reduc