The output of your convolutional layer will likely be handed throughout the ReLU activation functionality to bring non-linearity to your model. It will take the function map and replaces all the unfavorable values with zero. It had been noticed that with the network depth increasing, the precision gets saturated https://financefeeds.com/ethereums-pectra-upgrade-makes-staking-easier-but-web3bays-presale-offers-something-more-in-2025/