Cannanyone help having some trouble with this, A signal extending from 0 to 15.0 kHz is sampled at a rate 20% greater than the minimum sampling rate given by the sampling theorem. Each sample is coded as an 8-bit binary word and transmitted as a sequence of eight binary symbols. 1.How many bits per second are generated by this process? 2.Calculate the minimum bandwidth required to transmit bits at this rate over a noise-free channel?