Using the formalism of information theory, we analyze the mechanism of information transduction in a simple one-step signaling cascade S→X representing the gene regulatory network. Approximating the signaling channel to be Gaussian, we describe the dynamics using Langevin equations. Upon discretization, we calculate the associated second moments for linear and nonlinear regulation of the output by the input, which follows the birth-death process. While mutual information between the input and the output characterizes the channel capacity, the Fano factor of the output gives a clear idea of how internal and external fluctuations assemble at the output level. To quantify the contribution of the present state of the input to predict the future output, transfer entropy is computed. We find that a higher amount of transfer entropy is accompanied by a greater magnitude of external fluctuations (quantified by the Fano factor of the output) propagation from the input to the output. We notice that a low input population characterized by the number of signaling molecules S, which fluctuates in a relatively slower fashion compared to its downstream (target) species X, is maximally able to predict (as quantified by transfer entropy) the future state of the output. Our computations also reveal that with increased linear nature of the input-output interaction, all three metrics of mutual information, Fano factor, and transfer entropy achieve relatively larger magnitudes. © 2018 American Physical Society.