Component crosstalk causes severe system performance degradation in an optical network. In this paper the influence of the component crosstalk originated from signal-crosstalk beat noise, on the performance of a preamplified WDM receiver is investigated assuming both Gaussian and non-Gaussian model. From the analysis it is evident that for the finite interfering channels the results obtained using non-Gaussian model varies significantly from Gaussian model in terms of BER and the optimum detection threshold for obtaining minimum BER, but as the number of interfering channels approaches large value the variations are minimized. Effects on BER and optimum detection thresholds for minimum bit error rates are compared, summarized and analyzed in both graphical and tabular form for the two different model. © 2013 IEEE.