The Benefit of Split Nonlinearity Compensation for Optical Fiber Communications
November 12, 2015 Β· Declared Dead Β· π International Conference on Intelligent Pervasive Computing
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Domanic Lavery, David Ives, Gabriele Liga, Alex Alvarado, Seb J. Savory, Polina Bayvel
arXiv ID
1511.04028
Category
physics.optics
Cross-listed
cs.IT
Citations
50
Venue
International Conference on Intelligent Pervasive Computing
Last Checked
1 month ago
Abstract
In this Letter we analyze the benefit of digital compensation of fiber nonlinearity, where the digital signal processing is divided between the transmitter and receiver. The application of the Gaussian noise model indicates that, where there are two or more spans, it is always beneficial to split the nonlinearity compensation. The theory is verified via numerical simulations, investigating transmission of single channel 50 GBd polarization division multiplexed 256-ary quadrature amplitude modulation over 100 km standard single mode fiber spans, using lumped amplification. For this case, the additional increase in mutual information achieved over transmitter- or receiver-side nonlinearity compensation is approximately 1 bit for distances greater than 2000 km. Further, it is shown, theoretically, that the SNR gain for long distances and high bandwidth transmission is 1.5 dB versus transmitter- or receiver-based nonlinearity compensation.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β physics.optics
R.I.P.
π»
Ghosted
R.I.P.
π»
Ghosted
Training of photonic neural networks through in situ backpropagation
R.I.P.
π»
Ghosted
Experimental robustness of Fourier Ptychography phase retrieval algorithms
R.I.P.
π»
Ghosted
The physics of optical computing
R.I.P.
π»
Ghosted
Freeform Diffractive Metagrating Design Based on Generative Adversarial Networks
R.I.P.
π»
Ghosted
Scalable Optical Learning Operator
Died the same way β π» Ghosted
R.I.P.
π»
Ghosted
Language Models are Few-Shot Learners
R.I.P.
π»
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
π»
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
π»
Ghosted