Short-reach optical communication networks have various applications in areas where high-speed connectivity is needed, for example, inter- and intra-data center links, optical access networks, and indoor and in-building communication systems. Machine learning (ML) approaches provide a key solution for numerous challenging situations due to their robust decision-making, problem-solving, and pattern-recognition abilities. In this work, our focus is on utilizing deep learning models to minimize symbol error rates in short-reach optical communication setups. Various channel impairments, such as nonlinearity, chromatic dispersion (CD), and attenuation, are accurately modeled. Initially, we address the challenge of modeling a nonlinear channel. Consequently, we harness a deep learning model called autoencoders (AEs) to facilitate communication over nonlinear channels. Furthermore, we investigate how the inclusion of a nonlinear channel within an autoencoder influences the received constellation as the optical fiber length increases. Another facet of our work involves the deployment of a deep neural network-based receiver utilizing a channel influenced by chromatic dispersion. By gradually extending the optical length, we explore its impact on the received constellation and, consequently, the symbol error rate. Finally, we incorporate the split-step Fourier method (SSFM) to emulate the combined effects of nonlinearities, chromatic dispersion, and attenuation in the optical channel. This is accomplished through a neural network-based receiver. The outcome of this work is an evaluation and reduction of the symbol error rate as the length of the optical fiber is varied. Copyright © 2024 Iqbal, Ghafoor, Ahmad, Aljohani, Mirza, Aziz and Poti.
Symbol error rate minimization using deep learning approaches for short-reach optical communication networks
Potì Luca
2024-01-01
Abstract
Short-reach optical communication networks have various applications in areas where high-speed connectivity is needed, for example, inter- and intra-data center links, optical access networks, and indoor and in-building communication systems. Machine learning (ML) approaches provide a key solution for numerous challenging situations due to their robust decision-making, problem-solving, and pattern-recognition abilities. In this work, our focus is on utilizing deep learning models to minimize symbol error rates in short-reach optical communication setups. Various channel impairments, such as nonlinearity, chromatic dispersion (CD), and attenuation, are accurately modeled. Initially, we address the challenge of modeling a nonlinear channel. Consequently, we harness a deep learning model called autoencoders (AEs) to facilitate communication over nonlinear channels. Furthermore, we investigate how the inclusion of a nonlinear channel within an autoencoder influences the received constellation as the optical fiber length increases. Another facet of our work involves the deployment of a deep neural network-based receiver utilizing a channel influenced by chromatic dispersion. By gradually extending the optical length, we explore its impact on the received constellation and, consequently, the symbol error rate. Finally, we incorporate the split-step Fourier method (SSFM) to emulate the combined effects of nonlinearities, chromatic dispersion, and attenuation in the optical channel. This is accomplished through a neural network-based receiver. The outcome of this work is an evaluation and reduction of the symbol error rate as the length of the optical fiber is varied. Copyright © 2024 Iqbal, Ghafoor, Ahmad, Aljohani, Mirza, Aziz and Poti.File | Dimensione | Formato | |
---|---|---|---|
Potì_fphy-12-1387284_2024.pdf
accesso aperto
Tipologia:
Versione Editoriale (PDF)
Licenza:
Creative commons
Dimensione
2.64 MB
Formato
Adobe PDF
|
2.64 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.