In this letter, we propose a machine learning (ML) approach to peak-to-average power ratio (PAPR) reduction in the downlink channel of the digital Massive Multiple Input Multiple Output (MIMO) system with orthogonal frequency division multiplexing (OFDM) signal. Naturally, substantial PAPR reduction is achieved when the noise after PAPR reduction is distributed in both frequency and spatial domains following the maximum allowed error vector magnitude (EVM). However, such an approach requires time-consuming manual tuning of hyperparameters for each scenario. To overcome this problem, we propose to compute an optimal hyperparameter function. Then we propose an efficient approximation for this function to enable the PAPR reduction based on the classical ML approaches.
- machine learning
- PAPR reduction