5G ML Research: Beam Selection
Master's research project optimizing 5G beam selection using machine learning algorithms.
PythonPyTorchScikit-learnPandasMatplotlib
Problem
In 5G mmWave networks, selecting the optimal beam pair between the base station and user equipment is critical but incurs high overhead if done via exhaustive search.
Approach
I investigated using Machine Learning to predict the best beam based on location and channel state information (CSI).
- Dataset: Used the DeepMIMO dataset.
- Models: Trained Random Forest and Neural Network models.
- Evaluation: Compared top-k accuracy and beam search time reduction.
Tools
- Python: Data science stack.
- PyTorch: Deep learning models.
- Scikit-learn: Traditional ML models.
Output & Impact
- Achieved 90% top-3 accuracy in beam prediction.
- Reduced the beam search overhead by approximately 70% compared to exhaustive search.
What I Learned
- Data preprocessing is the most critical step in any ML pipeline.
- There is always a trade-off between model accuracy and inference latency, especially in real-time systems like 5G.