Fig 1.
Network slicing architecture.
Table 1.
Comparison of 5G and other networks.
Table 2.
Overview of the existing related models.
Fig 2.
Our INBSI model for 5G network slicing, utilizing VAE for anomaly detection, CNN with NADAM for slice prediction, and SHAP/LIME for interpretability to ensure efficient resource allocation and QoS compliance.
Fig 3.
Graphical representation of our proposed INBSI system.
Fig 4.
The proposed Network Bandwidth Slicing Identification (INBSI) system’s top-level paradigm.
Fig 5.
Working methodology of our proposed INBSI system.
Fig 6.
Variational autoencoder architecture.
Fig 7.
The proposed Model Architecture using NADAM Opitmizer.
Table 3.
VAE performance baseline.
Fig 8.
Anomaly detection of the constructed data.
Fig 9.
Training Loss of VAE.
Fig 10.
Learning curves for the ML models.
Learning curves demonstrate model stability with increasing data, critical for dynamic slicing where traffic patterns evolve for machine learning models.
Fig 11.
Confusion Matrices for the ML models.
Fig 12.
Classification Report for the ML models.
Fig 13.
Confusion Matrix and Classification report for DL models and Proposed CNN Model.
Fig 14.
Training accuracy and loss curves for the DL models with our proposed model.
Fig 15.
Comparison of the models.
Fig 16.
Achieved scores of these models based on performance metrics.
Fig 17.
2D PCA and t-SNE plots showing distinct clusters for network slices, indicating effective feature extraction and slice differentiation by the INBSI model.
Fig 18.
3D PCA and t-SNE plots demonstrating well-separated clusters for network slices, further validating the INBSI model’s ability to learn meaningful, discriminative feature representations.
Table 4.
Comparison of the models.
Table 5.
Inference Time Comparison of Models (Lower is Better).
Table 6.
Statistical Analysis of Model Performance.
Fig 19.
LIME interpretability with decision tree.
Fig 20.
LIME interpretability with our proposed CNN.
Fig 21.
LIME interpretibility with MLP.
Fig 22.
SHAP feature importance and summary plot of proposed CNN.
Fig 23.
SHAP feature importance and summary plot of proposed CNN model.
Fig 24.
The tree-structured output of the XGBoost classifier on the SHAP model.
Fig 25.
Slice distribution using our proposed hybrid CNN model.
Table 7.
Model comparison used.