Stream Graph

AI Model Training Resources Stream

Stream visualization of resource allocation during AI model training with neon gradient.

Output
AI Model Training Resources Stream
Python
import matplotlib.pyplot as plt
import numpy as np

COLORS = {
    'background': '#0a0a0f',
    'text': '#ffffff',
    'grid': '#333333',
}

np.random.seed(1010)
epochs = np.arange(0, 100)

# Training resources
gpu_compute = 60 + 20 * (1 - np.exp(-epochs / 20)) + np.random.normal(0, 5, 100)
memory = 30 + 10 * np.log1p(epochs) + np.random.normal(0, 3, 100)
io_bandwidth = 20 + 10 * np.sin(epochs * np.pi / 25) + np.random.normal(0, 3, 100)
cpu_util = 15 + 5 * np.cos(epochs * np.pi / 30) + np.random.normal(0, 2, 100)
network = 10 + 8 * (epochs > 50) + np.random.normal(0, 2, 100)

data = [np.clip(d, 1, None) for d in [gpu_compute, memory, io_bandwidth, cpu_util, network]]
resource_colors = ['#6CF527', '#27D3F5', '#F5276C', '#F5B027', '#4927F5']

fig, ax = plt.subplots(figsize=(14, 6), facecolor=COLORS['background'])
ax.set_facecolor(COLORS['background'])

ax.stackplot(epochs, *data, colors=resource_colors, alpha=0.85, baseline='sym',
             labels=['GPU Compute', 'Memory', 'I/O Bandwidth', 'CPU', 'Network'])

ax.axhline(0, color=COLORS['grid'], linewidth=0.5, alpha=0.5)
ax.set_xlim(0, 99)

ax.set_title('AI Training Resource Utilization', color=COLORS['text'], fontsize=14, fontweight='bold', pad=15)
ax.set_xlabel('Training Epoch', color=COLORS['text'], fontsize=11)
ax.set_ylabel('Resource Usage (%)', color=COLORS['text'], fontsize=11)

ax.legend(loc='upper center', bbox_to_anchor=(0.5, -0.12), frameon=False, labelcolor=COLORS['text'], fontsize=9, ncol=5)

for spine in ax.spines.values():
    spine.set_visible(False)
ax.tick_params(colors=COLORS['text'], labelsize=9)

plt.tight_layout()
plt.subplots_adjust(bottom=0.18)
plt.show()
Library

Matplotlib

Category

Time Series

Did this help you?

Support PyLucid to keep it free & growing

Support