Use dd to write the 50GB file to the raw disk, bypassing OS cache.
# Generates random data (slower, but realistic for encrypted traffic) $out = new-object byte[](1MB); (Get-Random -Count (50*1024)) | foreach $out[$_] = (Get-Random -Max 256) ; Set-Content D:\50GB_random.bin -Value $out Warning: Random generation on 50GB takes significant CPU time. Use the fsutil method for pure throughput testing. Best for: DevOps, server admins, and data scientists 50 gb test file
It is the "goldilocks" of synthetic data. It is too large for RAM caching (making it a true disk/network test), small enough to generate quickly on modern SSDs, and large enough to expose thermal throttling in NVMe drives or buffer bloat in routers. Use dd to write the 50GB file to
The dd command has been the king of synthetic files for 40 years. Best for: DevOps, server admins, and data scientists
Upload your 50GB file to an S3 bucket using the AWS CLI.
On random 50GB data, ZSTD will finish 5x faster than Gzip with similar ratios. Scenario 4: Disk Throttling & Thermal Testing NVMe SSDs have incredible burst speeds (7,000 MB/s), but after writing 20-30GB, the controller heats up and the SLC cache fills. The drive drops to "TLC direct write" speeds (1,500 MB/s).