Tom Clancy's Rainbow Six Siege (Void Edge Update) gameplay test/benchmark on PC. Low, Medium, High, Very High and Ultra in game settings on GT 1030 and i3 7100 (V sync off). Resolution: 768p. Updated: June 1st.
PC Specs:
CPU - i3 7100 @3.9gHz (2 core, 4 threads)
GPU - Gigabyte Geforce GT 1030 2gb GDDR5 [Driver version: 432.00]
RAM - 16 GB (2X8) RAMSTA 2400mHz DDR4 [Dual Channel]
HDD - Seagate Barracuda 1TB 7200RPM
SSD - RAMSTA 120 GB SATA 3 SSD.
MBD - HP 82A1
OS - Windows 10 Pro (64 Bit)
At low and ultra, I used 100% resolution scaling. Why? Switching to the low setting auto-stops the temporal injection like taa, res.scaling. Otherwise, I had to use custom in-game settings. With 50% res.scale on low 768p, the game gives an average of 100~110 FPS, still the resolution isn't 768p, it will be a much less something.
At ultra 50%, the game gives similar frame rate to very high, which makes this option a really excessive one. And using high textures on 50% res.scale makes no sense at all, "enjoying" blurry textures consuming all your vram while having a 2gb card isn't a great option. 100% res.scaling is the only way to stop the blur.
Almost same framerate on low, medium, high as the GPU isn't even used 100%.
Similar GPUs: RX 550, GTX 750 Ti 2gb, Vega 11, GTX 660, MX 150
At present, Towsif_8_12 has 21,397 views spread across 11 videos for Rainbow Six Siege, and less than an hour worth of Rainbow Six Siege videos were uploaded to his channel. This makes up 1.73% of the content that Towsif_8_12 has uploaded to YouTube.