I observed a similar performance regression with the software renderer on my Debian Stretch (64-bit) system from GZDoom 3.3.2 to 3.4.1.
The CPU is an AMD Phenom II X4 945 (yeah, it's kinda old), and the GPU is an Nvidia GeForce GT 710. The graphics driver being used is Nvidia's proprietary blob, version 390.48. The screen resolution is 1366x768 in windowed mode.
The area I tested is the beginning of Doom II's map01. Here are the FPS values I observed with the various software rendering modes in both versions; first value is from 3.3.2 (with "Software Canvas" set to OpenGL), and the second value is from 3.4.1:
- Paletted, non-softpoly: ~230 fps -> ~120 fps
- Paletted, softpoly: ~110 fps -> ~60 fps
- True-color, non-softpoly: ~140 fps -> ~55 fps
- True-color, softpoly: ~50 fps -> ~52 fps
Unfortunately, I'm unable to get decent data from either 'stat rendertimes' or the 'bench' console commands because the numbers that are reported, besides the FPS and map coordinates in the case of the 'bench' command, are all 0 or 0.000 or similar, which is certainly wrong. I have no idea why that is the case. This occurs with both GZDoom versions tested. I do get proper data if the OpenGL renderer is active, but this issue is about the software renderer, and I don't get sane data with the software renderer in all cases.
I do compile from source, and I compiled Release builds for both 3.3.2 and 3.4.1. If any more information is needed, let me know.