This is said with almost no recent prior research, so probably stupid, but...
From what I remember from the last time I played with linear regulator supplies, the typical setup is: Fixed AC voltage > rectifier> input caps> regulator ( voltage divider (pot) as reference)> output caps>banana jacks.
With the fixed AC input voltage, the power dissipation thing gets worse the lower your output voltage because it's (V_in - V_out) × I_out. So for example your rectified DC before the regulator is 30V and you're outputting 5V @ 2A, the regulator is burning up 50W ((30V-5V)×2A) in order to deliver only 10W (5V*2A).
What if instead of a fixed AC input and variable voltage divider reference (pot), you used a variable AC input and a fixed voltage divider? Speaking in terms of dollars and cents this probably only makes sense if you have a spare variac with no future, but in terms of simplicity of the design maybe it simplifies things and makes a more energy efficient supply? Just set your fixed voltage divider so that there is enough headroom to remove the ripple, and then control the voltage from the variac knob. The voltage divider ratio should scale to whatever the AC input voltage is, so even performance throughout the range, with just slightly more dissipation at higher voltages.
Am I missing something or should that work?