OK, sorry I didn't realize you wanted it to be software programmable. There is a sneaky trick that might work for you. It involves messing with the 336 calibration gain settings though, so we don't publish information on how to do this in the product manual as some of these commands can ruin the ability of the instrument to make measurements and would require it to be sent back to us for recalibration. The particular command below shouldn't cause any long-term issues with your instrument though, as any changes you make to the calibration settings would be reset when the 336 is power cycled.
The terminal command uses the same structure as the commands shown in Section 6.6 of the 336 product manual and is:
CALG <channel>, 0, <value> (sets calibration gain)
CALZ <channel>, 0, <value> (sets calibration offset)
where:
<channel> = 6 (for Output 3)
<channel> = 7 (for Output 4)
<value> is the gain constant or zero offset applied to the output (ranges from 0 to 1)
So first you will want to query the output to know what you gain constant and offset is at full scale. The examples I show will be for Output 3.
CALG? 6, 0
Your CALG value corresponds to the value required for 20 V of range from -10 to +10 V. Scale this CALG number to create a new full scale range. e.g. if you want +/-6V, then the full scale range will be 12 V and your gain factor should be set to 60% (12/20) of what it currently is.
E.g. CALG 6, 0, 0.6 (assuming my CALG value was 1.0 to begin with)
Now your output range will be around -10 V to +2 V (12 V full range). The next step is to change the offset to shift the max and min outputs to be symmetrical. As a side note, you don't have to have these values be symmetrical, but if you don't be aware that for Outputs 3 and 4, "off" is just the mid-point of full scale, which would normally be 0 V. In this example, "off" or 0% would see the instrument generating a -4V output.
The offset factor for this scenario can be calculated using: 0.5 - <full range>/40
In this example: 0.5 - 12/40 = 0.2
So you would enter CALZ 6, 0, 0.2
This is where you should check the actual output with a voltmeter. These calculations used to generate these numbers won't result in a perfectly centered bipolar range. You'll need to make slight trial-and-error adjustments to CALZ to get the setting for 0% or "off" to produce 0 V. For me and unit I have here, I had to shift my CALZ value from 0.2 to 0.2084 to get a 0.00 V reading.
The good news is that once you have a set of CALG and CALZ values that you're happy with for a given range, you can program these into your code to modify the range whenever you want. Just remember that these settings are not permanent and the instrument will revert back to its original calibration values (and +/-10 V output) when you turn the 336 off.
Hope this is a better solution for you. Please let me know how this goes, or if you have any other questions.