If you want good control of them, LEDs are best driven with a particular current, not a particular voltage. This is because a very small change in voltage can produce a very large change in current, and thus in power and light output. Couple this with the fact that the voltage characteristics of LEDs are pretty loosely characterized, and vary from one to the next, and you can see why we usually talk about current in LEDs instead of voltage.
You can luck out and put 6-8V across a "12V" led and it might work well. The next one might not. This can depend on the design and/or the individual unit you get.
Many LED drivers, especially ones used for single-die LEDs, use a low value resistor to 'sense' the current in the LED. If the driver senses the current is a bit low, it increases the output, and vice-versa. If you were to cut this resistance in half, the driver would sense only half the current, and thus would double the output to compensate. Conversely, if you were to double the resistance, the current would be half to compensate.
If you have a driver that is built this way, changing the current sense resistor is a good way to change the output. Keep in mind that increasing the output current tends to increase stresses on many of the components, and can result in reduced life, even to a few seconds or less. On the other hand, reducing the output tends to reduce stresses on the components, and may increase driver lifetime. This is not an absolute rule; in certain cases the exact opposite may occur. In general, an "expert" familiar with the particular design can be pretty good at predicting which is which.