John, the first thing to realize is LEDs are not rated by voltage but by current. When a LED is described in adverts as being 12V or 5V it doesn't mean that is the voltage you supply them with, it refers to the voltage they maintain across themselves when allowed to pass a certain current. If you connect a voltage source directly across an LED in most cases it will instantly burn out, you have to start with a voltage a little higher than the LEDs voltage then deliberately starve it of current to keep it within it's operating limits. It is in the starved state that the rated voltage is specified.
In normal solar lights, the LED typically has between 2V and 3V across it and a small circuit is used to boost the battery voltage from about 1.2V up to the LED voltage while also limiting the current it can draw. They have a simple inhibit circuit on them that kills the LED power while there is voltage on the solar panel wires. When it gets dark enough, the panel stops producing voltage and the inhibit is lifted so the light comes on.
Yes, you can use the board in the advert but all it will do is switch on the power at a predetmined light level. You can use it to turn the 12V on before wiring it to each of the modified lights. Your problem is that in each of the lights you plan to use a high power LED to get more light and for each of those LEDs you need to limit the current they are allowed to draw. The LED will get hot and unless you use some fairly advanced power regulation design, the current limiter will also get hot.
I'm not trying to deter you from trying, just point out that LEDs work quite differently to conventional light bulbs. A conventional bulb gets brighter as you supply it with more voltage, an LED does nothing up to a certain voltage then suddenly turns on at full power, and burns out, unless you restrain it from doing so.
At it's simplest, you wire a resistor in series with each LED to limit the current it can pass. That may be all you need. The calculation of it's value is simple:
1. Pick the current you want the LED to pass (use manufacturers data and drop it to add a safety margin). The figure will probably be in mA. For example if it is rated at 500mA, decide on maybe 450mA to actually use.
2. On the data sheet it will have a figure for "Vf", this is the voltage the LED will try to hold across itself.
3. The resistor value is: (your supply voltage - Vf) / chosen current in Amps. The result will be in Ohms. For example with a 12V supply and Vf of 3V at 450mA the calulation is (12 - 3) / 0.45 = 20 Ohms.
4. The resistor power rating has to be calculated next, the formula is (voltage across the resistor) * (Current through it). So using the same example, the resistor drops 12V down to 3V so it has 9V across it and 0.45A is flowing through it so the power it dissipates is 9*0.45 = 4.05W. You would pick one with a higher power rating than it actually dissipates, in this case I would suggest at least 5W.
So using that example you would buy a 20 Ohm, 5W resistor and wire one in series with each of the LEDS. As I pointed out though, the total power dissipated in each lamp will be that 4.05W plus the heat from the LED itself (~1.5W) so it will get quite hot and be wasting a lot of power.
Brian.