Post by spiderbob on Oct 5, 2016 7:11:18 GMT -8
Just to start with, PWM is actually more "power efficient" than MPPT--which means less total power loss in the controller itself. So this means that heat sinks in the design can be smaller (and less expensive). Missing in most analysis of MPPT is that there is always a conversion loss with MPPT, which tends to be higher the greater the voltage difference between battery and panels. That's why PWM can actually beat MPPT under certain circumstances.
I have found that in more places that analyze MPPT assume that panels with 30V open circuit voltage are being used in a 12V system. Ahhh, any good MPPT system will easily provide better performance in this case. They also may assume batteries are charging at 12 volts or even 11 volts, which is unrealistic. Lead acid batteries are typically below 13 volts only when discharging, watch your gauge you will show a lot of the time at 12.7 or perhaps charging with very little charging current--meaning the actual potential gain in amps is not great.
But, the benefits for MPPT becomes apparent if you use panels not voltage matched for the battery, which so many people do when they start buying seperate systems with mix and match. In this case, MPPT will utilize more of the potential energy of the panels. For example, if you use 24 volt panels to charge a 12 volt battery system you must use MPPT, otherwise you would be using your panels very inefficiently. If you are trying to use PWM in that case, you are misusing the PWM technology.
Another potential benefit with MPPT is that if distance between panels and batteries is far, especially for cabin dwellers, smaller wire can be utilized by running panels at higher voltage to the batteries. Running at twice the voltage reduces wire size to 1/4, which for a long run can be a significant saving in copper wire.
If temperatures are low enough, the slightly less power efficiency of MPPT will be compensated by the higher panel voltages, which will result in a little more battery current. But in actual measurements made using a commonly sold MPPT solar controller, this would occur at temperatures less than 55 F degrees (in full sun, when charging at more than 13 volts), now we have all heard that lolar performs better in cooler temperatures, NOT HOT, where there is a slight advantage to MPPT (near the California coast). As temperature drops below that (in full sun) MPPT will get some advantage, such as could occur at high elevations in the winter. Potentially this would be maximum about a 2.5% improvement in amps output for every 10 degrees F lower in temperature (or 4.6% per 10 degrees C colder (data being used in this experiment are from Kyocera KD-140 panels.)
There can be theoretically optimal situations where MPPT could give some advantage: that is when solar current is present, but the batteries are quite low in charge--but because loads are high and even greater than the solar current the batteries are still discharging despite the solar current. Under these conditions the voltage could be at 12.5 volts (I noted 12.7 in my own experiment), or even lower. Again, using data from Kyocera panels, ("Normal Operating Conditions") there is a theoretical maximum gain over PWM of 20% current assuming NO MPPT conversion loss and no voltage drop in the wires to the panels, at 20C (68F). With PWM, the voltage drop in the wires in this case would not affect the charging current. Now if in addition you lower the temperature to below freezing at 28 degrees F (while sun is shining) you might actually get up to a THEORETICAL nearly 30% gain while the batteries are discharging.
The only REALLY BAD part of MPPT, is all the hype surrounding it--for example one manufacturer advertises "UP TO 30% OR MORE" (actually several if not all do this) power harvested from your panels. If you are using solar panels properly matched to the batteries, 30% ain't gonna happen unless it's EXTREMELY cold. And your batteries have to be abnormally low in charging voltage--which tends not to happen when it's cold (unless you assume the battery is still discharging while solar is happening). Virtually all the analyses I've seen touting MPPT on the Internet ignore the conversion loss, assume really cold temperatures, assume unreasonably low charging voltages, assume no voltage drop in the wires from panels to batteries, use STC conditions for the panels (that the marketing types prefer) rather than more realistic NOCT conditions, and in some cases assume panels not voltage matched to the batteries.
The other thing that is misleading about MPPT, is that some manufacturers make meters that show both the solar current and the battery current. In almost all cases for a well designed MPPT type the battery current will be greater. The engineers making these know better, but it is implied that if you were NOT using MPPT you would be charging your batteries with only the SOLAR current that you read on their meters. That's not true, because the PWM BATTERY current should always be higher than the MPPT SOLAR current. It is the nature of the MPPT that maximum power occurs when the current is lower than the maximum, so they must operate there to get the maximum power. So to properly compare the two you need to compare MPPT with an actual PWM controller in the same circumstances.
So there you have it, as I have learned from various sources. You decide, I use PWM now, but for those starting out and don't want to be concerned about what or how, MPPT would be better, especially as more and more people mix and match their system instead of matching their systems.