My solar power monitor uses components I salvaged from a broken USB power bank. After a year and a half of daily charge cycling, the charging circuit has gone out. I replaced it with an inexpensive commodity battery management system (BMS) module based on a 4056 chip. Then I let the module run unmodified for two days to verify everything worked as advertised.

These modules shipped with charging rate configured at one amp. The general rule of thumb is charging rate should stay below 1C, which is 2.6A for these 2.6Ah capacity cells, so the default should be fine. But I wanted to lower the charging rate for several reasons:

  • This battery cell is almost a decade old now, and it is natural for older cells to have reduced capacity along with reduced tolerance for charging speed.
  • Charging at slower gentler rates improves battery longevity, hopefully making this old battery last even longer.
  • There's no rush: drawing from the solar panel, this thing can charge over entire span of daylight hours when panels are producing power.
  • By reducing the power demand, I can activate this charging circuit even when the solar panels produce little power such as during overcast or rainy days.
  • By slowly charging during daylight hours, it reduces stress on the battery because it would only have to run the microcontroller during early morning and early evening when sunlight is scant.

Due to those reasons, I had ambition to slow the charging rate using original USB power bank charging circuit as well, but I didn't know how to work with the unmarked chip. Switching to a commodity BMS chip meant I can get more information. I'm not sure exactly whose 4056 chip I have here. But as they seemed to be interchangeable commodities, I just downloaded one of them and learned charging rate is controlled by a resistor between pin 2 (PROG) and ground. The formula is (Charging current) = 1200/(RPROG). Probing this module, I find the charge rate control resistor to be the one labeled R3 and its tiny number says "122". I understand that to mean 12 * 102 = 1200Ω = 1.2 kΩ. This matches expectation with the formula 1A = 1200/1200.

The beauty of math is infinite, but real-world circuits have limits. What is a practical minimum for charging rate? The datasheet I consulted gave several examples, the lowest is 0.12A rate via a 10kΩ resistor. Conveniently, the just-retired USB power bank circuit board has a 10kΩ resistor (labeled "103" meaning 10 * 103 = 10000Ω) on board. Since it's not doing anything anymore, I can pull it off and swap it for the default resistor at position R3. My lackluster soldering skill with surface mount devices (SMD) wasn't pretty, but it's functional.

Now I have a low-demand charging circuit for my solar power monitor, letting me run it across more weather conditions while gently charging the single old lithium-ion battery cell to extend its operating life. It would be nice if I can do more to extend battery longevity: lithium-ion chemistry batteries are stressed when they are fully charged or fully discharged, so they are best kept partially charged. In this particular project, I handle both in software running on my ESP8266. It goes to deep sleep before voltage drops to a critical level. It also disables the solar panel DC buck converter before the battery is fully charged. It'd be nice if I could reduce software complexity by doing everything onboard the BMS module, but I can't. Low cutoff voltage is controlled by the mystery chip at location U2, and max charging voltage is fixed at 4.2V for a 4056 chip.

Perhaps in the future I'll find a charging module that would let me modify those parameters, but for today this is good enough for my solar monitoring project to return to service so I can resume my LEGO nostalgia tour. I was just getting to the good part: LEGO Technic sets!