Minigrids are expected to play a key role in achieving universal access to electricity. The International Energy Agency estimates that 25% of people gaining access to electricity by 2030 will do so through a mini-grid1. Generation is often paired with a battery, primarily lead acid or li-ion in this context. Battery storage can help to displace the use of diesel generators, through providing solutions to the mismatch between daytime solar photovoltaic (PV) generation and evening demand.Batteries are a key cost component of mini-grids. However, how to economically choose and operate batteries in an off-grid context is often unclear2. Factors such as ambient temperature and high State of Charge (SOC) have been shown to significantly increase the rate of capacity fade in li-ion batteries3. A good understanding of battery degradation is required to maximise lifetime and reduce costs. Derating strategies, informed by degradation modelling, limit charging/discharging currents under conditions that will accelerate capacity fade. These methods can be implemented for little cost, having been shown to be able to effectively manage factors such as battery temperature and SOC4,5. Through limiting current, derating will have implications for system reliability. Therefore, it is important to fully understand the impact of these strategies in off-grid systems.This study uses a mini-grid simulation and optimisation tool (Continuous Lifetime Optimisation of Variable Electricity Resources, CLOVER)6,7 and coupled electrothermal and semi-empirical degradation models for a Lithium Iron Phosphate cell (Full Battery System Model, FBSM) to model capacity fade for two locations in the global south. The first location has a milder climate more suitable for li-ion operation (Gitaraga, Rwanda), whereas the second exhibits more extreme temperatures (Bhinjpur, India). CLOVER was used to optimally size solar mini-grid systems and to calculate financial and environmental variables. FBSM was used to model battery aging, considering factors such as cell self-heating, observing the impact of calendar aging and 3 separate cycle aging mechanisms. The model was also used to evaluate the effect of derating strategies on battery degradation, in terms of capacity fade. In the first instance, these are calculated with no derating, and in the second with simple derating strategies based on limiting SOC and battery temperature operating windows. Important factors for mini-grid operation such as battery lifetime, unmet energy fraction, and Levelised Cost of Used Electricity (LCUE) are considered.This study finds calendar aging to be the most significant degradation mechanism for all scenarios investigated, causing a minimum of 87% of total capacity loss. Calendar aging is primarily a function of ambient temperature and SOC.Extreme highs and lows in ambient temperature are found to accelerate degradation. In the case study, PV and battery size was relatively high with respect to the load, resulting in relatively low battery currents, and a tendency of the system towards high SOC levels. Low currents exhibited mean that the significance of joule losses is found to be very small, thus the influence of cell self-heating was found to be negligible.SOC derating to achieve an average SOC of 50% is found to increase battery lifetime by over 11 years through limiting time spent at high SOC, without impacting on the system reliability. Temperature derating is able to increase battery lifetime for a maximum of 2.3 years. Derating led to increases in the unmet energy fraction and frequency of blackouts. However, these increases did not result in any change to the emissions intensity of electricity or the LCUE. Capacity fade observed for one of the systems investigated in both Gitaraga and Bhinjpur has been shown in Figure 1.Therefore, it is shown that derating strategies are a low-cost method that can increase battery lifetime, without increasing the cost of electricity or emissions intensity. Thus, improving the financial viability of mini-grids. Insights from degradation modelling enable better decision making regarding the operation of li-ion batteries in this context. Acknowledgements This work was kindly supported by the EPSRC Faraday Institution Multi-Scale Modelling Project (EP/S003053/1, grant number FIRG003). References IEA, Energy Access Outlook, (2017).S. Few, O. Schmidt, and A. Gambhir, Energy Sustain. Dev., 48, 1–10 (2019) https://doi.org/10.1016/j.esd.2018.09.008.M. Schimpe et al., J. Electrochem. Soc., 165, A181–A193 (2018).J. V. Barreras, T. Raj, and D. A. Howey, in Proceedings: IECON 2018 - 44th Annual Conference of the IEEE Industrial Electronics Society,, p. 4956–4961, IEEE (2018) https://ieeexplore.ieee.org/document/8592901/.M. Schimpe, J. V. Barreras, B. Wu, and G. J. Offer, in PRiME 2020/238th ECS Meeting,, Honolulu, HI, USA.P. Sandwell, (2018).P. Sandwell, N. Ekins-Daukes, and J. Nelson, Energy Procedia, 130, 139–146 (2017). Figure 1