How to prolong Lithium battery life? Overspec a bit! The rest of the tweaking is maybe not worth the complexity...
Some tips on spec and battery setup, including why matching a big enough battery capacity to the inverter helps stress the battery less, not to mention being able to offset from grid more.
TL;DR:
Seems like a good ratio is 2:1, e.g. 5k inverter, 10kWh battery, allowing for 0.5C charge rate.
Lithium is happiest when avoiding extremes of the spec in terms of temp, SoC and charge or discharge rates.
While most people might know it's best to avoid high DoD (depth of discharge) / low SoC (state of charge), a possibly lesser know concern is that constantly topping up at 100% SoC also adds stress, especially at high temp.
Notes about getting more cycles/energy out of Lithium Ion over its lifespan:
The ideal is to cycle batteries somewhere in the mid-upper levels ranges SoC. E.g.
Between 50% to 85% is where less degradation occurs during the cycling.
More degradation happens heading to 0% or 100%.
Overspec:
Avoid deep discharge limiting discharge to SoC to 10% or less.
You can only do this if you have the extra headroom.
Try avoid a maxed out scenario where your battery will charge and discharge at 1C.
Avoid high discharge or discharge rates, e.g. drop max charge and discharge to 0.5C.
10kWh battery too expensive? Say you only have ~5kWh battery with a 5kW inverter? The discharge rate would need to be 1C to match inverter, but you'd hopefully seldom be running at full 5kW load. You can still charge at just ~0.5C (2500W) to help on the one half of cycling.
At night, limit load a bit more because there's no PV output to support the load.
Grid can help for occasional spikes in load (you bought a hybrid afterall!).
But a more aggressive discharge limit could run into a logical overload issue if in backup / grid offline mode, since only battery / PV can supply load.
Avoid high temp combined with 100% SoC (hard to do anything about this).
Sorry, just so happens peak solar and 100% SoC is likely at mid-morning / mid-day which coincides with highest ambient temp . At night when it's cooler, there's no PV to charge...
Inverter likely delegates this to BMS.
Own example
I've set Lithium battery SoC min to 30% and have charge and discharge limited to 0.5C (also recommended by Dyness for example). This should strain the battery chemistry less.
E.g. 9.6kWh limited to 100A at 48V = 4800W (0.5C), close to 5000W rating of inverter. Note, 1C = 9600W for 9.6kWh capacity battery.
I'm even tempted to limit charge to 50A / 0.25C to mitigate rising temp from faster charging. I've noticed that at ~0.4C charge rate causes a fairly rapid rise in temperature. 0.4C because PV output over 4000W is rare and only enough to supply ~80A/0.4C to charge battery. However, on cloudy days, having bursts of ~3 or 4kW (~0.4C) in clear moments is useful to compensate for clouded moments. Clipped charge at 50A (~0.25C) would only be able to use ~2000W of PV power wasting potential output if Zero Export is set.
I've also set power limit to 4800W at night to limit discharge to 0.5C, since there's no PV to help supply load. During the day, I'm happier to have max limits of 5000W (or even 5500W) given PV can help supply the other few hundred W.
I admit this might be nicer in theory than in practise since few users would likely care to tweak complicated advanced charge settings in the aim of squeezing a bit more lifespan from batteries. Most people likely prefer max battery performance? Anyhow, just sharing the idea.
E.g. System Settings. Basic logic:
Offset consumption from grid by using battery at night.
40% down to 30% reserver during the morning in case of grid failure or cloudy morning.
Avoiding grid charge except later in the day to compensate for cloudy days when PV output wasn't adequate to meet load and catch up on charging the battery.

Related reading
To quote Solacity:
LFP prefers to sit at partial charge rather than being completely full or empty
To quote battery university:
Exposing the battery to high temperature and dwelling in a full state-of-charge for an extended time can be more stressful than cycling
So limiting SoC charge per temp zone would be a nice advanced feature.
See:
https://www.solacity.com/how-to-keep-lifepo4-lithium-ion-batteries-happy/
Nice read, highly recommend.
Explains how Lithium isn't really more expensive than Lead Acid/GEL, etc.
Also factor in labour and maintenance and Lithium seems worth the premium over Lead Acid/GEL, etc.
https://batteryuniversity.com/learn/article/how_to_prolong_lithium_based_batteries
Has nice graphs/tables showing how high DoD (depth of discharge) and high temp and state of charge affect usable cycles
But bit out of date and info likely based on lithium-cobalt, not lithium phosphate.
Why optimize: Battery cost is often the highest overall system budget item. Hence it makes sense to offer advanced battery lifespan care features.
As per a Solarcity Inc article (linked below):
You have just sold your first-born into slavery, remortgaged the house, and bought yourself a lithium-ion battery!
E.g., of the ~R 120K (~$7500 USD) my components cost, the battery was ~R65K (~$4050 USD) - over 50%!
Battery manufacturer BMS should know best? Or do they want you to buy another battery soonest?
An example of when battery manufacturer BMS doesn't always do the best thing (from https://www.victronenergy.com/live/battery_compatibility:pylontech_phantom ):
When DVCC is enabled, the battery (via the CAN-bms) is responsible for the charge voltage. The Pylontech battery requests a charge voltage of 53.2V. We have however found that in practice this is too high. The Pylontech battery has 15 cells in series, so 53.2V equates to 3.55V per cell. This is very highly charged and makes the system prone to go overvoltage. It should also be noted that a LiFePO4 cell stores very little additional energy above 3.45V. For this reason we opted to override the BMS and cap the voltage at 52.4V. This sacrifices almost none of the capacity and greatly improves the stability of the system.
inverter feature to avoid constant charging near 100%?
This likely will never be implemented. Too complex and too few people would care. Plus it second guesses battery BMS which can be quite unwise. But if you're into premature over-optimization, one can dream...
Simple feature idea: Set a threshold to resume charging again if under 95% SoC
I've noticed Dell and Lenovo laptops have power manager options to only start charging the battery again once a fair amount of discharge happens. This slightly mitigates for the degradation near 100%. E.g. going 100%, 98%, 100%, 99%... E.g. after a 100% charge including a complete absorption cycle, only apply charge again if SoC < 95%. This is vaguely similar to Hysteresis for thermostats (avoiding switching too frequently at a set temp or threshold). The battery cycles between discharging to 95% first before going back up to 100% instead of being held near 100% all the time with many micro charges and discharges.
But the above idea could be counterproductive if a longer charge from 95% to 100% builds up unnecessary heat.
Turns out, to some degree (pun intended), a BMS might mitigate for temperature by limiting charge rates. E.g. found this for Dyness (B4580) - a table of with SoC, charge current and temps. However, it's very tolerant and only begins to limit current above 50 ℃.

Found at: https://powerforum.co.za/topic/5547-dyness-powerbox-with-victron-settings-found/

I have Shoto 5.12 KW batteries, which are rated at 5000 cycles at 90% DOD. The installer has setup my Sunsynk to not discharge below 50%. This seems to be very concervative to me and wasting some battery capacity. Any thoughts that I change this to say 30% DOD? Would it make a big negative effect on lifespan?