Anything higher than -70dBm is considered good signal power, but what would be the theoretical “maximum” value that you could achieve/have seen “in the wild” on regular consumer hardware (e.g. smartphones) and an average cell tower, without any data transfer issues? A measurement that would be read out if you were to hold a phone out standing very near an average LTE cell tower? 0 dBm? Probably too high. -30?
It’s worth noting that the RSSI counts all of the signal in the air at a certain channel and width - think of channel like wifi channel, and think of width like 20MHz/40MHz/80MHz/160MHz in wifi, but with LTE instead.
This means utilization (load) and useless signal (e.g. interference) are also counted.
The tower sends ‘test signals’ in a known sequence. The (type of) average strength of one ‘test signal’ is the RSRP.
So, the RSRP may correlate more with the distance from the tower.
e.g. let’s take RSRP:-44, at one point the highest value supported in measurement reports.
There are grids that are 0.180MHz wide that can vary in interference, utilization, etc. For example, the load and interference / RSRQ may be -14. That’s 14 to add to the RSSI. There could be more.
LTE can normally have up to 100 of these grids. That’s 100 more = 20dBm more, so there’s 20 to add to the RSSI, with the above load and interference on average.
(-44) + 14 + 20 ≈ -10
So, for the RSSI to be high,
However,
Overall, the RSSI is weird. -70dBm is not necessarily good, it is ambiguous. Use SNR if available (it’s simple, higher is better) and RSRP.