Most advanced tools for browsing and querying logs are working in the cloud these days. You need to send the logs to them and accept storage pricing of uncompressed logs. This leads to a situation when teams are forced to keep low retention for logs at a very high price.
It's a very handy situation for the engineering team as they don't have to bother with the logging infrastructure as everything is working (and it's fast) with cloud logging systems.
Surprising billing
For example, you will pay around $1.50 - $2.50 per GB ingested and stored for 30 days within the cloud logging platform.
If you’re storing 20GB of logs each day, you can expect your monthly bill to be around $900 - $1500. That’s a whooping number and doesn’t even include any long term storage capabilities. Often times, it's even hard to estimate the size of the bill, since it's based on consumption. This can lead to situations where the bill will grow uncontrollably due to increased rate of the logs that are being produced (for example: producing DEBUG level on production)
Data is always hot and ready to be queried
You are paying for the infrastructure that is ready to serve your queries 24/7. However, usually, logs are queried ad-hoc when investigating issues or trying to recreate past events. You don’t really need that whole infrastructure to be serving you all the time, but you still pay for it.
You cannot leverage compression
Logs are well compressible, applying standard gzip compression to your log files can yield staggering 10x, or even 20x in some cases, space-saving. This comes from the fact that logs are very repeatable and most of the time only a portion of it is variable.
Cloud providers do not bill you based on the compressed size of your logs. Logs are probably compresses in the backend to cut costs and increase margins but you cannot leverage that to your advantage and lower the bill.