1243
Living the power user life over here
(lemmy.dbzer0.com)
I use Arch btw
Sister communities:
Community rules
Please report posts and comments that break these rules!
A dirty linux admin here. Imagine you get ssh'd in nginx log folder and all you want to know are all the ips that have been beating againts certain URL in around last let's say last seven days and getting
429
most frequent first. In kittie script its likefind -mtime -7 -name "*access*" -exec zgrep $some_damed_url {} \; | grep 429 | awk '{print $3}' | sort | uniq -c | sort -r | less
depends on how y'r logs look (and I assume you've been managing them - that's where thezgrep
comes from) should be run intmux
and could (should?) be written better 'n all - but my point is - do that for me in gui(I'm waiting ⏲)
As a general rule, I will have most of my app and system logs sent to a central log aggregation server. Splunk, log entries, even cloudwatch can do this now.
But you are right, if there is an archaic server that you need to analyse logs from, nothing beats a find/grep/sed
In splunk this is a pretty straightforward query and can be piped to stats count and sorted. I don't know if you'd exactly count that as gui though.