HACKER Q&A
📣 thojest

What is a simple tool for parsing and analyzing log files?


Hi there, I have a small app (frontend, backend, database) with <100 users. I would like to use my log files to analyze basic metrics, like for example - number of requests - number of comments created - number of shared links per unit of time.

All of this runs on one single server, no big infrastructure, no big data. I am unable to find a SIMPLE tool to collect my logs in some way, parse them, and visualize them in histograms for example.

I know there exist ELK stack, Splunk, Graylog and many others, but all these solutions are much too complex. Especially I do not want to spend weeks setting this up correctly. Further more, most of these solutions need an extra server for aggregating the logdata in some timeseries db.

I would be very happy if you know about any opensource tool which can do this job.


  👤 baccredited Accepted Answer ✓
I've had luck with https://goaccess.io/

example: goaccess logfile.log -o report.html --log-format=COMBINED


👤 tekronis
Maybe checkout LNAV: http://lnav.org/

There's also angle-grinder, which has less features, but also pretty useful: https://github.com/rcoh/angle-grinder


👤 gmuslera
Grafana´s Loki may have a lighter weight than the other examples you gave above.

For some kinds of logs there are tools for summarization and reports (like awstats for web or pflogsumm for mail servers).

And, of course, for particular queries on existing logs the standard text tools in a linux box let you generate a lot of info.


👤 runjake
For web logs, I still use Webalizer. For everything else, as long as we are not talking tens of gigs, I’ll be using some mix of Perl, Python, Shell, Awk, etc.

👤 kjs3
Syslog + something to massage the data (sh/awk/sed, perl, python)? Like we've been doing for simple apps for decades?

👤 PaulHoule
pandas?