Log File Auditing Tools & Their Differences | Lesson 20/34 | SEMrush Academy

Log files contain the history of every person or crawler that has accessed your website. You will learn how to deal with log files.
Watch the full course for free:
https://bit.ly/3gNNZdu

1:01 Log File Analyzer
1:39 Elastic Stack
2:15 Other solutions

✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹
You might find it useful:
Understand how Google bots interact with your website by using the Log File Analyzer:
https://bit.ly/3cs0rfC

Learn how to use SEMrush Site Audit in our free course:
https://bit.ly/2Xsb3XT
✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹ ✹

Once you have your log files ready it is time to start dealing with the data. There are different ways to approach this. It is not a good idea to rely on a simple text editor to open this file. Often, log files are hundreds of megabytes in size so if you try to open that type of a file, you will have problems.

For a small site, you can start with DIY-solutions based on Excel or even Google Docs. You’d have to build filtering,cross-references, etc manually. So it is not scalable. There are no nice dashboards and graphs – you’d need to build that first; which is clearly not the simplest way to approach this.

One of the better ways – especially from an SEO perspective – is using Screamingfrog Log File Analyzer. It is a beginner level, desktop-based log file auditing tool with some very useful, pre-defined reports. It has a simple interface where you can drill down into different reports, understand crawl events and behaviour, see response codes, etc.

But it does not have any sharing capability. Also, you need to download log files manually from the server and import them into the tool. If the files are large it can take forever. It is a beginner level tool for small- and medium-size sites.

Another solution is the Elastic Stack – formerly known as ELK. It consists of 3 different tools:

Elasticsearch: search & analytics engine,
Logstash: server-side data processing pipeline,
Kibana: data visualisation (charts, graphs, etc.)
The great thing with this one is, that it’s open source and therefore free to use. On the other hand, you have to set it up and run it on your own server infrastructure. So, it needs IT resources.

Other SaaS solutions are logrunner.io, logz.io & Loggly. They are all based on ELK but focused on SEO based auditing, so they have dashboards where you can for example see the crawl behaviour over time or response codes per crawler etc.

The beauty of SaaS solutions is, that they are almost real time. You pipe your log files into the system and very quickly you can see what’s happening on your website.

It is important that working with your log files is integrated as a part of your regular SEO workflow, rather than having one-off audits. One-off audits are fine for a start, but logfile audits really become invaluable if you combine them with web crawl data and do it on an ongoing basis. Also, messing around with exports, up-/downloads is frustrating.

I’d generally recommend finding something that fits your requirements. All the tools have limitations; the price is based on the amount of volume that they process per month. The advantage of SaaS solution is the possibility of sharing reports with a team or with a client. If you’re doing migrations, it makes things easier because you see all the events that can happen in real time.

#TechnicalSEO #TechnicalSEOcourse #LogFileAnalysisTools #SEMrushAcademy

You May Also Like