Skip to content

Apache HTTP webserver log parser and viewer UI

Install using script

The command below will download the latest release and add the executable to your system's PATH. You can also use it to update Logdy.

$ curl | sh

Install with Homebrew (MacOS)

On MacOS you can use homebrew to install Logdy.

$ brew install logdy

Download precompiled binary

Enter release page on Github and select recent release. Download a precompiled binary.

# For Linux (x86)
# For MacOS (ARM)
# Add Logdy to PATH:

More compliation target

You can find more precompiled libraries on the Github release page. We always always build for the following OS and architecture: linux/amd64 windows/386 windows/amd64 darwin/amd64 darwin/arm64 linux/arm64

Run Logdy with a tail on Apache logs

By default, on Linux systems, Apache will store logs for the virtual hosts in /var/log/apache2 directory. However this is dependent on a local configuration of each individual VirtualHost

logdy stdin 'tail -f /var/log/apache2/access.log'

Enter Logdy web UI

Visit the address provided in the console output after starting Logdy, by default it should be http://localhost:8080

Build a custom parser for Apache logs

Give a following line of a random Apache access log file

text - - [17/Jan/2024:19:35:00 +0100] "GET / HTTP/1.1" 200 864 "-" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.106 Safari/537.36"

We can create a parser code based on a regular expression that will parse the line into a JS object which we can then assign to json_content

(line: Message): Message | void => {

	const logPattern = /^(\S+) (\S+) (\S+) \[([^\]]+)\] "(\S+) (\S+) (\S+)" (\d+) (\d+) "([^"]+)" "([^"]+)"$/;
	const logMatches = line.content.match(logPattern);

	if (logMatches) {
		line.is_json = true
		line.json_content = {
			ip: logMatches[1],
			dash1: logMatches[2],
			dash2: logMatches[3],
			timestamp: logMatches[4],
			method: logMatches[5],
			path: logMatches[6],
			protocol: logMatches[7],
			status: parseInt(logMatches[8]),
			bytesSent: parseInt(logMatches[9]),
			referer: logMatches[10],
			userAgent: logMatches[11],

    return line;

This way, we'll be able to use that object when defining columns.

Display columns and filters

Logdy makes parsing and column selection a breeze. Use a built in "autogenerate" feature to generate columns based on JSON object present. Then you can make any adjustments and customizations. Based on the columns you can also emit facets or use another great feature to generate those automatically.

With a JSON object in place, you can use Auto-generated columns together with Faceted columns.

'Autogenerate columns'