Store Drupal logs on Amazon S3

Read Article
Pause
Resume
Stop
Store Drupal logs on Amazon S3

Store Drupal logs on Amazon S3 via hook_watchdog, so that you can get rid of heavy logs on your drupal database and can later read from the S3.

For this todo on Drupal

  • You should use the "hook_watchdog" hook, where this hook allows modules to route log events to custom destinations.
  • In our case, our custom destination will be S3.
  • Initially we need to get the access to AWS and appropriate S3 bucket.
  • Store the connection configuration details of S3 buckets in the Drupal.
  • In the hook_watchdog, read the S3 Auth configs.
  • Create a connection to AWS S3, on success it return back with the S3 object.
  • Check if any buckets exist to write the logs, of not you should create bucket on S3 for logs.
  • Next, Create a log directory and log file, populate it with the data came on hook_watchdog.
  • Next, Write to the S3 bucket with parameters which "putObject" expects.
  • Next, Remove the log directory & file which is created in the process.
  • Check the S3 bucket via S3 UI, you could see the logs

Here's the piece of code, via we log to S3.

watchdog(
    'module_2',
    t('Info: Successfully completed.'),
    array($data),
    WATCHDOG_INFO
);
watchdog(
    'module_3',
    t('Error: Issue while processing.'),
    array($exception, $data),
    WATCHDOG_ERROR
);

Here, the piece of code which helps to connect & write log's to S3.

<?php

use \Aws\S3\S3Client;

function audit_watchdog(array $log_entry) {
    // Set the modules which needs to be logged to S3
    $modules = array(
        'module_1',
        'module_2',
        'module_3',
    );
    if (in_array($log_entry['type'], $modules)) {
        if(isset($log_entry) && !empty($log_entry)) {
            // Get the S3 config details
            $s3_config = variable_get("s3_config");
            $s3_bucket = $s3_config->s3_bucket;
            $s3_region = $s3_config->s3_region;
            $s3_key = $s3_config->s3_key;
            $s3_secret = $s3_config->s3_secret;
            try {
                if (!empty($s3_bucket) && !empty($s3_region) && !empty($s3_key) && !empty($s3_secret)) {
                    // Create AWS connection
                    $s3 = create_aws_connection($s3_region, $s3_key, $s3_secret);
                    if (!empty($s3) && is_object($s3) && $s3 != FALSE) {
                        if ($s3->doesBucketExist($s3_bucket)) {
                            $log_path = 'public://logs/to_aws';
                            // Create log directory
                            $dir_path = create_aws_log_directory($log_path);
                            // Create log file
                            $file_path = create_aws_log_file($log_entry, $dir_path);
                            // Get the name of log file
                            $keyname = get_keyname_for_log_file($log_entry['severity'], $file_path);
                            // Store the log file to S3
                            $s3->putObject([
                                'Bucket' => $s3_bucket,
                                'Key' => $keyname,
                                'Body' => '',
                                'SourceFile' => $file_path,
                            ]);
                            // Remove the directory & file created
                            if (is_dir($dir_path)) {
                                rmdir_recursive($dir_path);
                                rmdir_recursive('public://logs');
                            }
                        }
                    }
                }
            }
            catch (Exception $e) {
                // Exception is ignored so that watchdog does not break pages during the
                // installation process or is not able to create the watchdog table during
                // installation.
            }
        }
    }
}

function create_aws_connection($s3_region, $s3_key, $s3_secret) {
    $s3 = new S3Client([
        'version' => 'latest',
        'region'  => $s3_region,
        'credentials' => [
            'key'    => $s3_key,
            'secret' => $s3_secret,
        ]
    ]);
    $buckets = $s3->listBuckets()->get('Buckets');
    if (isset($buckets) && !empty($buckets)) {
        return $s3;
    }
    else {
        return FALSE;
    }
}

function create_aws_log_directory($log_path) {
    if (!is_dir($log_path)) {
        mkdir($log_path, 0777, true);
        chmod($log_path, 0777);
    }
    return $log_path;
}

function create_aws_log_file($log_entry, $dir_path) {
    $content = json_encode($log_entry);
    $log_file_name = $log_entry['type'] . '-' . date("Y-m-d-H-i-s") . '-' . preg_replace("/^.*\./i","", microtime(true)) . '.log';
    $file_path = $dir_path . '/' . $log_file_name;
    $log_file = fopen($file_path, "w");
    $write_log_file = fwrite($log_file, $content);
    $close_log_file = fclose($log_file);
    $chmod_log_file = chmod($file_path, 0777);
    return $file_path;
}

function get_keyname_for_log_file($severity, $file_path) {
    $watchdog_array = array(
        "0" => "WATCHDOG_ERROR",
        "1" => "WATCHDOG_INFO",
    );
    return 'drupal-logs/' . $watchdog_array[$severity] . '/' . basename($file_path);
}

function rmdir_recursive($dir) {
    foreach(scandir($dir) as $file) {
        if ('.' === $file || '..' === $file) continue;
        if (is_dir("$dir/$file")) rmdir_recursive("$dir/$file");
        else unlink("$dir/$file");
    }
    rmdir($dir);
}

Advantage of having Logs on S3

  • Reduce the number of DB log entries on Drupal database.
  • Completely Keep the Audit system outside Drupal, So Prod instance will play smooth.

Cheers :)

Comments

Scroll to Top
We Need Your Consent
By clicking “I Accept Cookies”, you agree to the storing of cookies on your device to enhance site navigation & analyze site usage.
I Accept Cookies