lhs-panel Click here to expand

Archive

The log files processed by EventLog Analyzer are archived periodically for internal, forensic, and compliance audits. You can configure the following as per your requirements:

  • Archiving interval
  • Type of logs that need to be archived
  • Storage location of the archived files
  • Retention period

The archived files can be encrypted and time-stamped to make them secure and tamper-proof.

How to view archived logs ?

To view your archived log data, go to the Settings tab in EventLog Analyzer and navigate to Admin Settings > Data Storage > Archives

Archive

The Archived Logs page contains the following information:

  • Device - List of devices from which the logs are being collected
  • Format - Device type
  • From and To - The time frame denotes the time period during which the logs were collected and archived by EventLog Analyzer.
  • Size - Size of the archived log data collected from each device.
  • Integrity - The integrity of the archived files, whether they are intact or have been tampered with, is denoted by the following states:
    1. Verified - Archived logs are intact.
    2. Archive file is missing - When the flat file is not found during the compression/zipping process.
    3. Archive file not found - When an archived file is not available in the location where it was originally stored in the DB.
    4. Archive file is tampered - When the original archive file is edited/some part of the file is deleted externally.
    5. Note: In case a file has been deleted or tampered with, an email notification will be sent immediately containing the message "Archive file is tampered".

    6. Archive file available - When the archive integrity check is disabled, both the verified and tampered files will carry this status.
    7. Archive file not available - When the archive integrity check is disabled and the archive file is either missing or not found in the original location, this status will be shown.
    8. Access Denied - When the remote location can't be accessed where the archives are stored
    9. Connectivity failed - When the Amazon S3 bucket is not accessible due to network connectivity where archives are stored
  • The status of the archival is indicated by the following four different states:
    1. Loaded - The archived files are already loaded to the database. Click View to view the file
    2. Data already available - If the archive file is in Elastic Search database
    3. Data partially available - If some of the archive data is in ElasticSearch database
    4. Not Loaded - If the archive file is not in ElasticSearch database.
    5. File yet to be uploaded - If the archive file is not transferred to the specified zip location

How to view a specific archival file?

  • To view a specific archival file, click on the check box corresponding to Device.
  • To view the log files that were archived during a specific time, click on the calendar icon in the top right corner of the page and select the desired period.
Archive

How to filter and view a set of archive files?

To view files based on the size or status of the archive data, click on the filter icon next to Size or Status and set the appropriate values. The files will be filtered based on the given values.

Archive Archive

How to sort the list of archive files?

Click on the drop down icon next to Device/From/To, to sort the list in ascending order based on the respective column values. By clicking again, the list will be sorted in descending order.

Archive

How to load archive files?

To load your archived files, go to the Settings page in EventLog Analyzer and navigate to Admin Settings > Data Storage > Archives

  1. Check the status of the archived file corresponding to the device. If it shows Not Loaded, click on the Load Archive button to load the files to the database.
  2. Once the status of the file changes to Loaded, click on the corresponding View button to view the files.

Note: Archives stored in shared storage and S3 buckets will be downloaded to local storage and loading will be initiated.

Archive

Note: To unload a file, select the file and click on the Unload Archive button.

Archive

Note: If the status of the file says Data partially available and if you proceed to load the archive, there could be a duplication of the data.

Archive

How to delete archive files ?

To delete your archived files, go to the Settings page in EventLog Analyzer and navigate to Admin Settings > Data Storage > Archives.

  1. Select the archived file(s) by selecting the respective check box(es).
  2. Delete the archived file(s) by clicking on the Delete icon.
Note:

Deleting a host from the manage device page removes corresponding archive entries from the UI, but the physical files remain stored if you need to retrieve them in the future. If these archives are no longer needed, ensure to delete the files manually to free up space.

Archives that are in yet-to-be-uploaded status can also be deleted. It will delete the file which is stored in the local temp location

Archive

How to configure group based/device based archive settings ?

To configure archival settings, click on Settings in the top right corner of the screen.

Configure the archive interval, retention period, encryption, time-stamp of the archive files, location to save the archive files and the index files.

Note: The archive and database storage are asynchronus operations. These operations are unrelated.

Archive Archive Archive Archive Archive

Configure Cloud Account:

Archive
  1. Ensure that archiving is enabled. By default, it is enabled.Use the toggle button to disable archiving.
  2. Enter the Archive retention period for the archived files. The default period is forever.
  3. Logs can be archived in two formats - Raw Logs with Parsed Fields and Raw Logs. Logs will be stored with metadata on selecting the former, and without metadata for the latter.
  4. Note: The storage space for Raw Logs will be lesser but only basic reports can be generated using this data.

  5. Enter the storage location for Flat file location in the Temp File Location field.Click on Verify to validate the location.

    Note: By default, the location is set to local for optimal performance.

  6. Enter the storage location for the archived files in the Archive Zip Location field.
    1. Local - To store archives in a local location, choose local from the dropdown and enter the storage location
    2. Shared - To store archives in a shared location, select 'Shared' from the dropdown menu and enter the storage location.
      • By clicking on the authentication check box, you will need to enter the credentials to access the shared location. Unchecking it would make the path accessible to everyone with share access.
      • Ensure that the remote machine is available and it has sufficient read and write permissions for the share.
    3. S3 Bucket - To store archives in S3 Bucket, choose S3 Bucket from the dropdown and enter the folder name. By default the folder name will be "AwsArchive"
      1. Cloud account - Displays the configured cloud accounts.Select the respective cloud account from the dropdown. To configure the cloud account, click on "Configure Cloud Account" - This will configure AWS cloud account without a cloud trial. To configure the cloud account with cloud trial refer this document.
      2. Note: Cloud accounts configured through Domain and Accounts or from Archives can only be listed here

      3. Buckets - Displays the configured buckets associated with the respective cloud account. Select the respective bucket from the dropdown. Ensure that your bucket has the sufficient permissions. To create a new bucket,enter the bucket name in the input field and click on the "+" icon. By default, the bucket type is General Purpose, and it is in home region. Ensure that bucketname follows this naming convention rules, If the archives stored in these buckets are to be encrypted, Ensure that the buckets have " Bucket Key Enabled " permission. By default, KMS keys will be disabled when creating a bucket. To provide the permissions , navigate to Amazon S3 → Buckets → (Bucketname) → Properties → Default Encryption → Bucket Key
      4. Archive
      5. Storage Type - Displays the S3 Storage Classes. The default storage type utilized is S3 Standard, which comes at no cost. However, additional storage types include pricing based on the amount of storage utilized. AWS Storage Pricing and Permissions
      6. S3 Encryption - Displays the encryptions for the bucket.The default encryption is Amazon S3 managed keys (SSE-S3) as the base level of encryption.
      7. AWS Key Management Service (AWS KMS) and Dual-layer server-side encryption with AWS KMS keys ( DSSE-KMS ) can be configured if the respective KMS keys are already present.If no keys are available, you can generate KMS key in the AWS console.

        Creating a KMS Key:

        1. AWS console → Services → All Services → Key Management Service → Create key.
        2. Archive

        Permissions:

        • To list the configured kms keys for your cloud account , provide ''kms:ListKeys" policy permission to your IAM user.
        • To encrypt the archives using KMS key , provide "kms:GenerateDataKey" policy permission to your kms key (key ARN) . Refer this document for KMS key permissions

        Pricing :

        There are no additional charges for using default encryption for S3 buckets

        For SSE-KMS and DSSE-KMS, AWS KMS charges apply and are listed at AWS KMS pricing.

  7. To secure the archival logs, enable flat file encryption. By default, it will be disabled.
  8. Enter the log retention period for the loaded archive files. The default period is 7 days.
  9. Click on Advanced and fill in the following fields:
    1. Choose the time interval for file creation. The logs will be written to flat files at the specified time period.
    2. Note: The default interval is 8 hours.

    3. Choose the required time interval for creating a zip file. The flat files will be compressed (40:1 ratio) and zip files are created at the specified time period.
    4. Note: The default interval is 1 day.

    5. Enable Archive Timestamping if required. By default, it is disabled.
    6. The Periodic Archive Integrity Check is enabled by default.
    7. Note: The default interval is 1 day.

  10. Save the settings and close the window. For instant archiving, click the Zip Now button next to Zip Creational Interval.
  11. Note: Files will be zipped locally and will be transferred to the destination location, so additional disk space will be required. Ensure that you have sufficient storage in the system(Size).To know about the disk space required in the local storage, calculate using the tuning guide

    Configure multiple archive settings by clicking on Create New Policy in the top right corner.

Archive Archive

Additional configuration - Select the devices/groups for which the policy will be applied.

How to view configured Policy ?

Click on Settings at the top right corner of the screen. This will lead to the Archive Settings page which contains all the configured policies.

Archive Archive
  • Policy Name - Specifies the name of the policy.
  • Archive Location - Shows the zip location of the policy.
  • Devices/Groups - Shows all the devices and groups added in the policy.
  • Size - Total size of archive of all the devices/groups added in the policy.
  • Retention period - Log retention period of the policy.
  • Status - Shows the status of the archival. The status will either be Success or Archiving Disabled.

Click on Edit by hovering on the policy to edit the configured settings.

Archive

You can also add a new policy by clicking on the Create New Policy button in the top right corner in archive settings page.

Archive

How to edit the priority of the policies?

To change the priority of the policies, click on Priority Policy, rearrange the policies by dragging and dropping them, and save.

Note: If a device/group has been added under multiple policies, the archive settings of the policy with the highest priority will be applied to that particular device/group.

Archive Archive

How to check to which policy applies to a specific device?

In the Settings tab of EventLog Analyzer, navigate to Admin Settings > Data Storage > Archives > Settings > Archive Summary

Archive
  • Device - Shows the list of devices that are added in one or more policies
  • Effective Policy Applied - Shows the policy which is applied to that particular device.
  • Location - Shows the zip location of the policy.
  • Total size - Shows the total size of archives for that particular device.
  • Size in location - Shows the size of the device archives collected under that specific policy.

How to check the server status and storage occupied?

Storage Summary:

In the Settings tab of EventLog Analyzer, navigate to Admin Settings > Data Storage > Archives > Archive Summary > Storage Based

Archive
  • ServerName - Displays the list of servers configured in archive policies and archives
  • No of Devices - Displays the number of devices associated with the server
  • No of Archives - Displays the no of Archives present for the respective server
  • Size - Displays the total size of the archives in the server
  • Last Modified Time - Displays the Last password updated time for the server
  • Status - Displays the reachability status of the server
    • Access Denied - When the server is unavailable or it doesn't have the necessary permissions
    • Connectivity failed - When the cloud is not reachable due to internet disconnection
  • Edit icon - Update Credentials - Update the new credentials in case of password change for the server ( Applicable only for shared server )
Archive

Update the username and password for the server

Archive

Archive troubleshooting cases

Archive
  1. Update path
    • Goto Settings > Admin Settings > Data Storage > Archives > More in the top right corner > Update path
    • Select the old archive location in the dropdown and enter the new location where archives are moved or present in Archives moved location and click on Update.
    • Note: Update path is only applicable for local and shared locations

    Archive
  2. Update archive file integrity
    • Goto Settings > Admin Settings > Data Storage > Archives > More > Update path.
    • Click the refresh button in the top right corner to update the integrity status of the files.

    The File not Found status will change to Verified, if the file is present in the directory as specified in DB. This will also change the status from Tampered Files to Verified.

    Integrity Status like Access denied and Connectivity failed will be updated to Verified, if the file is present in the respective location

    Archive
  3. To add archives in DB
    • Goto Settings > Admin Settings > Data Storage > Archives > More > Add Archive Entries.
    • Enter the location where the archives are present.If needed, select Device and add the archives of a particular device.
    • Note: Add Archive Entry is only applicable for local and shared locations

    Archive
  4. If ES/data lost or corrupted
    • Goto Settings > Admin Settings > Data Storage > Archives > More > Rebuild Indexes.
    • Select the date range and the device for which the logs need to be indexed in ES from Archives.Click on Rebuild.
    Archive

Centralized Archiving:

If centralized archiving is enabled in the Admin Server. In that case, the S3 bucket option will not be displayed for new policies in Manage Server. For existing S3 bucket configured policies, file transfers will continue to upload in S3 locations. Files will be downloaded locally and then transferred to the Admin Server. Ensure that you have sufficient storage, also make sure to change the location to local if centralized archiving is enabled for optimal performance.

Archive

Steps to move EventLog Analyzer's Elasticsearch indices to a new location

Note:

ES\repo folder contains temporary files for ES archives

ES\data folder contains data

ES\archive folder contains ES archives

ES\repo, ES\data and ES\archive should never point to the same folder

Examples:

For remote network path use the following format:

  • path.data : ["//remote machine name/shared folder/data"]
  • path.repo : ["//remote machine name/shared folder/repo"]

For windows local storage use the following format:

  • path.data : ["C:\\ManageEngine\\EventLog Analyzer\\ES\\data"]
  • path.repo : ["C:\\ManageEngine\\EventLog Analyzer\\ES\\repo"]

For linux local storage use the following format:

  • path.data : ["/opt/ManageEngine/EventLog Analyzer/ES/data"]
  • path.repo : ["/opt/ManageEngine/EventLog Analyzer/ES/repo"]

Case 1: EventLog Analyzer as a standalone setup (Not integrated with Log360)

  1. Shutdown EventLog Analyzer.
  2. Navigate to <Eventlog home>\ES\config\elasticsearch.yml, update path.data to include the new location and save the file.
  3. Move the files from <ManageEngine>\<Eventlog>\ES\data folder to the new location.

Case 2: EventLog Analyzer is integrated into Log360 and is installed with Log360 installer (Bundled):

In this case, EventLog Analyzer uses a common ES that's shared with other modules

Note: With Log360, the integrated module will have only one ES and it can be located in the Admin > Administration and Search Engine Management page. By clicking on details we can see that it is running from <ManageEngine>\elasticsearch\ES folder.

  1. Shutdown EventLog Analyzer and Log360.
  2. Shutdown common ES.
    1. Open Command Prompt as the Administrator in <ManageEngine>\elasticsearch\ES\bin
    2. Run stopES.bat
  3. Navigate to <ManageEngine>\elasticsearch\ES\config\elasticsearch.yml, update path.data to include the new location and save the file.
  4. Move the files from <ManageEngine>\elasticsearch\ES\data folder to the new location.

Case 3: EventLog Analyzer is manually integrated into Log360:

In this case, EventLog Analyzer will be using its existing (before integration) local and the common ES (after integration with Log360).

Note:By default, the integrated module will have two ES and it can be located in the Admin > Administration and Search Engine Management page. By clicking on details we can see that one is running from EventLog Analyzer <Eventlog home>\ES folder and other from <ManageEngine>\elasticsearch\ES folder.

  1. Shutdown EventLog Analyzer and Log360.
  2. Shutdown common ES.
    1. Open Command Prompt as the Administrator in <ManageEngine>\elasticsearch\ES\bin
    2. Run stopES.bat
  3. Navigate to <ManageEngine>\elasticsearch\ES\config\elasticsearch.yml, update path.data to include the new location and save the file.
  4. Move the files from <ManageEngine>\elasticsearch\ES\data folder to the new location.
  5. Navigate to <ManageEngine>\<Eventlog>\ES\config\elasticsearch.yml, update path.data to include the new location (different from the one given for common ES) and save the file.
  6. Move the files from <ManageEngine>\<Eventlog>\ES\data folder to the new location.

Steps to move EventLog Analyzer's Elasticsearch data to a new location

Note:

ES\repo folder contains temporary files for ES archives

ES\data folder contains data

ES\archive folder contains ES archives

ES\repo, ES\data and ES\archive should never point to the same folder

Examples:

For remote network path use the following format:

  • path.data : ["//remote machine name/shared folder/data"]
  • path.repo : ["//remote machine name/shared folder/repo"]

For windows local storage use the following format:

  • path.data : ["C:\\ManageEngine\\EventLog Analyzer\\ES\\data"]
  • path.repo : ["C:\\ManageEngine\\EventLog Analyzer\\ES\\repo"]

For linux local storage use the following format:

  • path.data : ["/opt/ManageEngine/EventLog Analyzer/ES/data"]
  • path.repo : ["/opt/ManageEngine/EventLog Analyzer/ES/repo"]

Case 1: EventLog Analyzer as a standalone setup (Not integrated with Log360)

  1. Shutdown EventLog Analyzer.
  2. Navigate to <Eventlog home>\ES\config\elasticsearch.yml, update path.data to include the new data location and save the file.
  3. In <Eventlog home>\ES\config\elasticsearch.yml, update path.repo to include the new repository location (parallel to data directory) and save the file.
  4. Move the files from <ManageEngine>\<Eventlog>\ES\data folder to the new location.
  5. Create a folder with the name archive (parallel to the new data directory).
  6. Move the files from <ManageEngine>\<Eventlog>\ES\archive folder to the new folder named archive.

Case 2: EventLog Analyzer is integrated into Log360 and is installed with Log360 installer (Bundled):

In this case, EventLog Analyzer uses a common ES that's shared with other modules

Note: With Log360, the integrated module will have only one ES and it can be located in the Admin > Administration and Search Engine Management page. By clicking on details we can see that it is running from <ManageEngine>\elasticsearch\ES folder.

  1. Shutdown EventLog Analyzer and Log360.
  2. Shutdown common ES.
    1. Open Command Prompt as the Administrator in <ManageEngine>\elasticsearch\ES\bin
    2. Run stopES.bat
  3. Navigate to <ManageEngine>\elasticsearch\ES\config\elasticsearch.yml, update path.data to include the new data location and save the file.
  4. Also update path.data in <Eventlog home>\ES\config\elasticsearch.yml to include the new data location (same data location as mentioned in step 3).
  5. Update path.repo in <ManageEngine>\elasticsearch\ES\config\elasticsearch.yml to the new repository location (parallel to the new data path).
  6. Update path.repo in <Eventlog home>\ES\config\elasticsearch.yml to the new repository location (same repository location as mentioned in step 5).
  7. Move the files from <ManageEngine>\elasticsearch\ES\data to the new location.
  8. Create a folder with the name archive (parallel to the new data directory).
  9. Move the files from <ManageEngine>\<Eventlog>\ES\archive folder to the new folder named archive.

Case 3: EventLog Analyzer is manually integrated into Log360:

In this case, EventLog Analyzer will be using its existing (before integration) local and the common ES (after integration with Log360).

Note:By default, the integrated module will have two ES and it can be located in the Admin > Administration and Search Engine Management page. By clicking on details we can see that one is running from EventLog Analyzer, <Eventlog home>\ES folder and the other from <ManageEngine>\elasticsearch\ES folder.

  1. Shutdown EventLog Analyzer and Log360.
  2. Shutdown common ES.
  3. Open Command Prompt as the Administrator in <ManageEngine>\elasticsearch\ES\bin
  4. Run stopES.bat
I. Change in common ES
  1. Navigate to <ManageEngine>\elasticsearch\ES\config\elasticsearch.yml, update path.data to include the new location and save the file.
  2. Update path.repo in <ManageEngine>\elasticsearch\ES\config\elasticsearch.yml to include the new repository location (parallel to path.data).
  3. Move the files from <ManageEngine>\elasticsearch\ES\data to the new location.
II. Change in local ES (the path here should be different from the one given for common ES)
  1. Navigate to <ManageEngine>\<Eventlog>\ES\config\elasticsearch.yml, update path.data to include the new location (this should be different from the one given for common ES) and save the file.
  2. Update path.repo in <ManageEngine>\<Eventlog home>\ES\config\elasticsearch.yml to the same repository location as that of common ES.
  3. Create a folder with the name archive (parallel to the new data directory).
  4. Move the files from <ManageEngine>\<Eventlog>\ES\data to the new location.
  5. Move the files from <ManageEngine>\<Eventlog>\ES\archive folder to the new folder named archive.
Note: If you wish to set a dynamic key for encrypting the archive files, follow these steps:
  1. Go to the archive location. By default, files are archived at <EventLog Analyzer Home>\archive. Create a file EncryptedKey.enc.
  2. Open the file using a text editor and enter the dynamic key as text. The key should be exactly 16 characters in length.
  3. Restart the EventLog Analyzer service.

If you wish to import the files archived using the above dynamic key in another installation of EventLog Analyzer, follow these steps first:

  1. Paste the EncryptedKey.enc file in the installed product archive location.
  2. Restart the product.
  3. Import the required archive files.

Copyright © 2020, ZOHO Corp. All Rights Reserved.

Get download link