Skip to content

How to Collect Data for HCP for Cloud Scale

Updated  by hvuser
  • PDF
  • Print
  • Copy To Clipboard
  • Collapse All Expand All

Content

Objective

This article covers data collection from HCP for Cloud Scale. If you experience an issue with your HCP for Cloud Scale solution, use the procedures in this section to obtain required data for GSC to analyze and fix your problem.

Environment

  • HCP for Cloud Scale (HCP-CS).
    • Version 2.6.x and below.

Procedure

Collect diagnostic information for HCP for Cloud scale. The log_download tool is located at this path on each HCP-CS instance:

/<hcpcs-installation-path>/bin/log_download

 

For instance, with the default HCP-CS installation directoiry the tool is located at:

 /opt/hcpcs/bin/log_download

 

Usage:

  • Gather logs for the past 24 hours:
    /<hcpcs-installation-path>/bin/log_download -d -l
  • Gather logs for a specific timeframe, use the -t option:
    /<hcpcs-installation-path>/bin/log_download -d -l -t yyyy-MM-dd,yyyy-MM-dd
  • Get information on running the tool:
    /<hcpcs-installation-path>/bin/log_download -h

When using the log_download script, if you specify the --output option, do not specify an output path that contains colons, spaces, or symbolic links. If you omit the --output option, you cannot run the script from within a directory path that contains colons, spaces, or symbolic links.

When you run the log_download script, all log files are automatically compressed and moved to the retired/ directory

Additional Notes

If an HCP-CS instance is down, you need to specify the option --offline to collect the logs from that instance. If your whole system is down, you need to run the script log_download with the option --offline on each instance.

CXone Metadata

Tags:

PageID: 175396