Have a question? Want to report an issue? Contact JFrog support

Skip to end of metadata
Go to start of metadata

Overview

Mission Control 1.x used InfluxDB to store historical data on Artifactory's usage and storage. Starting from version 2.0, Mission Control will use a new Elasticsearch database to store this data.

After completing your Mission Control upgrade to version 2.0, the migration of data from InfluxDB to Elasticsearch can be started.

Migration is Optional

If you're not interested in your historical data, there is no need to migrate it. Mission Control will start collecting data using Elasticsearch from version 2.0.

After the migration is complete the InfluxDB will continue to be available for use if you require it.

The migration script uses Gawk 3 and above. For more information, refer to the Gawk user guide.

Page contents


Getting Started

The following script execution process will take place once the migration is triggered:

  1. Historical data is exported from InfluxDB to CSV files.
    The script will try to sequentially export data from the hour_data_policy, day_data_policy, and week_data_policy and year_data_policy tables in the database. Any issues in data export will be reported during this step.
  2. The CSV files are converted and imported into Elasticsearch.
    This script is interactive and will ask for user inputs.

Script User Inputs

The migration script will prompt you to enter the following parameters:

ParametersData required
Provide a working directory for writing the logs folder <./influxmigration/>.A working directory to store logs and other data files.
Press Enter to use the default ./influxmigration/ folder.
Provide the baseurl for Elasticsearch <http://elasticsearch:9200/>.

The URL end point for Elasticsearch.
For a Docker based installation, press Enter to use the default value (http://elasticsearch:9200/).
For Debian and RPM installs, provide the path where Elasticsearch is installed. Usually this would be on the localhost (http://localhost:9200/).

Provide the username for authentication with Elasticsearch.The Elasticsearch username for authentication.
If no authentication was used in Elasticsearch, press Enter to bypass this step.
Provide the password for authentication with Elasticsearch.The Elasticsearch password.
This appears only if a username was provided in the previous step.
Provide the full path to the CSV location for conversion <./influxmigration/csv/>.The path from which to pick the CSV files.
Press Enter to use the default path or provide the correct path if you have changed the working directory in the previous step.

Once the script is complete, it will  summarize the information provided and will prompt for a confirmation before the import action is triggered.

Troubleshooting

Check the troubleshooting section below for any errors with the execution.

Migration Folder Structure

Intermediate files and logs are stored in the following locations:

FilesPath
Migration logs<jfmc_installation_folder>/influxmigration/logs

Exported CSV data

<jfmc_installation_folder>/influxmigration/csv
JSON chunks<jfmc_installation_folder>/influxmigration/data

Triggering the Migration

Migrating historical data from InfluxDB to Elasticsearch can be triggered using the migration commands below.

During the migration process, you will need to provide the input parameters as described under Script Input Parameters.

Docker Installation

Navigate to the Mission Control installation folder and execute the migrate command:

cd <jfmc_installation_folder>
mission-control migrateToElastic

Debian and RPM Installation

Navigate to the Mission Control installation folder and execute the migrate command:

Note: The Influx installation folder (influxdb_installation_folder) is generally at /opt/jfrog/mission-control/influxdb/usr/bin

cd <jfmc_installation_folder>
cp migration/* <influxdb_installation_folder>
cd <influxdb_installation_folder>
./migrateInfluxToElastic.sh

Troubleshooting

Detailed run logs will be available in the configured logs folder. The log for the current execution is located here:
<installation_folder>/influxmigration/logs/log.out.

If the script stops executing without any errors, check the log file for more details.

Common errors and resolutions

 ./influxmigration/exportInflux.sh: line 13: ./influx: No such file or directory
Cause
This happens in Debian/RPM installs where the script cannot find the InfluxDB executable used to export data to CSV format.
Resolution
Move the migration script to the InfluxDB installation folder before executing.
Usually this folder is located here: 
/opt/jfrog/mission-control/influxdb/usr/bin
 Cannot find mission_control DB when running the script
Cause
Missing InfluxDB data folder
Resolution

Issue the following command:

curl -G "http://localhost:8086/query?pretty=true" --data-urlencode "q=show databases"

This should show you if you have the mission_control InfluxDB. If you don't, it likely that you have not copied the InfluxDB data directory.

 

 

 ERROR: Could not find ndjson files to insert to es.
Cause
This happens when data exported from InfluxDB to CSV files has failed for some reason.
Resolution
Check previous errors if any and re-execute the script.
 awk: ./influxmigration/convertor.awk: line 116: illegal reference to array fn1
Cause
This happens because Gawk needs to be upgraded.
Resolution

Install Gawk version 3 and higher.

 Found earlier attempts to run migration ...
Cause
This happens when the script has detected an earlier run of the migration script. More details will be provided following this message in the standard out.
Resolution
In such cases, a repeated run of the script will create an extra index in Elasticsearch with the same time-series data.
Although this use-case is undesirable, it will not affect the data integrity in the database. Even if there are multiple runs of migration, since the inserted data is of the same time-series, it will not affect the graphs in the UI
  • No labels