Import & Export Flows
The Nexla CLI supports methods to export flow specifications into JSON files and subsequently import the JSON specifications into new flows in the same or different user accounts. This is useful for:
- Flow Migration: Moving flows between environments or accounts
- Flow Replication: Creating copies of existing flows with modifications
- Backup & Recovery: Creating backups of flow configurations
- Development & Testing: Testing flow configurations in different environments
Export a Flow
Use this method to export one or more flows originating from a data source. Each Nexla source can have multiple flow branches connected to it.
Export Options
- Automatic Export: Use the
-aoption to automatically export all branches - Selective Export: Call without the
-aoption to select specific branches interactively
- Nexla CLI
nexla flows export
usage: nexla flows export [--source SOURCE] [--output_file OUTPUT_FILE] [options]
description: Export flow specification
arguments:
--source SOURCE, -s SOURCE
id of source to be exported
--output_file OUTPUT_FILE, -o OUTPUT_FILE
name of output file to be exported
options:
-a, --all Export all the flows of source(by default without entering the pipeline ids)
Export Examples
Export flows for source 9311 to a local file:
- Nexla CLI
# Export specific flows (interactive selection)
nexla flows export -s 9311 -o ~/Desktop/export_file.json
# Export all flows automatically
nexla flows export -s 9311 -o ~/Desktop/export_file.json -a
Export Response Examples
- Nexla CLI
# Example 1: With the -a option to export all flow branches of a source
➜ nexla flows export -s 9505 -o ~/Desktop/export_9505.json -a
[2022-06-17 11:10:57 UTC] Getting all pipeline ids...
[2022-06-17 11:10:57 UTC] Found 2 pipelines, exporting them
[2022-06-17 11:10:58 UTC] Creating template for dataset, sink and datamap
[2022-06-17 11:10:58 UTC] Scanning pipeline 1
[2022-06-17 11:11:01 UTC] Scanning pipeline 2
[2022-06-17 11:11:06 UTC] Fetching source details
[2022-06-17 11:11:06 UTC] exporting json..
# Example 2: Without the -a option, waits for user input to select flow branches
➜ nexla flows export -s 9505 -o ~/Desktop/export_9505.json
+-----------+------+-----------------------------+-----------------------------+-----------+
| pipeline_id | source | detected_dataset | dataset_1 | destination |
+-----------+------+-----------------------------+-----------------------------+-----------+
| 1 | 9505 | 14325 (1 - nexla_test,PAUSED) | | 8102 (1234) |
+-----------+------+-----------------------------+-----------------------------+-----------+
| 2 | 9505 | 14325 (1 - nexla_test,PAUSED) | 14450 (1 - nexla_test,PAUSED) | |
+-----------+------+-----------------------------+-----------------------------+-----------+
Enter pipeline ids : 1
[2022-06-16 08:14:04 UTC] Creating template for dataset, sink and datamap
[2022-06-16 08:14:04 UTC] Scanning pipeline 1
[2022-06-16 08:14:08 UTC] Fetching source details
[2022-06-16 08:14:10 UTC] exporting json..
Import a Flow
Use this method to import one or more flows from a previously exported JSON file. This is a quick way to create replicas of data flows with modifications applied as needed.
Import Command
- Nexla CLI
nexla flows import
usage: nexla [-h] [--input_file INPUT_FILE] [--properties PROPERTIES]
Import Flows
optional arguments:
-h, --help show this help message and exit
--input_file INPUT_FILE, -i INPUT_FILE
path of json file to be imported
--properties PROPERTIES, -p PROPERTIES
path of properties json file
Import Scenarios
Some flow import scenarios, such as migrating flows across environments/accounts, often require additional input. For example, credentials may need to be created for and/or assigned to relevant sources & destinations. You can provide this input via:
- Properties File: Include a properties JSON file with the import
- Interactive Input: Provide input during the import process
Import with Properties File
In the properties file, you can change the credentials according to the flow import requirements.
- Nexla CLI
nexla flows import -i ~/Desktop/export_9505.json -p ~/Desktop/export_9505_properties.json
- Nexla CLI
[2022-06-17 12:33:50 UTC] Using credential 6952 from properties file
[2022-06-17 12:33:50 UTC] Creating source : nexla_test
[2022-06-17 12:33:52 UTC] Data Source created with ID: 11204
[2022-06-17 12:33:53 UTC] Creating Dataset : 1 - nexla_test
[2022-06-17 12:33:56 UTC] ID: 17284, Name: 1 - nexla_test
[2022-06-17 12:33:59 UTC] Created Dataset with ID, 17284 from dataset 17283
[2022-06-17 12:33:59 UTC] Created Dataset id ====> 17284
[2022-06-17 12:33:59 UTC] Parent dataset id for sink is ===> 17283
[2022-06-17 12:33:59 UTC] Creating Sink : 1234
[2022-06-17 12:34:02 UTC] Sink created with ID: 9535, and associated with dataset 17283
Import without Properties File
When importing a flow without a properties file, the CLI will ask you to choose the credentials that are relevant to the flow.
- Nexla CLI
nexla flows import -i ~/Desktop/export_9505.json
- Nexla CLI
[2022-06-18 04:24:59 UTC] Credential Name given on Exported Pipeline : sk21
[2022-06-18 04:24:59 UTC] Available gdrive credentials
[2022-06-18 04:24:59 UTC] credential_id credential_name
[2022-06-18 04:24:59 UTC] 7041 sk21 (Copy) (Copy)
[2022-06-18 04:24:59 UTC] 7039 sk21 (Copy)
[2022-06-18 04:24:59 UTC] 6952 sk21
Enter credential_id : 6952
[2022-06-18 04:25:09 UTC] Creating source : nexla_test
[2022-06-18 04:25:11 UTC] Data Source created with ID: 11213
[2022-06-18 04:25:13 UTC] Creating Dataset : 1 - nexla_test
[2022-06-18 04:25:15 UTC] ID: 17290, Name: 1 - nexla_test
[2022-06-18 04:25:18 UTC] Created Dataset with ID, 17290 from dataset 17289
[2022-06-18 04:25:18 UTC] Created Dataset id ====> 17290
[2022-06-18 04:25:18 UTC] Parent dataset id for sink is ===> 17289
[2022-06-18 04:25:20 UTC] Credential Name given on Exported Pipeline : Abs_test
[2022-06-18 04:25:20 UTC] credential_id credential_name
[2022-06-18 04:25:20 UTC] 7040 Abs_test (Copy) (Copy)
[2022-06-18 04:25:20 UTC] 7038 Abs_test (Copy)
[2022-06-18 04:25:20 UTC] 6954 Abs_test
[2022-06-18 04:25:20 UTC] 6953 Azure Blob Storage_test
Enter credential_id : 6954
[2022-06-18 04:25:30 UTC] Creating Sink : 1234
[2022-06-18 04:25:32 UTC] Sink created with ID: 9546, and associated with dataset 17289
Best Practices
Following these best practices ensures successful flow import and export operations while maintaining data integrity and avoiding common pitfalls.
Export Best Practices
- Use Descriptive Names: Use meaningful names for export files
- Include All Branches: Use the
-aflag when you want to export the complete flow - Verify Source ID: Double-check the source ID before exporting
- Test Exports: Verify exported files contain expected data
Import Best Practices
- Prepare Credentials: Ensure target environment has necessary credentials
- Use Properties Files: For automated deployments, use properties files
- Verify Environment: Ensure target environment can support the flow
- Test Imports: Test imports in non-production environments first
Properties File Structure
The properties file should contain credential mappings and any environment-specific configurations:
{
"credentials": {
"source_credential_name": "target_credential_id",
"sink_credential_name": "target_credential_id"
},
"environment": {
"org_id": 2,
"owner_id": 6
}
}
Troubleshooting
When import or export operations encounter issues, these common problems and solutions can help you resolve them quickly.
Common Export Issues
- Source Not Found: Verify the source ID exists and is accessible
- Permission Denied: Ensure you have access to the source and its flows
- File Write Error: Check file permissions and disk space
Common Import Issues
- Credential Mismatch: Ensure target credentials exist and are compatible
- Resource Conflicts: Check for naming conflicts in target environment
- Permission Issues: Verify you have permission to create resources in target environment