Update a Data Source
Data sources can be updated to modify their configuration, credentials, or operational settings. Updates allow you to adjust data source behavior without recreating the entire resource, maintaining data continuity and flow relationships.
Update Endpoint
The primary endpoint for updating data sources is:
- Nexla API
PUT /data_sources/{source_id}
Example with curl:
curl -X PUT https://api.nexla.io/data_sources/5001 \
-H "Authorization: Bearer <Access-Token>" \
-H "Content-Type: application/json" \
-d '{
"name": "Updated S3 Source",
"description": "Updated description for the source"
}'
Updateable Fields
Most data source fields can be updated, including:
Basic Information
name: Source display namedescription: Detailed description of the sourcestatus: Operational status (ACTIVE, PAUSED, INIT)
Configuration
source_config: Connector-specific settingsdata_credentials_id: Reference to different credentialsflow_type: Processing type (streaming, in_memory, replication)
Advanced Settings
code_container_id: Reference to custom processing codecode_container: Inline code container configuration
Configuration Updates
Source configuration can be updated to adjust how data is extracted and processed.
S3 Configuration Update
- Nexla API
{
"source_config": {
"bucket": "new-data-bucket",
"prefix": "monthly/",
"file_pattern": "*.parquet",
"region": "us-west-2"
}
}
Database Configuration Update
- Nexla API
{
"source_config": {
"host": "new-db.example.com",
"port": 5432,
"database": "analytics",
"incremental_column": "modified_date"
}
}
Credential Updates
Data source credentials can be updated to use different authentication methods or connection details.
Change Credential Reference
- Nexla API
{
"data_credentials_id": 5002
}
Update Inline Credentials
- Nexla API
{
"data_credentials": {
"name": "Updated FTP Credentials",
"credentials_type": "ftp",
"credentials": {
"host": "new-ftp.example.com",
"username": "newuser",
"password": "newpass",
"port": 22
}
}
}
Flow Type Updates
The flow type can be updated to optimize performance for different use cases.
Change Flow Type
- Nexla API
{
"flow_type": "in_memory"
}
Flow Type Considerations
streaming: Default type, good for most use casesin_memory: High performance, higher resource usagereplication: Optimized for data transfer scenarios
Code Container Updates
Custom processing logic can be updated by modifying code container references or inline code.
Update Code Container Reference
- Nexla API
{
"code_container_id": 5003
}
Update Inline Code Container
- Nexla API
{
"code_container": {
"name": "Updated Data Processor",
"code_type": "python",
"code": "def process_data(data): return data.lower().strip()",
"resource_type": "source_custom"
}
}
Response Structure
Successful updates return the updated data source object:
- Nexla API
{
"id": 5001,
"owner_id": 2,
"org_id": 1,
"name": "Updated S3 Source",
"description": "Updated description for the source",
"status": "ACTIVE",
"source_type": "s3",
"source_config": {
"bucket": "new-data-bucket",
"prefix": "monthly/",
"file_pattern": "*.parquet"
},
"data_credentials_id": 5002,
"flow_type": "in_memory",
"updated_at": "2023-01-15T15:30:00.000Z"
}
Partial Updates
The update endpoint supports partial updates, allowing you to modify only specific fields:
Update Only Name
- Nexla API
{
"name": "New Source Name"
}
Update Multiple Fields
- Nexla API
{
"name": "Production Source",
"description": "Production data source for customer analytics",
"status": "ACTIVE"
}
Update Considerations
When updating data sources, consider the following:
Impact on Active Flows
- Configuration Changes: May require flow reactivation
- Credential Updates: Could affect ongoing data ingestion
- Status Changes: Pausing/activating affects all downstream flows
Data Continuity
- Schema Changes: Updates may affect data set schemas
- Processing Logic: Code container changes alter data transformation
- Scheduling: Configuration updates may change ingestion timing
Validation Requirements
- Credential Verification: New credentials must be validated
- Configuration Testing: Source-specific settings should be tested
- Flow Compatibility: Updates should maintain flow integrity
Best Practices
To ensure successful data source updates:
- Test Changes: Validate updates in non-production environments
- Monitor Impact: Watch for effects on data flows and processing
- Backup Configuration: Document current settings before major changes
- Gradual Updates: Make changes incrementally to minimize disruption
- Validate Credentials: Ensure new credentials work before updating
- Check Dependencies: Verify updates don't break dependent resources
Error Handling
Common update errors and solutions:
- Invalid Configuration: Verify source_config parameters for the connector type
- Credential Issues: Ensure new credentials are valid and accessible
- Permission Denied: Check that you have update access to the source
- Flow Conflicts: Resolve any active flows before making disruptive changes
- Validation Errors: Fix any schema or configuration validation issues
Related Operations
After updating a data source, you may need to:
Test the Source
PUT /data_sources/{source_id}/test
Reactivate if Needed
PUT /data_sources/{source_id}/activate
Monitor Performance
GET /data_sources/{source_id}/metrics
Check Flow Status
GET /data_sources/{source_id}/flow