Skip to main content

Transforms

Transforms in Nexla provide powerful data modification capabilities that allow you to reshape, enrich, and restructure your data as it flows through your processing pipelines, enabling complex data transformations and business logic implementation.

Transform Overview

Data transforms are the core building blocks for data modification in Nexla, providing both pre-built functions and custom logic to manipulate data according to your specific business requirements and data quality standards.

Core Transform Capabilities

The transform system provides several key capabilities for effective data modification and processing.

Data Modification

Comprehensive data manipulation capabilities:

  • Field Transformation: Modify individual field values and formats
  • Data Type Conversion: Convert data between different types and formats
  • Value Mapping: Map values according to business rules and logic
  • Data Enrichment: Add calculated fields and derived values

Business Logic Implementation

Implement complex business rules and logic:

  • Conditional Processing: Apply different transformations based on conditions
  • Data Validation: Implement custom validation rules and constraints
  • Business Calculations: Perform business-specific calculations and aggregations
  • Data Quality Rules: Apply data quality and business rule enforcement

Integration and Workflow

Seamless integration with data processing workflows:

  • Nexset Integration: Integrate transforms with Nexset processing
  • Flow Integration: Apply transforms throughout data flows
  • Real-time Processing: Support real-time and batch processing modes
  • Error Handling: Robust error handling and recovery mechanisms

Transform Types

Nexla supports various transform types for different data modification needs.

Built-in Transforms

Pre-built transformation functions:

  • String Transforms: Text manipulation, formatting, and parsing
  • Numeric Transforms: Mathematical operations and calculations
  • Date Transforms: Date and time manipulation and formatting
  • Logical Transforms: Boolean operations and conditional logic
  • Array Transforms: Array manipulation and processing
  • Object Transforms: Object structure modification and restructuring

Custom Transforms

User-defined transformation logic:

  • JavaScript Transforms: Custom JavaScript transformation functions
  • Python Transforms: Python-based transformation scripts
  • Expression Transforms: Expression-based transformation rules

Specialized Transforms

Domain-specific transformation functions:

  • Data Quality Transforms: Data cleaning and quality improvement
  • Business Logic Transforms: Industry-specific business rules
  • Integration Transforms: System integration and data mapping
  • Compliance Transforms: Regulatory and compliance requirements

Transform Components

Understanding the key components of Nexla transforms helps you create effective data modification workflows.

Transform Definition

Core transform structure and configuration:

  • Transform Name: Unique identifier for the transform
  • Transform Type: Category and classification of the transform
  • Input Schema: Expected input data structure and types
  • Output Schema: Expected output data structure and types
  • Configuration: Transform-specific parameters and settings

Transform Logic

The actual transformation implementation:

  • Function Code: The transformation function or script
  • Parameters: Configurable parameters and options
  • Error Handling: Error handling and recovery logic
  • Performance Optimization: Optimization for processing efficiency

Transform Metadata

Additional information and management data:

  • Version Information: Transform version and change tracking
  • Documentation: Usage instructions and examples
  • Testing Information: Test cases and validation results
  • Performance Metrics: Processing performance and resource usage

Transform Operations

Core operations for managing transforms in your Nexla platform.

Create Transforms

Define new transformation functions:

POST /transforms
Create Transform: Request
{
"name": "Customer Email Normalizer",
"description": "Normalize customer email addresses to lowercase",
"type": "string",
"category": "data_quality",
"input_schema": {
"type": "object",
"properties": {
"email": {"type": "string"}
},
"required": ["email"]
},
"output_schema": {
"type": "object",
"properties": {
"email": {"type": "string"},
"email_normalized": {"type": "string"}
}
},
"transform_function": {
"language": "javascript",
"code": "function transform(input) { return { ...input, email_normalized: input.email.toLowerCase() }; }"
},
"configuration": {
"case_sensitive": false,
"trim_whitespace": true
}
}

List Transforms

Retrieve available transforms:

GET /transforms
List Transforms: Response
{
"transforms": [
{
"id": 6001,
"name": "Customer Email Normalizer",
"type": "string",
"category": "data_quality",
"description": "Normalize customer email addresses to lowercase",
"created_at": "2023-01-15T10:00:00.000Z",
"updated_at": "2023-01-15T10:00:00.000Z",
"version": "1.0.0",
"status": "ACTIVE",
"owner": {
"id": 42,
"name": "John Doe"
},
"usage_count": 15
},
{
"id": 6002,
"name": "Date Formatter",
"type": "date",
"category": "formatting",
"description": "Standardize date formats across datasets",
"created_at": "2023-01-14T15:30:00.000Z",
"updated_at": "2023-01-16T09:15:00.000Z",
"version": "2.1.0",
"status": "ACTIVE",
"owner": {
"id": 42,
"name": "John Doe"
},
"usage_count": 28
}
],
"pagination": {
"total": 2,
"page": 1,
"per_page": 20
}
}

Update Transforms

Modify existing transform definitions:

PUT /transforms/{transform_id}
Update Transform: Request
{
"version": "1.1.0",
"description": "Enhanced email normalization with validation",
"transform_function": {
"language": "javascript",
"code": "function transform(input) { const normalized = input.email.toLowerCase().trim(); return { ...input, email_normalized: normalized, is_valid: /^[^@]+@[^@]+\\.[^@]+$/.test(normalized) }; }"
},
"configuration": {
"case_sensitive": false,
"trim_whitespace": true,
"validate_format": true
}
}

Transform Integration

Integrate transforms with other Nexla components for comprehensive data processing.

Nexset Integration

Use transforms within Nexsets:

  • Transform Application: Apply transforms to Nexset data
  • Processing Pipeline: Build transformation pipelines within Nexsets
  • Data Flow: Control data flow through transform chains
  • Error Handling: Handle transform errors within Nexset processing

Flow Integration

Integrate transforms with data flows:

  • Flow Transforms: Apply transforms at flow checkpoints
  • Transform Chains: Build complex transformation workflows
  • Conditional Processing: Apply transforms based on flow conditions
  • Quality Gates: Use transforms as data quality gates

Schema Integration

Integrate transforms with schema management:

  • Schema Evolution: Adapt transforms to schema changes
  • Type Safety: Ensure transform compatibility with schemas
  • Validation: Validate transform outputs against schemas
  • Documentation: Document transform input/output schemas

Transform Best Practices

To effectively use transforms in your Nexla platform:

  1. Design for Reusability: Create transforms that can be reused across different workflows
  2. Test Thoroughly: Test transforms with various input data scenarios
  3. Document Clearly: Provide clear documentation and usage examples
  4. Monitor Performance: Track transform performance and resource usage
  5. Version Control: Use proper versioning for transform changes

Transform Workflows

Implement structured workflows for effective transform management.

Transform Development Workflow

Standard workflow for developing transforms:

  1. Requirements Analysis: Analyze transformation requirements
  2. Design: Design transform structure and logic
  3. Implementation: Implement transform function and configuration
  4. Testing: Test transform with various input scenarios
  5. Documentation: Document transform usage and examples
  6. Deployment: Deploy transform to production environment

Transform Usage Workflow

Workflow for using transforms in data processing:

  1. Transform Selection: Select appropriate transforms for your needs
  2. Configuration: Configure transform parameters and settings
  3. Integration: Integrate transforms into processing workflows
  4. Testing: Test transform integration and performance
  5. Monitoring: Monitor transform execution and results
  6. Optimization: Optimize transform performance and efficiency

Error Handling

Common transform issues and solutions:

  • Input Validation: Ensure input data matches expected schemas
  • Performance Issues: Optimize transform logic for efficiency
  • Integration Problems: Verify proper integration with workflows
  • Schema Mismatches: Align transform schemas with data structures

After working with transforms, you may need to:

Manage Transforms

GET /transforms
PUT /transforms/{transform_id}
DELETE /transforms/{transform_id}

Apply Transforms

POST /transforms/{transform_id}/apply
POST /data/transform

Monitor Usage

GET /transforms/{transform_id}/usage
GET /transforms/{transform_id}/performance