Posted today
Secret
Mid Level Career (5+ yrs experience)
Unspecified
No Traveling
IT - Support
Centurion is looking to hire an Azure Data Factory Engineer for a long-term federal government position. The ideal candidate will be a US Citizen and hold an Active Secret or Top-Secret Clearance. This position is 100% remote.
Requirements Gathering
Collaborate with business analysts, data architects, and stakeholders to gather requirements and ensure data integrity throughout the migration lifecycle.
Ability to create Data mapping documents (Source to Target) by understanding business rules. Data Understanding/Profiling
Perform data profiling, cleansing, and quality checks to ensure accuracy and completeness of migrated data.
Analyze and map source Oracle database schemas to target Dataverse tables/entities, defining transformation and cleansing logic as required.
Technical Requirements for Development
Write and optimize SQL queries, stored procedures, and scripts for Oracle databases to support data migration and validation.
Implement data transformation processes to ensure data compatibility with Dataverse data types, relationships, and constraints.
Experience with Dataverse data types and CRUD operations
Experience configuring data pipelines in ADF (to extract, transform, load). Experience using .csv files as sources of data for the ADF pipelines
Experience using Oracle CDCs as sources of data for the ADF pipelines.
Experience logging errors (audit framework) from an ADF pipeline to a Dataverse table
Experience pushing data to Dataverse table(s) including necessary transformations to match target field types (i.e. Dataverse lookups, choices, date/time, etc.)
Experience configuring setting for pipelines to improve concurrency/performance (i.e., ADF cores, memory, etc.)
Experience with additional Azure services (e.g., Logic Apps, Data Lake, Azure Functions, Azure SQL).
Experience/exposure in python/shell scripting is a plus Data Validation Post migration
Ability to generate a reconcile report of the migrated data using SQL or scripts Code Migration
Must be familiar with Git or source code repository & CI/CD process.
Requirements Gathering
Collaborate with business analysts, data architects, and stakeholders to gather requirements and ensure data integrity throughout the migration lifecycle.
Ability to create Data mapping documents (Source to Target) by understanding business rules. Data Understanding/Profiling
Perform data profiling, cleansing, and quality checks to ensure accuracy and completeness of migrated data.
Analyze and map source Oracle database schemas to target Dataverse tables/entities, defining transformation and cleansing logic as required.
Technical Requirements for Development
Write and optimize SQL queries, stored procedures, and scripts for Oracle databases to support data migration and validation.
Implement data transformation processes to ensure data compatibility with Dataverse data types, relationships, and constraints.
Experience with Dataverse data types and CRUD operations
Experience configuring data pipelines in ADF (to extract, transform, load). Experience using .csv files as sources of data for the ADF pipelines
Experience using Oracle CDCs as sources of data for the ADF pipelines.
Experience logging errors (audit framework) from an ADF pipeline to a Dataverse table
Experience pushing data to Dataverse table(s) including necessary transformations to match target field types (i.e. Dataverse lookups, choices, date/time, etc.)
Experience configuring setting for pipelines to improve concurrency/performance (i.e., ADF cores, memory, etc.)
Experience with additional Azure services (e.g., Logic Apps, Data Lake, Azure Functions, Azure SQL).
Experience/exposure in python/shell scripting is a plus Data Validation Post migration
Ability to generate a reconcile report of the migrated data using SQL or scripts Code Migration
Must be familiar with Git or source code repository & CI/CD process.
group id: 91017959