Job Requirements
Seattle, WA
Secret Polygraph Unspecified
Career Level not specified
Salary not specified
Join Premium to unlock estimated salaries
Job Description
PDS Defense, Inc. is seeking a Computing Architect 4, in Seattle, WA. Job ID#216417
Pay Rate: $78 - $83/hr
Job Description:
Be part of a high performing software engineering organization focused on transforming the aviation training industry through Competency Based Training and Assessment (CBTA) digital solutions.
In this role, your responsibility will be to leverage Unified Data Modeling (UDM) to design and govern data models, process layers, transformations, routing, and schema evolution for the datalake built on Azure Databricks and Microsoft Azure. You will map incoming and changing source data to the UDM, manage and define data transformation rules, and collaborate closely with distributed data engineering teams (including team members in India) to implement robust, scalable data pipelines and governance.
Position Responsibilities:
• Design, document, and maintain the Unified Data Model (UDM) artifacts and mappings.
• Model process layers in the datalake, defining transformation responsibilities and lineage across layers.
• Define and enforce rules for data processing, including routing tables, validation rules, enrichment logic, and error-handling policies.
• Author and maintain schema definitions, attribute dictionaries, and change management processes for schema evolution.
• Translate business and source system requirements into data transformation specifications to be implemented in Azure Databricks and downstream systems.
• Collaborate with data engineers to design performant transformations, partitioning, and storage strategies in the datalake.
• Review and approve data pipeline designs, ensuring adherence to UDM, governance, and security policies.
• Work with DevOps and engineering teams to operationalize CI/CD for Databricks notebooks, jobs, and infrastructure-as-code.
• Provide technical leadership and mentorship to data engineering teams, including remote collaboration with engineers located in India; coordinate design, implementation, and delivery across time zones (may require off-hour work).
• Establish data quality metrics, monitoring, and remediation guidance; ensure traceability and lineage from source to consumption.
• Participate in architecture and design reviews, code reviews, and agile ceremonies; drive best practices for data modeling and transformation.
• Communicate architecture decisions and trade-offs to stakeholders, product owners, and engineering teams.
This position is On Site.
Describe the project/day-to-day activities they will be working on:
Designing, improving, communicating, and managing data and data models for training analytics application.
Technical/Software Skills needed:
Unified Data Model (UDM)
Data lakehouse concepts
Business intelligence analytical understanding
Basic Qualifications (Required Skill/Experience):
• 9+ years of experience in data architecture, data modeling, or related data engineering roles.
• Proven experience designing and implementing Unified Data Models (UDM).
• Strong expertise in Structured Query Language (SQL)
• Strong expertise in data modeling across multiple process layers (raw/ingest, transformed, curated) and defining transformation logic.
• Deep understanding of data governance, data lineage, metadata management, and data quality concepts.
• Demonstrated experience with Databricks and building/architecting datalake solutions.
• Experience defining schema evolution processes, new attribute definitions, and backward-compatible changes.
• Experience authoring route tables, processing rules, and data routing/ingestion patterns.
• Experience collaborating with geographically distributed engineering teams and willingness to work off hours to coordinate with teams in India.
• Strong communication, documentation, and stakeholder engagement skills.
Preferred Qualifications (Desired Skills/Experience):
• Prior experience in aviation, training analytics, or related operational data domains.
• Experience with Azure Data Factory, Delta Lake, Unity Catalog, or equivalent data governance tools.
• Familiarity with infrastructure-as-code and CI/CD practices for data pipelines (Terraform, Azure DevOps, GitOps).
• Knowledge of streaming ingestion patterns, Kafka/Event Hubs, and near-real-time processing.
• Experience with metadata platforms and data catalog tools (e.g., Purview, Alation).
• Experience with containerization and orchestration (Docker, Kubernetes) is a plus.
• Bachelor's or advanced degree in Computer Science, Information Systems, Engineering, or related field.
Education / Experience:
Technical bachelor's degree and typically 9 or more years' related work experience or a Master's degree with typically 7 or more years' or a PhD degree with typically 4 or more years' related work experience or an equivalent combination of education and experience. A technical degree is defined as any four year degree, or greater, in a mathematic, scientific or information technology field of study.
Benefits offered to vary by the contract. Depending on your temporary assignment, benefits may include direct deposit, free career counseling services, 401(k), select paid holidays, short-term disability insurance, skills training, employee referral bonus, affordable medical coverage plan, and DailyPay (in some locations). For a full description of benefits available to you, be sure to talk with your recruiter.
VEVRAA Federal Contractor / Request Priority Protected Veteran Referrals / Equal Opportunity Employer / Veterans / Disabled
To read our Candidate Privacy Information Statement, which explains how we will use your information, please visit http://www.tadpgs.com/candidate-privacy/ or https://pdsdefense.com/candidate-privacy/
The Company will consider qualified applicants with arrest and conviction records in accordance with federal, state, and local laws and/or security clearance requirements, including, as applicable:
San Francisco Fair Chance Ordinance
Pay Rate: $78 - $83/hr
Job Description:
Be part of a high performing software engineering organization focused on transforming the aviation training industry through Competency Based Training and Assessment (CBTA) digital solutions.
In this role, your responsibility will be to leverage Unified Data Modeling (UDM) to design and govern data models, process layers, transformations, routing, and schema evolution for the datalake built on Azure Databricks and Microsoft Azure. You will map incoming and changing source data to the UDM, manage and define data transformation rules, and collaborate closely with distributed data engineering teams (including team members in India) to implement robust, scalable data pipelines and governance.
Position Responsibilities:
• Design, document, and maintain the Unified Data Model (UDM) artifacts and mappings.
• Model process layers in the datalake, defining transformation responsibilities and lineage across layers.
• Define and enforce rules for data processing, including routing tables, validation rules, enrichment logic, and error-handling policies.
• Author and maintain schema definitions, attribute dictionaries, and change management processes for schema evolution.
• Translate business and source system requirements into data transformation specifications to be implemented in Azure Databricks and downstream systems.
• Collaborate with data engineers to design performant transformations, partitioning, and storage strategies in the datalake.
• Review and approve data pipeline designs, ensuring adherence to UDM, governance, and security policies.
• Work with DevOps and engineering teams to operationalize CI/CD for Databricks notebooks, jobs, and infrastructure-as-code.
• Provide technical leadership and mentorship to data engineering teams, including remote collaboration with engineers located in India; coordinate design, implementation, and delivery across time zones (may require off-hour work).
• Establish data quality metrics, monitoring, and remediation guidance; ensure traceability and lineage from source to consumption.
• Participate in architecture and design reviews, code reviews, and agile ceremonies; drive best practices for data modeling and transformation.
• Communicate architecture decisions and trade-offs to stakeholders, product owners, and engineering teams.
This position is On Site.
Describe the project/day-to-day activities they will be working on:
Designing, improving, communicating, and managing data and data models for training analytics application.
Technical/Software Skills needed:
Unified Data Model (UDM)
Data lakehouse concepts
Business intelligence analytical understanding
Basic Qualifications (Required Skill/Experience):
• 9+ years of experience in data architecture, data modeling, or related data engineering roles.
• Proven experience designing and implementing Unified Data Models (UDM).
• Strong expertise in Structured Query Language (SQL)
• Strong expertise in data modeling across multiple process layers (raw/ingest, transformed, curated) and defining transformation logic.
• Deep understanding of data governance, data lineage, metadata management, and data quality concepts.
• Demonstrated experience with Databricks and building/architecting datalake solutions.
• Experience defining schema evolution processes, new attribute definitions, and backward-compatible changes.
• Experience authoring route tables, processing rules, and data routing/ingestion patterns.
• Experience collaborating with geographically distributed engineering teams and willingness to work off hours to coordinate with teams in India.
• Strong communication, documentation, and stakeholder engagement skills.
Preferred Qualifications (Desired Skills/Experience):
• Prior experience in aviation, training analytics, or related operational data domains.
• Experience with Azure Data Factory, Delta Lake, Unity Catalog, or equivalent data governance tools.
• Familiarity with infrastructure-as-code and CI/CD practices for data pipelines (Terraform, Azure DevOps, GitOps).
• Knowledge of streaming ingestion patterns, Kafka/Event Hubs, and near-real-time processing.
• Experience with metadata platforms and data catalog tools (e.g., Purview, Alation).
• Experience with containerization and orchestration (Docker, Kubernetes) is a plus.
• Bachelor's or advanced degree in Computer Science, Information Systems, Engineering, or related field.
Education / Experience:
Technical bachelor's degree and typically 9 or more years' related work experience or a Master's degree with typically 7 or more years' or a PhD degree with typically 4 or more years' related work experience or an equivalent combination of education and experience. A technical degree is defined as any four year degree, or greater, in a mathematic, scientific or information technology field of study.
Benefits offered to vary by the contract. Depending on your temporary assignment, benefits may include direct deposit, free career counseling services, 401(k), select paid holidays, short-term disability insurance, skills training, employee referral bonus, affordable medical coverage plan, and DailyPay (in some locations). For a full description of benefits available to you, be sure to talk with your recruiter.
VEVRAA Federal Contractor / Request Priority Protected Veteran Referrals / Equal Opportunity Employer / Veterans / Disabled
To read our Candidate Privacy Information Statement, which explains how we will use your information, please visit http://www.tadpgs.com/candidate-privacy/ or https://pdsdefense.com/candidate-privacy/
The Company will consider qualified applicants with arrest and conviction records in accordance with federal, state, and local laws and/or security clearance requirements, including, as applicable:
- The California Fair Chance Act
- Los Angeles City Fair Chance Ordinance
- Los Angeles County Fair Chance Ordinance for Employers
San Francisco Fair Chance Ordinance
group id: 9117PDSD