Posted 1 day ago
Secret
Unspecified
Unspecified
IT - Database
Mechanicsburg, PA (On-Site/Office)
Overview
Nakupuna Solutions is looking for a Data Engineer designs, builds, and maintains data pipelines and data warehouse infrastructure to ensure reliable, scalable, and secure data integration that supports reporting and analytics. This is a mid-level role expected to execute data integration work independently and escalate complex issues to senior engineering as needed. This effort supports the Naval Supply Systems Command (NAVSUP).
Responsibilities
The following reflects management's definition of essential functions for this job but does not restrict the tasks that may be assigned.
Qualifications
Qualification/Skills:
Education and Experience: This position requires a bachelor's degree in Engineering, Computer Science, or equivalent. The following are desirable levels of experience:
Clearance Requirements: Must be a U.S. Citizen . Must have an active Secret clearance.
Physical Requirements: The ideal candidate must at a minimum be able to meet the following physical requirements of the job with or without a reasonable accommodation:
Nakupuna Solutions is looking for a Data Engineer designs, builds, and maintains data pipelines and data warehouse infrastructure to ensure reliable, scalable, and secure data integration that supports reporting and analytics. This is a mid-level role expected to execute data integration work independently and escalate complex issues to senior engineering as needed. This effort supports the Naval Supply Systems Command (NAVSUP).
Responsibilities
The following reflects management's definition of essential functions for this job but does not restrict the tasks that may be assigned.
- Design, build, and maintain ETL/ELT pipelines using Python and SQL to ingest data from Salesforce Education Cloud, flat files, and API endpoints into the data warehouse
- Implement robust error handling, logging, and automated monitoring to detect and track data quality issues
- Define schemas and curate unified student datasets with targeted completeness and accuracy for downstream reporting and modeling
- Optimize AWS Redshift and PostgreSQL performance through tuning, query optimization, and storage management
- Serve as escalation point for complex data discrepancies and pipeline failures; provide root-cause analysis and permanent fixes
- Coordinate with analysts and BI developers to validate transformations and resolve mismatches
Qualifications
Qualification/Skills:
- Hands-on experience designing and maintaining ETL/ELT pipelines using Python and SQL.
- Working knowledge of data warehouse concepts, schema design, and performance optimization.
- Experience implementing data quality checks, logging, and monitoring to detect pipeline failures and data anomalies.
- Ability to troubleshoot and remediate data discrepancies and integration issues across multiple source systems.
- Experience collaborating with BI developers and analysts to validate transformations and definitions.
Education and Experience: This position requires a bachelor's degree in Engineering, Computer Science, or equivalent. The following are desirable levels of experience:
- 5+ years of data engineering experience designing ETL/ELT pipelines and data warehouse architecture
- Advanced SQL skills (PostgreSQL, Redshift)
- Experience with Python-based ETL development and automation
- Familiarity with Salesforce data models is preferred
- Experience implementing monitoring, logging, and data quality checks
Clearance Requirements: Must be a U.S. Citizen . Must have an active Secret clearance.
Physical Requirements: The ideal candidate must at a minimum be able to meet the following physical requirements of the job with or without a reasonable accommodation:
- Ability to perform repetitive motions with the hands, wrists, and fingers.
- Ability to engage in and follow audible communications in emergency situations.
- Ability to sit for prolonged periods at a desk and working on a computer.
group id: 90957987