user avatar
Posted today

Job Requirements

Chantilly, VA
Top Secret/SCI Full Scope Polygraph
Career Level not specified
Salary not specified
Join Premium to unlock estimated salaries

Job Description

Job Title: Data Architect
Location: Chantilly, VA
Clearance Required: TS/SCI with Full Scope Polygraph
Company: Quantum Science Solutions (QSS)
Rate: Open

Position Overview
Quantum Science Solutions (QSS) is seeking a Data Architect to design, build, and maintain scalable, high-performance data infrastructure supporting enterprise and operational systems.
This role is responsible for developing robust data pipelines, integrating diverse data sources, and ensuring the reliability, quality, and accessibility of data across cloud and onprem environments. The ideal candidate will bring strong expertise in data engineering, cloud platforms, and data modeling, along with the ability to translate complex business requirements into scalable technical solutions.

Key Responsibilities
• Design, build, and maintain production-grade data pipelines using orchestration tools such as Apache Airflow or similar.
• Develop and manage ETL/ELT processes from diverse data sources including SaaS platforms, APIs, databases, files, and streaming systems.
• Integrate data from external systems via APIs, handling authentication, pagination, rate limiting, retry logic, and error handling.
• Transform semi-structured data (JSON, XML) into structured datasets with consistent schemas.
• Design and implement data warehouse architectures, including dimensional modeling and data marts.
• Develop conceptual, logical, and physical data models optimized for performance and scalability.
• Implement data quality checks, validation frameworks, and monitoring with alerting.
• Optimize pipelines for performance, cost efficiency, and reliability.
• Build and maintain cloud-based data infrastructure (AWS, Azure, or GCP) using infrastructure-as-code tools such as Terraform or CloudFormation.
• Implement CI/CD pipelines, version control (Git), and automated testing frameworks.
• Establish data governance, security controls, and compliance measures.
• Conduct data architecture assessments to identify technical debt, bottlenecks, and reliability issues.
• Translate business requirements into technical data solutions and provide guidance
to stakeholders.
• Develop documentation and enable self-service data access and analytics.
• Troubleshoot data pipeline failures and perform root cause analysis.

Required Skills & Experience
• Demonstrated experience designing, building, and maintaining production data pipelines.
• Strong proficiency in SQL, including complex queries, optimization, and performance tuning.
• Experience integrating data from SaaS platforms and operational systems via APIs.
• Experience working with semi-structured data (JSON, XML) and transforming into structured datasets.
• Experience developing robust API integrations with proper error handling and retry logic.
• Experience working with systems with limited documentation or vendor-specific data models.
• Strong knowledge of dimensional modeling and data warehouse design patterns.
• Proficiency in Python for data engineering and data processing libraries.
• Experience with cloud platforms (AWS, Azure, or GCP).
• Experience implementing ETL/ELT pipelines across diverse data sources.
• Experience with version control (Git) and software engineering best practices.
• Strong problem-solving and troubleshooting skills for complex data systems.
• Experience implementing data quality and validation frameworks.
• Ability to translate business requirements into technical solutions.
• Proven track record of delivering scalable, reliable data infrastructure.

Desired Skills
• Experience with ServiceNow APIs, data models, and integrations.
• Experience with network management or IT operations systems (SolarWinds, NetIM, Forward Networks).
• Knowledge of ITSM, ITOM, and CMDB data structures.
• Experience with Apache Spark / PySpark.
• Experience with DBT (data build tool).
• Experience with streaming technologies (Kafka, Kinesis).
• Experience with data quality platforms (Great Expectations, Soda, Monte Carlo).
• Experience implementing data observability and monitoring solutions.
• Knowledge of Data Vault or advanced data modeling methodologies.
• Experience with Docker and Kubernetes for data workloads.
• Experience with multi-cloud architectures.
• Experience mentoring or leading data engineering efforts.

Why Join QSS
You’re not just filling a role, you’re enabling data-driven, mission-critical capabilities. At QSS, we invest in innovation, value technical excellence, and empower our engineers to solve complex data challenges that make a real impact.

Employee Benefits
• Competitive Compensation & Incentives
• Premium Medical, Dental & Vision Plans
• Generous PTO and Paid Holidays
• 401(k) with Company Match
group id: 91142086
N
Name HiddenRecruiter
Find Quantum Science Solutions on Social Media
Network Employers
user avatar
About Us
Quantum Science Solutions is traditional in the sense that it was built with core foundations based on honesty and integrity. We take great pride in all of our efforts and projects, we are proud to serve the front lines of cyber defense. We challenge our team members frequently and thrive on the motto “One for all, all for one”. Here at QSS, we invite individuals who think outside of the box and are mentally stimulated by complex problems in the fast paced world of cyber and technology. Our collaborative work environment allows you to think freely and develop iron clad solutions for our customers. At QSS, your ideas and designs are not only heard but are recognized and rewarded. We support you and will provide you with the resources needed to fuel innovations and creativity. Imagine, you are working on cyber integration concept and it turns into fully funded project by a federal agency. Join the QSS team where rather than making a difference, we are the difference

Quantum Science Solutions Jobs


Job Category
IT - Database
Clearance Level
Top Secret/SCI