Yesterday
Top Secret/SCI
Entry Level (less than 2 yrs experience)
No Traveling
Full Scope Polygraph
Engineering - Systems
Annapolis Jct, MD (On-Site/Office)
When You Love the Work You Do, Any Mission Is Possible
Position: Dataflow Engineer, All Levels
Location: Annapolis Junction, MD/ (Fort Meade area)
***(Active Clearance with a FS Poly Required)
Who We Are:
Investing in our employee’s growth is a cornerstone of our philosophy. Joining Tiber means access to ongoing learning opportunities, mentorship, and a supportive environment that supports professional development. We believe in providing the tools and resources for our team to thrive and excel in their careers. We recognize that our collective strength lies in the diversity of our talents. Our collaborative work environment encourages cross functional teamwork, idea sharing, and a collective pursuit of excellence. Your skills and perspectives will be integral to our shared success.
Position Summary:
We are seeking a Dataflow Engineer to architect, implement, and manage data movement and transformation pipelines across enterprise-level environments. This role requires a deep understanding of data modeling, metadata management, and data governance principles within a secure and compliant environment.
The successful candidate will ensure data flows adhere to Data Management Requirements (DMRs), utilize Enterprise Data Headers, and enforce Attribute-Based Access Controls (ABAC) to enable secure, policy-driven access to sensitive information
Key Responsibilities:
• Design, implement, and manage secure dataflows across multiple environments and domains, ensuring efficient and traceable movement of data.
• Define and maintain data models (conceptual, logical, and physical) to support structured and unstructured data ingestion, transformation, and storage.
• Ensure adherence to Data Management Requirements (DMRs) across all data engineering processes, supporting auditability and governance.
• Implement and integrate Enterprise Data Headers (EDH) for enhanced metadata tagging, traceability, and cross-system interoperability.
• Apply and maintain Attribute-Based Access Control (ABAC) mechanisms to enforce data protection and dissemination rules based on user roles, attributes, and mission need.
• Work closely with cyber, DevSecOps, and software teams to ensure compliant, scalable data architectures.
• Support policy enforcement, tagging schemas, data catalogs, and data lineage tracking using modern data governance tools.
• Troubleshoot data delivery issues, system bottlenecks, and access control misconfigurations.
• Provide technical guidance to program managers, system integrators, and data consumers to ensure clarity and mission alignment.
Required Qualifications:
• Bachelor’s degree in Computer Science, Data Science, Information Systems, or a related field.
• Expertise in data modeling, relational and non-relational databases (SQL, NoSQL), and schema design.
• Demonstrated experience with DMRs and metadata-driven data governance processes.
• Hands-on knowledge of Enterprise Data Headers (EDH) implementation and tagging mechanisms.
• Proven understanding and implementation of ABAC using policy enforcement points and decision engines (e.g., XACML, OPA).
• Familiarity with secure data transfer protocols, cross-domain solutions (CDS), and data protection.
• Proficient with data management tools and languages (e.g., Apache NiFi, Kafka, Airflow, Python, Spark).
• Excellent problem-solving skills and the ability to collaborate with cross-functional teams.
Position: Dataflow Engineer, All Levels
Location: Annapolis Junction, MD/ (Fort Meade area)
***(Active Clearance with a FS Poly Required)
Who We Are:
Investing in our employee’s growth is a cornerstone of our philosophy. Joining Tiber means access to ongoing learning opportunities, mentorship, and a supportive environment that supports professional development. We believe in providing the tools and resources for our team to thrive and excel in their careers. We recognize that our collective strength lies in the diversity of our talents. Our collaborative work environment encourages cross functional teamwork, idea sharing, and a collective pursuit of excellence. Your skills and perspectives will be integral to our shared success.
Position Summary:
We are seeking a Dataflow Engineer to architect, implement, and manage data movement and transformation pipelines across enterprise-level environments. This role requires a deep understanding of data modeling, metadata management, and data governance principles within a secure and compliant environment.
The successful candidate will ensure data flows adhere to Data Management Requirements (DMRs), utilize Enterprise Data Headers, and enforce Attribute-Based Access Controls (ABAC) to enable secure, policy-driven access to sensitive information
Key Responsibilities:
• Design, implement, and manage secure dataflows across multiple environments and domains, ensuring efficient and traceable movement of data.
• Define and maintain data models (conceptual, logical, and physical) to support structured and unstructured data ingestion, transformation, and storage.
• Ensure adherence to Data Management Requirements (DMRs) across all data engineering processes, supporting auditability and governance.
• Implement and integrate Enterprise Data Headers (EDH) for enhanced metadata tagging, traceability, and cross-system interoperability.
• Apply and maintain Attribute-Based Access Control (ABAC) mechanisms to enforce data protection and dissemination rules based on user roles, attributes, and mission need.
• Work closely with cyber, DevSecOps, and software teams to ensure compliant, scalable data architectures.
• Support policy enforcement, tagging schemas, data catalogs, and data lineage tracking using modern data governance tools.
• Troubleshoot data delivery issues, system bottlenecks, and access control misconfigurations.
• Provide technical guidance to program managers, system integrators, and data consumers to ensure clarity and mission alignment.
Required Qualifications:
• Bachelor’s degree in Computer Science, Data Science, Information Systems, or a related field.
• Expertise in data modeling, relational and non-relational databases (SQL, NoSQL), and schema design.
• Demonstrated experience with DMRs and metadata-driven data governance processes.
• Hands-on knowledge of Enterprise Data Headers (EDH) implementation and tagging mechanisms.
• Proven understanding and implementation of ABAC using policy enforcement points and decision engines (e.g., XACML, OPA).
• Familiarity with secure data transfer protocols, cross-domain solutions (CDS), and data protection.
• Proficient with data management tools and languages (e.g., Apache NiFi, Kafka, Airflow, Python, Spark).
• Excellent problem-solving skills and the ability to collaborate with cross-functional teams.
group id: 91111770