Job Requirements
Chantilly, VA Herndon, VA McLean, VA
Top Secret/SCI Polygraph
Career Level not specified
Salary not specified
Join Premium to unlock estimated salaries
Job Description
Overview
We are seeking a skilled Data Engineer with at least 5 years of experience designing, building, and maintaining scalable data pipelines and architectures. The ideal candidate will have strong expertise in data integration, ETL/ELT processes, and cloud-based data platforms, with a focus on delivering high-quality, reliable data solutions that support analytics and business decision-making.
What will you do?
Do you have what it takes?
We are seeking a skilled Data Engineer with at least 5 years of experience designing, building, and maintaining scalable data pipelines and architectures. The ideal candidate will have strong expertise in data integration, ETL/ELT processes, and cloud-based data platforms, with a focus on delivering high-quality, reliable data solutions that support analytics and business decision-making.
What will you do?
- Design, develop, and maintain scalable data pipelines and workflows (ETL/ELT)
- Build and optimize data architectures, including data lakes, data warehouses, and data marts
- Ingest, process, and transform large volumes of structured and unstructured data
- Ensure data quality, integrity, and reliability across systems
- Collaborate with data analysts, data scientists, and business stakeholders to define data requirements
- Implement data governance, security, and compliance best practices
- Optimize data storage, query performance, and cost efficiency in cloud environments
- Monitor and troubleshoot data pipeline performance and failures
- Maintain documentation for data flows, systems, and processes
Do you have what it takes?
- Active TS/SCI with Polygraph required.
- Bachelor's degree in computer science, Software Engineering, or related field.
- 5+ years of experience in data engineering or a related field
- Strong proficiency in SQL and experience with relational databases (e.g., PostgreSQL, MySQL, SQL Server)
- Experience with big data processing frameworks (e.g., Spark, Hadoop)
- Hands-on experience building ETL/ELT pipelines using tools such as Apache Airflow, Informatica, or similar
- Experience with cloud data platforms (AWS, Azure, or GCP)
- Proficiency in programming languages such as Python, Scala, or Java
- Experience with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery)
- Strong understanding of data modeling concepts (star/snowflake schemas)
group id: RTL806649