Job Requirements
Chantilly, VA
Top Secret/SCI Full Scope Polygraph
Career Level not specified
Salary not specified
Join Premium to unlock estimated salaries
Job Description
We’re looking for a Data Software Engineer to join a team focused on building and optimizing large-scale data systems in a cloud environment. This role is ideal for someone who enjoys working with complex datasets and owning the full data lifecycle—from ingestion and transformation to delivery and visualization.
You’ll design and develop scalable data pipelines and ETL processes, working with technologies like Java, SQL, and cloud-based platforms to process and analyze high volumes of data. The role involves hands-on work with data cleansing, profiling, and optimization, as well as building custom solutions to integrate new and existing data sources.
This is a great opportunity for someone who thrives in a big data, fast-paced environment, enjoys solving complex data challenges, and wants to contribute to systems that support mission-critical decision-making.
Duties, Tasks &Responsibilities
Designing and implementing large-scale ingest systems in a Big Data environment
Optimizing all stages of the data lifecycle, from initial planning, to ingest, through final display and beyond
Designing and implementing data extraction, cleansing, transformation, loading, and replication/distribution
Developing custom solutions/code to ingest and exploit new and existing data sources
Developing data profiling, deduping logic, and matching logic for analysis
Organizing and maintaining data layer documentation, so others are able to understand and use it
Collaborating with teammates, other service providers, vendors, and users to develop new and more efficient
methods
Effectively articulating the risks and constraints associated with software solutions, based on environment
Required Experience, Skills, & Technologies
Strong software development experience, to include significant Java development, data analysis/parsing, and
SQL/database experience
Strong experience with the full data lifecycle, from ingest through display, in a Big Data environment
Strong experience with Java-related technologies, such as JDK, J2EE, EJB, JDBC, and/or Spring, and experience
with RESTful APIs
Experience developing and performing ETL tasks in Linux and/or Cloud environments
Desired Experience, Skills & Technologies
Experience with Hadoop, Hbase, MapReduce
Experience with Elasticsearch
Experience working in a mission environment and/or with many different types of data
You’ll design and develop scalable data pipelines and ETL processes, working with technologies like Java, SQL, and cloud-based platforms to process and analyze high volumes of data. The role involves hands-on work with data cleansing, profiling, and optimization, as well as building custom solutions to integrate new and existing data sources.
This is a great opportunity for someone who thrives in a big data, fast-paced environment, enjoys solving complex data challenges, and wants to contribute to systems that support mission-critical decision-making.
Duties, Tasks &Responsibilities
Designing and implementing large-scale ingest systems in a Big Data environment
Optimizing all stages of the data lifecycle, from initial planning, to ingest, through final display and beyond
Designing and implementing data extraction, cleansing, transformation, loading, and replication/distribution
Developing custom solutions/code to ingest and exploit new and existing data sources
Developing data profiling, deduping logic, and matching logic for analysis
Organizing and maintaining data layer documentation, so others are able to understand and use it
Collaborating with teammates, other service providers, vendors, and users to develop new and more efficient
methods
Effectively articulating the risks and constraints associated with software solutions, based on environment
Required Experience, Skills, & Technologies
Strong software development experience, to include significant Java development, data analysis/parsing, and
SQL/database experience
Strong experience with the full data lifecycle, from ingest through display, in a Big Data environment
Strong experience with Java-related technologies, such as JDK, J2EE, EJB, JDBC, and/or Spring, and experience
with RESTful APIs
Experience developing and performing ETL tasks in Linux and/or Cloud environments
Desired Experience, Skills & Technologies
Experience with Hadoop, Hbase, MapReduce
Experience with Elasticsearch
Experience working in a mission environment and/or with many different types of data
group id: 91164284