Today
Top Secret
Unspecified
Polygraph
IT - Database
GA (On-Site/Office)
Job Details
Data Engineer
Remote Opportunity
Top Secret Clearance Required
We are looking for a cleared Data Engineer Join our client, a technology company and part of global Accenture, to do work that matters in a collaborative and caring community, where you feel like you belong and are empowered to grow, learn and thrive through hands-on experience, certifications, industry training and more.
Join us to drive positive, lasting change that moves missions and the government forward!
The work:
Here's what you need:
Bonus points if you have:
Security Clearance:
#cjpost
Job Requirements:
Data Engineer
Remote Opportunity
Top Secret Clearance Required
We are looking for a cleared Data Engineer Join our client, a technology company and part of global Accenture, to do work that matters in a collaborative and caring community, where you feel like you belong and are empowered to grow, learn and thrive through hands-on experience, certifications, industry training and more.
Join us to drive positive, lasting change that moves missions and the government forward!
The work:
- Pipeline Architect : Design, build, and maintain scalable end-to-end data pipelines using Databricks, Spark, and related technologies.
- Transformation Titan : Develop efficient data processing and transformation workflows to support analytics and reporting needs.
- Integration Hero : Integrate diverse data sources including APIs, databases, and cloud storage into unified datasets.
- Collaboration Champion : Work closely with cross-functional teams (data science, analytics, business units) to design and implement data solutions that align with business goals.
- Quality Guardian : Implement robust validation, monitoring, and observability processes to ensure data accuracy, completeness, and reliability.
- Automation Avenger : Contribute to data governance, security, and automation initiatives within the data ecosystem.
- Cloud Commander : Leverage AWS services (e.g., S3, Glue, Lambda, Redshift) to build and deploy data solutions in a cloud-native environment.
Here's what you need:
- Experience with cloud-based ETL services (e.g. AWS Glue, Google Cloud Dataflow, Azure Data Factory)
- Experience with Cloud data warehousing technologies (e.g. Amazon Redshift, Google BigQuery, Snowflake)
- Experience with Python, SQL, Spark, and PySpark
- Experience with data platforms like Databricks, Palantir, and Snowflake
- Familiarity with data orchestration and data quality processes
Bonus points if you have:
- Experience working with federal clients
- Experience with Docker/Kubernetes Hadoop/Spark, NiFi, ELK stack
- Experience with Agile / Scrum
- Experience with COTS and open-source data engineering tools such as ElasticSearch and NiFi
- Data engineering certification such as Palantir Foundry Data Engineer, Azure Data Engineer Associate, Google Professional Data Engineer, IBM Certified Data Engineer, or similar
Security Clearance:
- Active Top Secret or TS/SCI or TS/SCI with polygraph clearance
#cjpost
Job Requirements:
- Integrate new data and big data management technologies and software engineering tools
- Utilize the new big data tools
- Ensure the data architecture can be extendable for big data solutions
- Implement data tools to support analytics and data scientist team
- Breathe data systems architecture engineering
- Design the data architecture and data integration layers
- Ensure that data center networks
- Develop business critical data solutions
- Manage the migration of data from legacy systems to new data solutions
- Troubleshoot data processing and regular data loads
- Integrate new data technologies/tools across the enterprise
- Evangelize data best practices and implement analytics solutions
- Discover data across many different systems, data sources, and data types
- Collect and store big data
- Executing data center infrastructure engineering projects About AECOM
- Executing data center infrastructure engineering projects
- Implement data extraction tools with integration of a variety of data sources and data formats
- Solve big data problems with smart algorithmic solutions
- Assist with data-related technical issues
- Build and integrate data from various resources and manage big data
group id: 10238000