Job Requirements
Fort Meade, MD Hill AFB, UT Scott AFB, IL
Secret Polygraph not specified
Mid Level Career (5+ yrs experience)
Salary not specified
Join Premium to unlock estimated salaries
Job Description
One of our Top Federal IT Partners is hiring an experienced Data Integration Engineer supporting various efforts around their core data and analytics platform. You will be joining a team that securely connects more than 3 million end users at over 3,000 DoD and Federal sites around the world. In this position, you will help lead data integration development – focusing on expanding the foundational Integrated Data Architecture platform and delivering modern analytics capabilities in support of mission-critical DoD network operations. *Must sit onsite 3x weekly/as needed at one of the following locations: Scott Air Force Base, IL or Ft. Meade, MD or Hill Air Force Base, UT.* Apply today to connect with our team for more details!
REQUIREMENTS:
• 5+ years of hands-on data integration experience – developing & implementing Kafka integrations between ELK/Elastic Stack (Elasticsearch, Logstash, Kibana)
o Including working with RESTful APIs, connectors, and event streaming pipelines
• 3+ years of Databricks experience building pipelines with Delta tables, data cleansing, and implementing Bronze/Silver/Gold (medallion) architecture
• Extensive Python and/or Java programming experience
• Experience working in Agile development environment (SDLC) supporting sprint cycles, testing, and deployment activities
• Active SECRET Clearance or higher REQUIRED
• CompTIA Security+ certification required within the first 14 days of employment
PREFERRED SKILLSET:
• Certified Confluent Developer and/or Certified Elastic Engineer Certifications
• Kubernetes containerization experience for cloud deployments and/or AWS Gov Cloud environment experience
• Experience developing and deploying software in a DoD environment (DISA experience is a plus)
• Experience with Atlassian tools including JIRA and Confluence
RESPONSIBILITIES:
• Develop and implement integration solutions using Kafka and Elastic as the primary data architecture platforms, with expanded integration to other technologies, including but not limited to Databricks
• Integrate data sources into Databricks, Confluent (Kafka), and Elastic platforms in support of the GMS core data and analytics environment
• Develop Kafka system integrations, custom connectors, and work with ksqlDB and Kafka Streams for data processing based on the design solution
• Design and implement solutions within Databricks, including Delta Lake tables and medallion architecture layers, to support analytics and data transformation initiatives
• Develop Kafka-based integrations into Databricks and Elastic, including custom connectors, APIs, and event streaming pipelines
• Support the integration of new data sources, including DODNet, into the existing platform environment
• Support and maintain Elasticsearch and Logstash integrations to ensure continuity and stability of the existing ELK environment
• Automate the full software lifecycle from design and development through testing and deployment, including production environments
• Work with other members of the data integration team to propose solutions based on mission needs and platform strategies
• Develop DoD requirements, traceability, and detailed plans and schedules, including software systems engineering and interface documents (IDDs/ICDs)
• Interact with the customer to address data engineering technical considerations and associated problems, issues, or conflicts
REQUIREMENTS:
• 5+ years of hands-on data integration experience – developing & implementing Kafka integrations between ELK/Elastic Stack (Elasticsearch, Logstash, Kibana)
o Including working with RESTful APIs, connectors, and event streaming pipelines
• 3+ years of Databricks experience building pipelines with Delta tables, data cleansing, and implementing Bronze/Silver/Gold (medallion) architecture
• Extensive Python and/or Java programming experience
• Experience working in Agile development environment (SDLC) supporting sprint cycles, testing, and deployment activities
• Active SECRET Clearance or higher REQUIRED
• CompTIA Security+ certification required within the first 14 days of employment
PREFERRED SKILLSET:
• Certified Confluent Developer and/or Certified Elastic Engineer Certifications
• Kubernetes containerization experience for cloud deployments and/or AWS Gov Cloud environment experience
• Experience developing and deploying software in a DoD environment (DISA experience is a plus)
• Experience with Atlassian tools including JIRA and Confluence
RESPONSIBILITIES:
• Develop and implement integration solutions using Kafka and Elastic as the primary data architecture platforms, with expanded integration to other technologies, including but not limited to Databricks
• Integrate data sources into Databricks, Confluent (Kafka), and Elastic platforms in support of the GMS core data and analytics environment
• Develop Kafka system integrations, custom connectors, and work with ksqlDB and Kafka Streams for data processing based on the design solution
• Design and implement solutions within Databricks, including Delta Lake tables and medallion architecture layers, to support analytics and data transformation initiatives
• Develop Kafka-based integrations into Databricks and Elastic, including custom connectors, APIs, and event streaming pipelines
• Support the integration of new data sources, including DODNet, into the existing platform environment
• Support and maintain Elasticsearch and Logstash integrations to ensure continuity and stability of the existing ELK environment
• Automate the full software lifecycle from design and development through testing and deployment, including production environments
• Work with other members of the data integration team to propose solutions based on mission needs and platform strategies
• Develop DoD requirements, traceability, and detailed plans and schedules, including software systems engineering and interface documents (IDDs/ICDs)
• Interact with the customer to address data engineering technical considerations and associated problems, issues, or conflicts
group id: 91081371