user avatar

Data Enigneer

Eliassen Group

Today
Public Trust
Unspecified
Unspecified
Remote/Hybrid (Off-Site/Hybrid)

Description:
We are seeking a skilled Databricks Specialist / Data Engineer to design, implement, and optimize big data pipelines and analytics solutions. The ideal candidate will have hands-on experience with Databricks , Apache Spark , and cloud-based data platforms , along with a strong understanding of data governance , security , and compliance standards in regulated environments.

Location: Fully Remote

Pay Rate: $70 - $80 / hr

Clearance: Must have an Active MBI Public Trust

This is a contract-to-hire opportunity. Applicants must be willing and able to work on a W2 basis and convert to FTE following contract duration. For our W2 consultants, we offer a great benefits package that includes Medical, Dental, and Vision benefits, 401k with company matching, and life insurance.

Responsibilities: Key Responsibilities:
  • Design, develop, and maintain scalable ETL/ELT pipelines using Databricks and Apache Spark.
  • Work with large datasets from multiple sources (structured and unstructured) to support analytics and reporting.
  • Collaborate with agency teams to implement data lakehouse solutions and optimize data storage and retrieval.
  • Develop and maintain Delta Lake tables and pipelines to ensure data reliability and ACID compliance.
  • Implement data transformations, cleansing, and enrichment processes for downstream analytics.
  • Use Python, SQL, and other scripting languages to build and automate workflows.
  • Ensure compliance with federal security standards (e.g., NIST 800-53, IRS 1075).
  • Collaborate with data scientists, analysts, and stakeholders to deliver actionable insights.
  • Participate in performance tuning, troubleshooting, and optimization of Databricks clusters and workloads.


Experience Requirements: Required Qualifications:
  • Hands-on experience with Databricks and Apache Spark in production environments.
  • Strong data engineering skills including ETL/ELT pipelines, data modeling, and analytics.
  • Proficiency in Python, SQL, or Scala for data processing.
  • Experience with cloud platforms (AWS, Azure, or GCP), including storage, compute, and security controls.
  • Knowledge of Delta Lake, data lake architecture, and big data best practices.
  • Experience working in federal or highly regulated environments preferred.


Desired Skills:
  • Familiarity with data governance, metadata management, and lineage tracking.
  • Experience integrating Databricks with BI tools such as Power BI or Tableau.
  • Knowledge of machine learning workflows in Databricks.
  • Understanding of federal modernization initiatives and data sensitivity requirements.


Education Requirements: Education & Certifications:
  • Bachelor's or Master's degree in Computer Science, Data Engineering, or related field.
  • Preferred certifications:
    • Databricks Certified Data Engineer
    • AWS / Azure / GCP Cloud Certifications
    • Certified Analytics Professional (CAP)
group id: 10106647

Match Score

Powered by IntelliSearch™
image match score
Create an account or Login to see how closely you match to this job!

Similar Jobs


Clearance Level
Public Trust