Yesterday
Top Secret/SCI
Unspecified
CI Polygraph
IT - Data Science
VA (On-Site/Office)
Overview
Iron EagleX (IEX), a wholly owned subsidiary of General Dynamics Information Technology, delivers agile IT and Intelligence solutions. Combining small-team flexibility with global scale, IEX leverages emerging technologies to provide innovative, user-focused solutions that empower organizations and end users to operate smarter, faster, and more securely in dynamic environments.
Responsibilities
Job Description:
Iron EagleX is seeking a Data Scientist SME to join our dynamic team in Crystal City, VA. This role creates and delivers innovative analytics through production software as a member of a fast-paced, multi-discipline team. This position is onsite and requires travel for 60-90 days to CONUS sites each year.
Job Duties Include (but not limited to):
Qualifications
Required Skills & Experience:
Desired Skills:
Education & Certifications:
Security Clearance:
Equal Opportunity Employer / Individuals with Disabilities / Protected Veterans
Iron EagleX (IEX), a wholly owned subsidiary of General Dynamics Information Technology, delivers agile IT and Intelligence solutions. Combining small-team flexibility with global scale, IEX leverages emerging technologies to provide innovative, user-focused solutions that empower organizations and end users to operate smarter, faster, and more securely in dynamic environments.
Responsibilities
Job Description:
Iron EagleX is seeking a Data Scientist SME to join our dynamic team in Crystal City, VA. This role creates and delivers innovative analytics through production software as a member of a fast-paced, multi-discipline team. This position is onsite and requires travel for 60-90 days to CONUS sites each year.
Job Duties Include (but not limited to):
- Build and ship Python-based analytic services (APIs, batch, streaming) with CI/CD, testing, and containerization.
- Build and update web-based GUIs (Streamlit, etc.) for the lightweight visualization of complex analytics.
- Design and harden automated methods for linking disparate sources across very large datasets.
- Own the lifecycle: ingest/ETL, modeling, deployment, and monitoring (performance, drift, data quality).
- Work with analysts, developers, and other data scientists to turn workflows into robust, automated pipelines.
- Apply supervised/unsupervised learning, graph methods, and statistical modeling.
- Enforce engineering rigor: unit and integration testing, typing, code review, and documentation.
- Identify data gaps/anomalies and improve TTPs and methodologies in classified environments.
Qualifications
Required Skills & Experience:
- Production Python expertise (packages, APIs/services, packaging, dependency management).
- SQL proficiency; experience with large-scale systems (Trino, PostgreSQL, Hive; OpenSearch, ElasticSearch; distributed compute a plus).
- Proven model productionization (FastAPI, batch jobs, or streaming), CI/CD, Docker, and Linux.Strong with NumPy, SciPy, pandas, scikit-learn, Matplotlib; comfortable moving from exploration to maintainable code.
- Practical experience in entity resolution, graph/relational reasoning, and applied statistics.
- Experience integrating structured, unstructured text, and semi-structured data.
- Due to US Government Contract Requirements, only US Citizens are eligible for this role.
Desired Skills:
- Orchestration & Data Quality: Airflow
- Distributed & Streaming: Spark, Kafka
- Development and CI/CD: Kubernetes, pytest
- Observability & Reliability: Data & model drift monitoring
Education & Certifications:
- Bachelor's degree in Computer Science, Statistics, Engineering, or a related field (or equivalent experience). Advanced degrees are a plus.
Security Clearance:
- An active TS/SCI security clearance is REQUIRED and candidates must have or be willing to obtain a CI Poly. Candidates without this clearance will not be considered.
Equal Opportunity Employer / Individuals with Disabilities / Protected Veterans
group id: 91092607