Today
Unspecified
Mid Level Career (5+ yrs experience)
$140,000
No Traveling
IT - Software
McLean, VA (Off-Site/Hybrid)
We’re Hiring: Senior Python Developers (4 Openings)
Apply now by sending your resume to awade@apexsystems.com
Locations: McLean, VA (Hybrid), Richmond, VA (RVA)
Duration: 6-month contract (possible extension)
Rate: $70/hr
Experience: 5–7 years (Senior Level)
**Prior Capital One experience required
Join a fast-moving surge team within Card Tech – Digital Experience (DX), supporting high-impact platforms powering real-time and batch messaging systems. You’ll be helping scale and optimize complex event-driven systems for one of the largest financial institutions in the U.S.
⸻
Key Responsibilities
• Develop and scale real-time and batch data pipelines
• Support ingestion from multiple event streams and batch files
• Design and build bulk pipelines and optimize existing ones for scalability
• Contribute to microservices architecture and API design
⸻
Must-Have Skills
• Python (primary) and Java (secondary)
• AWS: Lambda, ECS, EC2, Fargate
• Kafka or Kinesis
• Node.js
• DynamoDB
• RESTful APIs
• Experience building data pipelines (Spark experience preferred)
• Familiarity with performance testing is a plus
⸻
Interview Process
• 1 Round (1 hour)
• 2nd round only if needed
⸻
Platform Details
You’ll be working on two interconnected platforms:
• One platform listens to multiple event streams, ingests data, and triggers workflows using AWS Lambdas
• The second platform reads from event stores, processes real-time and batch data, and is being optimized to support larger scale for Discover
Apply now by sending your resume to awade@apexsystems.com
Locations: McLean, VA (Hybrid), Richmond, VA (RVA)
Duration: 6-month contract (possible extension)
Rate: $70/hr
Experience: 5–7 years (Senior Level)
**Prior Capital One experience required
Join a fast-moving surge team within Card Tech – Digital Experience (DX), supporting high-impact platforms powering real-time and batch messaging systems. You’ll be helping scale and optimize complex event-driven systems for one of the largest financial institutions in the U.S.
⸻
Key Responsibilities
• Develop and scale real-time and batch data pipelines
• Support ingestion from multiple event streams and batch files
• Design and build bulk pipelines and optimize existing ones for scalability
• Contribute to microservices architecture and API design
⸻
Must-Have Skills
• Python (primary) and Java (secondary)
• AWS: Lambda, ECS, EC2, Fargate
• Kafka or Kinesis
• Node.js
• DynamoDB
• RESTful APIs
• Experience building data pipelines (Spark experience preferred)
• Familiarity with performance testing is a plus
⸻
Interview Process
• 1 Round (1 hour)
• 2nd round only if needed
⸻
Platform Details
You’ll be working on two interconnected platforms:
• One platform listens to multiple event streams, ingests data, and triggers workflows using AWS Lambdas
• The second platform reads from event stores, processes real-time and batch data, and is being optimized to support larger scale for Discover
group id: apexsan