Loading
Hire.Monster

Software Engineer, Change data capture

Bengaluru
РазработкаСША

Описание вакансии

About the team

Stripe’s Change Data Capture (CDC) system powers the company’s data infrastructure by streaming real-time database changes into the Stripe Data Lake. That low-latency feed unlocks timely insights and workflows across Payments, Ledger, machine learning, fraud detection, product analytics, regulatory reporting, and financial reconciliation — and it underpins externally facing tools like Radar and Sigma. CDC operates at production scale (~1M events/sec) and supports a broad user base: engineering teams, data scientists, sales & operations, and finance.

CDC provides multiple products that let customers backfill historical data, stream real‑time changes into the Data Lake, and archive records to query‑optimized analytics stores — all without compromising the reliability, maintainability, or scalability of our transactional databases (for example, MongoDB).

  • What you’ll do
  • As a Software Engineer on our team, you will do the following:
  • Design, build, and maintain innovative next-generation or first-generation versions of key CDC products, with an emphasis on usability, reliability, security, and efficiency.
  • Design ergonomic APIs and abstractions that build a great customer experience for internal Stripes, that will in turn enhance the experience of millions of Stripe users.
  • Ensure operational excellence and enable a highly available & reliable CDC Platform.
  • Collaborate nimbly with high-visibility teams and their stakeholders to support their key initiatives - while building a robust platform that benefits all of Stripe in the long term.
  • Plan for the growth of Stripe’s infrastructure by unblocking, supporting, and communicating proactively with internal partners to achieve results.
  • Connect your work with improvements in the usability and reliability of Open Source Software (OSS) like Debezium and contribute back to the OSS community.
  • Who you are
  • We’re looking for someone who meets the minimum requirements to be considered for the role. If you meet these requirements, you are encouraged to apply. The preferred qualifications are a bonus, not a requirement.
  • 8+ years of professional experience writing high quality production level code or software programs and interest in Data Infrastructure.
  • Has experience operating or enabling large-scale, high-availability data pipelines from design, to execution and safe change management. Expertise in Spark, Flink, Spark, Airflow, Python, Java, SQL, and API design is a plus.
  • Has experience developing, maintaining, and debugging distributed systems built with open source tools
  • Has experience building infrastructure-as-a-product with a strong focus on users needs
  • Has strong collaboration and communication skills, and can comfortably interact with both technical and non-technical participants.
  • Has the curiosity to continuously learn about new technologies and business processes.
  • Is energized by delivering effective, user-first solutions through creative problem-solving and collaboration.
  • Has experience writing production-level code in Expertise in Java, Go, Scala, and Airflow is a plus.
  • Has experience designing APIs or building developer platforms
  • Has experience optimizing the end to end performance of distributed systems
  • Has experience with scaling distributed systems in a rapidly moving environment ● Has experience working with Airflow Infrastructure.
  • Genuine enjoyment of innovation and a deep interest in understanding how things work

Office-assigned Stripes spend at least 50% of the time in a given month in their local office or with users. This hits a balance between bringing people together for in-person collaboration and learning from each other, while supporting flexibility about how to do this in a way that makes sense for individuals and their teams. Office location - Bengaluru , KA, India

Опубликовано: 12.01.2026