- Key Responsibilities
- Strategic Data Modeling: Translate complex business requirements into efficient, scalable data models and schemas. You will design the logic that turns raw events into actionable business intelligence.
- Pipeline Architecture: Design, implement, and maintain resilient data pipelines that serve multiple business domains. You will ensure data flows reliably, securely, and with low latency across our ecosystem.
- End-to-End Ownership: Own the data development lifecycle completely—from architectural design and testing to deployment, maintenance, and observability.
- Cross-Functional Partnership: Partner closely with Data Analysts, Data Scientists, and Software Engineers to deliver end-to-end data solutions.
- What You Bring
- Your Mindset:
- Data as a Product: You treat data pipelines and tables with the same rigor as production APIs—reliability, versioning, and uptime matter to you.
- Business Acumen: You don't just move data; you understand the business questions behind the query and design solutions that provide answers.
- Builder's Spirit: You work independently to balance functional needs with non-functional requirements (scale, cost, performance).
- Your Experience & Qualifications:
- Must Haves:
- 6+ years of experience as a Data Engineer, BI Developer, or similar role.
- Modern Data Stack: Strong hands-on experience with DBT , Snowflake , Databricks , and orchestration tools like Airflow .
- SQL & Modeling: Strong proficiency in SQL and deep understanding of data warehousing concepts (Star schema, Snowflake schema).
- Data Modeling: Proven experience in data modeling and business logic design for complex domains—building models that are efficient and maintainable.
- Modern Workflow: Proven experience leveraging AI assistants to accelerate data engineering tasks.
- Bachelor’s degree in Computer Science, Industrial Engineering, Mathematics, or an equivalent analytical discipline.
- Preferred / Bonus:
- Cloud Data Warehouses: Experience with BigQuery or Redshift.
- Coding Skills: Proficiency in Python for data processing and automation.
- Big Data Tech: Familiarity with Spark, Kubernetes, Docker.
BI Integration: Experience serving data to BI tools such as Looker, Tableau, or Superset. #LI-Hybrid