Loading
Hire.Monster

Data Engineer

Remote
УдалённаяАналитикаДругое

Описание вакансии

We are seeking a Data Engineer to build, maintain, and optimize the data ecosystem that powers Tidal’s ETF platform. This role sits at the intersection of data engineering, analytics, and infrastructure—responsible for developing robust, cloud-native data pipelines and ensuring reliable data delivery across business and operational systems. You will contribute to technical design, implement data engineering best practices, and collaborate with cross-functional partners to ensure that our data infrastructure supports accurate, timely, and compliant decision-making across the organization.

Обязанности

What you'll do 1. Data Pipeline Development & Maintenance Build, maintain, and optimize scalable data pipelines in AWS (or equivalent), leveraging technologies such as Snowflake, dbt, and modern orchestration frameworks (Airflow, Prefect, etc.). Integrate complex financial data sources—Bloomberg APIs, custodial feeds, and fund administration data—into reliable, auditable data flows. Contribute to the development of a modular, well-documented data ecosystem supporting analytics, reporting, and operations. Participate in code reviews and ETL design discussions to ensure code quality, performance, and maintainability. 2. Data Quality & Reliability Implement and maintain automated data validation and testing checks (e.g., Great Expectations or similar). Monitor data pipeline performance and troubleshoot issues to ensure reliability and accuracy. Adhere to established standards for data lineage, versioning, and documentation within the engineering environment. Work with the team to ensure that all financial data meets Tidal’s quality and regulatory expectations. 3. Collaboration & Business Support Work with sales, operations, and trading teams to understand data requirements and implement infrastructure that delivers actionable insights. Collaborate with analytics teams to support model optimization, dataset creation, and query performance in BI tools such as Sigma. Assist in automating workflows to support data-driven decision-making across the ETF product lifecycle. 4. Engineering Best Practices Write clean, efficient, and well-documented code, following best practices in data modeling and system design. Contribute to the team’s knowledge base by maintaining documentation on pipelines, design standards, and deployment processes. Participate in sprint planning and stand-ups to ensure timely delivery of features and fixes. Stay curious and keep up to date with emerging technologies and patterns in data engineering and cloud architecture.

Требования

Qualifications Education & Experience Bachelor’s degree in Computer Science, Data Engineering, or related technical experience. 3–5 years of professional experience in data engineering or backend software development. Experience building and maintaining cloud-based data architectures, preferably in financial or fintech environments.

Навыки

Technical Skills Proficient in Python and SQL for data transformation, pipeline orchestration, and automation. Strong working knowledge of data warehousing (Snowflake, Redshift, or equivalent) and cloud platforms (AWS preferred). Hands-on experience with modern orchestration and transformation tools (Airflow, dbt, Prefect). Familiarity with APIs, event-driven pipelines (Kafka or Kinesis), and ETL frameworks. Solid understanding of version control (Git/GitHub), CI/CD, and Agile methodologies. Familiarity with financial datasets (ETFs, trading, market data) is a plus. Soft Skills Strong communication and collaboration skills with technical and non-technical teams. Problem-solving mindset with the ability to troubleshoot complex data issues. Eager to learn and contribute to a culture of engineering excellence. Detail-oriented and committed to data reliability and quality.

  • Required Experience:
  • 3–5 years of professional experience in data engineering or backend software development. Experience building and maintaining cloud-based data architectures, preferably in financial or fintech environments.
  • Required Education:
  • Bachelor’s degree in Computer Science, Data Engineering, or related technical experience.
Опубликовано: 10.01.2026