Job Description
One of our clients, a leader in their industry, is seeking a Senior Data Engineer to design, build, and optimize their data infrastructure. You’ll play a key role in creating scalable, efficient data pipelines using dbt and BigQuery to drive data insights and business decisions.
Key Responsibilities
- Develop and maintain robust data pipelines using dbt and Google BigQuery.
- Collaborate with data teams to integrate, model, and transform data across systems.
- Optimize data workflows for performance and cost-effectiveness.
- Ensure data quality and consistency, implementing best practices in data warehousing.
- Design and manage ETL/ELT processes to handle large datasets.
- Monitor, troubleshoot, and resolve data pipeline issues.
- Mentor junior engineers and ensure adherence to coding standards.
Qualifications
- 5+ years of experience in data engineering and building data pipelines.
- Expertise in dbt and BigQuery.
- Strong skills in SQL and experience with Google Cloud Platform (GCP).
- Experience with ETL/ELT processes and data warehousing.
- Familiarity with tools like Python, Airflow, Kubernetes, and version control (e.g., Git).
- Strong problem-solving and troubleshooting skills.
Nice to Have
- Knowledge of AI/ML workflows.
- Experience with Snowflake, Redshift, or other data platforms.
Why Join
- Work on impactful data projects with a fast-growing client.
- Competitive compensation and career growth opportunities.