About the Role
We are seeking a skilled and detail-oriented Analytics Engineer to design, build, and maintain a centralized data mart that empowers data-driven decision-making across our organization. You will play a crucial role in bridging data engineering and analytics by transforming raw data into well-modeled, accessible, and reliable data assets for business intelligence, reporting, and advanced analytics.
Key Responsibilities:
- Design, develop, and maintain a centralized data mart to serve as the single source of truth for reporting and analytics.
- Work closely with data engineers to integrate data from multiple sources into clean, governed, and scalable models.
- Transform and model data using modern data transformation tools (e.g., dbt, Spark SQL, BigQuery SQL).
- Define and implement data modeling standards and best practices to ensure consistency, reusability, and documentation across data assets.
- Collaborate with analysts, data scientists, and business stakeholders to understand reporting and analytical needs and translate them into efficient data structures.
- Ensure data quality through rigorous validation, testing, and monitoring.
- Contribute to establishing data governance frameworks, including metadata management, data lineage, and access controls.
- Troubleshoot data issues and provide ongoing support to ensure the reliability and performance of the centralized data mart.
- Optimize queries through tuning, query plan analysis, and distributed computing best practices.
Qualifications:
- Bachelors or Masters degree in Computer Science, Data Engineering, Analytics, or a related field.
- 3+ years of experience in analytics engineering, data engineering, or BI engineering.
- Solid experience in designing and implementing data models for analytics use cases.
- Strong SQL skills and proficiency in data modeling (dimensional modeling, star/snowflake schemas).
- Experience with data transformation tools such as dbt or equivalent frameworks.
- Familiarity with modern data warehousing platforms (e.g., Snowflake, BigQuery, Redshift).
- Experience with orchestration tools (e.g., Airflow, Prefect) and version control (e.g., Git) is a plus.
- Understanding of data governance concepts including data quality, lineage, and documentation.
- Strong analytical and problem-solving skills with a business-oriented mindset.
- Excellent communication skills and ability to collaborate across technical and non-technical teams.
Report job