About the job
Role requirements:
- Collaborate with data engineers to understand ETL/ELT processes using ADF and Databricks.
- Develop test plans and test cases for data pipelines, including transformation logic validation and end-to-end data flow.
- Validate data accuracy, completeness, and consistency across systems (e.g., landing, staging, transformation, and reporting layers).
- Implement automated data quality and validation tests.
- Perform regression, integration, and performance testing on data workflows.
- Participate in code reviews, defect triaging, and sprint planning sessions.
- Monitor data pipeline executions and investigate test failures or anomalies.
- Document test results, defects, and quality metrics.
Preferred qualifications:
- Experience with PySpark or notebooks in Databricks.
- Exposure to Azure DevOps, Unit Testing frameworks, or Great Expectations for data testing.
- Knowledge of data warehousing or medallion architecture (bronze, silver, gold layers).
- Experience with data visualization tools (e.g., Power BI) is a plus.
…