Data Platform Operations Analyst

Praha /
Hybrid
Lokation: Prague, possible full remote
Languages: Excellent English and Advanced Czech

Level: Senior
Form of cooperation: Contraktor
Start date: asap
Allocation: Full-Time
Allocation length: Long term

  • We are looking for an analytical and detail-oriented Data Platform Operations Analyst to join our team. This position requires a strong problem solving mindset with the ability to reverse engineer and perform root cause analysis (RCA) on complex issues involving multiple data layers. The ideal candidate must have experience tracking problems from the visualization layer (Power BI) through model views to source databases, with deep knowledge of table relationships and measurement calculations.

Issue Investigation & Root Cause Analysis:

  • Perform end-to-end issue analysis, tracing data inconsistencies or failures from Power BI reports back to model views, stored procedures, and source databases.
  • Identify and resolve data integrity issues by analyzing relationships between tables, measures, and transformations across the data stack.
  • Reverse-engineer data pipelines and dependencies to troubleshoot performance bottlenecks and ensure accurate reporting.
  • Ability to debug ETL processes and optimize data ingestion workflows.
  • Maintain detailed documentation of RCA findings and recommend long-term solutions to prevent recurrence.

Azure Databricks Management:

  • Monitor Azure Databricks clusters, jobs, and data pipelines to ensure optimal performance and availability.
  • Develop and manage data ingestion and transformation workflows, ensuring data consistency and reliability.
  • Investigate data pipeline failures, query slowdowns, and unexpected transformations impacting business intelligence reporting.

Power BI Operations & Troubleshooting:

  • Support and maintain Power BI reports, dashboards, and datasets, ensuring data accuracy and timely updates.
  • Debug and resolve Power BI model issues, including relationships, DAX measures, and calculated columns.
  • Analyze query performance within Power BI and source databases, optimizing report refresh times and query execution.
  • Maintain user access controls, security policies, and governance best practices within the Power BI environment.

Data Integration and ETL Monitoring:

  • Implement, maintain, and optimize ETL processes using tools like Azure Data Factory and Databricks.
  • Monitor data pipeline performance metrics and proactively address performance bottlenecks.
  • Ensure data transformation logic aligns with business requirements, identifying and resolving anomalies.

Performance Monitoring and Optimization:

  • Conduct regular performance tuning and capacity monitoring for Azure Databricks and Power BI.
  • Analyze data usage patterns and recommend enhancements to improve efficiency and reduce costs.
  • Collaborate with engineers and analysts to implement best practices for data model optimization and visualization performance.

Documentation & Collaboration:

  • Maintain clear documentation on issue resolutions, RCA processes, and troubleshooting methodologies.
  • Work closely with cross-functional teams, including data engineers, analysts, and business users, to enhance platform stability.
  • Participate in meetings, workshops, and knowledge-sharing sessions to drive continuous improvement in data operations.

Qualifications:

  • 3 to 5 years of experience in data operations, troubleshooting, and business intelligence support.
  • Strong analytical and debugging skills, with a track record of reverse-engineering and resolving complex data issues.
  • Expertise in tracing data lineage from Power BI visualizations to model views and underlying database tables.
  • Proficiency in Azure Databricks, including cluster management, job scheduling, and performance tuning.
  • Deep understanding of Power BI, including DAX, data modeling, relationships, and report optimization.
  • Strong SQL and Python skills for data analysis, troubleshooting, and ETL debugging.
  • Experience with cloud-based data platforms (Azure, AWS, or GCP) and data integration processes.
  • Excellent problem-solving and communication skills, with the ability to convey technical findings to non-technical stakeholders.
  • Relevant certifications in Azure, Power BI, or Data Engineering are a plus.