C

Senior Cloud Data Engineer

Cepal Hellas Financial Services S.A.
Full-time
On-site
Remote

Role Purpose

The Senior Cloud Data Engineer will be responsible for operating and optimizing our cloud-based data processing environment. The role involves working with Databricks, AWS services, Spark, Unity Catalog, and Delta Lake to ensure that data pipelines and analytics workloads run efficiently, securely, and reliably.

Main responsibilities

  • Refines data transformations using PySpark and Spark SQL within notebooks such as Databricks enabling efficient processing of large-scale datasets.
  • Leverages orchestration tools like Apache Airflow to automate, schedule, and oversees data workflows for consistent and reliable execution.
  • Participates in code reviews, testing, and documentation as part of the development lifecycle.
  • Supports and troubleshoots Databricks jobs, Spark workloads, and AWS-based data processes across DEV/QA/Production.
  • Optimizes Databricks clusters and jobs for performance and cost.
  • Maintains and improves existing data pipelines built with AWS (CodePipeline), Delta Lake, and Databricks Notebooks.
  • Works closely with data engineering and analytics teams to improve data quality and pipeline reliability.
  • Maintains and enhances CI/CD workflows for Databricks deployments using AWS tools.
  • Manages access controls with IAM and Unity Catalog, ensuring secure and compliant data usage.
  • Performs regular monitoring, troubleshooting, and root-cause analysis of data and compute workloads.

Requirements

Education, Experience and Technical Skills

  • Bachelor’s degree in Computer Science, Information Systems, Data Engineering, or similar
  • Master’s Degree will be considered as an asset
  • 5+ years of experience in big data operations or cloud-based data engineering.
  • Strong hands-on experience with AWS, Databricks, Delta Lake, and Apache Spark
  • Proficient in Python, SQL, and PySpark
  • Experience with CI/CD, version control, and release processes (AWS CodePipeline, Git)
  • Experience with monitoring, debugging, and optimizing ETL/ELT and Spark workloads
  • Knowledge of data governance frameworks and exposure to enterprise security or regulated environments will be considered as an asset

Competencies

  • Excellent problem-solving skills and attention to detail
  • Strong communication skills and the ability to work collaboratively in a team environment
  • Effective time management with ability to multi-task and prioritize work