Senior Data Engineer

Req Id:  20340
Job Family:  Data Management
Location: 

Porto, PT, 4100-136

Additional Location: 
Description: 

Purpose

  • The Senior Data Engineer is a hands-on technical expert responsible for delivering complex data solutions within SBM’s lakehouse ecosystem. This role focuses on designing, building, and optimizing data pipelines and transformations across the Medallion architecture (Bronze, Silver, and Gold), ensuring high levels of performance, scalability, and governance. Working closely with data architects, the Senior Data Engineer ensures alignment with enterprise data models and standards while mentoring peers, promoting engineering excellence, and driving continuous improvement of SBM’s metadata-driven data platform.

Responsibilities

  • Lead and participate in the design, development, and implementation of scalable data engineering solutions across the Medallion architecture (Bronze, Silver, Gold).
  • Design and maintain efficient data pipelines and integration processes using Azure Data Factory, Synapse, and Azure Data Lake Storage.
  • Develop, test, and deploy SQL and PySpark ETL/ELT workflows ensuring data quality, consistency, and performance.
  • Collaborate with data architects, data scientists, analysts, and business stakeholders to define data requirements and deliver high-quality solutions aligned with governance and modeling standards.
  • Monitor, troubleshoot, and optimize data pipelines and architectures for scalability, reliability, and efficiency.
  • Ensure compliance with data governance, security, and regulatory standards throughout the data lifecycle.
  • Maintain clear technical documentation and promote reusable frameworks, automation, and best practices.
  • Mentor junior engineers and foster a culture of technical excellence, collaboration, and continuous improvement.

Education

  • Bachelor’s degree in Computer Science, Information Technology, Data Science, or a related field is required.
  • A Master’s degree in Data Science, Computer Science, or a related field is an advantage but not mandatory.
  • Advanced certifications in data engineering or related disciplines (e.g., Google Cloud Professional Data Engineer, AWS Certified Big Data – Specialty, Microsoft Certified: Azure Data Engineer Associate – DP-203, or Microsoft Certified: Fabric Analytics Engineer Associate – DP-600) are preferred.
  • Proficiency in English is essential for effective communication with team members, stakeholders, and external vendors.

Experience

  • 7+ years of professional experience in data engineering with proven expertise in Azure technologies, including at least 2 years in a leadership or senior technical role.
  • Strong track record in designing, implementing, and managing complex data pipelines, ETL/ELT processes, and integration solutions.
  • Advanced knowledge of data modeling, dimensional design, and data warehousing using SQL, NoSQL, and columnar databases.
  • Hands-on experience with Azure Data Factory, Synapse (dedicated and serverless), and Azure Data Lake Storage.
  • Proficiency in Python and PySpark for scalable data processing; additional experience with Java or Scala is a plus.
  • Expert-level SQL skills for complex transformations, optimization, and performance tuning.
  • Familiarity with Agile and DevOps methodologies, including CI/CD implementation using Azure DevOps or GitHub Actions.
  • Working knowledge of data visualization tools such as Power BI, Tableau, or Looker.
  • Solid understanding of data governance, data quality, and security best practices (RBAC, encryption, GDPR).
  • Strong leadership and communication skills, with proven ability to mentor engineers and collaborate effectively with cross-functional teams.
  • Exposure to Microsoft Fabric and Cognite Data Fusion (CDF) is an advantage.

Functional Competencies

Compliance
Governance, Risk and Control
Analytics and reporting
IT Tools and application
Cost and budget control
Business Partnering
Digital savvy
Business Acumen
Contract Management
Management of change application