Curriculum Vitae

Professional Summary

I am a Data Engineer and Software Developer specializing in building efficient data pipelines and creating elegant
solutions to complex problems. With experience in both telecommunications and insurance sectors, I focus on cloud-based data
solutions and ETL frameworks.

Experience

Senior Data Engineer

Zürich Insurance | October 2022 - September 2023

  • Led the implementation of Azure Databricks Cloud solution in the ETL framework
  • Designed and developed Data Warehouse systems
  • Managed migration of ETL workflows from legacy systems to Azure Databricks
  • Led sub-projects and conducted team training sessions
  • Collaborated with IT development, business units, and management to implement requirements
  • Demonstrated strong project leadership and team coordination skills

Data Engineer

A1 Telekom Austria | September 2021 - October 2022

  • Implemented Data Engineering & Data Science solutions in A1’s Analytical Environment
  • Developed solutions using Python, Relational Databases, Apache Spark, Hadoop, and Hive
  • Prototyped and implemented cloud solutions using Azure services (Data Factory, Event Hubs, Synapse)
  • Worked with containerization technologies (Docker, Kubernetes/OpenShift)
  • Demonstrated high reliability and initiative in project delivery

Graduate

A1 Telekom Austria | September 2019 - August 2021

  • Started career in Transformation, Market & Corporate Functions unit
  • Developed foundational skills in data engineering and business transformation

Education

Doctor of Natural Sciences (Dr.rer.nat.)

University Name | 2016 - 2019

  • Specialized in Data Science and Engineering
  • Research focus on scalable data processing systems

Master of Computer Science

University Name | 2014 - 2016

  • Focus on Software Engineering and Database Systems
  • GPA: 3.8/4.0

Bachelor of Computer Science

University Name | 2010 - 2014

  • Focus on Software Engineering and Database Systems
  • GPA: 3.7/4.0

Skills

Technical Skills

  • Programming Languages: Python, SQL
  • Data Engineering: Apache Spark, Hadoop, Hive, Azure Databricks
  • Cloud Platforms: Azure (Data Factory, Event Hubs, Synapse)
  • Database Systems: Relational Databases, Data Warehousing
  • Containerization: Docker, Kubernetes/OpenShift
  • ETL/ELT: Data Pipeline Development, Data Migration
  • Version Control: Git, GitHub
  • CI/CD: Jenkins, GitHub Actions

Soft Skills

  • Strong project leadership and team coordination
  • Excellent communication with stakeholders
  • Self-motivated and initiative-driven
  • High reliability and responsibility
  • Customer-oriented approach
  • Organizational talent
  • Efficient and goal-oriented work style

Languages

  • English (Professional)
  • German (Professional)
  • Russian (Native)

Professional Achievements

  • Successfully led migration of ETL workflows to Azure Databricks
  • Established new cloud-based data solutions
  • Developed and implemented complex data warehouse systems
  • Trained team members in new technologies
  • Consistently delivered high-quality results under pressure

Contact Information

  • Email: [[email protected]]
  • LinkedIn: [linkedin.com/in/andrei-bazarenko]
  • GitHub: [github.com/Gexar]