Data Engineer
Transforming raw data into business value through automated pipelines and thoughtful engineering.
Scroll to explore
About Me
I'm a Mid Data Engineer based in Vilnius, Lithuania, currently at IBSettle. I specialize in designing and maintaining automated ETL pipelines that transform chaotic data into reliable, business-ready insights.
From SQL and Python to Apache Superset and Data Warehouses — I build systems that run 24/7, so decision-makers never have to wait for their data.
The Journey
From my first SQL query to architecting fintech data warehouses.
@ Kaunas University of Technology
Foundation Quest
"Learn the Language of Business"
Where analytical thinking began. Understanding business logic before writing code gave me the perspective to build solutions that actually matter.
Further Learning (2021-2023)
"Before you can engineer data, you need to understand what it means to the business."
@ Vivus Finance
Main Quest
"Decode the Spreadsheets"
Skills Unlocked
Achievement Unlocked
First Production Dashboard
My journey began in the finance world, where every number tells a story. At Vivus Finance, I learned that data isn't just rows in a table — it's the pulse of business health.
Daily financial and collateral analysis taught me to build Excel models that later evolved into automated reporting systems. Here I discovered a fundamental truth: a good analyst doesn't just read data — they tell its story.
"My first SQL query in production took 47 minutes to run. That day I learned what a WHERE clause is and why it matters."
Boss Defeated
The Chaos of Unstructured Data
@ Vičiūnai Group
Main Quest
"Guard the Data Kingdom"
Skills Unlocked
Achievement Unlocked
Master of Clean Records
Vičiūnai Group — one of the largest food industry companies in Lithuania — became my new arena. Here I wasn't just an analyst; I became a data guardian, responsible for master data integrity across all systems.
Every day was a battle against duplicates, inconsistencies, and legacy system quirks. I learned that data governance isn't just a set of rules — it's a philosophy of how an organization respects its data.
"Found a record that existed in 4 different systems with 4 different spellings. That 'client' was the same company. Welcome to enterprise data."
Boss Defeated
The Dragon of Dirty Records
@ Vičiūnai Group
Epic Quest
"Bridge Two Worlds"
Skills Unlocked
Achievement Unlocked
Zero Data Loss Migration
Promotion to Senior Migration Specialist came with the biggest challenge yet — migrating the entire company's data infrastructure to SAP. This wasn't just a technical project; it was a business transformation.
Customer, vendor, and tax data — all of it had to "travel" from legacy Navision to the modern SAP world. Every record was carefully cleansed, transformed, and validated before migration.
"Migration without data loss is like a heart transplant where the patient has to keep working. Not a single second of downtime was acceptable."
Boss Defeated
The Ancient Legacy Systems
@ Sanitex
Nov 2024 — Nov 2025
✅ Quest Completed
"Orchestrate the FMCG Data Flow"
Skills Unlocked
Achievement Unlocked
Pipeline Orchestrator
At Sanitex — one of the largest FMCG distribution companies in the region — I built and maintained automated data pipelines that powered real-time business decisions.
Apache Airflow DAGs were my daily symphony. Each DAG was like an orchestra: different instruments (data sources) had to play in sync so that business reports arrived on time and error-free.
I integrated data from multiple internal and external sources, ensuring quality through validation, transformation, and monitoring. Every row had its journey — from source to dashboard.
"A good Data Engineer is invisible. When everything works — no one knows you exist. But when something breaks... everyone knows your name."
Boss Defeated
The Never-Ending Report Requests
@ IBSettle
Dec 2025 — Present
Legendary Quest
"Tame the Financial Data Chaos"
Skills Mastered
Current Mission
Data Warehouse Architect
Now at IBSettle — a mature fintech company where I'm designing and building ETL pipelines that integrate diverse data sources into our central Data Warehouse.
My daily mission: create automated data collection scripts, optimize SQL queries for performance, troubleshoot data quality issues, and implement best practices across the data infrastructure.
Collaborating with analysts and stakeholders, refactoring legacy tables, and bringing order to financial data chaos. In fintech, every number must be precise — there's no room for "approximately correct."
"Discovered that half the critical business logic lives in JSON fields and longtext columns. Welcome to fintech data engineering."
The journey continues... // Level 5 Active
The Arsenal
All the tools and technologies I use daily