kathanp.github.io

LinkedIn|Email|Download Resume

đź‘‹ About Me

I am a Data Integration Analyst (Engineering Focused) with 3+ years of experience in FinTech (Flexiti, Questrade), bridging the gap between Data Engineering and System Reliability.

I don’t just build pipelines; I ensure they stay up. I specialize in the Modern Data Stack (Databricks, Python, SQL) and have a strong focus on Incident Response and Automated Data Quality.

Core Stack: Python (Pandas/Boto3), Databricks (Lakehouse), SQL, AWS.


🛠️ Key Projects

1. Enterprise Financial Data Lake Integration

The Problem: Financial data was siloed across three different legacy systems (CRM, Marketing, Core Banking), requiring manual extraction that delayed reporting by 48 hours.

The Solution: I architected a Databricks Lakehouse solution to unify these sources into a “Single Source of Truth.”

The Architecture: Financial Data Lake Architecture The Workflow: Incident Response Workflow The Result:

⚡ Reduced processing time by 30% 📉 Eliminated manual file transfers completely 🟢 Achieved 99.9% uptime for critical financial reporting —

2. Automated Incident Response System

The Problem: SQL timeouts and payment gateway errors were creating “operational noise,” requiring manual triage that distracted the engineering team.

The Solution: Built an automated monitoring framework using Python and CloudWatch/Datadog.

The Result:

Reduced Mean Time to Detection (MTTD) from hours to minutes. Allowed the team to focus on root cause fixes rather than triage.


📜 Certifications


🚀 Experience Snippets