Certified Data Management Professional
Google Certified Cloud Architect
TOGAF - Enterprise Architecture
Agile Project Manager | Scrum Master
SAFe 6.0 Agilist
​
​
​
Projects
Lean Agile Product Development - Google Cloud Products

-
I was responsible for maintaining a healthy product backlog, ensuring team efficiency, and implementing lean Agile processes to stabilize velocity and enhance productivity.
-
As a servant leader, I empowered my team to take ownership of their tasks and work collaboratively towards our goals. I developed and managed project roadmaps, resource strategies, and delivery schedules while providing status updates to stakeholders through structured communication.
-
I effectively managed onboarding, conflict resolution, performance management, and skill development. I promoted using productivity tools like Rally, JIRA, and Azure DevOps for a more efficient workflow.
-
I organized daily stand-up meetings to foster communication, address roadblocks, and clarify acceptance criteria. By working closely with Product Owners, I prioritized backlog items based on customer feedback.
-
I facilitated sprint ceremonies—planning, stand-ups, reviews, and retrospectives—upholding Agile principles. I also partnered with business leaders to address project dependencies and risks, ensuring we met our sprint objectives.
-
I used burn-down charts and velocity tracking to monitor progress, which helped us achieve our goals. I led the agile transformation within the organization, promoting value delivery, autonomy, and self-management among teams.
Google Cloud Solution Development & Architecture

-
Piloted a fast-paced project setting up all the infrastructure and delivering 5 data products in Google Cloud in just a month of 2 agile iterations.
-
Mentored and supervised a team of 18 talented individuals, ensuring high productivity and achieving all committed deployments monthly.
-
Successfully migrated 2 applications, each with 10K lines, to the Google Cloud Platform using Python, Dataproc, Big Query SQL, Terraform, Tekton, Looker, and Astronomer during the migration.
-
Orchestrated implementation of DEVSECOPS tools for security & vulnerabilities, scanning the entire git repository code, which helped achieve regulatory compliance with 0 security issues.
-
Accomplished committed targets of 3 Major Google Cloud Migration deployments to enable partner business support, enabling revenue of $900K from each product.
-
Designed a technical roadmap for 3 major deployments using Datastage ETL, delivered incremental value with each iteration through consecutive deployments, yielding $1M in revenue overall.
Data Mesh - Data Access Control & Data Domain Ownership

-
Implemented Data Mesh using Data Plex Discovery Frameworks.
-
Leveraged DataPlex features and enabled Data profile, Data Quality, and Secured Access.
-
Authorizing consumer service accounts and individual IDs for the specific data domain based upon the data domain type & classification, consumer type & role, and legitimate business reasons for access provisioning.
Customer 360 - Single Source of Customer Data

-
Engineered an impactful Customer 360 model by enriching customer data using Python, Apache Spark, Azure DevOps Server, Github, Jupyter Notebook & Bigdata Hadoop platform.
-
Augmented datasets drive more effective Next Best Offer campaigns weekly for all the product lines.
-
Formulated integration of 20+ ML Scores, including Customer Traits & Interest Scores, Propensity, Uplift, and Abnormality using Python and Apache Spark hosted in Infosys NIA Hadoop Platform.
Predictive Modelling and Churn Reduction

-
Reduced churn by 9% through pre-paid campaigns to targeted customers using supervised ML techniques using Apache Spark, Python, H2O, ML Ops, ML Flow, and Adobe Campaign Manager.
-
Reduced churn by 10% through Pay Monthly campaigns by identifying customers who likely accept the offers using Supervised Machine Learning techniques in Microsoft Cloud Azure ML Eco System.
-
Appraised vendor tools for the ML Observability & performed value analysis on the buy Vs build.
-
Built the "ask" bot as a question-answering NLP-based application that helps new team members quickly get the source data availability and the name of the Database, Tables, & Views.
Domain Driven Feature Store

• ​Involved in Designing Feature Store for Customer Data, Products Data, Campaign Offer Data
• ​Designed the NO SQL data model to support the Action managment framework that contains Offer Attributes, Offer Eligibility Criteria, & Offer Thresholds.
• ​Deployed the the R-Shiny Pilot pipelines to support Offer Action framework for all action updates.
Batch & Stream Pipeline Integration Model

• Designed 20+ Data Products for Product Data Squad using Infosys NIA & IBM Datastage ETL Tools to support Pay Monthly, Broadband, Copper, and Managed Services Product Lines.
• Converted the 20K Lines of Oracle Procedure into an ETL Application using IBM Netezza, Datastage, and DB2 to build the call center system using Genesys Info Mart (for call center employees' salary wages) as the source data.
• Designed and deployed 25 ETL Pipelines for Source Aligned System in IBM DB2 Data Warehousing system, bringing data from multiple source systems in Order, Accounts, Connections, and Credits.
• Designed analytical applications for the automotive firm using Teradata ETL tools (MLOAD, FASTLOAD, BTEQ, TPUMP).
Customer Base Reporting - Acquisition, Base & Churn (ABC)

• Designed and deployed the Acquisition Base-Churn ETL Pipeline using IBM Datastage, IBM DB2 UDB, and IBM Netezza.
• Augmented datasets for product line-level reporting using the IBM DB2 DWM Ref Model, which saved $5M in Revenue from the previous Oracle solution.
• Engineered calculation of customer states based upon the customer product usage, date & time of usage especially for pre-paid customers that helped to identify the potential churners.
ELT Pipelines - Google Big Table

• Designed and Built NO SQL Data Model solutions using Google Big Table for a US Retail Client.
• Deployed Data Integration Pipelines using Pentaho tools, integrated Apache Hadoop data with Amazon S3 files, and developed data marts for reporting for a US Banking Client.
IBM Master Data Management - Customer Data Mart

• Architected IBM Datastage ETL Jobs (Change Data Capture) for Customer Master Data Handling.
• Catapulted Consumer Data Refinery Data Marts, applying Customer Knowledge Services Data Domain skills.
• Designed and Deployed IBM UDB DB2 Stored Procedures for CDR Data Marts.