About Me


Hello! Hola! Namaste! My name is Vaibhav. People call me VJ to save their time and to avoid mispronounced my name and I like it.
In my three years of professional experience, I've worked primarily in data-driven roles in automation delivery, ranging from junior contributor to direct contributor of leadership. But, to this day, what I find most rewarding about my career choice is being of service to the clients and teams I've worked with – and producing shared success from which we can all grow.
I've assembled and learned from teams what it means to be high-performing, emphasizing how shared values like trust, vulnerability, and accountability can result in extraordinary team achievements. As a Data Enthusiast, my first responsibility is to foster a sense of confidence and safety among our teams and the clients we serve. Second, I believe that, as professionals, we should exude craftsmanship in everything we touch, produce, and communicate.

Roles I excel in: Data Analyst, Data Scientist, Data Engineer, Business Analyst

Clients and colleagues describe me as enthusiastic, energetic, trustworthy, committed, empowering, and detail-oriented. One of the things that makes me laugh about myself is that my spreadsheets are far superior to my bed. I spent more time explaining how difficult it can be to complete successful projects. To be successful, teams and clients must align common goals and values while also understanding their constraints.


Education


  • Northeastern University, Boston

    Master's Student, Data Analytics


    • Relevant Courses: Predictive Modelling, Data management and Big Data, Data mining, Probability & Statistics, Machine Learning, Data Visualization.
  • NMIMS University, India

    Bachelor of Technology, Computer Engineering


    • Relevant Courses:C, C++, Data Structure, Algorithm, Microprocessor, DBMS, Java, Advance Java, Operating Systems, Web Technology, Theory of Computation, Compiler Design, Data Mining, Big Data, Artificial Intelligence, Python

Experience



  Download My Résume




   Programming

Python
R
Pandas
Numpy
Dplyr
Javascript
Tensorflow

  Data Engineering

SQL
PostgreSQL
MongoDB


  Cloud technologies & Other tools

AWS
Kubernetes
Snowflake
Docker
GCP


  Tools

Tableau
Power BI
Looker
Github


  Certifications

  • Toyota Financial Services, TX

    Data Engineer


    • Collaborated closely with Finance teams to develop SOX-compliant automated SQL queries, reducing audit validation time across multi-environment systems and driving process efficiencies.
    • Developed end-to-end data pipelines using Python and SQL to ingest and transform data, applying Data Vault techniques in Snowflake to ensure data integrity and scalability across Finance-related processes.
    • Contributed the migration of on-premises data infrastructure to the cloud, ensuring minimal downtime and data integrity throughout the transition.
    • Designed and implemented scalable data engineering frameworks, enabling teams to automate Data Vault load processesand streamline real-time data ingestion, impacting strategic financial insights for over 1,000 internal stakeholders.
    • Leveraged AWS event services to build event-driven data workflows, automating real-time data ingestion from S3 to Finance reporting layers, enhancing timely data delivery for decision-making.
  • Sezzle Inc, Remote

    Analytics Engineer


    • Utilized AWS Redshift and SQL to assist in maintaining and optimizing existing data pipelines.
    • Contributed to the developement of DBT model and supported the design of Redash dashboards for visualizing key metrics.
  • TakeOff Technologies, MA

    Data Analyst Coop


    • Built out the data and reporting infrastructure from the ground up using Looker and SQL to provide real-time insights into the product, marketing funnels, and business KPIs.
    • Built operational reporting in Looker to find areas of improvement for contractors resulting in quaterly incremental revenue.
    • Worked with stakeholders to understand business needs and translate those needs into actionable reports in Looker and Snowflake, saving 18 hours of manual work each week.
    • Presented presentations concerning ad-hoc research and findings from disparate sources to upper-level management.
    • Utilized techniques and business intelligence (Looker) to create 15+ dashboards and 25+ ad hoc reports to address business problems and streamline processes.
  • Accenture INC, India

    Associate Software Engineer (Data Engineer)


    • Co-developed the SQL server database system to maximize performance benefits for clients.
    • Developed Custom ETL Solution, Batch processing, and Real-Time data ingestion pipeline to move data in and out of Hadoop using Python and shell Script.
    • Experienced in writing complex SQL Queries, Stored Procedures, Triggers, Views, Cursors, Joins, Constraints, DDL, DML, and User Defined Functions to implement business logic.
    • Worked extensively with Data migration, Data cleansing, Data profiling, and ETL Processes features for data warehouses.
    • Designed and published visually rich and intuitive Tableau dashboards and Crystal Reports for executive decision-making.
    • Extensively worked on Data validation between Hive source tables and target tables using automation Python Scripts.
  • CatchSavvy Solutions, India

    Data Engineer Intern


    • Strategized ETL processes and maintained Data Pipelines across millions of rows of data which reduce manual workload by 43%.
    • Maintained large databases and used various professional statistical techniques to collect, analyze, and interpret financial data from customers and partners; also responsible for carrying out A/B testing.
    • Contributed to the design and development of new quantitative models and Data Warehouse to help the company stabilize and maximize efficiency.
  • Nextsavy Technologies LLP , India

    Technology Analyst Intern


    • Identified and derived the key features from unstructured data by converting from HDFS to RDBMS using MySQL.
    • Maintained large databases and used various professional statistical techniques to collect, analyze, and interpret data from customers and partners.