profile-pic

Kislay srivastava

Vetted Talent

Kislay srivastava

Vetted Talent

I have over 8 years of experience with Python development, I have worked primarily with Django and Flask frameworks to create scalable web applications and deploying them on cloud.

I am confident in my ability to take on complex projects and provide innovative solutions that meet the needs of clients.

  • Role

    Software Development - Other

  • Years of Experience

    9 years

  • Professional Portfolio

    View here

Skillsets

  • Django
  • Python Programming - 8 Years
  • Django - 5 Years
  • AWS Cloud Lambda Function (Python & JavaScript) - 2 Years
  • AWS - 4.0 Years
  • Python - 8.0 Years
  • SQL - 5 Years
  • Data Warehousing - 5 Years
  • Web Development
  • Docker - 3 Years
  • Kubernetes - 2 Years
  • Machine Learning - 2 Years
  • and Kubernetes. - 1 Years
  • Databricks - 4 Years
  • DBT - 2 Years
  • Numpy - 2 Years
  • Pandas - 4 Years
  • Snowflake - 4 Years
  • SQLAlchemy - 2 Years
  • GCP - 3 Years
  • Jenkins - 2 Years
  • Git - 4 Years
  • Bash
  • Airflow - 4 Years
  • Data Engineering - 6 Years
  • Nodejs - 1 Years
  • FastAPI - 2 Years
  • MicroServices - 3 Years
  • Rest APIs - 4 Years
  • aws (amazon web services) - 4 Years
  • Data Science - 4 Years
  • JavaScript - 2 Years
  • TypeScript - 1 Years
  • RestAPI - 1 Years
  • React.js - 1 Years
  • PostgreSQL - 3 Years
  • BigQuery - 2 Years
  • ETL processes - 3 Years
  • Hive - 1 Years
  • Relational DB - 5 Years
  • Hadoop - 3 Years
  • Architecture Design - 2 Years
  • CloudWatch - 2 Years
  • Docker/Kubernetes - 1 Years
  • EKS - 1 Years
  • MongoDB - 1 Years
  • Node services - 1 Years
  • RESTful API - 4 Years
  • S3 - 6 Years
  • Node Js - 1 Years
  • Celery - 1 Years
  • Redis - 2 Years
  • Database Design - 2 Years
  • Product based Project - 4 Years
  • Leadership - 2 Years
  • API Development - 2 Years
  • Distributed Systems - 4 Years
  • DRF - 2 Years
  • English - 8 Years
  • MySQL - 5 Years
  • Code Quality - 5 Years
  • Django /Flask - 4 Years
  • Neo4j - 2 Years
  • Restful APIs - 3 Years
  • Terraform - 1 Years
  • Artificial Intelligence - 4 Years
  • Cloud Computing - 5 Years
  • Big Data - 5 Years
  • Rest - 5 Years
  • Azure - 2 Years
  • Apache Airflow - 4.0 Years
  • Apache Kafka - 2 Years
  • AWS Glue - 3 Years
  • Databases - 4 Years
  • ETL - 4.0 Years
  • Pyspark - 4 Years
  • Apache Spark - 3 Years
  • Business acumen - 4 Years
  • Clickstream data - 1 Years
  • Data Analysis - 5 Years
  • Tableau - 2 Years
  • Artificial intelligence/machine learning - 2 Years
  • AWS native analytics (RDS/SageMaker/Bedrock) - 4 Years
  • Cyber security - 2 Years
  • Data Engineer - 5 Years
  • LLM - 1 Years
  • Monte Carlo simulation/Bayesian theorem - 2 Years
  • PySpark/TensorFlow/PyTorch - 3 Years
  • Snowflake Administration - 2 Years
  • Big Data Technology - 5 Years
  • Linux - 4 Years
  • NoSql - 2 Years
  • ChatGPT
  • Elasticsearch
  • flask
  • Java - 1 Years
  • OOPs
  • OpenAI
  • REST API
  • Async
  • Springboot - 1 Years
  • System Design - 3 Years
  • Algorithms - 4 Years
  • DSA - 4 Years
  • Os - 2 Years
  • Software Engineering - 4 Years
  • System designs - 4 Years
  • Lambda - 4 Years
  • Identity and Access Management - 4 Years
  • Containerization - 3 Years
  • Dataops - 3 Years
  • DevOps - 3 Years
  • Aws/azure - 4 Years
  • Data Integration - 4 Years
  • Data Pipelines - 4 Years
  • Shell Scripting - 4 Years
  • Architecture - 6 Years
  • data Scapping - 2 Years
  • Data warehouse - 5 Years
  • startups - 1 Years
  • HTML5/CSS3 - 3 Years
  • Next.js - 1 Years
  • PostgreSQL/MySQL - 5 Years
  • react - 1 Years
  • Security - 2 Years
  • AWS Services - 5 Years
  • IAC - 2 Years
  • Networking - 2 Years
  • Natural Language Processing - 3 Years
  • Model evaluation - 4 Years
  • Deep Learning - 2 Years
  • TensorFlow - 1 Years
  • Pytorch - 3 Years
  • Reinforcement Learning - 2 Years
  • Data Visualization - 2 Years
  • Complex SQL Queries - 4 Years
  • ETL/ELT - 5 Years
  • CSS3 - 2 Years
  • GraphQL - 1 Years
  • Spark - 4 Years
  • Gradle - 1 Years
  • Postgres - 2 Years
  • Apache Tomcat - 1 Years
  • BERT - 1 Years
  • HTML5, - 2 Years
  • GitHub Actions - 1 Years
  • React Js - 2 Years
  • DynamoDB - 2 Years
  • API - 6 Years
  • aws infrastructure - 3 Years
  • Cloud ETL Processes - 4 Years
  • Aws cloudwatch - 3 Years
  • AWS CodePipeline - 1 Years
  • Guardduty - 1 Years
  • Redshift - 4 Years
  • Mortgage industry - 1 Years
  • Relational Database - 3 Years
  • Integration Testing - 3 Years
  • 3rd party API integrations - 3 Years
  • automations - 5 Years
  • Docker/Terraform - 4 Years
  • Generative AI - 1 Years
  • Mssql - 4 Years
  • Data Migration - 4 Years
  • AWS Expertise - 5 Years
  • Etl development - 4 Years
  • SQL Proficiency - 4 Years
  • Infrastructure - 1 Years
  • Cloud Integration - 3 Years
  • Go - 1 Years
  • datalakes - 5 Years
  • AI/ML - 3 Years
  • Cassandra - 1 Years
  • Google Cloud Platform (GCP) - 2 Years
  • Spark/pyspark - 4 Years
  • GenAI - 2 Years
  • GIT
  • machine learning
  • NA - 2 Years
  • Data Modeling - 4.0 Years
  • Data Lakes - 4 Years
  • Golang - 1 Years
  • Data Processing - 4.0 Years
  • Apache Iceberg - 1 Years
  • Aws
  • Git
  • Kafka - 2 Years
  • Product development - 2 Years
  • Azure/AWS/GCP - 5 Years
  • Hdfs - 2 Years
  • Backend Development - 5 Years
  • Data Pipeline - 4.0 Years

Vetted For

13Skills
  • Roles & Skills
  • Results
  • Details
  • icon-skill_image
    Data Engineer || (Remote)AI Screening
  • 76%
    icon-arrow-down
  • Skills assessed :Airflow, Data Governance, machine learning and data science, BigQuery, ETL processes, Hive, Relational DB, Snowflake, Hadoop, Java, PostgreSQL, Python, SQL
  • Score: 68/90

Professional Summary

9Years
  • Nov, 2022 - Present2 yr 2 months

    Senior Software Engineer

    Miratech Pvt ltd
  • Nov, 2022 - Present2 yr 2 months

    Senior Software Engineer - Data Platform

    Miratech Pvt ltd
  • Apr, 2022 - Nov, 2022 7 months

    Senior Lead Engineer

    Apisero Integration Pvt ltd
  • Dec, 2015 - May, 20215 yr 5 months

    Software Engineer 1

    Infosys
  • Jun, 2021 - Nov, 20221 yr 5 months

    Software Developer 1

    Larsen & Toubro Infotech
  • Apr, 2022 - Nov, 2022 7 months

    Senior Backend Engineer

    Apisero Integration Pvt ltd
  • Dec, 2015 - May, 20215 yr 5 months

    Senior Software Engineer

    Infosys Ltd

Applications & Tools Known

  • icon-tool

    Python

  • icon-tool

    Pyspark

  • icon-tool

    AWS (Amazon Web Services)

  • icon-tool

    Apache Airflow

  • icon-tool

    Snowflake

  • icon-tool

    MySQL

  • icon-tool

    Docker

  • icon-tool

    Kubernetes

  • icon-tool

    Django

  • icon-tool

    Flask

  • icon-tool

    Athena

  • icon-tool

    Airflow

  • icon-tool

    Azure Databricks

  • icon-tool

    Tableau

  • icon-tool

    AWS S3

  • icon-tool

    AWS

  • icon-tool

    Azure Blob Storage

  • icon-tool

    Azure Databricks

  • icon-tool

    Tableau

  • icon-tool

    AWS Glue

  • icon-tool

    Tableau

  • icon-tool

    Tensorflow

  • icon-tool

    Pandas

  • icon-tool

    Tableau

  • icon-tool

    AWS RDS

  • icon-tool

    AWS Fargate

  • icon-tool

    Kafka

  • icon-tool

    Jenkins

  • icon-tool

    AWS Elastic Beanstalk

  • icon-tool

    Django Rest Framework

  • icon-tool

    CI/CD

  • icon-tool

    AWS S3

  • icon-tool

    Azure App Service

  • icon-tool

    Databricks

  • icon-tool

    AWS S3

  • icon-tool

    AWS EMR

Work History

9Years

Senior Software Engineer

Miratech Pvt ltd
Nov, 2022 - Present2 yr 2 months
    My client is the world's leading Asset manager and as part of Index team, I am responsible for enabling the pipelines which onboard new clients. I was responsible for integrating data from several sources using python and pyspark and further push it to downstream systems as per the expected internal models. The Data platform so designed is part of a Django based web application. The application is deployed on AWS Elastic beanstalk. I used MySql as my database for the application. It was a managed AWS RDS instance. Developed fanning out mechanisms with Kafka to deal with the sharing of transformed data to the registered clients. The created data platform was fully cloud native and was auto-scalable. I achieved a 17% higher throughput for my workflows using asynchronous processing and using event based triggers. The UX was enhanced by me after adding a AWS ElastiCache as a separate component.

Senior Software Engineer - Data Platform

Miratech Pvt ltd
Nov, 2022 - Present2 yr 2 months
    My client is the world's leading Asset manager and I am responsible for onboarding new clients and proposing enhancements to the existing pipelines As part of the Index data team, I am responsible for integrating data from several sources using python and pyspark and further push it to downstream systems as per the expected internal models. The Data platform so designed is part of a Django based web application. The application is deployed on AWS Elastic beanstalk. I had to create several data pipelines leveraging the AWS analytics stack with Snowflake Data warehouse. The Initial ETL was handled by AWS Glue and after ingestion, the data was mostly traversed using Snowflake tasks and streams. Developed fanning out mechanisms to deal with the sharing of transformed data to the registered clients.

Senior Lead Engineer

Apisero Integration Pvt ltd
Apr, 2022 - Nov, 2022 7 months
    My client was a Global Supply Chain Platform and I worked as the designer of new integration pipelines I worked on enhancing the existing django data platform which was hosted on a Azure app service environment. I created new pipelines from scratch using MS Azure blob storage and utilized SnowFlake as the backend data warehouse. Created event based pipelines using Azure EventHub and custom Azure functions. Leveraged the Python code to pre-process the landed data. The processing was carried by Snowflake stored procedures after ingestion.

Senior Backend Engineer

Apisero Integration Pvt ltd
Apr, 2022 - Nov, 2022 7 months
    My client was a Global Supply Chain Platform and I worked as the designer of new integration pipelines I worked on enhancing the existing django application which was hosted on a Azure app service environment. Worked with HTTP and Websocket APIs to create efficient web handlers. I was responsible for recreating the REST APIs of the system and make them more standardized. Used PostGreSql as my transactional database. Utilized JDBC connections to connect the application with the database. Leveraged the React/HTML/CSS combination to make the User interface more responsive.

Software Developer 1

Larsen & Toubro Infotech
Jun, 2021 - Nov, 20221 yr 5 months
    Created event driven, high throughput apps using Kafka and Django frameworks. Deployed the application on AWS Elastic beanstalk to leverage the high scalability and managed infrastructure. Designed the system on the lines of Microservice based architecture as opposed to the earlier monolithic application. Created an integration with Mysql Instance and handled database lookups and queries. Was responsible for setting up the final CI/CD pipelines via Jenkins.

Software Engineer 1

Infosys
Dec, 2015 - May, 20215 yr 5 months

    I worked on several projects during my time here. I started my professional journey as a Python Django developer and proceeded to move to cloud based migration projects later on.

    1. Python Django Developer: (client Morgan Stanley - 2016 to 2019) My team and I were tasked with creating and maintaining a simple MVC app to perform some minimal transformations on some input files and writing the transformed files to an AWS s3 bucket location.
    2. Python/Pyspark developer AWS cloud: (client Ameriprise Financial Services - 2019 to 2021) I was in-charge of a data migration project wherein the data was being ingested through AWS Glue and were processed downstream using Pyspark programs running on top of EMR clusters. Thereafter the data was sent to an S3 bucket and visualized using Amazon Athena.

Senior Software Engineer

Infosys Ltd
Dec, 2015 - May, 20215 yr 5 months
    Created and deployed several Django based web applications on the cloud. Initially was responsible for developing and personalizing certain web applications. I worked with HTML,CSS and Javascript to optimize the UI of my applications. Worked recently on big data landscape and used AWS EMR to carry out certain python and pyspark scripts. Created Airflow DAGS to enable scheduled triggering of execution. I used Pandas/Pyspark for performing transformations on my input data. The main objective was to create a data pipeline for ingesting the Source files from various systems and ultimately pushing the processed rows to AWS S3.

Achievements

  • Developed fanning out mechanisms to deal with the sharing of transformed data to the registered clients.
  • Successfully developed and tested end-to-end ETL pipelines for automated ingestion and storing the results on a cloud warehouse.
  • Created and deployed several Django-based web applications on the cloud.
  • Revised concepts of RDBMS,DW and Python development.
  • Pyspark and Hadoop technologies played a key part in the final capstone project.
  • Studied SDLC concepts and pipelines in the data cloud.
  • Also worked on backend frameworks like Django and Flask, along with Containerized services.
  • Learned to utilize MS Azure services for creating event based data pipelines
  • Created Big Data pipelines using GCP services
  • Used Tensorflow library for creating ML models
  • Mostly worked with Pandas and Pyspark API of Python.
  • Project work and lab sessions used IBM DSX(data studio) as the service provider.
  • Revised concepts of RDBMS, DW and Python development.
  • Created Big Data pipelines using GCP services.
  • IBM Certified Backend Engineer
  • Infosys Python Associate
  • Infosys certified Python developer
  • Architecting Solutions on AWS

Major Projects

4Projects

Enterprise Data Platform

Blackrock Pvt ltd
Nov, 2022 - Present2 yr 2 months

    My team and I were tasked with identifying a viable alternative to the pre-existing Index data platform being used (mostly Perl and SAP Sybase database along with several in-house antiquated tools).

    1. As part of our modernization we moved from traditional architecture to a more cloud native approach.
    2. The next priority was for us to become as vendor agnostic as possible, this led to the choice of Snowflake with DBT as an ELT framework
    3. I was regularly involved in POCs for enhancing the existing Index Platform.
    4. Being a Python developer, I was in charge of understanding the legacy perl code and migrate them to more efficient python scripts.

Offline Verification of Digital Signatures using ANN models

    Created a Neural Network identifier for offline signature verification. Used Supervised Learning Algorithms to make the classifier.

Financial Services Guidance

Ameriprise Financial Services
Aug, 2019 - May, 20211 yr 9 months
    1. Python/Pyspark developer AWS cloud: (client Ameriprise Financial Services - 2019 to 2021) I was in-charge of a data migration project wherein the data was being ingested through AWS Glue and were processed downstream using Pyspark programs running on top of EMR clusters. Thereafter the data was sent to an S3 bucket and visualized using Amazon Athena.

Spreadsheet comparator app

Morgan Stanley
May, 2016 - Aug, 20193 yr 3 months
    1. Python Django Developer: (client Morgan Stanley - 2016 to 2019) My team and I were tasked with creating and maintaining a simple MVC app to perform some minimal transformations on some input files and writing the transformed files to an AWS s3 bucket location.

Education

  • Master of Technology: CSE

    IIT Dhanbad
  • BTech (Comp Science)

    SRM University, Chennai (2011)
  • Master of Technology

    IIT Dhanbad
  • MASTER OF TECHNOLOGY (CSE)

    Indian Institute of Technology, Dhanbad (2015)
  • MTech (Comp Science)

    IIT Dhanbad
  • Bachelor of Technology(CSE)

    SRM University
  • BACHELOR OF TECHNOLOGY (CSE)

    Srm University, Chennai (2011)

Certifications

  • Ibm certified data engineer

  • Ibm certified data science professional

  • Infosys certified python associate

  • Gcp big data and ml engineer

  • Ms azure for data engineering

  • Ibm backend developer

  • Ibm data science professional

  • Ibm certified data engineer (07/2022 - present)

  • Ibm backend developer (01/2023 - present)

  • Ibm data science professional (09/2019 - present)

  • Gcp big data and ml engineer (01/2020 - present)

  • Ms azure for data engineering (08/2022 - present)

  • Ibm certified backend engineer

  • Infosys python associate

  • Infosys certified python developer

  • Meta backend developer

  • Architecting solutions on aws (01/2024 - present)

  • Architecting solutions on aws

Interests

  • Travelling
  • Watching Movies
  • Exercise
  • Cricket