profile-pic
Vetted Talent

Ahmar Hashmi

Vetted Talent

Result-oriented Senior Technical Lead with 8+ years of experience in developing and leading the implementation of innovative solutions. Gained experience in handling project management activities including project scoping, estimation, planning, finalization of technical/ functional specifications and resource administration. Strong expertise in web development technologies like Node.js and React.js., Express.JS, Typescript, Python, C# and Angular. Knowledge of SQL and NoSQL Databases and CI/CD pipelines & tools. In-depth knowledge of Docker, AWS (lambda, EC2, Code-Deploy) for managing and scaling applications. Proven experience in architectural decision-making and system design. Solid planning and organizational skills in coordinating all aspects of each project from inception through completion. Strong team builder and facilitator, fosters an atmosphere that encourages highly talented professionals to balance high-level skills & maximum production.

  • Role

    Consultant

  • Years of Experience

    8.3 years

  • Professional Portfolio

    View here

Skillsets

  • Python
  • ReactJs
  • Hapi
  • GraphQL
  • Express
  • Azure
  • TypeScript
  • Nodejs
  • MongoDB
  • C++
  • Python
  • JavaScript
  • MySQL
  • MongoDB
  • C++
  • AWS - 5 Years
  • JavaScript
  • MySQL
  • MongoDB
  • C++
  • Python
  • JavaScript
  • MySQL
  • MySQL - 05 Years
  • AWS
  • FastAPI - 3 Years
  • C++
  • C# - 4 Years
  • JavaScript - 7 Years
  • Python - 6 Years

Vetted For

14Skills
  • Roles & Skills
  • Results
  • Details
  • icon-skill_image
    MERN Stack DeveloperAI Screening
  • 77%
    icon-arrow-down
  • Skills assessed :Debugging, Engineering Background, Relational Databases, Troubleshooting, Node Js, Node Js, React Js, React Js, Agile, Github, JavaScript, Mongo DB, Redux, Rest APIs, SQL, Type Script
  • Score: 69/90

Professional Summary

8.3Years
  • Sep, 2025 - Present 5 months

    Consultant

    Synechron
  • Jul, 2024 - Present1 yr 7 months

    Backend Developer

    Genie Connections
  • Jul, 2024 - Present1 yr 7 months

    Backend Developer

    Koottu
  • Nov, 2023 - May, 2024 6 months

    Technical Lead

    IQ-Line
  • Jul, 2024 - Jan, 2025 6 months

    Senior Engineer

    Saagas AI
  • Mar, 2025 - Jun, 2025 3 months

    Technical Lead

    Turing
  • Jul, 2023 - Feb, 2024 7 months

    Backend AI Developer

    Million Labs
  • Mar, 2023 - Oct, 2023 7 months

    Senior Backend Engineer

    DarioHealth
  • Jun, 2022 - Mar, 2023 9 months

    Senior Backend Developer

    BizzTM Technologies
  • Apr, 2017 - Feb, 2018 10 months

    Full Stack Developer

    CusXP
  • Mar, 2018 - May, 20191 yr 2 months

    Software Engineer

    Blackmagic Design
  • Sep, 2019 - Feb, 20222 yr 5 months

    Software Development Engineer II

    Smiths Detection
  • Nov, 2016 - Feb, 2017 3 months

    Software Engineer intern

    Works Applications

Applications & Tools Known

  • icon-tool

    NodeJS

  • icon-tool

    AWS Lambda

  • icon-tool

    AWS EC2

  • icon-tool

    Apollo GraphQL

  • icon-tool

    MongoDB

  • icon-tool

    Azure DevOps

  • icon-tool

    Docker Swarm

  • icon-tool

    Qt

  • icon-tool

    C++

  • icon-tool

    C#

  • icon-tool

    jQuery

  • icon-tool

    Bootstrap

  • icon-tool

    Virtuoso

  • icon-tool

    Ansible

  • icon-tool

    Azure Data Factory

  • icon-tool

    Bubble

Work History

8.3Years

Consultant

Synechron
Sep, 2025 - Present 5 months
    Building Open Finance framework for UAE's leading bank, ENBD. Technology used: JavaScript, NodeJS, Typescript, MongoDB, Microsoft CoPilot.

Backend Developer

Genie Connections
Jul, 2024 - Present1 yr 7 months
    Build and maintain the backend architecture of the application Genie Connections. Technology used: NodeJS, Mongoose, JavaScript, AWS (Lambda, EC2, API Gateway, CloudWatch), mongoDB.

Backend Developer

Koottu
Jul, 2024 - Present1 yr 7 months
    Build and maintain the backend architecture of the application Koottu. Technology used: NodeJS, Hapi, AWS (Lambda, EC2, API Gateway, CloudWatch), mongoDB.

Technical Lead

Turing
Mar, 2025 - Jun, 2025 3 months
    Trained and benchmarked Gemini LLM against various other LLMs like Claude, ChatGPT. Technology used: JavaScript, React, Python, Fast API, Typescript.

Senior Engineer

Saagas AI
Jul, 2024 - Jan, 2025 6 months
    Built a platform to help AI creators and artists securely publish and monetize their work fairly. Technology used: Python, Fast API, React, Typescript, AWS, mongoDB.

Technical Lead

IQ-Line
Nov, 2023 - May, 2024 6 months
    Led a cross-functional team of 5 developers in the design, development and implementation of a cutting-edge Laboratory Information Management System (LIMS) for healthcare laboratories. Implemented automated deployment of the above LIMS across 100+ diagnostic centers in PAN-India. Technology used: NodeJS, React, Typescript, AWS (Lambda, EC2, CloudWatch), postgreSQL.

Backend AI Developer

Million Labs
Jul, 2023 - Feb, 2024 7 months
    Developed an image viewer application with features like drawing shapes, labelling, transformations, support for touch devices, pressure sensitivity, color coding. Developed a web application to classify transactions based on transaction code by reading messages on the phone, and using it to build monthly budgets using AI. Technology used: bubble, python.

Senior Backend Engineer

DarioHealth
Mar, 2023 - Oct, 2023 7 months
    Handled overall accounts system microservice-based architecture across multiple applications by designing and implementing APIs from scratch. Technology used: NodeJS, React, Typescript, AWS (Lambda, EC2, CloudWatch), postgreSQL.

Senior Backend Developer

BizzTM Technologies
Jun, 2022 - Mar, 2023 9 months
    Built an e-commerce platform to cater tier-II, tier-III cities for their regular household needs. Designed APIs that enable core e-commerce functionality, starting from product exploration to order placement. Setting up the entire deployment life cycle from scratch for develop, staging and production environments. Technology used: NodeJS, Apollo GraphQL, AWS (Lambda, EC2, Code-Deploy), mongoDB, Python, Bitbucket Pipelines.

Software Development Engineer II

Smiths Detection
Sep, 2019 - Feb, 20222 yr 5 months
    Created a cloud based solution for enhancing airport security, automated customs clearance. Took complete handover of project's backend codebase from Hitachi Consultancy and understood end to end architecture of the project. Technology used: C#, C++, Python, mySQL, mongoDB, rabbitMQ, Azure DevOps, Docker swarm.

Software Engineer

Blackmagic Design
Mar, 2018 - May, 20191 yr 2 months
    Add new features and fix bugs in DaVinci Resolve's Edit and First Cut page. Technology used: Qt, C++.

Full Stack Developer

CusXP
Apr, 2017 - Feb, 2018 10 months
    Integrated 3rd party APIs, performed sentiment analysis, built AI consensus engine, generated real-time automated report on the platform CusJo using Natural Language Processing techniques. Built a surveybot at CusXP to automate survey form filling in a chat format. Technology used: C#, AngularJS 1.6, Google Natural Language API, dialogFlow.

Software Engineer intern

Works Applications
Nov, 2016 - Feb, 2017 3 months
    Built an Inventory Management System software to track the life-cycle of the products in an inventory. Built a Personal Information Management System software to manage the personal details of the employees, their attendance records, leaves, payroll information, etc. Technology used: JavaScript, jQuery, Bootstrap.

Achievements

  • Obtained All India Rank 36 out of 1,55,190 students in GATE 2014 (99.97 percentile)
  • Bagged All India Rank 36 out of 1,55,190 students in GATE 2014 (99.97 percentile) 2014
  • Worked as Teaching Assistant(TA) throughout my entire MTech tenure 2014-16
  • Qualified for ACM-ICPC regional finals held at Amritapuri, Asia 2013
  • Achieved First Rank in inter-college coding competition at the techno-management festival of IIEST Shibpur 2014
  • Received Full Scholarship from the State Government, (WBMDFC) during my B.E. 2010-14
  • Completed vocational training on Core Java at Moniba Academy jointly partnered by IBM 2012

Major Projects

3Projects

Efficient learning of pre-ordering rules for Machine Translation

May, 2015 - Jun, 20161 yr 1 month
    Learned rules incrementally for re-ordering of sentences using selective examples and source side grammar. Retained both the syntactic structure and lexical level details in the generated rules. Designed a deterministic framework which can extend to any language pair while performing better than the base-line translation.

Understanding Social Behaviour by Sensing Contexts

Jul, 2013 - Jun, 2014 11 months
    Performed face detection first, and then created clusters for each facial expression. Used KNN classifier to classify a new image to its corresponding cluster. Tools/technology used: MATLAB.

Study of Semantic Query Retrieval from Triple Store using SPARQL

May, 2013 - Aug, 2013 3 months
    Analyzed how graph databases are stored. Performed queries written in SPARQL via a JAVA program on various graphs. Tools/technology used: SPARQL, Virtuoso, Protege.

Education

  • MTech in Computer Science Engineering

    IIT Bombay (2016)
  • B.E. in Computer Science and Technology

    IIEST, Shibpur (2014)

Certifications

  • Completed course on Machine Learning by Andrew NG, Stanford University on Coursera

  • Microsoft specialist in programming in html5 with javascript and css3

  • Certified associate graphql developer by apollo; 2022

  • Completed bizztm connect program in node js & apollo graph ql server; 2022

  • Deep learning by andrew ng, stanford university on coursera 2017

  • Machine learning by andrew ng, stanford university on coursera 2015

  • Completed a course on deep learning by andrew ng, stanford university on coursera 2017

  • Cleared a course on it foundation skills to be recognized as a cognizant certified student 2013

Interests

  • Cricket
  • Travelling
  • AI-interview Questions & Answers

    Hi, everyone. This is, and I'm a software developer professional. Uh, my educational background is that I've done my BTech in computer science engineering from IS to shift course in the year 2010 and graduated in 2014. After that, I went for my masters and completed I'm taking computer science engineering from the prestigious IIT Bombay. I graduated in 2016. After that, I've been working as a software developer for a few companies. And by the total working space, more than 7 years. I have started my career as a dot net developer with Angular frame, uh, with the front end technology being Angular. And I have around 4 years of work ex as an as a dotnet developer. Since past 3 years, I've been working mostly on JavaScript domain, and I have full stack experience in Node. Js as well as React. Js. Apart from this, I have good experience in both SQL as well as NoSQL databases. And, also, I have good good experience in Docker.

    To mitigate SQL injection risk in a Node. Js application, we can use ORMs or we can use parameterized queries rather than, uh, hard coded queries or input or, yeah, parameterized queries or we should use the ORM. Other than that, we can also make use of the fact that and, uh, the inputs which are received from the front end, we check for inputs before running the query. We do sanity checks on the input to test the input is valid or not. So that is also 1 technique.

    So to improve the load time of a single page application, uh, which is built on React and which has many components. What we can do is to improve the total time is we can do code splitting. Maybe we can split up out a smaller chunks, and that can be loaded on demand. Um, we can do lazy lazy loading using the React router. Um, you can also do optimization of the images and the assets. We can make use of CDNs. CDNs will also help. We can minimize the bundle size, bundle of CSS. We can do the mini bundling. And caching, we can do. All these activities would help improve the

    So we would use the interface for separate interface for MongoDB and SQL, And, uh, the bringing it to a common API, like, what I imagine maybe there's a there's an API which needs access to both these databases, uh, data in both these databases. So it will try to connect to each of them using their own respective ORMs, and, uh, it would fetch data or push data, whatever. But, like, whatever it would, it would be it would fetch data from both of these tables and, uh, using their separate interfaces. And then based on the data which has been received, we can we can take action of that data. We can modify the data or take or or apply our business logic on that data.

    So for a content management system, if you want to design a MongoDB schema to handle multiple content types and their associations, We can have it in the form of a JSON, and we can query on this JSON object. Imagine that we have these JSON data stored in the DB. Each of them would have their content type and their association type inside, And, uh, we can have the table name as contents, and contents would be having different content content types. So 1 content type can be article or a text text based content. Another can be an image based content. Another can be a video document based content. And, uh, association array would be stored in their associations, uh, to which related content they are associated to. And, uh, another can be yeah. All these associations will be having a type and a content ID. And we can also have the author who is the author. So there can be an author author collection. And, also, each content would be tagged to 1 author. We can also tag articles or videos with tags. So we can also have another entity called tag collection, another tag. So probably this should be a good schema design for in MongoDB to handle multiple content types and their associations.

    So here, what we can do, we can make use of a logic search to, uh, or maybe we can have an in memory cache like that is. Uh, we can make use of that to store to to stores most frequently most, uh, recently or most frequently accessed data in the cache, we can store them. And when for repetitive queries, we can make it sure that it is stored in the cache and it is served from the cache rather than making a DB query. So this is how we can implement the caching mechanism.

    So in this particular code snippet, we're not we're not checking whether the query is actually returning a user of user object or not. So we can we have to add that check null check over there. If user exists, only then we can return a test dot status 200 dot JSON user. If it if the user object doesn't exist, then it would result in a 500 error.

    So it can be the situation here with we're trying to find the user and updating its email. So we need to first add add an add a check before calling this find by ID and update. What I mean is that, uh, it might it might be that the user does not exist at all. So this if if condition to check the user should be before calling this find by ID and update. If the user exists, only then we can update. So, uh, probably, uh, this should this should be the issue.

    So in 1 of our previous rules, we had the, uh, entire application on, uh, Node. Js back end, and the DB was MongoDB. So while we we were we met with the situation where the number of users were scaling up at our MongoDB database was getting slow. So we used the we used the inbound feature of MongoDB to to increase the cluster size. And, uh, we in we earlier, we were having only 1 node, so we increased it to scale up to, uh, 3 nodes and 1 master, 2 slaves like that. And, uh, it would auto scale on the basis of that. It earlier, it would be only 1 node, and maybe with the number of requests is increasing to the database, it would scale to multiple nodes. So this was a feature of the MongoDB MongoDB itself, so that's how we managed to resolve this.

    To ensure code quality, we run unit tests and find out the code coverage. We also make use of various hooks provided by GitLab to check to ensure that the code is of good quality and that all these tests passed before we only that it would merge the feature into the main branch. This provided this kind of, uh, code quality insurance is provided by, uh, the paid repository providers or, like, GitHub or GitLab. We use GitLab. So at GitLab, we have done this kind of a thing.

    So for managing data consistency, what we use is we use a message queue. And any changes in we and it is to ensure that different services, uh, are, uh, having these queues, and they push they they pull data from this queue 1 at a time in that in the, uh, using the queue architecture, and that's how we manage data consistency.