Fullstack Software Developer with 3 plus years of experience in the domains of e-commerce, supply chain software, education and govt. Services. A professional with hands on experience in building scalable websites/applications using a wide range of front-end and back-end technologies such as Javascript, Node.Js, Python, Vue.Js, React, MongoDb, Postgresql. A 2019 Bachelor of engineering + Master of science graduate of BITS Pilani interested in working on both backend and frontend development, solving analytical problems and experimenting with new technologies.
Overview
3
3
years of professional experience
6
6
years of post-secondary education
8
8
Certifications
Work History
Software Development Engineer 2
Zeno Health
10.2023 - Current
Zeno Health is an omnichannel pharmacy. It has 200+ offline stores and has e-pharma web and mobile applications where users can order medicines.
Their web application uses node js, express framework in the backend for both the web application as well as the admin panels. The web application uses a vue js framework in the frontend.
Designed and developed a lead management system for capturing and managing leads generated through health camps and other marketing activities.
Identified improvements to be made in the existing elasticsearch query by comparing with competitor e-pharma websites like Tata 1MG and flipkart healthplus . Modified the search query, as well as changed how the index is created. This improved the quality of search results by providing more accurate and predictable results.
Worked on combining the backend operations of zeno health with their new acquisition. The backend operations like order creation, modification, pincode serviceability modifcation and other similar activities were captured and sent via AWS SQS to sync with the common backend of zeno health.
Ideated and developed a panel for migrating and tracking high value recurring chronic customers acquired from a competitor. The panel involved setting reminders against each lead, efficiently querying for these leads, summary segment to track the conversion of these leads and a timeline of all the activities done on these leads.
Fixed various production bugs of legacy code, involving application of coupons and discount calculation logic.
Software Development Engineer 1
PandoCorp
03.2022 - 10.2023
Pando is a supply chain technology company serving fortune 500 companies. It is a cloud based SaaS product.
I have worked across various modules and microservices such as manage, track-trace and courier-service.
Wrote REST APIs using expressJS, fastify (NodeJS) to process various stages in indent/order management.
Wrote API to track halt durations of trucks carrying consignments.
Revamped the courier integration API to handle creation, updation and deletion of bookings made with leading International and domestic courier service providers like SAFEXPRESS, FEDEX, DHL, Delhivery etc.
Developed and maintained frontend components using Vue JS as a frontend framework.
Integrated Google Maps library in VueJS application for efficient route visualization.
Utilized AWS services like Lambda for managing cron jobs and S3 buckets for asset management.
Worked on POCs to improve search query performance of indents/orders.
Software Development Engineer
Ardhas Technology India Pvt. Ltd.
06.2021 - 03.2022
Development of Features and Maintenance of https://www.ilearn.gov.in/
ilearn is an education/scholarship portal hosted by Ministry of External Affairs of India. It is used by African nationals to access Indian university degree programs and (NPTEL/SWAYAM) MOOCs. This project was previously being handled by IIT Madras.
Took knowledge transfer (KT) and took control over all the existing software, code, documents and all related artifacts as understood from the existing Partner Agency of IIT Madras. Ilearn's existing Python (webapp2 framework) based web application was running on Google Cloud Platform's serverless platform "Google App Engine". The Project was using the serverless NoSQL Document Storage, "Google Firestore" as its main database and Google Cloud Storage for storing larger documents like student images.
Coordinated with new higher educational institutions and provided necessary API setup to the 6 universities to integrate their student enrolment page with iLearn portal for seamless transfer of student related information from Ilearn to the HEI.
Developed new admin modules for tracking Student Enrollment Status, for tracking Student Progress on Courses, for managing University Fees, Semester course details and Scholarship Approval management. Developed a CSV and document upload feature within the student progress module and student enrollment module. Errors made in the CSV would be available to download.
Developed dashboards using "Chart.Js" for tracking Student related data eg. Student country wise enrollment, university wise enrollment, enrollment status, university wise scholarship and semester wise student progress.
Developed reports module which generates summary reports for parameters specified by the client.
Wrote a script which generates summary data for display on the dashboard module and report module. This python script would run as a cronjob on top of Google App Engine at specified time intervals.
Provided inputs for creating all related documentation SRS, User Manuals of new modules as per client requirements.
Migrated the project to a new cloud service and also revamping the UI of the portal. Migrated data from firestore to mongoDb.
Chatbot Development for Indian Embassy in Singapore
https://www.hcisingapore.gov.in/
Developed backend for an FAQ based chatbot POC using FastApi framework (Python) and "Spacy" pretrained NLP model en_core_web_md for finding similar questions. Text preprocessing before matching was done by removing stop words, punctuations and pronouns using spacy library.
Education
B.E. Chemical Engineering + Msc. (Dual Degree) -
BITS Pilani
India
07.2013 - 07.2019
Skills
Node JS
Vue JS
JavaScript
MySQL
MongoDB
PostgreSQL
Redis
Flutter
Hadoop
Spark
Kafka
Elasticsearch
GCP - Google Cloud Platform
Solr
Flask
Django
Spring Boot
React JS
AWS
Java
Languages
English
Advanced
Tamil
Advanced
Certification
Introduction to Algorithms and Analysis, IIT Kharagpur, NPTEL
Publication
Biokinetics of fed-batch production of poly (3-hydroxybutyrate) using microbial co-culture.
1 Feb'20 https://link.springer.com/article/10.1007/s00253-019-10274-7 Subramanian, A.M., Nanjan, S.E., Prakash, H et al.
Projects
Stock Portfolio and Watchlist application
The user will be able to create, view and edit their watchlists and portfolios. Stock prices are updated in real time using websockets (Socket.io). Notification popups and emails are sent when price targets are reached. Served the ReactJs application using Django.
Used Django Rest Framework to create JSON rest api endpoints for the ReactJS application. Used the socket.io service to update the stock
data in real time in the watchlists and portfolio. Used Postgesql database to store the data related to the user, watchlist, portfolio,
stock specific data and historical stock data upto 5 years.
Created a Celery based webscraper for periodically fetching stock related data for eg. news, historical data. Stored essential data in a
cache which was later bulk updated in the postgresql database.
Created a nodejs based socket io server for updating the stock prices and dashboard data in real time.
Used JWT based authentication between the services. User authentication was accomplished using Django as well as Google OAuth.
Twitter Feed Consumption and Analysis
Created services for receiving and analysing twitter feed based on
keyword/hashtag filters set by user.
Used Kafka as a messaging and storage service for recieving and
sending the continuous twitter feed. Created a java based script
(kafka producer) using twitter developer tools for ingestion, which will
send twitter feed continuously to the kafka service.
Used a cloud based elasticsearch cluster (bonsai.io) as an endpoint
for twitter feed consumption and analysis. A kafka consumer service
would run in the backgtound transfering the twitter data from kafka to
the elasticsearch database.
Observed an increased usage of bitcoin hastags during recent times.
POC using Hadoop Ecosystem for a Mobile Engagement Company
Scaled up and migrated a legacy application running on Oracle and Java to Hadoop and Java to handle increase in volume from 200
million to 500 million SMS per day with peak volumes reaching 100 million an hour.
Wrote consumer APIs to consume data from Kafka and load them into Solr and HBASE.
Used Oozie to schedule overnight summary generation jobs by moving data from HBASE to Hive.
Exported the external summary tables from Hive to MySQL using Sqoop.
Stored historical data in parquet format to save space and used
Apache Drill for fast retrievals