Currently working as Senior Bigdata Engineer at Experian Services India Private Limited, Hyderabad. Has an IT experience of 9 years. Firsthand experience in using Apache Spark, Python, Scala, boto3, chalice framework, AWS (IAM, Lambda, S3, API Gateway, DynamoDB, EMR, Cloud Watch, SES, SNS), Apache Airflow, Hive, Sqoop, HDFS, UNIX/LINUX commands, shell scripting, Oracle 11g SQL, PL/SQL technologies and Control-M scheduling tool. Involved in automating manual tasks using UNIX shell scripting. Worked in Development projects which followed agile methodologies in delivering the projects. Worked on production support projects. Attained good knowledge by identifying the root cause of major production issues. Involved in resolving incidents/problem resolution tickets. Adaptable to learn modern technologies and work efficiently on them. Initiative-taking in sharing the knowledge by conducting training sessions. Capable of independently managing the assigned work.
Data structures, OOP, OS internals (UNIX)
undefinedExperian, Banking and Finance, March 2020 to till date, Developer, Sr Bigdata Engineer, This project aims for product development which gives end-to-end solution to customers for accessing services offered by Experian in credit bureau space, AWS – Lambda, API Gateway, S3, EMR, Dynamo DB, IAM, Simple queue service (SQS), Simple email service (SES), Simple notification service (SNS), Python, Spark, Boto3, AWS chalice framework, Terraform, Airflow, Jenkins, Git, Developed python code using boto3 for Dynamo DB and integrated same with chalice framework., Configured chalice to create required infrastructure as code while deployment., Developed API Gateway services and integrated it with lambda for execution state management and exposed as endpoints to UI., Involved in spark development and airflow for orchestration implementation., Developed CI/CD Jenkins pipelines. JPMC, Banking and Finance, June 2017 to Feb 2020, Developer, IT Analyst, This project aims for ingesting, transforming, and loading data from source server (Windows) to Hadoop server conformed zone (Linux)., Apache Spark, Scala, Hive, Sqoop, Unix shell scripting, Control-m, Made end-to-end SFTP set-up between Source (Windows) and Destination (Unix) servers., Developed SSIS package and Sqoop jobs for pulling data from source system., Created automation script for creating and loading data to hive external tables., Performed end to end testing of the UDS data flows for data load., Involved in automating tasks using Unix shell scripting., Creating Control-M jobs to trigger ingestion jobs. JPMC, Banking and Finance, April 2016 to May 2017, Developer, IT Analyst, This project aims for migrating the Oracle based PL/SQL code to Ab-initio graphs, consuming data from Hadoop semantic layer., Analyzed the PL/SQL code and worked on converting them into an Ab-initio graph without changing functionality., Coordinated with on-site team for day-to-day updates and requirement clarification., Performed end to end testing of the Ab-initio graphs for data load., Involved in automating tasks using Unix shell scripting., Creating Control-M jobs to trigger graphs. JPMC, Banking and Finance, Feb 2015 to March 2016, Developer, Systems Engineer (SE), This project aims to evaluate and mitigate the impact of migrating the existing home equity Loans from VLS to MSP. It validates the impact and to do a thorough testing so that the end users and Applications do not get impacted by this migration., Analyzed the application and performed a detailed impact analysis to provide a smooth transition of the upgrade/migration., Coordinated with on-site team for day-to-day updates and requirement clarification., Performed end to end testing of the updated scripts., Collaborated with different teams of VLS-MSP migration project., Involved in automating tasks using Unix shell scripting., Involved in changing the existing Oracle procedures to add new queries to enable the procedures to execute MSP data., Developed L2- tool using UNIX shell scripting which parses the huge volume of SQL scripts and gives a detailed report extract that contains an Object level mapping., Developed L3- tool using UNIX shell scripting which parses the huge volume of SQL scripts and gives a detailed report extract that contains an Object level + Attribute level mapping., Developed the control-M XML draft parsing tools., Automated the Bulk table loads from the csv files received from Share point., Automated the Bulk report generation process from the source tables and uploading reports to user specific share point path.