Collaborative engineering professional with a total of 7.10 years of IT experience including 5.10 years in Bigdata Hadoop with good knowledge in Pyspark. Demonstrated history of using cloud services provided by Microsoft Azure and over 1.5 years of leading competitive data engineers in team. Extensive hands - on exposure on Apache spark, Hive ,Sqoop,HDFS and automation tools like Autosys. Strong Understanding on Telecom and Heathcare Domains.
Objective : Abbvie is an American publicly traded biopharmaceutical company founded in 2013.It originated as a spin-off of Abbott Laboratories. The headquarters are in North Chicago, United States. It offered products like – Humira, Skyrizi, Rinvoq, Norvir, Mavyret, Kaletra, Zinbryta etc. The number of employees here almost 50,000. By far, Abbvie's biggest sales generator is Humira. Doctors prescribe the drug to treat arthritis and Crohn's disease. It accounts for more than half of Abbvie's total annual sales
Project Responsibilities :
Abbvie Sypher Build, Abbvie Marketo Development
• Understand the existing design and logic of Legacy project.
• Understand the Salesforce UI application.
• Analyze the existing SQL queries using Teradata application.
• Develop the base layer tables – Dimension and fact tables based on the requirement.
• Base layer creation – Load the base layer tables after running Unix jobs in automation tool Autosys.
• Perform Sanity and SIT testing of base layer tables in DEV, QA and PROD.
• KPI Layer Build Up – Perform metric calculation using Azure DevOps by creating the branch
and load the data to Postgres tables/Snowflake tables as per the client requirement.
• Perform Jira activities in each sprint.
• Worked on Pyspark ,Python Scripting, Jupyter, PgAdmin , Snowflake to develop the code preparation and
automation.
• Attend client and status calls. Also prepare the consolidated report and full day status of
the entire team and represent the team to the client call..
Objective :- BDM (Bell Data Management)
Bell offers a variety of advanced digital video, high-speed Internet and telephone services over our
own IP network. All the dataset information has been captured into RDBMS which has all the digital
media information irrespective of the customers. Business customers of all sizes are provided with our
high speed Internet, phone and long distance services, as well as data and video transport services.
Bell Media offers local and national cable advertising in both traditional and new media formats, along
with promotional opportunities and production services which enhance the business for the
customers across the regions.
Role: System Engineer
Technical Responsibilities:
· Moved all products information of existing customers to HDFS for further
processing.
· Worked on Spark dataframe transformations to map business analysis and
apply actions on top of transformation.
· Importing and exporting data into HDFS and Hive using Sqoop.
· Involved in creating Hive tables, loading with data and writing hive queries that will run
internally in map reduce way.
· Load and transform large sets of semi structured data.
· Implemented test scripts to support test driven development and continuous integration.
· Involved in gathering the requirements, designing, development and testing
Analyzing the requirement to setup a cluster.
· Created External Hive Table on top of parsed data.
· Written Hive queries for data analysis to meet the business requirements.
Pyspark
undefined