Every problem is a gift—without problems we would not grow.
Tony Robbins
Summary
A high-impact Big Data/Cloud technical architect/Consultant/Engineer/Data Analyst and key business driver, specialized in creation, development, and administering of end-to-end, scalable, data-driven “big data” applications in cloud.
With 10+ years of track record, I deliver data-centric solutions achieving monetization, cost savings and product optimization. Build sophisticated applications with versatility, scalability, and reliability.
Skilled in migrating legacy workloads to dynamic, resilient, and scalable platforms which will help clients to make the most of the cloud. Proficient in conducting technology workshop and training for onshore and Offshore client as per demand.
An anticipated big data infrastructure platform architect, able to identify gaps and shape a comprehensive, end-to-end design and technology strategy.
Successful Consultant promotes aggressive drive toward organizational change through research, optimization and systems development. Experience conferring with employees and management to address problems with internal controls and procedures negatively impacting business operations. Focused on reducing costs, streamlining processes and maximizing resource utilization.
Overview
10
years of professional experience
2
Languages
2
years of post-secondary education
Work History
Xebia IT Architect India Pvt. Ltd.
Bangalore, Karnataka
Principal Consultant
02.2018 - Current
Job overview
Project: 1 ETL OTT data inconsistency through Python and performing automation through Workato
Ensured accuracy and consistency of revenue data by addressing inconsistencies among various file formats used by different OTT platforms.
Enabled seamless integration of consistently formatted data into salesforce and snowflake for enhanced analysis and dashboarding capabilities.
Automation of ETL process through workato Ipaas tool and by deploying Python code in AWS lambda.
Led workshops with clients to identify their needs and develop appropriate solutions.
Advised clients on how to best utilize resources to improve operational efficiency.
Project: 2 Driving innovation through Data-Lake on AWS cloud
Domain: Global Management Consulting Firm Duration: Dec 2019 – Sept 2021 Tools & Technologies: AWS Cloud Services: IAM, VPC, S3, EC2, EMR, Athena, Lambda, RDS, Red Shift, SNS, SageMaker, CloudWatch, CloudTrail, AWSCLI, Trusted Advisor, Guard Duty, KMS, Python 3.5, Qubole data services, Hadoop, Spark, Hive, Terraform. Project Description: Constructed an enterprise-level Data Lake platform on AWS Cloud for the firm's Technology and Digital unit, encompassing platform evaluation, development of dev/test/production environments, hardening of Landing zones with robust security controls. Transferred on-premises/in-house applications to cloud-based infrastructure through the implemented data lake. Responsibilities: • Develop implementation plans for big data and Data Lake architecture team. Work with global architecture team to drive the technical strategy and architecture.
• Work with Application Architecture and Development teams to identify requirements for distributed computing services.
• Preparation of technology blueprints, supporting technical documentation and technology roadmap.
• Engage in technology sourcing and selection activities on, working with third-party vendors to identify the solution options.
• Research and evaluate new tools and/or new architecture and review with analytics and data scientist teams as applicable.
• Cyber & firmwide security- Established cloud VPC and firewall based secure infrastructure & making sure application infrastructure is compliant with firm's policies and security requirements.
• Work with engineering team to define best practices and processes as appropriate to support the entire infrastructure lifecycle – Plan, Build, Deploy, and Operate such as automate lifecycle activities – self-service, orchestration and provisioning, configuration management.
• Working closely with platform delivery team, solutions architect, and application architects to create move their application to cloud and ensure the environments are built accurately as per the standards and full filling the need and as per firm's security standard.
• Worked with Python to scrap the data from web and provide the same further to Analytics team.
• Performed Git connectivity for the instances running in cloud to commit and read codes from the version control repository. Also created a connectivity between snap logic and GitHub to automate the ETL pipelines with CICD.
• Develop easy to use documentation for the frameworks and tools developed for adaption by other teams
• Iterate rapidly and work collaboratively with product owners, developers, and other members of the development team
• Works with other team members, including DBA's, Other ETL developers, Technical Architects, QA, Business Analysts & Project Managers
• Work directly with project owners across the business and plan, manage and execute key projects
• Follow and improve established processes with an agile approach to deliver ETL and Program based solution. Project: 3 Product Feature development using Python Domain: Global Management Consulting Firm's Product Unit Duration: June 2018 – Nov 2019 Tools & Technologies: Python 3.5, GitLab, Jenkins, SQL DB, GitHub, PyCharm Project Description:
· Development of end-to-end Python code to handle different kind of client data and automate various ETL process of campaigning and promotion management product engine for retail, where user can go and create different kind of promotions and campaigns. Description:
· Automation of vendor process creation through python ETL pipeline includes validation of file, vendor creation, user creation form values on file, policy & group creation for a central authentication System.
· Python to automate client's authorization process to different users as per roles by calling different policy. This also includes policy creation and role creation API in customer authentication system Project: 4 AI Chatbot for legal team Domain: Artificial Intelligence Duration: Feb 2018 – May 2018 Tools & Technologies: Python 3.5.2, NumPy, TensorFlow 1.2, NLTK, RestApi and HTML for the UI. Project Description:
Creation of chatbot based on tensor flow to give replies based on learnings from various document as input. Description:
· Implementation is based on the legacy seq2seq model (static RNN) in TensorFlow version 1.2 and higher.
· Bot created with higher vision of using it in various domains and different kinds of Support activities like policy, sales, HR etc.
· Bot has been trained using different kinds of conversation data, so it will be able to answer on different kinds of general conversation as well.
The end-to-end pipeline system was designed & developed for the “US based largest online travel company” (Expedia USA).
Extended breadth and depth of data analysis by integrating a raw data source as inputs (i.e., users browsing logs to business insights).
Determined business objectives and built, debugged, deployed, and monitored production level PySpark code through effective cross-functional collaboration with 3 teams (Data Science, ETL and Champaign Analysis) in a dynamic, Agile/Lean environment.
Led cross-functional/cross-departmental collaborations, driving multidisciplinary team communication and cohesion.
Build an analytical framework over the existing data for deep dive insights and business decisions.
Trendwise Analytics
Bangalore
Big Data /Social Media Analytics Intern
05.2014 - 10.2014
Job overview
Project Description: A proof of concept building on analytics and enrichment of viewers data collected from various set-top boxes and social media.
Responsibilities: Designing concepts to capture and store the data in real time and apply analytics over it.
Storing the captured data in distributed environment (HDFS using Sqoop and flume), Analyzing the data by performing EDA, applying rule-based analytics and interpreting the results using hive and R.
Data Visualization: Learn tableau and create Dashboards to visualize the results generated.
Connectivity of tableau from various database and fetching of data to create dashboards
Designing of rich Graphical visualizations including Drill Down and Drop-down option in Tableau.
Creation of new derived fields from the existing columns and applying filter to aggregate data.
Education
SRM University
Chennai
Master of technology from Database Systems
09.2012 - 05.2014
Rajiv Gandhi Proudyogiki Vishwavidyalaya
Indore
Bachelor of Engineering from Computer Science and Engineering
Skills
AWS Cloud Service
IAM
Athena
VPC
S3
EC2
EMR
Lambda
RDS
Red Shift
EBS
SNS
CloudWatch
CloudTrail
Awscli
TrustedAdvisor
GuardDuty
KMS
Microsoft Azure
Data Bricks
Azure Data Factory
Azure Storage Service
Framework/Components
Apache Spark
Hadoop
Hive
Sqoop
Flume
Jenkins Pipeline
Languages
Python
PySpark
SQL
Shell Scripting
Software/Tools/Utilities
Qubole Data Service platform
Jenkins
Pycharm
Gitlab CICD
GitHub version control
Data Querying Tool
MySQL
MS SQL Server
Oracle 11g
AWS Athena
Spark SQL
Operating Systems
Windows 7/10
Linux (Ubuntu, Centos, Red Hat)
Mac OS X
Machine Learning
Basic of Linear Regression
Logistic Regression
Classification
TensorFlow
Personal Information
Gender: Male
Marital Status: Married
Publications
Analytics and Prediction over Student's Record, Ananya Chandraker & G. Vadivu, International Journal of Advances in Science, Engineering and Technology, Volume-2, April edition, 2321-9009
Mobile computing devices and its future, national conference on Business Technologies (TBTC-09)
Work Preference
Work Type
Full TimeContract WorkPart Time
Work Location
On-SiteRemoteHybrid
Important To Me
Company CultureWork-life balanceCareer advancement
Software
AWS Cloud
Python
PySpark
SQL
Big Data
Interests
Learning new tools and technology
Travel
Trekking
Availability
See my work availability
Not Available
Available
monday
tuesday
wednesday
thursday
friday
saturday
sunday
morning
afternoon
evening
swipe to browse
Contact Me
ananyachandraker03@gmail.com
Timeline
Principal Consultant
Xebia IT Architect India Pvt. Ltd.
02.2018 - Current
Senior Software Engineer
Affine Analytics
11.2014 - 02.2018
Big Data /Social Media Analytics Intern
Trendwise Analytics
05.2014 - 10.2014
SRM University
Master of technology from Database Systems
09.2012 - 05.2014
Rajiv Gandhi Proudyogiki Vishwavidyalaya
Bachelor of Engineering from Computer Science and Engineering