From:                              route@monster.com

Sent:                               Monday, September 28, 2015 12:59 PM

To:                                   hg@apeironinc.com

Subject:                          Please review this candidate for: Talend

 

This resume has been forwarded to you at the request of Monster User xapeix03

Narender Kanuganti 

Last updated:  06/09/14

Job Title:  no specified

Company:  no specified

Rating:  Not Rated

Screening score:  no specified

Status:  Resume Received


Alpharetta, GA  30005
US

Mobile: 470-564-4000   
naren.kanuganti@gmail.com
Contact Preference:  Email

Quick View Links:

Resume Section

Summary Section

 

 

RESUME

  

Resume Headline: BG Resume

Resume Value: z26xtgvg5ha8f22j   

  

 

Narender. Reddy. Kanuganti, MBA.

                   Certified Big Data Consultant.

                                                                                                         naren.bigdatapro@gmail.com : Cell # 470-564-4000

 

BIG DATA Practice •HADOOP DATA INTEGRATION DATA WAREHOUSING JAVALINUX BUSINESS INTELLIGENCEE-COMMERCE WEB ANALYTICS SDLC RALLY-AGILE –SCRUM.

SUMMARY:                                                                                        

·           Tech Savvy Techno-Functional Project Manager /Scrum Master turned BIG DATA Consultant with 11 + years’ Total IT experience in Broad-spectrum of  Industries- like IT /Retail / Telecom /Healthcare/Insurance/Logistics performing Business Analysis, Predictive Data Analysis, Data Science Analytics, Requirements Elicitation, Business Modeling, and Use Case development using UML methodology &  UAT with a “Get things done” attitude with a strong

Boots-on-the-ground mentality to steer projects into right portfolios and create Win-Win Situation.

·           About 3 years of Hands-on Experience in Big Data Technologies like Hadoop and its eco-system and still making my hands dirty and shaking hands with Top Management folks as well.

·           Blessed with Techniques to seek Sign-off from Key & tough Stakeholders through Diplomacy acting as a Liaison and Bridging the Gap between Business World and Technical World and keep all Stakeholders in Sync and involve myself in all phases of the Product Life Cycle right from Inception to launching and ensure the project deliverables within Time limits and Scope.

 

·           Effective COMMUNICATOR and Good LISTENER and a Team Player with strong technical DOCUMENTATION SKILLS and  ANALYTICAL BENT OF MIND Possessing strong WILLINGNESS TO LEARN NEW SYSTEMS, Good coordinating skills ability to Ramp up to the Speed and hit the Ground Running.

 

 

·           TEAM-ORIENTED & Self-starter Professional working with Cross functional teams in a Techno functional environment playing dual Role as” Big Data Analyst” and “Jr Project Manager” Role intermittently focusing on Project Budgets and Timelines and avoid Scope Creep.

·           Proactive to enter Developer’s world and willing to wrangle with lines of Code and make my hands dirty enough with hardware & Software and indulge in troubleshooting to resolve technical issues and take ownership of the Product and drive the project from end to end.

·           A good Deal puncher with customer focused Negotiating abilities/solution finding skills wearing different Hats at different Scenarios. Ability to Read the Code and do the critical Analysis of Bugs and Defects and accomplish the End user’s/Business requirements.

·           Self-Starter and Self-motivated Team leader, DETAIL ORIENTED and ability to think OUTSIDE THE BOX and thirst for learning new systems and adapt with changing corporate environments and increase the Bottom line of the Company. Strong in AGILE and WATERFALL Methodologies  and expert in Managing “Contracts Backlog” and prioritize according to project deadlines

·           Experienced in Full Life cycle implementations of Big Data solutions (HADOOP) & CRM (Salesforce) gathering Requirements from Clients and perform customizations/ setting up configurations and testing for Performance. Ability to de-mystify predictive analytics and create actionable insights from the output of that algorithm to add business value to the enterprise.

Strong understanding of Principles of Validation and documentation.

·           Expert in Gathering and defining/capturing those Requirements from front end Business Stakeholders with 100% clarity without any ambiguity and sit with the backend Technical team and ensure the development designs meet the  Business  needs and specifications and ensure all the Team members are on same page and aligned with the Project deliverables and avoid Scope Creep.

 

·           Strong understanding of Software Development Life Cycle (SDLC), including good knowledge of RUP methodology. Reproduce, Expertise in SOA , investigate and debug software issues. Knowledge of both Agile and Waterfall Software development methodologies. ENTERPRISE WEB DEVELOPMENT Experience in Accu Process, SaaS (Ajax), Informatica Cloud and Extensive knowledge of Data warehouse Concepts. HP (PPM) and Business Objects (SAP).

·           Extensive experience in interacting with Offshore Team/Virtual team & Stakeholders, Problem Management Analysis, Eliciting Requirements and creating Business Requirement Documents (BRD), User Requirement Specifications, Functional Requirement Documents (FRD), System Requirement Specification (SRS), Test Plan, analyzing and Creating Use Cases, Use Case Diagrams, PROCESS FLOW Diagrams, BPMN Diagrams, Activity diagrams, System Workflow. Strong Understanding of all Versions of SharePoint.

·           Experience in applying Rational Unified Process (RUP) methodology using Modeling and requirement documentation tools such as MS Office, MS VISIO, and MS Project.

·           Conducting and facilitating JAD Sessions and communicating with Key Stakeholders, Development team, SMEs, System Analysts, Business Analysts and Project Managers and External Vendors.

·           Excellent skills in Business Analysis, BAM, OO analysis, requirement analysis, Business modeling and Use Case development/ UML Methodology. Support Analytics and BI Technology (ABIT) Team deliverables.

·           Experience in customizing the portal sites on Share point 2007 and 2010, Rational Requisite Pro, Team Foundation Server and Variety of e-Commerce applications, Cloud Computing and Deployment of Apps. Strong understanding of SOA architecture designs and concepts

·           Cradle to grave” experience with large RETAIL Payer system implementations analyze, interpret

and configure business requirements based upon client’s design documents. Strong Experience in Retail Claims Processing

SKILLS:

 

TECHNICAL SKILLS:

·  BIG DATA Concepts /Architecuture, HADOOP-(HDFS), write  MapReduce Jobs & Algorithms and using various tools  PIG, HIVE, Hbase, IMPALA, NoSQL, CASSANDRA,IBM Data Explorer. ZOOKEEPER,Sqoop,/Mahout/Pentaho / Vertica/Informatica /Talend/ Teradata Aster.

·  HADOOP Cluster Administration, configuration, monitoring, debugging, and performance tuning. Ability to  implement Hadoop based solutions and offer best practices in Big Data space.

·  HADOOP Eco-System : Setting up Clusters/nodes/Maintenance and Tuning of the cluster nodes end-to-end ,troubleshoot the technical issues and offer solutions.

·  Demonstrated ability to influence and consult (providing options with pros, cons and risks) while providing thought leadership to sponsors/stakeholders in solving business process and technical problems associated with day to day running of the project tasks.

·  Strong Understanding of Hadoop Eco-System and other Business Intelligence and ETL Tools on top of Hadoop like VERTICA,PENTAHO,SQOOP,OOZIE,FLUME,HBase,Tableau, TeraData , Datastax Datameer and Mahout(Machine Learning) Web Analytics (OMNITURE).MPP (Massive Parellel Processing) in TERADATA.

·   Ability to write MapReduce Programs and create Business Intelligence reports from the output.

·  Daily support of several Hadoop, data warehouse appliances, including monitoring capacity, throughput, health, and usage and Clickstream Analysis out of the web logs.

·   Collaborate with various application development teams to design solutions for multi-tenant platforms

·  Collaboration with vendors and users to coordinate and accomplish repairs, upgrades, patches, and other enhancements, additions, or replacements.

·  Query analysis and tuning advice for end users, to maintain throughput and reliable operation.

·  Scripting to deploy monitors, checks, and other sys admin function automation.

·   Production Support for any problem leading to acceptable resolution, including daytime, nighttime, and weekend support if required.

·  Performs Incident resolution, Problem Determination and Root Cause Analysis.  (i.e., hardware and software diagnostic tools to monitor performance and perform problem determination).  Familiarity with hardware and software diagnostic tools to monitor performance and perform problem determination.

·  Oversee installations, monitoring and managing change to servers (Overall Change Management for Servers).Oversees implementation of security guidelines in order to prevent unauthorized access to servers and report any violations.

·  Ability to Monitor and tune operating systems to achieve optimum performance levels in standalone and multi-tiered environments. 

·  Collaborate with System Engineering, Network Engineering, solves complex and recurring operational issues and develops corrective actions, as needed.

·  Interact regularly with Metrics team, developers, engineers, and the IT outsourcer to ensure the Company’s Reliability, Availability and Serviceability (RAS) metrics are sustained and improved from current level.

·  Develop and direct enhancement of application monitoring, reporting, error handling and recovery to ensure customer satisfaction, improved operational efficiency, and improved employee technical knowledge and training.

·  Participate in the evaluation, recommendation, and selection of hardware and software solutions. - Reviews, evaluate, designs, implements and maintain internal and external data.

·  Identify data sources, constructs data decomposition diagrams, provides data flow diagrams and documents the process. 

·  Writes codes for database access, modifications, and constructions including Map Reduce Programs, Pig/Hive Scripts, SQL-H, Stored Procedures, etc.

·  Developed and reviewed project plans, identifies issues, resolves issues, and communicates status of assigned projects to users and manager.

·  Experience in operational support and hands-on implementation Hadoop based Big Data Platforms.  Gathers requirements, builds logical models and provides quality documentation of detailed user requirements for the design and development of systems.

 

         Databases: Hbase/NoSQL/SQL Server, Oracle/IMPALA /MongoDB, CASSANDRA.

 

BUSINESS SKILLS:

·  PEOPLE SKILLS… Networking with people ,connect the DOTS and GET THINGS DONE. ………….. Kind of a JACK OF ALL TRADES.

·  Facilitate JAD Sessions, POWER POINT Presentations of NEW Products and Services to wide spectrum of audience/ Clients and Business Users.

·  Test Cases, Vision, Scope, and SRS documents/UAT Sessions with Stakeholders.

·  Gap Analysis , Impact Analysis and SWOT Analysis/Feasibility Analysis, Product Marketing &Sales.

·  Focus on End Game and Prioritize the Requirements and achieve goals within Timelines and Budget.

·  Business Process Analysis & Research using i-Rise Software/Enterprise Architect.

·  Use Case Modeling & Analysis ,Troubleshooting software bugs and defects management.

·  Functional Requirement Gathering & Technical Requirements Development.

·  Prototyping / Wireframes & Mockup Screen Creations

 

               BUSINESS DOCUMENTATION:

·  Share Point, Vision Documentation Hyper office Alfresco .Epiware.Team Foundation Server.

·  Documentation, using Reporting tools like Abnitio /Crystal Report./Business Objects.

·  BPMN Diagrams, Business Requirements Documents & Functional Requirement Docs.

·  Story Boards, Scenarios, Personas. Test Plans, Test Scripts and Test Cases,Prototypes.

EDUCATION:

·          Diploma in Computer Applications.                                  ……………..     Jan , 2004

·          Master of Business Administration, MBA.   …………………………….     July, 1994

·         Bachelor of Sciences, BS,                                                        .   …                  Aug, 1992

·         Certified “Cloud Computing Consultant” RACKSPACE/SALESFORCE.COM/IBM

·        HADOOP Admin Certification. (Big Data)

·        AHIMA, Member(American Health Information Management Association)                     

·        Member of PMI- Project Management Institute. Membership .ID #: 2504042

·        Member of IIBA (International Institute of Business Analysts)

·         Currently Working on PMP (Project Management Professional)

PROFESSIONAL EXPERIENCE:                                                                

        Walgreens , H.Q.   (TCS)                                                                                  April ‘ 2014   - Current

        Deerfield, IL .

       Big Data Engineer.

·  Jr Architect  Role

 

o          Setting up Hadoop Cluster from ground-up and setup the visibility of the data flow from end to end. Provide technical direction in a team that designs and develops path breaking large-scale cluster data processing systems.

o          Take the ownership the Big data strategy and draw the roadmap and Design new data pipelines from the legacy systems and recommend the Best practices and lay down the blue prints for the project deliverables.

o          Help establish thought leadership in the big data space by contributing internal papers, technical details/ recommendations and best practices to stakeholders.

o          Testing, Fine-tuning and Diagnosis of Clusters, Applying fixes Configuring Benchmarking Capacity planning Disaster/failure recovery automation Detection/repair of data corruption and Optimize the Cluster for better performance. Interact with the Vendor (Cloudera)for any Technical issues.

o          Maintain the cluster with detailed information to support the sales teams and then identify trends, forecast from reports, understand and highlight anomalies and improve performance within each sales division and be comfortable working with both technical and non-technical groups.

o          Hadoop Production Support, Change Management, Maintenance, Capacity Planning, Compression techniques, Performance Component verification  Plan production cut-over/deployment and recommend the best practices in the industry and End-to-End execution of the project from conceptual beginning to final output and seek the solutions for the Technical issues encountered during the production phase.

o          Gathered requirements,built logical models,and providing documentation,Benchmark systems,analyse system bottlenecks and

Propose solutions to eliminate them and interact with the Vendor to raise Tech support Tickets to resolve the issues. Subdivide a complex application, during design phase,I nto smaller and more manageable Pieces.

 

 

 

v     UPS (United Parcel Services) Supply Chain Solutions.                                   July ’2013 – Marh ‘ 2014

(World H.Q.)

Atlanta, GA.

 

BIG DATA ENGINEER /Hadoop Admin- Developer (Cluster setup & Hadoop Map Reduce Team)

              Use Cases: Fraud Detection, Efficient Routing, On-Time Deliveries of Shipments.

·  Big Data Analysis & Optimization / Architecture

(HADOOP-Proof of Concepts /ORION Big Data Project Implementation)

·  CRM –Third Party Software Integrations

 

Responsibilities: Map Reduce Jobs & Hadoop Cluster Maintenance

    BIG Data – JAVA/ Hadoop – ORION Software Development Project

·  Project name: ORION-(On-Road Integrated Optimization and Navigation). To Create actionable insights using the unstructured data related to Logistic telematics and crunching of big data package information, user preferences and creating an efficient routing to drivers lead to a huge savings to the tune of $50million a year @ one mile a day for every UPS driver

·  Playing a Hands-On Hybrid role of a Hadoop Admin-cum-Developer which required me to pull large data sets into HDFS, as per the Use Case, and then write JAVA based Algorithm/ MapReduce jobs to trigger into MapReduce framework and with the emitted output, and analyze them to create Statistical/Graphical visualization reports as per the Business Users needs using various BI tools like Tableau/Clikview .

·  Hands-On Experience in creating MapReduce Jobs and making my hands dirty by entering Development environment and troubleshooting and analysing the end results to create actionable insights and graphical Dashboards and feed to Business Intelligence Reporting system/DSS (Decision Support Systems) using various third party tools like Tableau, Pentaho and Big insights(IBM)/ IBM Data Explorer/ Teradata Aster/SPLUNK.

·  Strong Understanding of the Hadoop and its Eco-system /Architecture and associated sub-projects sitting on top of Hadoop like Hive/ Pig /H-Base /Sqoop etc.

·  ADMIN /Jr Architect  Role

 

o          Testing, Fine-tuning and Diagnosis of Clusters, Applying fixes Configuring Benchmarking Capacity planning Disaster/failure recovery automation Detection/repair of data corruption and Optimize the Cluster for better performance. Interact with the Vendor (Cloudera)for any Technical issues.

o          Maintain the cluster with detailed information to support the sales teams and then identify trends, forecast from reports, understand and highlight anomalies and improve performance within each sales division and be comfortable working with both technical and non-technical groups.

o          Hadoop Production Support, Change Management, Maintenance, Capacity Planning, Compression techniques, Performance Component verification  Plan production cut-over/deployment and recommend the best practices in the industry and End-to-End execution of the project from conceptual beginning to final output and seek the solutions for the Technical issues encountered during the production phase.

o          Gathered requirements,built logical models,and providing documentation,Benchmark systems,analyse system bottlenecks and

propose solutions to eliminate them and interact with the Vendor to raise Tech support Tickets to resolve the issues. Subdivide a complex application, during design phase,I nto smaller and more manageable Pieces.

o          Communicate the concepts to Back end Developers and explain the dependencies

·   Working directly with UPS clients to map out their existing Business Processes and providing system-based Predictive Analytic solutions that increase efficiency and reduce operating costs in setting up automation in their newly planned system and Integrating with UPS IT environment using Big Data solutions for increased productivity, customer satisfaction & avoid Customer churn.

·   ETL Jobs : Performed ETL Jobs with structured (transaction), semi-structured (user behavior) and unstructured (text) data and develop algorithms and systems before ingesting the data into HDFS using state-of-the-art open-source platforms like Talend, Pentaho, Splunk, Hive and Pig.

o   MAHOUT: Restructured Random Forest Algorithm to obtain 95% prediction accuracy on a Transactional EDI-856 Advance Ship Notice dataset (Using Mahout) 

o   Deployed multi-node Cloudera Distribution Hadoop clusters ( 60 nodes, version 1.x and 2.x) in order to prototype solutions using Mahout (0.7, 0.8) to build predictive models with data from millions of the Retail EDI -856 Transactions received in UPS database. This helps to reduce product Recalls & Shipment specifications for Pharma clients of UPS and enhance Customer Satisfaction.

o   Classification of 1GB of data on local cluster of 450+ cores at scale using Hadoop MapReduce. Involved in ETL processing and then  integrating and transforming data and content to deliver authoritative, consistent Predictive Analytics to top management on a regular basis throughout its life cycle.

o   To engineer a platform for high volume data analytics to be deployed on cloud environment.  Accelerate processing/communication time using memory optimization and faster communication links.

o   Design BI dashboards, scorecards, charts/graphs, drill-downs, and dynamic reports to meet the needs of the top management and decision makers.

·  CASSANDRA: Used DATASTAX brand of Cassandra (Peer-to-Peer) tools to handle a real-time operational data store for online transactional applications and a read-intensive database for large-scale business intelligence (BI) systems and created  Graphical  BI Dashboards out of the ad-hoc query output for the Top hierarchy management.

STORM: Expert in using Storm to perform Real-time Processing to perform Predictive analytics by ingesting the telemetric/satellite data and triggering the Operational Alerting systems and scheduled Announcement systems to the Driver on Road in Real-time .

·  Clickstream Analysis out the web logs to create the actionable and meaningful insights.

·  WEB ANALYTICS: Measuring and collecting off-site and on-site web logs to do analysis and reporting of internet data for purposes of understanding and optimizing web usage and enhance the KPI’s and improve the customer web browsing experience.

·  Strong understaning of  Web Analytics like OMNITURE./ WebTrends.

·  Cradle to Grave understanding of  HADOOP Eco-System HDFS/MapReduce, JAVA Related Projects and other Hadoop related projects like Pig Hive NOSql, Zookeeper, Sqoop, Mahout, Cassandra. Expert in (MPP) Massive Parallel Processing architecture in Teradata.

·  HADOOP Eco-System: Setting up Clusters/Multi-nodes/Maintenance/Troubleshooting and Tuning of the clusters. Involved in integrating Hadoop into existing technology stacks and software portfolios to achieve maximum Business value.

·  Ability to design solutions independently based on high level architecture

·  Implemented Hadoop based solutions and developed governance strategy and provided architectural recommendations on integration standards.

·  Architected and Designed Solutions for the business to accomplish Business Value.

·  Estimated Workload Profiles (for analytical processing, Data Processing, Ad-hoc Processing etc) ETL using various Tools like Pentaho,HP Vertica,Informatica,Hive,Pig.

·  Determine Workload Types, Data Landings, Estimate amount of data/ intervals, Determine data retention periods, any transformations, types/number of integrations, Plan compression levels.

·  Estimate directionality of integration, Describe use cases; Determine SLAs, SLA uptime guidelines, Availability Guidelines, performance guidelines; Cluster Configurations/refinements; Set initial configuration changes, OS and cluster type configs

·  Determined and implemented security policies, i.e. user-ids, schemas, etc.

·   Strong Requirements gathering tasks using JAD Sessions & Conducting User Interviews to seek Clarity and avoid ambiguity, prepare functional documents like BRD’s, Use Cases, Software Requirements Specifications (SRS)and setting up design sessions with Backend Developers and make sure all team members are in Sync with the Business expectations and engage all stakeholders throughout the Project Life cycle. 

 

Environment: HADOOP Eco-System, JAVA,.NET, Agile, MS Office, Cloud Computing.

 

v                                                                                                                                                                                                                                                                           The Home Depot (World H.Q)                                                                      June ’2012  - June ’2013

             Atlanta, GA.

             Big Data Consultant / Hadoop Engineer

 

             Role & Responsibilities: Admin- cum- Developer.

            HADOOP Cluster Implementation Strategy/Big Data /WMOS Solutions /SOA Architecture

             Software Enhancements/Business Intelligence/ Data Migration/SHAREPOINT.

 

Big Data Consultant for Big Data downstream projects.

·  Experience deploying best practices and methodologies to define Hadoop (Cloudera) infrastructure to roll out releases into production.

·  Clickstream Analysis out the web logs to do Basket Analysis and create meaningful and actionable insights like consumer buying patterns/predictable analytics to prevent customer churn/pre-empt competitors by bringing the most desired items to store shelves.

·  Targeted Marketing - Hadoop framework helped to deploy to increase sales volume and conversion rates, reduce stock-outs and lead times, and more effectively compete with alternative web-based E-Commerce options like HD.Com.

·  Capacity forecasting – Hadoop data output helped to get an updated view of order inventory to enable real-time pricing tools which incorporate projections and actual behaviour to maximize high-fixed, low variable cost, inventory forecasting and demand supply cycle helped save 28% of revenue to the company and maintain a healthy growth trajectory cycle.

·  Hands-on experience with MapReduce Jobs on Hadoop based distributed systems (e.g. MapReduce, Hive, Hbase, Pig, Flume)Using JAVA program extensively.

·  Responsible for writing MapReduce programs. Import and export data into HDFS from other RDMS using Sqoop/Hive. Involved in loading data from UNIX file system to HDFS

·  Expert level experience architecting, building, maintaining, and performance tuning and Enterprise grade Hadoop commercial distribution-Cloudera CDH

·  Experience suggesting industry best practices towards development, testing, implementation and post production support

·  Work with large data sets, automate data extraction, build monitoring/reporting  and high-value, automated clickstream Analysis and offering Business Intelligence solutions.

·  Build monitoring solution(s) for the Big Data infrastructure to understand the health of the infrastructure.

·  Scale the current infrastructure to handle greater levels of processing capability, accessibility, and reliability using data storage methods such as sharding, partitioning and data modelling.

·  Provide thought leadership, strategy and lead innovation by exploring, investigating, recommending, benchmarking and implementing data centric technologies for the platform

·  Develop data architectural strategies at the modeling, design and implementation stages to address product requirements

·Setting up the Hadoop Clusters & HDFS/MapReduce Jobs. Ability in Administering, Installation, configuration, troubleshooting, Security, Backup, Performance Monitoring and Fine-tuning of HADOOP Clusters.Experience in using Scoop, ZooKeeper and Cloudera Manager.  Good Knowledge on Hadoop Cluster architecture and monitoring the clusters and Huge Data sets integration.

·Hadoop MapReduce programs helped for better understanding of customer basket size and structure, real-time access to inventory levels, and insight into trade and promotion effectiveness to help refine future advertising campaigns and align inventory levels by location. Further helps to get an updated view of order inventory to enable real-time pricing tools which incorporate projections and actual behaviour to maximize high-fixed, low variable cost, inventory.

·  The output results from Hadoop jobs helped  to adjust the content to each user, to attract and retain customers, and thereby improve sales/usage volume and stop Customer Churn.

·Conducting JAD Sessions/Standup meetings, setting up Workshops for Clients/ Technical and Business   Stakeholders. Act as a Liaison between Business Stakeholders and Technical team.

 

TOOLS : Pentaho, Teradata Aster, Vertica, Splunk, Talend, Tableau.

 

v                                                                                                                                                                                                                                                                           CVS Caremark    / Tata Consultancy Services                                             May’2011 – May’ 2012    

             Cumberland, RI.

                                                                                                            

            BIG Data Analyst /Hadoop Consultant 

 

  Project:

Big Data- HADOOP - POC (Proof of Concept)DATA Analytics & Third Party DATA Integration. 

              USE  Cases: Fraud Detection, prevent Customer Churn, Patient Clinical notes Integration.

   Role & Responsibilities:

·  Started my Hadoop journey here as I got involved in Setting up of a New POC– in Amazon EMR’s and after successful POC, we eventually collaborated with Cloudera distribution technicians to initiate a Multi-Node Cluster setup, Configure and Test for Development and production.

·  Worked on a new customer data management solution that provides consolidated view of the customers, products, related organizations, and orders. Data is exported from MS SQL Server and other application data ingested into Hadoop for further predictive analysis using various tools like Tableau.

·  The data is fed into ETL and then processed using Hive to de-normalize and aggregate the disparate data sources. The customer profiles are categorized and product profiles are built using Pig. The processed data is then moved into Hive for real-time access using a REST-based API. 

·  Admin Expert in Hadoop Cluster Maintenance issues, Monitoring, Tuning, Creating BI reports using various tools like Tableau, Microstrategy.

  •    Exported data from SQL Server to HDFS using Sqoop and NFS mounts and created Graphical Analytical reports using “BigInsights IBM” 

·  Importing and exporting data into HDFS from RDBMS/ Hive using Sqoop.

·  Written Hive and Impala queries for data analysis to meet the Business requirements.

·  Used High level Data flow Pig Latin scripts to process the datasets using Business logic Algorithm.

·  Involved in creating Hive tables Internal & External, loading with data and writing hive queries which will run internally through MapReduce Framework.

·  Load and transform large sets of structured, semi structured and unstructured data.

·  Responsible to manage data coming from different sources. Got good experience with NOSQL/HBase database.

·  Involved in loading data from UNIX file system to HDFS. Scheduled Hadoop jobs using Oozie.

·  Cluster coordination services through Zoo Keeper. Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.

·   Analyzed large amounts of data sets to determine optimal way to aggregate and create Graphical visualizations Tableau as per the User Ad-hoc queries and deploy to the Tableau Server.

·  Worked on Recommendation Algorithms based on Pearson's Correlation and other distance measures. 

·  Worked on analysing Hadoop cluster and different big data analytic tools including Pig, Hbase      database and Sqoop 

·  Installed and configured Flume, Hive, Pig, Sqoop, HBase on the Hadoop cluster.  Implemented 30 nodes Hadoop cluster on Amazon EMR and then Cloudera commercial Distribution.

·  Worked on installing cluster, commissioning & decommissioning of data node, name node recovery, capacity planning, and slots configuration.  Setup Hadoop cluster on Amazon EC2 using whirr for POC. 

·  Resource management of HADOOP Cluster including adding/removing cluster nodes for maintenance and capacity needs . Created HBase tables to store variable data formats of PII data coming from different portfolios. Implemented best income logic using Pig scripts.

·  Responsible to manage data coming from different sources.Installed and configured Hive and also written Hive UDFs. 

·  Experienced on loading and transforming of large sets of structured, semi structured and unstructured data.Cluster coordination services through Zookeeper. 

·  Experience in managing and reviewing Hadoop log files. Exported the analysed data to the relational databases using Sqoop for visualization and to generate reports for the BI team. Analysed large amounts of data sets to determine optimal way to aggregate and report on it. 

·  Supported in setting up QA environment and updating configurations for implementing scripts with Pig and Sqoop. 
 
Environment: Hadoop, HDFS, Hive, Flume, HBase, Sqoop, PIG, Java (JDK 1.6), Eclipse, MySQL and Ubuntu, Zookeeper, Amazon EC2, SOLR. JAVA .

 

v                                                                                                                                                                                                                                                                           IBM - Express-Scripts Inc./ Medco ,  Minneapolis ,MN                         April 2010– April’ 2011 

 

            Business Data Analyst                                                                                      

·  Involved as Data Analyst to perform DATA INTEGRATION using TIBCO Tool for this newly merged company- Express Scripts + Medco.

·  Conducted GAP Analysis /Impact Analysis /Pricing Analysis and flushing out High level Requirements from the Business users and other Stakeholders involved in this Initiative.

 

v                                                                                                                                                                                                                                                                           Hewlett-Packard , Windsor, CT                                                                   Jan 2010- March -2010 

           Business Data Analyst        

PRODUCTION SUPPORT& TESTING and Analysis of Defects related to Transaction sets 837/ 835/277 CA  utilizing HP “QUALITY CENTER” Requirements/Defects MODULE, /EDIFECS Working collaboratively with technical team to Fix the defects and find appropriate Solutions to the satisfaction of the Business/Trading partners/End users.

·  Coordinating the upgrade of X12 Transaction Code Sets 277,837P, 835 and 834 to HIPAA compliance.

·  Used i-Rise Software extensively to prototype the projects to end users/Business users.

 

v       Motorola Mobility - Horsham, PA                                                                        Jan’  2006 to Dec’ 2009                  

Data Integration Analyst/CRM Admin                                            

         

      Roles & responsibilities:

        •    Data Integration using TIBCO . Third party softward integrations. Created Dashboards and scheduled automatic refresh and email.

·Responsible for all dashboard, metrics & analytics for global operations.

·   Analyzed the needs of 1000+ users and updated requirements. Identify Risks and involved management in decision making.

 

v     Frontier Communications .Virginia                                                                 July 2003 – Dec’ 2005

             Business Systems Analyst

           PROJECT: Customer Churn.

Responsible for creating and reviewing business requirements, functional specifications, project schedules, technical documentation and test plans.

 

FOREIGN LANGUAGES:

          Telugu, Hindi, Urdu, Basic Proficiency in German, Spanish

 

PROFESSIONAL AFFILIATIONS & MEMBERSHIPS:

  •          Rackspace Certified “Cloud Solutions Consultant”
  •          Hadoop Admin Certification.

·  AHIMA, Member, (American Health Information Management Association)

·  IIBA Member,      (International Institute of Business Analysts)

·  PMI Member # 2504042 –Project Management Institute.

  •          HL 7  Member,        ( Health level Seven International

 

VISA STATUS :

                   I am a  US Citizen      (Naturalized). No sponsorship needed.

     Willing to Relocate and 100% Travelling acceptable. (Road Warrior)

                  LinkedIn Account:      http://www.linkedin.com/in/narenkanuganti/



Additional Info

BACK TO TOP

 

Current Career Level:

Experienced (Non-Manager)

Years of relevant work experience:

7+ to 10 Years

Date of Availability:

Immediately

Work Status:

US - I am authorized to work in this country for any employer.

Active Security Clearance:

None

US Military Service:

Citizenship:

US citizen

 

 

Target Company:

Company Size:

 

Target Locations:

Selected Locations:

US-GA-Atlanta North

Relocate:

Yes

Willingness to travel:

Up to 100%

 

Languages:

Languages

Proficiency Level

English

Fluent

Hindi

Fluent

Spanish

Beginner

Telugu

Advanced

Urdu

Fluent