From:                              route@monster.com

Sent:                               Monday, September 28, 2015 1:03 PM

To:                                   hg@apeironinc.com

Subject:                          Please review this candidate for: Talend

 

This resume has been forwarded to you at the request of Monster User xapeix03

Naresh N 

Last updated:  09/28/15

Job Title:  no specified

Company:  no specified

Rating:  Not Rated

Screening score:  no specified

Status:  Resume Received


Fairfield, CT  06824
US

Home: (512) 535 2697   
hadoopte@gmail.com
Contact Preference:  Telephone

Quick View Links:

Resume Section

Summary Section

 

 

RESUME

  

Resume Headline: Naresh N - Big Data/Hadoop Developer

Resume Value: a62avh7cqdr69m6z   

  

 

              Page: 1

SUMMARY:

·               Over 7 years of industrial experience in Application development and maintenance, data management, statistical programming, data analysis and data visualization.

·               Hands on experience on major components of Hadoop Ecosystem like Hive, HDP, Hbase, Pig and Sqoop, Oozie, Zookeeper,  MapReduce.

·               Good understanding of Hadoop architecture and different components of Hadoop clusters which include Job Tracker, Task Tracker, Name Node and Data Node.

·               Experience Tuning Hive queries to improve performance

·               Developed scripts and batch jobs to schedule various Hadoop Programs.

·               Experience in automating the Hadoop Installation, configuration and maintaining the cluster by using the tools like Puppet and Chef.

·               Developed Java MapReduce programs for the analysis of sample log file stored in the cluster.

·               Experience in implementing join operations using PIG Latin.

·               Experienced with processing different file formats like Avro, XML, JSON and sequence file formats using MapReduce programs.

·               Knowledge in NoSQL databases like HBase and Cassandra.

·               Worked on importing and exporting data from different databases like Oracle, Teradata into HDFS and Hive using Sqoop.

·               Hands on experience in Amazon Web Services like EC2, S3, EMR, RDS, DynamoDB.

·               Extensive knowledge of NoSQL databases such as HBase and Cassandra.

·               Extensive experience in SQL and NoSQL development.

·               Extensive experience in all the phases of the software development lifecycle (SDLC).

·               In-depth understanding of Data Structures and Algorithms.

·               Experience in deploying applications in heterogeneous Application Servers TOMCAT, Web Logic and Oracle Application. Server.

·                Worked on Multi Clustered environment and setting up Cloudera Hadoop echo System.

·               Background with traditional databases such as Oracle, Teradata, Netezza, SQL Server, ETL tools / processes and data warehousing architectures. 

·               Experience in Object Oriented Analysis Design (OOAD) and development of software using UML methodology, good knowledge on J2EE design patterns and core Java Design Patterns.

·               Experienced in preparing and executing Unit Test Plan and Unit Test Cases using Junit, MRUnit.

·               Experience in build tools like Maven and Ant.

·               Worked on web-based language such as HTML, CSS, PHP, XML and other web methodologies including Web methodologies including Web Services and SOAP.

·               Worked on the version control tools like CVS, SVN and Git.

·               Knowledge in creating PL/SQL Stored Procedures, Packages, Functions, Cursors.

·               Experience in Scrum, Agile and waterfall models.

·               Worked on different operating systems like  Unix/Linux, Windows XP,

·               Goal oriented self-starter, quick learner, team player and proficient in handling multiple projects

 

 

SKILLS:

·               Hadoop, MapReduce,  Sqoop, HIVE, HDP, PIG, Flume, Oozie

·               Storm, Zookeeper, MapReduce, HDFS, Splunk

·               Shell, Python, AVRO

·               AIX 5.1, Red Hat Linux. Cent OS, Windows

·               HBase, Cassandra

·               Apache Contributer

·               AWS (Amazon Web Services), EMR, RDS, EC2, S3

·               Puppet and Chef Configuration Management utility.

·               JIRA, SDLC, MongoDB

·               Cloudera, HortonWork

·               Data Warehouse/ Business Intelligence(BI)

·               Datastage, Talend Open Studio

·               IBM DB2, Teradata, MySQL, NoSQL

·               Data Pipeline and Redshift

·               ETL Tool (Informatica)

·               Java, J2EE, Springs, Hibernate

·               MYSQL, Oracle 10g, MS SQL Server

·               HTML, CSS, Javascript, Ajax

·               Agile, Scrum, Waterfall

 

 

PROFESSIONAL EXPERIENCE:

 

General Electric, Fairfield, CT                                                                                Oct 2014- Present                           

Big Data/Hadoop Developer

 

Responsibilities:

·           Working on Big Data integration and analytics based on Hadoop and Web method technologies.

·           Working in implementing Hadoop with the Amazon cloud (AWS) EC2 service by creating and managing few instances in gathering and analyzing data log files. 

·           Handling importing of data from various data sources like relational databases systems to HDFS using Sqoop.

·           Involved in scheduling Oozie workflow engine to run multiple Hive and Pig jobs.

·           Developed Unix Shell scripts to automate the process of sqooping the data from various sources and also generated common templates for Sqooping data, File Movement and created validation scripts.

·           Working on Unix shell scripts for business process and loading data from different interfaces to HDFS.

·           Written Sqoop incremental import job to move new /updated info from Database to HDFS.

·           Experience in joining raw data with the reference data using Pig scripting.

·           Analyzed the data by performing data profiling

·           Created Oozie coordinated workflow to execute Sqoop incremental job daily. 

·           Optimized Map Reduce Jobs to use HDFS efficiently by using various compression mechanisms

·           Working on GE on loading files to HIVE and HDFS from MongoDB.

·           Designed techniques and wrote effective and successful programs in JAVA, Linux shell scripting to push the large data including the Text and Byte type of data to successfully migrate to NO SQL Stores using various Data Parser techniques in addition to Map Reduce jobs.  

·           Evaluate Puppet framework and tools to automate the cloud deployment and operations.

·           Developed Use cases and Technical prototyping for implementing PIG, HDP, HIVE and HBASE.

·           Working on configuring and Maintaining Hadoop environment on AWS

·           Working on Modifying Chef Recipes used to configure the Hadoop stack.

·           Delivered Working Widget Software using EXTJS4, HTML5, RESTFUL Web services, JSON Store, Linux, Hadoop, ZOOKEEPER, NO SQL databases, JAVA, SPRING Security, JBOSS Application Server for Big Data analytics.

 

 

 

Toyota, Saline Michigan                                                                                              Mar 2013-Oct 2014

Big Data/Hadoop Developer

 

Responsibilities:

·           Experience in Installation, Configuration and Managing Hadoop cluster using Cloudera Manager

·           Deployed and managed Multinode Hadoop cluster with different Hadoop components using Cloudera Manager and HortonworksAmbari.

·           Analyzing the clients’ existing Hadoop infrastructure, understand the performance bottlenecks and provide the performance tuning accordingly

·           Involved in collecting and aggregating large amounts of streaming data into HDFS using Flume and defined channel selectors to multiplex data into different sinks

·           Installed and configured MapReduce, HIVE and the HDFS; implemented CDH3 Hadoop cluster               on CentOS. Assisted with performance tuning and monitoring.

·           Created Cassandra tables using CQL to load large sets of structured, semi-structured and               unstructured data coming from UNIX, NoSQL and a variety of portfolios.

·           Experience in writing custom UDFs for Hive (UDF, UDAF, UDTF) and Pig (Eval, Filter, Load, Store) in Java

·           Created POC to store Server Log data into Cassandra to identify System Alert Metrics

·           Involved in writing Flume and Hive scripts to extract, transform and load the data into Database

·           Implemented complex map reduce programs to perform joins on the Map side using distributed cache.

·           Thoroughly tested Mapreduce programs using MRUnit and Junit testing frameworks.

·           Assisted with data capacity planning and node forecasting.

·           Used Pig as ETL tool to do transformations, event joins, filter bot traffic and some pre-aggregations before storing the data onto HDFS

·           Effectively used Oozie to develop automatic workflows of Sqoop, Mapreduce and Hive jobs.

 

 

Fed Ex/ Memphis                                                                                            Dec 2011-Mar 2013

Big Data/Hadoop Developer

 

Responsibilities:

·           Installed and configured Hadoop through Amazon Web Services in cloud. 

·           Developed MapReduce jobs in Java for data cleaning and preprocessing. 

·           Written Sqoop incremental import job to move new / updated info from Database to HDFS

·           Importing and exporting data into HDFS and Hive using Sqoop.

·           Used Bash Shell Scripting, Sqoop, AVRO, Hive, HDP, Redshift, PIG and Java Map/Reduce daily to develop ETL, batch processing, and data storage functionality.   

·           Responsible for developing data pipeline using flume, Sqoop and PIG to extract the data from weblogs and store in HDFS.

·           Worked on NoSQL databases including Hbase and MongoDB.

·           Worked on Designing and Developing ETL Workflows using Java for processing data in HDFS/Hbase using Oozie.

·           Exploited Hadoop MySQL–Connector to store Map Reduce results in RDBMS.

·           Worked on Hadoop installation & configuration of multiple nodes on AWS EC2 system.

 

·           Developed Simple to complex MapReduce Jobs using Hive and PIG.

·           Worked on automate monitoring and optimizing large volume data transfer processes between Hadoop clusters and AWS.  

·           Design and implement data processing using AWS Data Pipeline.

·           Developed Simple to complex MapReduce Jobs using Hive and PIG.

 

 

Comcast, Denver, CO                                                                                                                                                 Jan 2010-Dec 2011

JAVA Developer

 

      Responsibilities:

·           Designed and developed application using Java; Developed SQL queries and stored procedures for the application.

·           Project delivery in agile model, and tracking project’s progress through daily scrum regarding design defects.

·           Provide a robust, secure and scalable E-business platform.

·           Experienced Core java developer with hands on experience in Struts, Spring and Hibernate.

·           Used JavaScript and struts validation framework for performing front end validations.

·           Analyzed System Requirements and prepared System Design document.

·           Developed dynamic User Interface with HTML and JavaScript using JSP and Servlet Technology.

·           Designed and developed a sub system where Java Messaging Service (JMS) applications are developed to communicate with MQ in data exchange between different systems

·           Done end to end development and testing for Change Requests.

·           Having Knowledge of Agile Product Quality Management and Agile Product Collaboration Modules, developed major changes in the Agile Application.

·           Designed an ER Diagram for all the databases using the DB Designer an Open Source Tool.

·           Designed the Class Diagrams and the Use cases Diagram using the Open Source tool.

·           Created and executed Test Plans using Quality Center by Test Director.

·           Developed database schema and SQL queries for querying database on Oracle 9i.

·           Reviewed and edited data forms using Microsoft Excel.

·           Interacted and communicated with Key stake holders to understand business problems and define the analytical approach to resolve problems.

·           Involved in all facets of application development from system design, implementation, maintenance, support, testing and also proficient in documentation.

·           Helped other team members in the project if they are facing any technical issues in application integration and configuration side.

 

 

Standard & Poor’s USPF value Driven Surveillance, Virtusa, Hyderabad, India                                              Apr 2009-Jan 2010

JAVA Developer

 

     Responsibilities:

·           Understanding the requirements from Detailed Software Requirements Specifications

·           Developed the presentation layer using JSP, HTML, CSS and client validations using JavaScript

·           Involved in designing and development of the ecommerce site using JSP, Servlet, EJBs, JavaScript and JDBC

·           Designed and developed application using Java; Developed SQL queries and stored procedures for the application.

·           Provide a robust, secure and scalable E-business platform.

·           Used Eclipse 6.0 as IDE for application development

·           Understanding the relationship between entities of physical data model.

·           Conduct Knowledge Transfer (KT) sessions on the business value and technical functionalities incorporated in the developed modules for new recruits.

·           Configured Struts framework to implement MVC design patterns

·           Designed and developed GUI using JSP, HTML, DHTML and CSS

·           Worked with JMS for messaging interface.

·           Developed database schema and SQL queries for querying database on Oracle 9i.

·           Reviewed and edited data forms using Microsoft Excel.

·           Handled SVN as a SVN administrator and resolved pending issues in it.

·           Helped other team members in the project if they are facing any technical issues in application integration and configuration side.

 

 

Virtusa Ind Pvt.Ltd, Hyderabad, India                                                                                                                   Jul 2008- Apr 2009

JAVA Developer

 

     Responsibilities:

·           Worked on Java Struts 1.0 synchronized with Oracle Database to develop an internal application for ticket creation.

·           Worked heavily with the Struts tags- used struts as the front controller to the web application. Implemented Struts Framework according to MVC design pattern.

·           Involved in the analysis, design, implementation, and testing of the project.

·           Implemented the presentation layer with HTML, XHTML and JavaScript.

·           Created Web.xml, Struts-config.xml, Validation.xml files to integrate all the components in the Struts framework.

·           Developed web components using JSP, Servlets and JDBC.

·           Worked on MySQL 5.0 and SQL Developer for fetching/adding the data.

·           Designed tables and indexes.

·           Wrote complex SQL and stored procedures.

·           Prepared Documentation and User Guides to identify the various Attributes and Metrics needed from Business.

·           Involved in fixing bugs and unit testing with test cases using JUnit.

·           Developed user and technical documentation.

 



Experience

BACK TO TOP

 

Job Title

Company

Experience

Big Data/Hadoop Developer

General Electric Company

- Present

 

Additional Info

BACK TO TOP

 

Current Career Level:

Experienced (Non-Manager)

Work Status:

US - I am authorized to work in this country for my present employer only.

 

 

Target Job:

Target Job Title:

Big Data/Hadoop Developer

 

Target Company:

Company Size:

Occupation:

Quality Assurance/Safety

·         ISO Certification

 

Target Locations:

Selected Locations:

US-CT-Danbury/Bridgeport