From:                              route@monster.com

Sent:                               Monday, September 28, 2015 12:59 PM

To:                                   hg@apeironinc.com

Subject:                          Please review this candidate for: Talend

 

This resume has been forwarded to you at the request of Monster User xapeix03

Chiranjit D. 

Last updated:  08/14/14

Job Title:  no specified

Company:  no specified

Rating:  Not Rated

Screening score:  no specified

Status:  Resume Received


Annapolis, CA  95412
US

trial_version_3@yahoo.com
Contact Preference:  Email

Quick View Links:

Resume Section

Summary Section

 

 

RESUME

  

Resume Headline: Chiranjit D.

Resume Value: 8k9wc2udehg9ihjv   

  

 

Chiranjit D.

 

·   Having real time 7+ years of Professional experience in installing, configuring, and administrating Hadoop cluster of major Hadoop distributions.

·   Highly capable of processing large sets of structured, semi-structured and unstructured data and supporting systems application architecture.

·   Extensive skillset includes data ingestion pipeline design, Hadoop information architecture, statistical data analysis, data modeling and data mining, text mining, machine learning and advanced data processing.

 

Professional Highlights

·   Experienced in installing, configuring, and administrating Hadoop cluster of major Hadoop distributions.

·   Understanding of Statistics and experience in application of Algorithms in relation to them.

·   Knowledge in Data Structures.

·   Experience in implementation of Open-Source frameworks like Struts, spring, Hibernate, Web Services etc.

·   Experience in deploying applications in heterogeneous Application Servers TOMCAT, WebLogic and Oracle Application Server.

·   Implemented Unit Testing using JUNIT testing during the projects.

·   Excellent knowledge of implementing MapReduce for various large scale cloud applications using Hadoop and HDFS

·   Develop codes in R and Python to add, modify or analyze data.

·   Experienced in using RHadoop and worked on Ruby

·   Hands on experience in developing MapReduce Codes in R and Python

·   Experience in writing data access layer using JDBC

·   Experience in installing, configuring and using ecosystem components like HadoopMapReduce, HDFS, HBase, Zoo Keeper, spark, Oozie, Hive, Cassandra, Sqoop, Pig, Flume, Avro, Chukwa, Whirr, MongoDB, Pentaho Kettle and Hortonworks Talend.

·   Experience in applying OOAD concepts to real world projects using Use Cases, Sequence diagrams and Class Diagrams in UML with Rational Rose

·   Knowledge of Agile Software Development using Scrum process and exposure in using tools like Version One and Rally for sprint planning, daily standups (dashboard) and sprint reviews

·   Experience using GlowCode to do performance analysis and detect memory leaks in a deployed code

·   Experience in creating testing scripts to implement unit tests and ATPs

·   Excellent analytical, problem solving, communication and interpersonal skills with ability to interact with individuals and can work as a part of a team as well as independently.

·   Excellent communication skills, interpersonal skills, problem solving skills a very good team player along with a can do attitude and ability to effectively communicate with all levels of the organization such as technical, management and customers.

·   Extensive experience in designing analytical / OLAP and transactional / OLTP databases.

·   Proficient using ERWin to design backend data models and entity relationship diagrams (ERDs) for star schemas, snowflake dimensions and fact tables.

·   Excellent leadership, interpersonal, problem solving and time management skills

·   Excellent communication skills, both written (documentation) and verbal (presentation)

·   Very responsible and good team player. Can work independently with minimal supervision.

 

Technical Skills

Java Technologies: Java 5,Java 6, JAXP, AJAX, I18N, JFC Swing, Log4j, Java Help API

J2EE Technologies: JSP 2.1 Servlets 2.3, JDBC 2.0,JNDI, XML, JAXP, Java Beans

Methodologies: Agile, UML, OOP, Scrum Design Patterns

Languages                        : R, Matlab, Python and Ruby.

Framework: Struts, Core Spring, Spring DAO, Spring MVC, Hibernate

Big Data

Ecosystem:Hadoop, MapReduce, HDFS, HBase, Zookeeper, Azure Hive, Pig, Sqoop, Cassandra,  

Oozie, Flume, Chukwa, Pentaho Kettle and Talend

Database:Oracle 10g, DB2,MySQL, Oracle, MySQL, MS SQL Server 2005, RESTful API, Derby, MS Access

Application Server:Apache Tomcat 5.x 6.0,  JBoss 4.0

Web Tools: HTML, Java Script, XML, DTD, Schemas, XSL, XSLT, XPath,  DOM, XQuery

Tools: SQL developer, EclipseMaven, ANT, JUnit,  TestNG, Jenkins, Soap UI, DB visualize

IDE / Testing Tools:Netbeans, Eclipse, WSAD, RAD

Operating System: Windows. Linux

 

 

Professional Experience

 

One Gas Tulsa, OK

Senior Hadoop Developer, Sep 2012 – Present

 

·   Installed and configured HadoopMapReduce, HDFS, Developed multiple MapReduce jobs in java, R and Python for data cleaning and preprocessing.

·   Configured and tested Hadoop for a non-distributed mode, as a single Java process using Hadoop 0.21.0 and  Java 1.6

·   Configured and tested Hadoop for pseudo-distributed mode where each Hadoop job runs in a separate Java process

·   Worked in CloudEra environment with desktop virtualization using VMWare Player to provision the Hadoop jobs in department cluster

·   Importing and exporting data into HDFS and Hive using Sqoop

·   Experienced in defining job flows

·   Experienced in managing and reviewing Hadoop log files

·   Supported Map Reduce Programs those are running on the cluster

·   Involved in loading data from UNIX file system to HDFS.

·   Installed and configured Hive and documented Hive UDFs.

·   Involved in loading data from UNIX file system to HDFS.

·   Installed and configured Hive and documented Hive UDFs.

·   Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map way.

·   Collaborate in creating Hive tables, loading with data and writing hive queries which will run internally in map way.

·   Data Analysis using Regression models in R and Python.

 

Environment:

Hadoop, MapReduce, HDFS, Hive, Java (jdk1.6), Hadoop distribution of Hortonworks, Cloudera, MapR, DataStax, IBM DataStage 8.1(Designer, Director, Administrator), Flat files, Oracle 11g/10g, PL/SQL, SQL*PLUS, Toad 9.6, Windows NT, UNIX Shell Scripting, Autosys r11.0.

 

 

 

Mercedes Montville, NJ

Hadoop Developer, Feb 2011 – Aug 2012

 

·   Installed and configured HadoopMapReduce, HDFS, Developed multiple MapReduce jobs in java, R and Pyhton for data cleaning and processing.

·   Administered a Hadoop cluster with 20 physical machines with around 100 virtual nodes and an HDFS file system with 5 TB capacity.

·   Automated all the jobs, for pulling data from FTP server to load data into Hive tables, using Oozie workflows.

·   Performed shell based administration of workspace for students and peers by assigning permissions using DFSShell command line interface and DFS Admin command set for HDFS access

·   Extensively worked with Apache platforms like HDFS, HBase and PIG for large data set analysis

·   Design process implementation of HRMS SaaS solution

·   Designed and implemented MapReduce algorithms in Java for clustering problems using byte streams to minimize memory usage.

·   Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map way

·   Responsible for ETL design, development, testing and review of code along with supporting documents like Unit test case and technical handover documents.

·   Responsible for integration testing and User acceptance testing. 

·   Developing complex ETL mapping and its corresponding sessions & worklets, workflows. 

·   End-to-end testing of Data warehouse/Data Mart load.

·   Responsible for understanding the scope of the project and requirement gathering.

 

Environment:

Hadoop, MapReduce, HDFS, Hive, Java (jdk1.6), Hadoop distribution of Hortonworks, Cloudera, MapR, DataStax, IBM DataStage 8.1(Designer, Director, Administrator), Flat files, Oracle 11g/10g, PL/SQL, SQL*PLUS, Toad 9.6, Windows NT, UNIX Shell Scripting, Autosys r11.0.

 

 

Yankee Candle South Deerfield, MA                                              

Hadoop Administrator, July 2009-Jan 2011

 

·   Supported code/design analysis, strategy development and project planning.

·   Developed multiple MapReduce jobs in Java for data cleaning and preprocessing.

·   Assisted with data capacity planning and node forecasting.

·   Analyzed the requirements from the customers and participated in Agile

·   Data extracted from Teradata and pushing into Hadoop using Sqoop.

·   T logs are pulled form log server and   stored in the FTP server hourly wise; this data is pushed into Hadoop by deleting data in the FTP server.

·   Knowledge on Java Virtual Machines (JVM) and multithreaded processing.

·   Developed PIG Latin scripts to extract the data from the web server output files to load into HDFS

·   Developed the PIG UDF’S to pre-process the data for analysis

·   Develop HIVE queries for the analysts

·   Developed workflow in Oozie to automate the tasks of loading the data into HDFS and   pre-processing with PIG

·   Responsible to manage data coming from different sources

·   Supported Map Reduce Programs those are running on the cluster

·   Transactional Data Analysis using R and Python

 

 

 

Environment:

Hadoop, MapReduce, HDFS, Hive, Java (jdk1.6), Hadoop distribution of Hortonworks, Cloudera, MapR, DataStax, Spring 2.5, Hibernate 3.0, JSF, Servlets , JDBC, JSP,JSTL, JPA, JavaScript, Eclipse 3.4, log4j,Oracle 10g, CVS, CSS, Xml, XSLT, SMTP, Windows-XP.
 

 

 

Bhartiya Technologies, India

Java/J2EE Consultant, Oct 2007-April 2009

 

·   Involved in various phases of Software Development Life Cycle.

·   Used Eclipse 6.0 as IDE for application development.

·   Validated all forms using Struts validation framework and implemented Tiles framework in the presentation layer.

·   Configured Struts framework to implement MVC design patterns

·   Designed and developed GUI using JSP, HTML, DHTML and CSS.

·   Worked with JMS for messaging interface.

·   Used Hibernate for handling database transactions and persisting objects.

·   Deployed the entire project on WebLogic application server.

·   Used AJAX for interactive user operations and client side validations.

·   Used XML for ORM mapping relations with the java classes and the database.

·   Used XSL transforms on certain XML data.

·   Developed ANT script for compiling and deployment.

·   Performed unit testing using JUnit.

·   Extensively used log4j for logging the log files.

·   Used Subversion as the version control system

·   Data Cleaning and Analysis using R.

 

Environment:

Java/J2EE, Oracle 10g, SQL, PL/SQL, JSP, EJB , Struts, Hibernate, WebLogic 8.0, HTML, AJAX, Java Script, JDBC, XML,UML, JUnit, log4j, MyEclipse 6.0.


 

 

Hierarchy Technologies, India

Java/J2EE Developer, Aug 2006 – Sep 2007

 

·   Involved in analyzing the client requirements and convert them into technical specifications

·   Worked in Analysis, Design and Coding for client developmentusingJ2EE stack using Eclipse platform.

·   Involved in creating web-based java components like client Applets and client side UI using JFC in Eclipse

·   Developed PL/SQL stored procedures to perform complex database operations.

·   Designed and developed SQL queries in the application

·   Developed Design documents for various components identified in the system.

·   Generated the Hibernate XML and Java Mappings for the schemas

·   Used Rational Application Developer (RAD) as Integrated Development Environment (IDE).

·   Extensively used Core Java, Servlets, JSP and XML.

·   Used Struts 1.2 in presentation tier.

·   Generated the Hibernate XML and Java Mappings for the schemas.

·   Used Subversion as the version control system

·   Involved in various phases of Software Development Life Cycle.

·   Created UML Diagrams (Class and Sequence) during Design Phase using Visio.

·   Deployed the entire project on WebLogic application server.

·   Transactional Data Analysis in R and Matlab.

 

 

Environment:

Java/J2EE, Oracle 10g, SQL, PL/SQL, JSP, EJB , Struts, Hibernate, WebLogic 8.0, HTML,  AJAX, Java Script, JDBC, XML, JMS, XSLT, UML, JUnit, log4j, MyEclipse 6.0.

 

 

 

 

                                                                                                                                                                                                                

 

 

 

 

 

 

 

 



Experience

BACK TO TOP

 

Job Title

Company

Experience

Senior Developer

One Gas Tulsa, OK

- Present

 

Additional Info

BACK TO TOP

 

Current Career Level:

Experienced (Non-Manager)

Work Status:

US - I am authorized to work in this country for any employer.

Active Security Clearance:

None

US Military Service:

Citizenship:

None

 

 

Target Job:

Target Job Title:

Senior Developer

Desired Job Type:

Employee

Desired Status:

Full-Time

 

Target Company:

Company Size:

 

Target Locations:

Selected Locations:

US-CA-Marin County/North Bay

Relocate:

No