From:                              route@monster.com

Sent:                               Monday, September 28, 2015 1:01 PM

To:                                   hg@apeironinc.com

Subject:                          Please review this candidate for: Talend

 

This resume has been forwarded to you at the request of Monster User xapeix03

Narendar Dheeravath 

Last updated:  07/29/15

Job Title:  no specified

Company:  no specified

Rating:  Not Rated

Screening score:  no specified

Status:  Resume Received


Charlotte, NC  28211
US

Home: 7178025374   
narendar25d@gmail.com
Contact Preference:  Telephone

Quick View Links:

Resume Section

Summary Section

 

 

RESUME

  

Resume Headline: Narendar Dheeravath - Java Lead Developer

Resume Value: jbwktt8e5kbff2wt   

  

 

Narendar Dheeravath

Java Lead Developer

narendar25d@gmail.com

717-802-5374

Summary:

·               8+ years of expertise in software engineering, which encompasses complete SDLC, design, development, Maintenance, application software development in Web-based environment, Online Transaction Processing, N-tier architecture and Client/Server architecture

·               Implemented business solutions in Banking, Health Insurance Exchange and Credit Report& Bureau and Insurance by understanding core business model of the industry.

·               Specialization in working on Web-based applications developed in Java SE 1.4+ and Java EE technologies.

·               Very good experience in Hadoop, HDFS, Hive, Talend, Spark, YARN, and Big data Technologies.

·               Good experienced in in Services Integrations, Service Component Architecture this includes Service Oriented Architecture (SOA), IBM Enterprise Service Bus.

·               Good experienced in design, development and implementation of services integration using IBM technologies like Web Sphere Process Server (WPS), Web Sphere ESB (WESB), Web Services, WSRR and Data power

·               Good working experience with IBM Integration Designer, Rational Application Developer, Web Sphere Business Process Manger and IBM Data Power.

·               Very good experience in designing and developing applications using Spring Framework, Spring MVC, JSP, Servlets, JNDI, JDBC, EJB 2.1,  Hibernate 3.0,  AJAX, Google Web Toolkit, Struts2

·               Very Good experience with jBPM4 (Java Business Process Management), JMS and ActiveMQ, MQ Series

·               Very Good experience in Java Middle Tier Development, Message Queuing, ESB

·               Very Good Experience in SQL (Structure Query Language)

·               Very Good experience with Business Rule Management System (JBoss Drools),

·               Good experience with Design Patterns like MVC, Service Locator, Factory Pattern, Single Ton, Session Facade, Command Pattern and Data Access Object Pattern

·               Experienced in SOA and Web Services with Axis2 and Apache CXF (WSDL, XML and End Points), JAX-WS and JAX-RS

·               Working experience on Application Servers like Apache Tomcat, Jetty, IBM Webspere and JBoss 5.0+.

·               Working experience on ISAM IBM Security Access Manager, IBM Enterprise Service Bus and IBM File Net Content Manager

·               Good experience with SQL with IBMDB2, MS SQL Server, MySQL and Oracle databases

·               Hands-on experience with other technologies as HTML, JavaScript, ANT, Log4J, JUnit 4.0, TestNG, JBoss Rules-Drools, XML, XSD, DOM, SAX , XPATH, XSLT, Caster, JDOM and XMLBean

·               Worked with various development methodologies like SDLC (Waterfall Model),  Agile(Scrum process) and Iterative Software development

·               Experience with Version Control Systems like GIT,CVS, SVN, and IBM Rational Clear Case

Achievements:

·               Appreciation from access health CT (State of Connecticut-Health Insurance Exchange) client for outstanding performance award at Deloitte.

·               Appreciation for outstanding performance award for integration of jBPM in Nextgen Project while working on Experian

 

Education & certifications:

·               Master in Information Technology from University of Hyderabad, India - 2007.

·               Bachelor in Computer Science and Information Technology from J.N.T.U Hyderabad, India - 2004.

 

Technical Skills:

Languages                   

Java, C, C++, XML/XSL, SQL, JavaScript

Java Technologies

Servlets, JSP, SAX/DOM, Web Services (JAX-WS and JAX-RS), SOAP, WSDL, JAXB, Java Mail, HTML, JSON, JSTL, JXL, POI, JNDI, JMS, JPA, JDBC.

Servers                         

Tomcat, JBoss 5, Jetty, IBM Web Sphere 7.0

Operating Systems     

Ubuntu Linux, Red Hat Linux, Windows 7 and other flavors.

Frameworks

Spring 3.1, Hibernate 3.3, Struts 2, Tiles, Google Web Toolkit, Dozer Mapping, LDAP, JPA, Struts, WEKA, ISAM, ESB, File Net, JAXB, Castor, XML Beans, JAXP, JDOM, DOM4j,  XSLT, XSLT-FO, XPATH, XML Schema, XSD, JSON

Rule Engine

JBoss Rules- Drools 5.1, ILog Configurator

Workflow Engine

JBPM (Java Business Process Management)

Messaging Engine

Active MQ, JMS, MQ series

Web services

Axis2, Apache CXF, JAX-WS, JAX-RS, SOA, IBM ESB, WSRR, WPS, Web Sphere Process Server, IBM Data Power, Web Sphere ESB (WESB),

Databases                     

IBM DB2, Oracle, MySQL, MS SQL Server

IDE                                

Eclipse 3.0+, RAD 8

Version Control Tools   

SVN, GIT, IBM Rational Clear Case

Tools

Maven, ANT, Log4J, JUnit , Hudson, Selenium, PMD, Cygwin, jMeter, jProfiler, Java Visual VM

BigData Technologies

Hadoop, HBase, Hive, Oozie, Flume, HDFS, Talend, YARN, Spark, MangoDb

 

Professional Experience:

CITI , Charlotte, NC.                                                                                                      Sep 2014 to Till now

Java Lead  developer

Pre Provision Net Revenue and  Ruby Reporting

Developed a financial risk reporting system for PPNR and Ruby to ensure compliance with federal CCAR regulations.

Analyzing CCAR data in each cycle and generate the reports on application.

 

Responsibilities:

 

·   Analysis of design and development of Ruby reporting system.

·   Involved in requirement analysis and sprint planning.

·   Handled the team of 10 members and assigning task to them

·   Designed  and developed the multi threaded handling of web services client.

·   Lead the development of JAVA and Spring Batch Jobs / Database modifications

·   Design and development of batch jobs to load the data in Ruby DB Schema

·   Design and development of batch jobs to load the data into PPNR db using spring batch

·   Design and development of  data validation rules and data validation framework 

·   Design and development of rules based service

·   Design and development of data suppression rule

·   Design and Development of batch jobs using spring batch

·   Analysis of requirement of CCAR- PPNR application

·   Design and Development of PPNR application

·   Design and development of schema migration

·   Design and Development of data migration service

·   Design and development of stored procedures in oracle and invoked them using Java

·   Design and Design of NIR Liv Sync and Ruby Sync web services.

·   Design and development of PPNR admin, PPNR Contributor, PPNR Template Loader application

·   Design and Development of ruby web services integration with PPNR

·   Involved in deployment of PPNR application into different environment.

·   Unit test cases for PPNR application.

 

Environment: Java, J2ee, Maven, Hibernate, Ext Js, Web services, Oracle, SVN, Eclipse, Spring, Spring MVC, Restful web services, JSON, Spring Aop, Talend, Web sphere, Jetty, Sonar, Jenkin, Spring Batch, Drools, SOAP

 

 

 

 

 

Capital One, Richmond, VA.                                                                                                      Feb 2014 to Sep 2014

Senior Hadoop Java developer

ISRM (Information Security Risk Management) and Seed Data Hub

EDS Team took an initiative to develop a generic framework that can be used by all the internal teams which manages data in terms of terabytes. Core of this framework is to utilize the Hadoop ecosystem for data analytics part integrated with Drools framework for data validation and small minimal transformations. This framework also provides you the push and pulls functionality for the files in the HDFS cluster. ISRM is a Security and risk management which calculates the risk score and security breaches based on data retrieved from source system. It is going to read the HDFS file System and analysis the data which calculate the risk score. We are trying to replace with big data thingies. As part of ISRM we have legacy system right. So we are trying to migrating project using BIG Data, Java and open source technologies.

 

Responsibilities:

·               Involved in design and development of frameworks for Seed Data Hub.

·               Implemented the rules framework for ISRM which handles the transformation rules

·               Design and development of data quality framework

·               Design and development of file broker HDFS file system

·               Design and development of data extraction framework from HDFS file system.

·               Design and development of data acquisition framework using spring XD

·               Integrated the rules framework with Talend

·               Implementation of Talend jobs to extract the data from different system.

·               Design and Development of Talend POC for Big data

·               Involved in registering the data sets into data registry frameworks.

·               Design and Development of Talend Jobs which connect to Hadoop echosystems to load into Hive Tables for data analytics

 

 

Environment: J2EE, spring, Drools, Talend, Hadoop, HDFS, Hive, Pig, YARN, Hbase, Teradata, oracle, db2, SVN, Eclipse, Jenkin, spring xd

 

Deloitte Consulting LLP, Camp Hill, PA.                                                                                       Mar 13 to Feb 2014

Senior Java SOA developer

Connecticut - Health Insurance Exchange (CT-HIX)

A Health Benefit Exchange is a key provision of the Affordable Care Act that creates a new marketplace for each state to offer health benefits to individuals, families and small businesses. Under national health reform, states must have an Exchange. Basic functionality of CT-HIX product is related to create new market for the state of Connecticut residents on Health benefits. There were modules like admin to create the user, eligibility module to create the user application form to fill online or offline. Online the applicant enters the data online and he can modify the details. Offline applicant can submit the data by phone or paper, so the worker is going to fill the application as power user. The application will run on two profiles of the web sphere server as customer and worker. Enrollment is the process after the eligibility validation got success, so the applicant will get enrolled to the CT-HIX system. Various Federal government rules are applied to validate applicant’s application form to get subsidy on the health care insurance.

 

Responsibilities:

·               Design and Development of frameworks for projects.

·               Design and Development of JAX-WS and JAX-RS Web services

·               Design and Development of Simulator for FDSH

·               Design and development of Resource Bundle for internationalization.

·               Design and development of Split the project for worker and customer of two different profiles.

·               Design and Development of Application Portal and Worker Portal

·               Design and development of Interceptors for the User, flow control and URL restriction.

·               Design and development of navigation flow of various pages.

·               Design and development of Hibernate EH-cache for secondary cache.

·               Design and Development of Application Retry

·               Design and Development of Asynchronous web services

·               Design and Development of web services client to communicate with Federal Data Hub through ESB

·               Design and development of DAO layer by using Hibernate framework of Abstract design pattern.

·               Design and Development of Security Frameworks using IBM ISAM

·               Design and Development of web services client frameworks to communicate with federal web services using soap.

·               Involved in Eligibility frameworks online application development.

·               Involved in Eligibility frameworks online application change report development.

·               Involved in Eligibility frameworks Power User development.

·               Involved in Eligibility frameworks application validation development

·               Involved in implementing the JMS Queue to consume the response.

·               Integrated the anti-virus check for file upload

·               Developed the frameworks for web services retry. 

·               Involved in Forty scan review for security

·               Involved in Apps scan review for security

·               Develop the framework for the components per requirements.

·               Coding, debugging, Unit testing & reviewing the code.

·               Analysis of the business solution and develop Business requirement Definition

·               Reporting project status to the project leader and updates the project status

 

Learning:

§    Exposure to Web Sphere application server 7.0 and RAD.

§    Exposure to Health Care Exchange.

§    Exposure to Web Sphere File net and ESB (Enterprise Service Bus).

§    Exposure to ISAM-IBM Security Access Manager.

§    Exposure to data analysis and ETL (Extract Transform and Load) Process.

 

Environment: J2EE ,Struts2, Spring, Hibernate, Crystal report (to generate the reports), JMS, IBM File Net Content Manager, XML ,JSP, JavaScript, JQuery, AJAX and IBM WebSphere 7.0 Application Server, Java 1.6, SOAP based web services and Web Sphere Enterprise Service Bus (ESB) and  ISAM-IBM Security Access Manager and DB2.

 

Award and Appreciations:

·  Appreciation from access health CT (State of Connecticut-Health Insurance Exchange) client for outstanding performance at Deloitte on 26 OCT 2013.

 

Client:  Experian, CA, USA                                                                                                     Jul 2012 to Feb 2013

Position: Senior Java Developer

Product Delivery System

Product delivery system is product which reports the credit score of consumer or business based on his credit profile, law, data providers and domain knowledge experts and clients.

Responsibilities:

·               Owner of a complete module in the development phase (Product Delivery System).

·               Planning of complete application and assigning the tasks to team members

·               Designed the UML diagram and process flow for PDS product

·               Design and development of complete product delivery  framework

·               Design and development of display rules framework

·               Development of data  validation rules frameworks

·               Design and development of consumer and commercial web services using Apache CXF

·               Deployment of PDS application to servers

·               Developed the technical proof of concepts in Hadoop to setup the clusters

·               Developed Map Reduce/EMR jobs to analyse the data and provide heuristics and reports. The heuristics were used for improving campaign targeting and efficiency

·               Setup and benchmarked Hadoop / HBase clusters for internal use

·               Responsible to manage data coming from different sources

·               Experienced in defining job flows

 

Environment: Spring, Java SE 6,Hadoop, Drools, Java EE, EJB 3.0, IBM DB2, Jboss5.1, XML, XSLT, JDBC,  JAXB, SOAP, REST, Apache CXF, Servlets, JMS, Active MQ, Pentaho kettle, MangoDB, jBPM, GWT, Maven, Jetty Server, Sonar, Git, Junit.

 

Social Media Sentimental Data Analysis                                                                               May 2012– Jul 2012

POC (Proof of concept)

The POC was based on Twitter using twitter4j and Hadoop, Flume, Oozie, and Weka. POC collects the twitter data periodically on hourly basis. Flume is configured as service on the system to load the twitter data on to HDFS. Oozie workflow scheduler system is configured as service to manage Apache Hadoop jobs. Based on the HDFS data availability Oozie triggers were raised to load the data and created hive tables, from hive tables data is rendered by using hive query, that internally runs the map reduce jobs to analyze the data is rendered to front end for user to understand the sentiment on given term (Ex. Obama, Sachin, CaptialOne, etc.), for this to happen WEKA has to be trained to Naive Bayes Classifier for positive and negative terms and classify based on polarity of sentence. This was done on amazon EC2 by using 10 micro node clusters

Responsibilities:

·               Setting up and running Hadoop cluster

·               Writing Map Reducing jobs based on Requirement.

·               Setting up Hive tool as service and run hive queries for data analysis

·               Design and Development of frameworks to collect the twitter data periodically using Twitter4j API

·               Managed Apache Hadoop jobs using Oozie.

·               Developed Map Reduce/EMR jobs to analyse the data and provide heuristics and reports. The heuristics were used for improving campaign targeting and efficiency

 

Environment: Java 1.7, HADOOP (Map Reduce), Hive, Eclipse, Flume, Oozie, Ubuntu, GWT Visualization, Twitter4j, WEKA

 

Experian, CA, USA.                                                                                                       Oct 2011 to May 2012 

Position: Sr. Developer

NextGen Credit Report & Bureau, Australia             

A credit reporting and scoring system for Consumer and Commercial, built on top of Nextgen core system we developed earlier, customized organically based on requirements gathered from law, data providers, domain knowledge experts and clients. It was developed by a 7 member cross-functional agile team

Responsibilities:

·               Design and development of complete product framework for PDS, BOS(Bureau Orchestrations Studio)

·               Technical proof of concept development –jBPM and Integration with product

·               Reformatter POC using Altova Map Force Tool –Technical Proof of Concepts

·               Develop the framework for the components per requirements.

·               Developed custom API for their customization requirement.

·               Assigning the tasks to team members

·               Designed the process flow, UML diagram of consumer and commercial credit report for PDS

·               Design and development of Australia Securities and Investment Commission  ASIC Web services integration  with Next gen web services( 3 rd. party calls) to display the ASIC data on PDS

·               Training on the architecture, skills to new entrants.

·               Coding, debugging, Unit testing & reviewing the code.

·               Profiling of whole product by using jProfiler and Java VisualVM and identifying the bottlenecks and fixing of  those bottlenecks

·               Improved the performance of product for PDS

·               Analysis of the business solution and develop Business requirement Definition

·               Development of technical specification, component detailed requirements based on the requirements.

·               Involved in preparing detail design by understanding specification documents.

·               Involved in planning the tasks to be taken up sprint by sprint basis.

·               Involved in implementing the Credit Report Architecture leveraging Java /J2ee

·               Involved in moving the code to New structure to facilitate and remove the dependencies of the modules

·               Worked in developing Interfaces to interact with projects.

·               Involved in deploying the project to the different Environments.

·               Worked actively in Modularizing and Generalization of code to facilitate Customization with only limited changes

·               Worked Actively with Business Team to get the clarifications and inputs as we have continuous Integration

·               Process development  and adhere to the services level agreement

·               Data analysis and design

·               Reporting project status to the project leader and updates the project status

·               Co-ordinate with the project teams located in various Geographic’s on resource and scheduling

·               Perform unit testing and system testing

·               Deployment of the application on UAT/QA server.

·               Packaging and delivery or the application.

 

Environment: Spring, Java SE 6, Drools, Java EE, Hadoop, HDFS, Hive, EJB 3.0, IBM DB2, Jboss5.1,XML,XSLT, JDBC,  JAXB, JAX-WS, Apache CXF, Servlets, JMS, ActiveMQ, Pentaho kettle, MangoDB, jBPM, GWT, Maven, Jetty Server, Sonar, Git, Junit.

 

Experian, CA, USA Nov 2010– Oct2011

Position: Sr. Developer

NextGen Credit Bureau Core

Experian is a global leader in providing information, analytical tools and marketing services to organizations and consumers to help manage the risk and reward of commercial and financial decisions. Using comprehensive understanding of individuals, markets and economies, Experian help organizations find, develop and manage customer relationships to make their businesses more profitable. Experian promotes greater financial health and opportunity among consumers by enabling them to understand, manage and protect their personal information, helping them control financial aspects of key life events, and make the most advantageous financial and purchasing decisions.

Objective of Next Gen Global Credit Bureau is to develop a software product that enables Experian to rapidly expand its consumer and commercial geographic coverage around the world. It gives the customers fast and easy access to an extensive range of up to date and accurate data about consumer and business, across a wide variety of industry sectors

Responsibilities:

·               Leading a complete module in the development phase (Product Delivery System).

·               Technical proof of concept -jBPM in development

·               Designing & finalizing the process flow, UML

·               Design and developed the data services to get the data from database

·               Integrated the services in jBPM Workflow Engine

·               Holds the discussion with the clients

·               Training on the architecture, skills to new entrants.

·               Coding & reviewing the code.

·               Analysis of the business solution and develop Business requirement Definition

·               Development of technical specification, component detailed requirements based on the requirements.

·               Development of Logical Database design and model

·               Involved in preparing detail design by understanding specification documents.

·               Involved in planning the tasks to be taken up sprint by sprint basis.

·               Involved in implementing Pentaho transformations and jobs to create and maintained a DWH.

·               Involved in moving the code to New structure to facilitate and remove the dependencies of the modules

·               Implemented Cross Module Functionality to facilitate Data Transfer across Modules

·               Worked in developing Interfaces to interact with projects.

·               Involved in deploying the project to the different Environments.

·               Worked actively in Modularizing and Generalization of code to facilitate Customization with only limited changes

·               Worked Actively with Business Team to get the clarifications and inputs as we have continuous Integration

·               Data analysis and design

·               Reporting project status to the project leader and updates the project status

·               Co-ordinate with the project teams located in various Geographic’s on resource and scheduling

 

Environment: Spring, Java SE 6,HADOOP, HDFS, HIVE,  Drools, Java EE, EJB 3.0, IBM DB2, Jboss5.1,  XML, XSLT, JAXB, Apache CXF, Servlets, JMS, ActiveMQ, Pentaho kettle, MangoDB, jBPM, Maven, Jetty Server, GWT, Maven, Sonar, Git, IBM Clear case, Junit, JPA, SQL

 

Value Momentum Software India (P), Hyderabad                                         Mar 2009 – Nov 2010

Software Engineer

IFOUNDRY Product (Insurance Product Life Cycle Management)

PLCM implements product strategies across the organization in a timely, efficient and cost effective manner. All the activities related to definition, approval and maintenance that occur in an insurance product’s lifecycle after a product has been conceptualized and designed are within the scope of PLCM. A product’s definition will consist of various constructs such as Coverage, Limits, Terms, Business Rules, Calculations, Product Constraints and Data specific to that Product. The PLCM Engine interprets the fulfillment and risk information provided by the customer to select a set of best suited products and refines the products based on the variations defined by the modelers based on the jurisdiction, distribution channel and the specific risk profile.  The engine also determines the conditions that would apply to the configured recommendation and computes the price. IFoundry PLM is architected and designed based on SOA standards and guidelines. To the external applications, it exposes a set of services that can be consumed by end client applications like Quoting System or Policy Issuance System for product configuration, eligibility and pricing/rating

Responsibilities:

·               Involved in design and development of frameworks for components as per requirement.

·               Involved in design and development of web services using Axis2

·               Developed the data services to get the data from database using jdbc and spring jdbc templates

·               Design and developed the get Coverage’s and get Configure Product web services

·               Design and Developed the get Insurable Items web service for product in product model component

·               Developed the get Roles web service for product in product model component

·               Involved in the design of product configuration, publish, and execution components

·               Involved in the integration of publish and execution component with the controller

·               Involved in the design of the Eligibility Rules Component (Publish and Execution) using drools

·               Implementation of the rules Publish Component using Drools API

·               Implementation of the rules Execution component using Drools API

 

Environment: Core Java, J2EE, Drools, Oracle 9i, MS SQL Server, Antlr, XSLT, XML, Axis2,

Jboss, Apache Tomcat, Maven, ILOG Configurator, JBDD, Spring, Hibernate, EJB, DOM, SVN

 

Value Momentum Software India (P), Hyderabad                                         Aug 2007 – Feb 2009

Software Engineer

Interactive Solver Product

Interactive Solver will help the client application to configure a product interactively and dynamically. All the ii and role based conditional product rules and other non-conditional product rules will be solved in getCoverage’swithIIservice. All the conditional constraints (non ii and role based) will be solved through interactive solver. Once user selects (or deselects) any part (optional package or coverage) or selects/inputs any value of a building block which is participating in the condition part of a product rule, product rules will be fired through interactive solver and product page will be refreshed automatically if there is any change on the product model.

Responsibilities:

·               Involved in design and development of interactive solver frameworks and service

·               Involved in design of data base table in SQL server and oracle

·               Developed the interactive solver web service for getting the product model

·               Developed the interactive solver API for solving of product based rules dynamically and interactively.

 

Environment: Core Java, J2EE (JSP, Servlets), Oracle 9i, MS SQL Server, XSLT, XML,

Axis2, JBoss, Apache Tomcat, Maven, ILOG Configurator, JBDD, Hibernate, DOM, SVN, spring,

Hibernate, Java Script, AJAX, Drools, Struts2

 

 

 



Experience

BACK TO TOP

 

Job Title

Company

Experience

Java Lead Developer

CitiGroup

- Present

 

Additional Info

BACK TO TOP

 

Current Career Level:

Experienced (Non-Manager)

Date of Availability:

Within 2 weeks

Work Status:

US - I am authorized to work in this country for any employer.

Active Security Clearance:

None

US Military Service:

Citizenship:

None

 

 

Target Job:

Target Job Title:

Java Lead Developer

Desired Job Type:

Employee

Desired Status:

Full-Time

 

Target Company:

Company Size:

Occupation:

IT/Software Development

·         Software/Web Development

 

Target Locations:

Selected Locations:

US-NC-Charlotte

Relocate:

Yes

Willingness to travel:

Up to 25% travel

 

Languages:

Languages

Proficiency Level

English

Advanced