From:                              route@monster.com

Sent:                               Monday, September 28, 2015 1:00 PM

To:                                   hg@apeironinc.com

Subject:                          Please review this candidate for: Talend

 

This resume has been forwarded to you at the request of Monster User xapeix03

Ranga Vure 

Last updated:  05/26/15

Job Title:  no specified

Company:  no specified

Rating:  Not Rated

Screening score:  no specified

Status:  Resume Received


Pleasanton, CA  94588
US

raangs.jobs@gmail.com
Contact Preference:  Telephone

Quick View Links:

Resume Section

Summary Section

 

 

RESUME

  

Resume Headline: Ranga Vure - Hadoop Lead

Resume Value: qznuu7h5qq8eeb5b   

  

 

 

RANGA, VURE (804-274-9656)

Work Authorization          H1B

E-mail:raangs@gmail.com

                           

Professional Profile:

§    Total of 15 Years IT experience in Design and Development.

§    2 yrs+ Experience in Hadoop Eco system (HDFS, Map Reduce, Hive, Pig, Talend) using CDH.

§    Experience in design and implementation of Data Pipelines using Java Map Reduce, PIG, HIVE, Talend and also using Python with Hadoop Streaming

§    Strong understanding in Hadoop EcoSystems like OOZIE, Flume, Kafka, Spark etc

§    Strong Java background with 12 years’ experience in Java/J2EE with Oracle/DB2/SQL Server

§    Experience in design and development of applications using J2EE, JSP/Servlets, EJB, Hibernate, Struts, Web Services(SOAP,REST), XML/XSLT, LDAP

§    Strong relational database experience, Oracle (SQL, PL/SQL), DB2, SQL Server and NoSql (MongoDb),HAWQ (MPP)

§    Design and Developed applications using industry standard Java open source frameworks like Spring, Struts, Hibernate, Maven, Ant, JUnit, Log4j, Quartz, jBPM, Drools,ActiveMQ and CI tools Maven, Nexus, Jenkins, Hudson, Ant

§    Total of 5 years experience on working with product development and

SCRUM Agile software development.

§    Worked on most of the UNIX flavors and Linux, good in scripting like Perl, Shell, AWK

 

Certifications

§    Cloudera Certified for Apache Hadoop (CCDH)

§    Sun Certified Programmer in Java2

§    Sun Certified Java Web Component Developer (SCWCD)

§    Pega Certified System Architect (CSA)

§    Certified Scrum Master

 

GE,San Ramon CA                                                         Jan 2015 – Till Date

Role: Data Lead Engineer

NextGenOps is Operational Analytics System. IT operations, application and Infrastructure data is collected from various sources and provides the analytics and correlations.

 

·   Data Ingestion using SQOOP from various sources.

·   Incremental Data Ingestion, and CDC Implementation

·   Building Data Pipe Lines/ETL using Talend, Hive, HAWQ and Map Reduce, consuming the data from Splunk, Flume

·   Implemented REST API

 

CapitalOne Inc, Richmond VA : Greenhouse                         Apr13 – Jan 2015

Role: Hadoop Architect

Greenhouse team is CapitalOne’s BigData/Hadoop technology team and has the objective of providing Hadoop/Bigdata solutions for Data Processing, Analytics, ETL, and Visualization.

·   Design and Implementation of data pipeline using Java Map Reduce for EBCDIC Datasets

·   Design and Implementation of data pipe line using Hadoop Streaming, Python, AVRO for processing the Credit Bureau Datasets

·   Design and Implementation of Data Quality framework using Java Map Reduce Java DROOLS (Java Based Rule engine)

·   Demonstrated the Cascading Framework as a solution for migrating Capital One’s Batch Applications to Hadoop Env.

·   Implemented POCs using Hive, PIG and Talend

·   Performance tuning of Map Reduce, PIG, Hive Jobs

·   Exposure to DataMeer, Tableau, Talend , I Python, R on Hadoop.

·   Technical direction to the team and Resolving technical issues during development etc.

 

Product: NextGen, Experian US              May11 – Mar 13

Role: Tech Lead

 

NextGen is Credit Bureau Suite, which provides Credit Reports (J2EE Web app) for Consumers and Business. The system collects the large volumes of data from Bureau Members for different data types like Credit Accounts, Bankruptcy Data, Court cases etc and generates the credit reports. The first customization of NextGen is in production for Australia Consumer http://www.experian.com.au/

 

·               Design and Development Core Framework using Core Java, Spring, ETL Pentaho

·               Design, Development of Framework using J2EE and Drools, to process the rules

·               Design and development of ASIC framework, which process the Credit Bureau data

from SOAP Web Services

·               Responsible for deliverables of Australia Credit Bureau System

·               Leading the team of 15 to deliver the Australia Credit Bureau System.

·               Technical direction to scrum teams Consumer, Commercial, Accumatch

·               Designed and Implemented Normalization and Validation components

·               Integration and Deployment of J2EE, and ETL Pentahao applications

·               Helping the teams resolving technical impediments, dependencies etc.

·               Resolved  build and deployment issues in different environments

 

Environment: J2EE, GWT, Active MQ, Spring, DROOLS Rules, JBoss, DB2, Kettle/Pentaho, Mongo DB

 

Company: PegaSystems Pvt Ltd, Hyderabad, India                Apr09- May11

Role: Tech Lead

Product: PRPC

PRPC, is Java based BPM Pega product is world leader in BPM domain. PRPC Reporting and PRPC BIX are the subsystems in Pega Product called PRPC.

 

·               Design and Development of PRPC Reporting Subsystem using Java and JDBC

·               Responsible for deliverables as a part of the Product Releases. Successfully delivered PRPC v5.5 SP2 , v6.1, V6.2 (In progress)

·               Leading the team of 8 to deliver the product releases and providing technical direction.

·               In Agile practice taking the role of Scrum Master

·               Supporting & Maintaining PRPC BIX v2.1, v2.2, v2.3

·               Overall responsible of implementing and managing the PRPC Reporting and PRPC BIX subsystems.

·               Fixing the customer issues and delivering the HFIXes to the customer.

 

Environment: J2EE, PRPC, Tomcat, Oracle/DB2/SQL Server

 

Company: Innominds Software Pvt Ltd, Hyderabad, India                Jan06-Mar09

Role: Tech Architect

Client: Nokia, Bangalore Product: OVICP

OVICP is a single door managing the all the device or Nokia application settings at one place. It allows the end user to manage the data using the device/mobile or using web based application www.ovi.com and makes the data sync all the time.

 

·               In Agile practice taking the role of Scrum Master

·               Design and development of REST based web services

·               Design and development of Storage framework using Spring and Hibernate

 

Environment: J2EE, JBoss, Oracle10g, Oracle Application server, Oracle Identity Server

Client: Infogix Product: ControlsAssure

Role: Tech Lead

Controls Assure is an automated controls solution that assures the accuracy, consistency, and reliability of information. This product verifies balances, reconciles and tracks your critical business information. This product control information within single and across multiple applications and systems, databases, and applications.

 

·               Implemented the use cases using JSP, Struts, EJB and Hibernate O-R mapping layer

·               Created and managed Hibernate mappings, persistence objects.

·               Used Apache FOP (Formatting Object Processor) open source Framework for producing the PDF documents

 

Environment: J2EE, Websphere, Oracle9i/DB2

 

Company: Centrata Software Pvt Ltd, Bangalore, India                 Dec 03- Dec05

Product: IT Service Delivery Management Suite

·               Developed the D2RT, an integration system of  J2EE web application .Net  Windows application which uses the WebServices

·               Implemented Publishing Framework (.Net client) which creates and publishes the content to the Content Server.

·               Over all responsible for delivering Content Management Framework (J2EE & .Net)

 

Environment: J2EE, Weblogic, Oracle9i/SQLServer2000

 

Company: TAG Inc, Louisville, KY                                  Jan03-Oct03

 

Client: Brown & Williamson Tobacco, Louisville, KY (Jan03- Oct03)

Project: Appress Framework

 

·               Designing and development of application infrastructure Framework using Microsoft .NET technologies.

·               Framework includes concept of presentation themes based on ASP.NET which enabled applications change look and feel just with a change in configuration file.

·               Provided role based authentication and authorization system based on metadata which is defined in Active Directory (LDAP), and also provided personalization features to render menus.

·               Framework provides a tool to generate data access layer (based on ADO.NET) to relieve the programmers from the drudgery of routine data access coding.

·               Used open source tools NAnt, NUnit, Log4Net, NDoc

 

Environment:  C#, ASP.Net, ADO.Net, .Net Framework, AciveDirectory, IIS, SQLServer2000

 

Company: Tek Systems Inc, Louisville, KY                            Sep02-Dec02

 

Client: Yum!Brands, Louisville KY (Sep02-Dec02)

Project: EDB

 

·               Design and development of Intranet Web Application to maintain the data about the Yum!Brands franchises ( PizzaHut, KFC, TacoBell).

·               Designed and develeoped assemblies to implement business logic using ADO.Net,XML.

 

Environment:   C#, ASP.Net, SQLServer2000, IIS5.0, VisualStudio.Net

 

Company: Rapidigm Inc, Pittsburgh, PA                          Nov00-Aug02

 

Client: GEAppliances, Louisville KY (Nov00- Aug02)

 

§    Involved in design and developement of STRUTS like J2EE framework based on Model View Controller (MVC) design pattern.

§    Implemented JUnit based unit-testing process and ANT based nightly build and integration testing process.

§    Performed Code Reviews, Technical support for other application teams

·               Maintaining  all production applications which are in NAS/iPlanet , Enhydra , Weblogic

·               Fixing bugs in TSF Java Framework

 

Project: LDAP ADMIN TOOL

 

LDAP Admin tool is web based administration tool for managing the roles, capabilities and other data stored in LDAP. 

·               Designed the objects using Rational Rose Tool

·               Developed the objects for managing the Roles and Capabilities.

·               Developed the JSP s for displaying the forms for adding  Roles , Users , Capabilities

 

 

Environment:  Java, PERL, LDAP API, Netscape Directory Server 4.15/iPlanet, HP - UNIX,

Sun-Solaris, Linux, Netscape Application/Enterprise Server, Enhydra

 

Company: Chenab Infotech Pvt Ltd, Mumbi, India                           Aug98-Oct00

 

Client: iMedeon Inc, Pittsburgh PA (Apr00 – Oct00)

Project: iM:Work - Mobile Field Service Solution.  

 

The iM:Work software application suite is a 100% web-based wireless solution for companies with mobile field service resources. The application suite fully integrates a company's mobile resources for improved customer service, allowing organizations to more effectively plan, communicate, manage, execute and analyze any type of field Work.

·               Developed Servlets  to generate HTML pages,

·               Developed Business Objects (SilverStream-specific) for the servlets.

·               Developed Data Source objects (SilverStream-specific).

·               Performed database interactions of Servlets using JDBC.

 

Environment: SilverStream Application Server 3.0, Servlets, JDBC, JavaScript, Oracle 8i.

 

Client:  A.F.S. (SwissAir)(Aug98 – Mar00)             

Project:  Revenue and Cargo Accounting System (RACAS)

 

The RACAS is a accounting system developed by Chenab for Swissair Cargo.  It consists of following modules:  Load, Pro-Ration, Invoicing, Balancing, Auditing and Administration Procedures. 

 

·               Developed the stand-alone Monthly-Closing application for the Pro-Ration, Balancing, and OutBilling modules using Pro*C/C++

·               Developed  load  scripts  to  load the  data from text files  using  Pl/SQL ,SQL*Loader AWK  & Shell Scripts

·               Generated  text based reports  using Pro*C/C++

 

 

Education   

       B.Tech from Nagarjuna University, AP (1994)

       PGDCA from CMC Gachibowli and Trainee in TC/4 project, Hyderabad(1997)

1



Experience

BACK TO TOP

 

Job Title

Company

Experience

Hadoop Solution Architect

GE

- Present

 

Additional Info

BACK TO TOP

 

Current Career Level:

Experienced (Non-Manager)

Work Status:

US - I am authorized to work in this country for my present employer only.

Active Security Clearance:

None

US Military Service:

Citizenship:

None

 

 

Target Job:

Target Job Title:

Hadoop Lead

Desired Job Type:

Employee
Temporary/Contract/Project

Desired Status:

Full-Time

 

Target Company:

Company Size:

Occupation:

IT/Software Development

·         Software/System Architecture

 

Target Locations:

Selected Locations:

US-CA-Oakland/East Bay

Relocate:

No

Willingness to travel:

No Travel Required

 

Languages:

Languages

Proficiency Level

English

Fluent