From:                              route@monster.com

Sent:                               Monday, September 28, 2015 1:01 PM

To:                                   hg@apeironinc.com

Subject:                          Please review this candidate for: Talend

 

This resume has been forwarded to you at the request of Monster User xapeix03

Rajesh Reddy 

Last updated:  07/21/15

Job Title:  no specified

Company:  no specified

Rating:  Not Rated

Screening score:  no specified

Status:  Resume Received


San Francisco, CA  94102
US

rajesh.hadoop222@gmail.com
Contact Preference:  Email

Quick View Links:

Resume Section

Summary Section

 

 

RESUME

  

Resume Headline: Rajesh H - Sr. Tableau/Hadoop Developer

Resume Value: 5ux3pewwsr96mf2k   

  

 

Rajesh Reddy
Sr. Tableau/Hadoop Developer
TOOLS/METHODS
Tableau (Dev & Admin), HDFS,
MapReduce, Pig, Hive, Impala,
HBase, MarkLogic, Sqoop, Oozie,
Zookeeper, Flume, Core Java,
Eclipse, NetBeans, COBOL, Java,
KSH & Markup Languages,
GreenplumDB, PostgreSQL,
MySQL, Oracle
8i/9i/10g/11g/11i/r12, DB2, IMS,
Putty, WINSCP, EDI(Gentran),
Streamweaver, Compuset
RELEVANT EXPERIENCE
9+ years of overall IT experience in a variety of industries, which
includes hands on experience in Big Data technologies, Reporting and
Data Visualization tools.
SKILLS INVENTORY
§ 3+ years of comprehensive experience in Tableau (Dev & Admin)
and Big Data/Hadoop which includes MR (Mapreduce), PIG, HIVE,
Impala, Oozie, Flume, Zookeeper, NoSQL DB’s such as Hbase and
exposure to MarkLogic
§ Hands on experience working on Tableau Suite (8.x/7.x)( Desktop,
Server, Reader, Online, Public)
§ Experience in designing stunning
§ Experience
in Installation,
visualizations
using tableau software and publishing and presenting dashboards
on web and desktop platforms
Configuration
and
administration of Tableau Server in a multi-server and multitier
environment.
CLIENTS
GE Oil & Gas
REI Systems
Nationwide Insurance
Capital one
Wachovia Bank
Ingram Micro
ABN-AMRO
§ Designed and deployed reports with Drill Down, Drill Through and
Drop down menu option and Parameterized and Linked reports.
§ Very good understanding of Hadoop ecosystems like Sqoop2,
Spark and YARN.
§ Experience in Data Analysis, Data Validation, Data Verification,
Data Cleansing, Data Completeness and identifying data
mismatch.
§ Experience in working with MR, PIG scripts & HIVE query Language.
§ Experience in importing and exporting data using Sqoop from
HDFS to Relational Database Systems and vice-versa and
exposure to HVR and CDC
§ In depth understanding/knowledge of Hadoop Architecture and
various components such as HDFS, Job Tracker, Task Tracker,
Name Node, Data Node and MapReduce concepts
EDUCATION
Bachelor of Engineering, VIT,
India
§ Experience in analyzing data using Hive QL, Pig Latin, and custom
MapReduce programs in Java.
§ Extensive experience with SQL, PL/SQL, PostgreSQL and database
concepts
§ Knowledge of NoSQL databases such as HBase and Casandra
§ Knowledge of job workflow scheduling and monitoring tools like
Oozie and Zookeeper
§ An individual with excellent interpersonal and communication
skills, strong business acumen, creative problem solving skills,
technical competency, team-player spirit, and leadership skills
Rajesh Reddy
Sr. Tableau/Hadoop Developer
PROJECTS
GE Oil & Gas, Houston, TX
Role: Sr. Tableau/Hadoop Developer
GE oil and gas provides drilling equipment and subsea systems, to turbomachinery solutions and
downstream processing. IT sourcing has 23 different oracle SAP, Oracle DB and Flat File systems
which shall be combined into one huge Greenplum Cluster
Responsibilities:
· Build front end Dashboards with stunning visualizations in Tableau for spend, supplier, P&L,
year to month analysis etc. based out of Oracle, Greenplum DB’s with Drill Down, Drill Through
and Drop down menu options with Parameters and hyperlinking reports.
· Create views, dashboards in Tableau Desktop and published to Tableau Server for performing
analytics and customization using filters and actions for Business users and internal team.
· Collaborate with Business users for requirement gathering for building Tableau reports as per
business needs.
Oct 2014 to Present
· Creation of detailed reports (hyperlink with filters) for business users who work closely with
suppliers for first-hand information.
· Responsible for providing demo to clients on regular basis and help business users on how to
navigate through Tableau dashboards.
· Evaluate data connections for live connection or data extracts based on factors such as data
size, real-time reporting.
· Handled Tableau admin activities such as site creation, adding site admins, users, schedules &
subscriptions, server maintenance, connectivity, backup etc.
· Load, Clean, Enrich data in Greenplum DB using PIG
· Responsible for preparing technical specifications, analyzing functional Specs, development
and maintenance of code.
· Design and development of SQL/PostgreSQL queries/procedures for creation of data extracts
for PO/Receipt/Invoice spends
· Use in depth features of Tableau like Data Blending from multiple data sources to attain better
data analysis.
· Handled installation of additional worker nodes when data increase was anticipated.
· Handle Informatica integration between warehouse and other teams for clean-up and
enrichment of data.
Environment: Tableau Desktop 8.1, Tableau Server 8.1, Oracle (11i & r12), Greenplum, HDFS,
Pig, Informatica, Talend
REI Systems, Sterling, VA
Role: Sr. Hadoop/Tableau Developer
REI works with different government sectors in the space of decision and data analytics, Dashboards,
business process etc.
Apr 2013 to Oct 2014
Rajesh Reddy
Sr. Tableau/Hadoop Developer
Responsibilities:
· Worked with the Data Science team to gather requirements for various data mining projects
· Distributing the work between team members and tracking the development progress.
· Interaction with client regarding the progress of the project on regular basis.
· Responsible for development of Pig (involving UDF’s) scripts & Map-Reduce programs for data
cleansing and application of business logic.
· Involved in running Hadoop jobs for processing millions of records of text data.
· Developed Simple to complex MapReduce Jobs using Pig and Hive
· Developed chained MapReduce jobs in java for data cleaning and preprocessing
· Strong understanding on MRV1 and MRV2 execution environments for MapReduce Jobs in
Hadoop.
· Involved in the development of the Hive/Impala scripts for extraction, transformation and
loading of data into other data warehouses.
· Involved in loading data from LINUX file system to HDFS
· Responsible for managing data from multiple sources
· Extracted files from Oracle through Sqoop and placed in HDFS.
· Experienced in running Hadoop streaming jobs to process terabytes of xml format data.
· Created views in Tableau Desktop that were published to internal team for review and further
data analysis and customization using filters and actions.
· Upgradation of tableau server 7.0 to 8.1 on development, stage and production environments.
· Handled admin activities such as site creation, adding site admins, users, schedules &
subscriptions, server maintenance, connectivity, backup etc.
· Used Hue UI for viewing output of Hive queries.
· Collection of Hadoop workflow jobs with several actions arranged in a control dependency
using Oozie.
· Up gradation of tableau server 8.1 to 8.2 on development, stage and production environments
in a phased manner.
· Created and maintained Technical documentation for launching HADOOP Clusters and for
executing Pig Scripts and Hive queries
· Responsible for providing demo to client on regular basis Responsible for preparing technical
specifications, analyzing functional Specs, development and maintenance of code.
Environment: CDH4 (MRV1 and MRV2), HDFS, Pig, Hive, Impala, MapReduce, Sqoop, LINUX,
Hue, Tableau Desktop, Tableau Server
Nationwide Insurance Company, Columbus, OH
Role: Hadoop Developer
Feb 2012 to Apr 2013
Nationwide Mutual Insurance Company is a leading provider for Insurance and Financial services in
USA. Nationwide Mutual Insurance Company offers diverse and comprehensive products and
services that provide a strong and stable option for customers.
Responsibilities:
· Worked on Hadoop cluster scaling from 4 nodes in development environment to 8 nodes in
pre-production stage and up to 24 nodes in production.
Rajesh Reddy
Sr. Tableau/Hadoop Developer
· Involved in complete Implementation lifecycle, specialized in writing custom MapReduce, Pig
and Hive programs.
· Exported the analyzed data to the relational databases using Sqoop for visualization and to
generate reports for the BI team.
· Extensively used Hive/HQL or Hive queries to query or search for a particular string in Hive
tables in HDFS.
· Did various performance optimizations like using distributed cache for small datasets, Partition,
Bucketing in hive and Map Side joins.
· Experience in developing customized UDF's in java to extend Hive and Pig Latin functionality.
· Created HBase tables to store various data formats of data coming from different portfolios.
· Managing and scheduling Jobs to remove the duplicate log data files in HDFS using Oozie.
· Used Flume extensively in gathering and moving log data files from Application Servers to a
central location in Hadoop Distributed File System (HDFS).
· Implemented test scripts to support test driven development and continuous integration.
· Responsible to manage data coming from different sources.
· Experienced in Analyzing Cassandra database and compare it with other open-source NoSQL
databases to find which one of them better suites the current requirements.
· Used File System check (FSCK) to check the health of files in HDFS.
· Involved in creating dashboards by extracting data from different sources.
· Created dashboard using parameters, sets, groups and calculations.
· Involved in creating interactive dashboard and applied actions (filter, highlight and URL) to
dashboard.
· Involved in creating calculated fields, mapping and hierarchies.
· Created drill through reports in dashboard.
· Developed the UNIX shell scripts for creating the reports from Hive data.
· Experienced on loading and transforming of large sets of structured, semi structured and
unstructured data.
· Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.
Environment: Hadoop, Java, UNIX, HDFS, Pig, Hive, MapReduce, Sqoop, NoSQL DB’s,
Cassandra, Hbase, LINUX, Flume, Oozie, Tableau Desktop
Capital One, Richmond, VA
UNIX, Composite, Hadoop Developer
Enterprise Fulfillment IT is a generation 2.5 platform sourcing engagement leveraging state-of-theart
document automation, batch processing and monitoring systems to serve the credit cards, bank
and mortgage portfolios of Old National Bank. It is a massive engine to process millions of events to
generate letters, statements, checks and other outgoing communication while also receiving checks,
remits, online and other forms of payment from customers. Built on UNIX AIX and mainframes, the
project includes aggregation, reconciliation and vendor interaction as well.
Responsibilities:
· Implemented 60+ minor enhancements in the period covering customer management,
recoveries, fulfillment management, customer acquisitions, rewards, high yield accounts,
Oct 2009 to Feb 2012
Rajesh Reddy
Sr. Tableau/Hadoop Developer
money market accounts. Changes including creation of new communication methods not
limited to letters, statements and checks.
· Represented platform in work identification, scoping, allocation and prioritization meetings
with business users, application owners, architects including VPs.
· Identified several new business streams for the application including online presentation of
certain instruments, simplified handling of change requests, simplified content changes etc.
· Built the application for the use of 40+ users from scratch identifying user requirements,
technology requirements, functional and non-functional specifications. Offer presentation was
handled using manual updates previously.
· Identified application performance parameters using industry standard tools and documented
them for future applications built in the same space.
· Automated all business processes related to message processing viz. offer submission, offer
review, artwork upload, text submission, rules coding, offer approval and final database
· Identified as a major contributor to Customer Acquisitions for work in letters processing.
Worked extensively on Visual Compuset. Built several reusable styles for different needs.
· Interacted with 4 external vendors to set up file transfer and reports. Setup new interfaces for
the application internally and externally. Worked in different transfer protocols including FTP,
SFTP and Connect Direct.
· Acted as the enhancements lead managing 5 resources with work done in both waterfall and
periodically in Agile.
· Loading of data into HDFS, processing of data using PIG scripts and loading of data into Hive.
Environment: UNIX (AIX), KSH, Compuset, Java, HDFS, PIG, Hive
Wachovia Bank (Now Wells Fargo), Hyderabad, India
COBOL and DB2 Developer
Sept 2008 – Oct 2009
Involves in handling of ACH payments gateway for Wachovia Bank. Project involves existing code
enhancements to support high volume of child payments.
Responsibilities:
· Implementation of convenience check and ACH origination programs for payment processing.
· Resolution of file formatting issues facilitating ACH origination.
· Initiation of specifically formatted ACH addenda records to process high volume of child
support payments.
· Developed materials in support of operations training relative to check processing, ACH
Environment: IBM Mainframe OS/390, COBOL, DB2
· Delivered on request by client for rapid implementation adding high volume processing to ACH
origination process.
Rajesh Reddy
Sr. Tableau/Hadoop Developer
Ingram Micro, Bangalore, India
EDI Specialist
Dec 2006 – Aug 2008
Project involves maintaining and developing EDI Applications (Electronic Data Interchange) that are
continuously requested by internal business units within Ingram. Ingram Micro uses EDI for majority
of business communication with the customers. So EDI Applications form the critical part of the
Ingram IT system. Responsibilities include mostly production support activities, providing solutions to
business user/trading partner queries (Severity 3 and 4) and also mainframe job failures (Severity 1
and 2). The Execution methodology includes Trading Partner Communication, Trading Partner Profile
Setup in Gentran (Mapping).
Responsibilities:
· New EDI implementations for EDI X12 documents such as (850, 855. 810, 856 etc), test case
Execution, Build deployment, Post production. Enhancing existing EDI documents based on
customer requirements.
· Worked on JCL and COBOL component enhancements related to EDI implementations.
· Support to Business, Customer representatives EDI related queries.
· Worked on tickets and job failures raised by business users and clients. Responsible to find the
root cause and resolution for severity 3 and 4 problems. Most of the fixes include COBOL, IMS
and DB2 programs, JCL changes.
· Performed Regression testing for purchase order program processing. This process makes sure
that changes to PO processing are not affected since it was considered very important to
business.
Environment: IBM Mainframe OS/390, COBOL, DB2 (Platinum, Inter-test etc), IMS, EDI Gentran
ABN-AMRO, Pune, India
Test Engineer
Oct 2005 – Nov 2006
The project is a Testing of the application built by TCS for ABN-AMRO Bank. The Knowledge transition
was done in 3 waves each wave consisting of different number of applications. Black box testing was
done on a front end system named HAAI. The screen flow of the system was tested with help of test
cases.
Responsibilities:
· Understanding the business Requirements and Technical Requirements.
· Preparing the Test Cases according to the requirements.
· Test Cases were implemented.
Environment: Test Director



Experience

BACK TO TOP

 

Job Title

Company

Experience

Sr. Tableau/Hadoop Developer

GE Oil & Gas

- Present

 

Additional Info

BACK TO TOP

 

Current Career Level:

Experienced (Non-Manager)

Work Status:

US - I am authorized to work in this country for my present employer only.

 

 

Target Job:

Target Job Title:

Sr. Tableau/Hadoop Developer

 

Target Company:

Company Size:

Occupation:

IT/Software Development

·         General/Other: IT/Software Development

 

Target Locations:

Selected Locations:

US-CA-San Francisco