From:                              route@monster.com

Sent:                               Thursday, September 24, 2015 11:54 AM

To:                                   hg@apeironinc.com

Subject:                          Please review this candidate for: Application

 

This resume has been forwarded to you at the request of Monster User xapeix03

Sushil Saxena 

Last updated:  06/30/15

Job Title:  no specified

Company:  no specified

Rating:  Not Rated

Screening score:  no specified

Status:  Resume Received


Fremont, CA  94536
US

sks.hadoop@gmail.com
Contact Preference:  Email

Quick View Links:

Resume Section

Summary Section

 

 

RESUME

  

Resume Headline: Hadoop Admin & Architect

Resume Value: qwn3jj74ykb6m5tb   

  

 

SUSHIL KUMAR SAXENA

SKS.Hadoop@gmail.com 510-736-8493 (Cell), Fremont, California

Hands-on qualified Technical Solution/Application Architect and Hadoop Administrator working in Big data and Cloud technologies for the last 3.5 years with overall software industry experience of around 18+ years. Have proven hands on experience leading architectural design of large scale, high availability Big Data solutions; further expansion of my core skills in Big Data Solution Architecture & Administration, Data Architecture & Modeling, Web application development, technical management and effective handling of large-scale dynamic data sets.

 

Hands-on experience with Big Data solutions includes Hadoop, HDFS, MapReduce, Hive, HBase, MongoDB, Cassandra, SQRRL, Accumulo, Impala, Oozie, Flume, ZooKeeper, Yarn, Solr, Kafka, Spark, Sqoop, Ramger, Facon, and Puppet. Coding hands-on in Python, Java, Linux (Bash), SQL, and PHP.

 

PROFESSIONAL EXPERIENCE

 

Optum Insight - United Health Group, MN/CA USA4/2015 – 6/2015

Hadoop Administrator and Architect

 

Working on Healthcare Payer Data Analytics.

 

·   Define and setup Hive and HDFS security with Ranger, Kerberos, LDAP

·   Optimizing and support HIVE analytics queries.

·   Architecture Hadoop Innovation Cluster with SQRRL, SPARK, Puppet, HDP 2.2.4

·   Managed 300+ Nodes HDP 2.2.4 cluster with 3 PetaBytes of data using Ambari 2.0 and Linux Cent OS 6.5

·   Skills Used: Hortonwork HDP 2.2.4, Ambari 2.0, Ranger, Kerberos, Hive

 

Quinnox (DatuHealth), USA1/2015 – 3/2015

Big Data Solution and NoSQL Data Architect, Administrator

 

Working on Healthcare Wellness Big Data Analytics and data security for a multi-tenancy application.

 

·   Created application architecture for Big Data Analytics/ETL using Flume, Talend, Accumulo, REST API

·   Design and developed SQRLL sources, model and E-R with cell level security on Accumulo NoSQL DB

·   Setup and Install Hadoop (With YARN / Map Reduce) cluster and Enterprise Data Ware House

·   Build High-Availability (HA) architectures and deployed with Big Data Technologies

·   Setup Platform build on AWS and implemented Kerberos, LDAP security

·   Skills Used: WSO2 API Gateway, Talend ESB/ETL, AWS, Lifray, SQRRL, Accumulo, Flume, Kerberos, REST API, Solr, PostgreSQL, Nagios, Ambari, Hortonwork distribution.

 

Symantec, USA9/2014 – 12/2014

NoSQL DB Solution Architect

 

Big Data Solution for Data protection, backup, restore on Amazon Wes Services using NoSQL DB

 

·   Understand the technical opportunities and limitations of the various technologies at disposal.

·   Determining the viability of a business problem for a big data NoSQL database solution and migration.

·   Create, Manage, and monitor MongDB on AWS hosted platform with CentOS.

·   Develop solution for event push/pull using RESTful web services

 

Linkquest Telecom, India7/2013 - 9/2014

Big Data Solution / Hadoop Architect & Administrator

 

Telecom Big Data analytics platform with multiple source of data (Devices, Portal-Clickstream, Application, RDBMS)

 

·   Determining the viability of a business problem for a big data solution. Defining a logical architecture of the layers

and components of a big data solution. Selecting the right products to implement a big data solution.

·   Along with the immediate architecture, support provision, monitor, support, evolve, and evangelize the chosen technology stack(s).

·   Plan and manage HDFS storage capacity. Advise a team on best tool selection, best practices, and optimal processes using Sqoop, Oozie, Hive, Hbase, Cassandra, Pig, Flume and Bash Shell Scripting.

·   Facilitate access / ETL to large data sets utilizing Pig/Hive/Hbase/Python on Hadoop Ecosystem.

·   Install OS and administrated Hadoop stack with CDH5 (with YARN) Cloudera Distribution including configuration

management, monitoring, debugging, and performance tuning.

·   Manage Hadoop operations with multi-node HDFS cluster using Cloudera Manager. Monitor cluster with Ganglia.

·   Manage Massive Parallel Processing with Impala with Hbase and Hive.

·   Worked on Pentaho to provide data integration, reporting, data mining and ETL.

·   Design technical solution for real-time analytics using Kafka, Storm, and Hbase.

·   Managed data security and privacy with Kerberos and role based access.

 

Nutrihealth Wellness, India5/2012 - 6/2013

Big Data Solution Architect

 

Wellness Big Data analytics with recommendation engine.

 

·   Interact with customers, business partners and all stakeholders to understand the business objective and drive solutions that effectively meet the needs of a client.

·   Sketches the big data solution architecture, then monitors and governs the implementation.

·   Design strategies and programs to collect, store, analyse and visualize data from various sources for specific projects.

·   Work with bigdata scientists and engineers to produce powerful data processes to be purposed into real time analytics and reporting applications as well as building the necessary hardware and software for the chosen big data solution.

·   Devised and led the implementation of the next generation architecture for more efficient data ingestion and processing, formulated procedures for planning and execution of system upgrades for all existing Hadoop clusters.

·   Participated in development and execution of system and disaster recovery processes and actively collaborated in all Security Hardening processes on the Cluster.

·   Upgrade the Hadoop Cluster from CDH 4.1 to CDH5 (Hadoop with Yarn) Cloudera Distribution.

·   Support the data analysts and developers of BI and for Hive/Pig development.

 

iProzone Managed Hadoop Cluster, UK  6/2011 - 4/2012

AWS Cloud Hadoop Administrator

 

·   Architecre and build a scalable cloud computing environment with Amazon’s EC2

·   Design MongoDB NoSQL database schema.

·   Manage run-time configuration, processes, scaling, backup and recovery, monitoring, and performance tuning for production MongoDB instances.

·   Set up multi-node Hadoop cluster with configuration management/deployment tool (Puppet).

·   Set up cluster monitoring and alerting mechanism using Ganglia, Nagios and Bash Script.

 

iProzone, India9/2009 - 5/2011

Software Architect

 

·   Lead the analysis of deep technical problems and advise on solutions

·   Design high performance internet facing services by creating using scalable, extensible, reliable, maintainable, inter-operable systems

·   Design modular components of a distributed data processing and management infrastructure.

·   Provide supporting information to the engineers to aid in the creation of functional specifications

·   Provide technical leadership, working closely with cross-functional engineering teams in an agile software development environment

·   Conceptualize, design, and developed web based application using PHP, MySQL, and JSON Zing Chart. Developed many strong routines and functions in MySQL.

 

World Wide Resilience, Netherlands      8/2008 - 7/2009

Software Architect

 

·   Developed and participated in code review for PHP and JAVA code.

·   Designed and developed innovative recruitment solution (INNOVIRSA) with the Data points such as time on a page in the assessment, keystrokes, public domain data, social network data, interaction data during application, resume.

·   Designed and developed web services to expose data.

 

Mamut, Norway 4/2008 - 8/2008

Software Solution Architect

 

·   Managed system analysis and design for MamutOne SaaS based application on cloud. 

·   Create data model from conceptualization to database optimization.

 

JK Technosoft, India6/2006 - 3/2008

Sr. Technical Solution Architect / Data Architect

 

·   Overseeing the Technical Solution delivery with a team of 45.

·   Design solutions for data management initiatives: master data management, data warehousing, business intelligence, event processing, business rules management, and compliance reporting.

·   Design and develop SSIS, SSAS service using SQL Server 2008. Create data model ER-Studio.

·   Work as Solution Designer and Architect for SaaS system. Develop PACS PoC using Mirth OpenMRS, JSP, Java.

·   Setup visualization and reporting on SharePoint using web parts developed in ASP.Net/ C#.

 

Heartland Hospital, USA 6/2004 - 6/2006

Sr. Product Engineer / Application Architect/ Data Architect

 

·   Work across multiple phases of development ( SDLC ) within a project as a data solution architect providing a holistic view, focusing on the coordination and integration of individual components.

·   Designed and developed SOA based Electronic Medical Record System (EMR) and Hospital Revenue Cycle Management (eHIMS), HIPAA (EDI) and HL7 compliant healthcare products using Java, J2EE, .NET and Oracle.

·   Set up enterprise data standards, policies, and data development including data modeling.

·   Work within the overall Enterprise Architecture team. Design system using Mirth messaging as SOA architecture.

 

First Data (FDMS), USA1/2003 - 4/2004

Principal Data Architect / DBA

 

·   Created a scalable data architecture, design and model to handle and analyze millions of events daily.

·   Maintained MS SQL Server, developer stored procedures, views for optimized data extraction and processing.

·   Worked on Cognos Data warehouse for OLAP cube development, data visualization and reporting.

·   Developed ETL processes (using SQL script, DTS) for data sources from Mainframe and Legacy systems.

·   Design large data warehouses with data model design techniques such as star schema, flow-flake schema.

 

General Motors (GMAC), USA 12/2001 - 12/2003

Data Architect / Database Administrator

 

·   Create data model, from conceptualization to database optimization which extends to SQL development and Admin.

·   Administer Oracle database on Linux platform with the development of Shell scripts. Automate task with Shell Script

·   Managed server backup and security, Performance tuning and capacity planning, Operations and trouble shooting.

·   Develop code using Java.

 

Ventaso, Inc. USA4/2000 - 7/2001

Sr. Data Architect / Database Administrator

 

·   Design database tables and related objects, reviewed database object designs of software engineers.

·   Establish T-SQL programming best practices for use by software engineers during code reviews.

·   Provided reports on database management systems performance on MS-SQL server and Oracle 8.0.

·   Design and developed scripts for ETL and DTS packages. Administer UNIX and Windows servers.

 

Government Departments, India8/1996 - 2/2000

Software Developer

 

EDUCATION and CERTIFICATION

·   Post Diploma in Computer Programming, 1987– AMU, India

·   Master of Science (Math with Computer) , 1986– Delhi University, India 

·   Certified MongoDB, Cassandra, PMP, Unix Admin (SCSA), Oracle  (OCP),  Microsoft (MCSE), Cisco (CCNA)

·   IBM BigdataUniversity Certificate of Achievement in HBase, Hive, and Oozie



Experience

BACK TO TOP

 

Job Title

Company

Experience

Hadoop Admin & Architect

Unitedhealth Group Incorporated

- Present

 

Additional Info

BACK TO TOP

 

Current Career Level:

Experienced (Non-Manager)

Date of Availability:

Immediately

Work Status:

US - I am authorized to work in this country for my present employer only.

Active Security Clearance:

None

US Military Service:

Citizenship:

None

 

 

Target Job:

Target Job Title:

Hadoop Admin & Architect

 

Target Company:

Company Size:

Occupation:

IT/Software Development

·         Database Development/Administration

·         Software/System Architecture

 

Target Locations:

Selected Locations:

US-CA-Oakland/East Bay

Relocate:

No

Willingness to travel:

Up to 25% travel

 

Languages:

Languages

Proficiency Level

English

Fluent