From:                              route@monster.com

Sent:                               Thursday, September 24, 2015 11:55 AM

To:                                   hg@apeironinc.com

Subject:                          Please review this candidate for: Application

 

This resume has been forwarded to you at the request of Monster User xapeix03

mohan bandaru 

Last updated:  12/15/14

Job Title:  no specified

Company:  no specified

Rating:  Not Rated

Screening score:  no specified

Status:  Resume Received

Quick View Links:

Resume Section

Summary Section

 

 

RESUME

  

Resume Headline: mohan bandaru

Resume Value: gg6hsfgdp2rv2s7v   

  

 

 

Mohan Bandaru

Hyderabad, India

 

Mobile: +91 9642271508                                                        Email: bandarumohan89@gmail.com

 

                                Professional Summary

A Competent professional with about 4+ years of IT experience. Extensively worked in area of integration. Worked in R&D team for product development in Power utility sector. Spearheaded the design and development of a Spring MVC and Hibernate based mission critical systems. Had experience working with MVC architecture using Spring Framework. Working experience of nearly 4 years in Pentaho BI Suite, MULE ESB. I worked on service-oriented architecture, enterprise application integration and application development.

 

·  Extensively worked on both Enterprise and Community edition of MULE ESB.

·  Developed custom logging framework for mule esb application and analysis was made using logstash, elasticsearch and kibana tools.

·  Implemented clustering and throttling concepts for mule based solutions.

·  Formulated build and deployment process for mule project for both on-premise and cloud using maven and cloudhub cli tool.

·  Had an experience in development and deploying mule project on cloud hub.

·  Used Munit for mocking and unit testing mule flows.

·  Had experience in using the enterprise security, notifications features provide by mule.

·  Good hands on experience using mule connectors like FTP,FILE,SFTP,IMAP,SalesForce,NetSuite. .etc as a part of integration usage.

·  Good hands on programming experience in developing web based applications and Client-Server technologies using Java, J2EE, JSP, Servlets, Spring, JDBC, JMS and JNDI.

·  Hands-on experience in complete software development life cycle and software engineering and strong in development of specifications, functional, and technical requirements, and process flows with extensive experience in using OOAD and concepts.

·  Experienced with Model View Controller (MVC) Architecture and ZK frameworks.

·  Experienced in Cutting Edge Technologies like MULE ESB, wso2 esb, HornetQ, Hazelcast, ActiveMQ, XML and Log4j with good experience in Developing and Deploying web based application on Application Servers like Web Sphere1.7, J Boss6.

·  Migration of application from JSP Servlets to Zkoss framework.

·  Tightly integrated the Mule ESB with different application.

·  Integrated open flash charts components in the application to generate graphs for analytics.

·  Hands on experience in Web-Services in publishing and consuming data.

·  Worked on Role based Authentication and successfully integrated to the application.

·  Integrated the pentaho bi reports generated in to the application.

·  Worked on tetra data quartz scheduler.

·  Profound Experience in using ActiveMQ, HornetQ message queue along with Hazelcast server.

·  Proficient in designing reports using the Pentaho Report Designer.

·  Transformation of data using Pentaho Business Intelligence tool Kettle.

·  Strong communication, presentation and analytical and problem solving skills.

·  Worked with smooks framework integrated with mule esb and wso2 esb.

·  Successfully deployed the application in glassfish server and had good hands on experience with clustering glassfish server.

·  Worked on different databases like oracle, db2, mssql,derby.

 

 

                                 Professional Career

 

Experience

Company Name

Designation

Duration

1years

White Sky labs(Working)

Integration Consultant

Jan-2014 to Till Date

3.1 Years

Phoenix IT Solutions

Software Engineer

Nov-2010 to Dec-2013

 

 

 

 

 

 

 

                            Certifications & Acheivements

 

1)      Certified Associate Mule Soft Developer.

2)      Received Letter of Appreciation for implementing the project at DABS (Da Afghanistan Breshna Sherkat) in Afghanistan under USAID during working at Phoenix It Solutions.

3)      Part of team worked for receiving IBM safe validation certification for Product ‘MeterData Management Systems’ at Phoenix It Solutions.

 

 

                                 Professional Experience

 

Key Projects and Products handled:

 

Integration Development Team

Integration Consultant,

White Sky Labs,

Jan/2014-TillDate

 

Project: AMBU CRM Integration

        Client: AMBU,Denmark

Role: Integration Consultant

Nov-2014-till now

 

Description:

 

The Mule integration layer exposes a suite of stateless services, for creation, update and association of the entities, and this shall facilitate orchestration and weaving of business processes by               Ambu. These services will be invoked from the application called Sales Tracing Tool (STT).

 

A suite of RESTful web services will be exposed by the integration layer to facilitate the creation   and update of account, sales order header and sales order lines, and association of the various entities               with the book entity. The high level process is shown below:

 

 

Responsibilities:

 

·   Implemented RAML for approach to expose API via MULE API Manager.

·   Handling secured web services calls.

·   Transformation of requests to payload to make secured web service call using mule datamapper and XSLT.

·   Sprint based implementation.

·   Written MUnit Test cases for end to end  testing. Implemented mocking during testing.

·   Exception handling in error prone conditions.

·   Provided Application tracing utility for analyzing mule logs and provided complete mule transaction via UI developed in Flex.

·   Handling DB data interactions with json input provided DB Stored procedure.

·   Handled in writing Design document.

·   Implemented cluster based approach.

Environment:

MULE API Manager,MuleESB, MuleAnypoint Studio, Mule runtime5.2,Spring, Java, SOAP, REST, JUnit, MUnit, Maven, JIRA,Github,Flex,Logstash,ElasticSearch.

 

Project: Sales force-XERO-Integration

Client: Internal Project(Whiteskylabs)

Role: Integration Consultant

Oct-2014-Nov -2014

Description:

Integrated salesforce application with xero accounting  software using mule esb. Configures topics to listen events on opportunities statistics like opportunity creation, close won.etc events. Creating a contact of customer in xero when the opportunity won.

 

Responsibilities:

 

·   Developed a mule connector to interact with xero system.

·   Gather requirements and planning on integration of both SAS applications using mule esb.

·   Configured push topics on salesforce application and listening the pushtopics for events to occurred at mule esb flow level

·   Exception handling in error prone conditions

·   Configured business events in mule flow to track each request in cloud hub and statistics

·   Defined a mule flow to retrieve all case closed opportunities and placing the order at NetSuite.

Environment: Mule ESB,Mule Anypoint Studio,CloudHub Dec-2013 mule runtime, SalesForce, NetSuite, Munit, Junit, java,Spring, Maven,JIRA,Github.

 

 

Project: MTUDDA

       Client: ­MTU Detroit Diesel

Role: Integration Consultant

April-2014-Sep-2014

 

Description:

 

MTU Detroit Diesel currently integrates with Oracle E-Business Suite (OEBS) using either custom interfaces or via iSupplier portal. The project is aimed at provisioning an effective integration platform that simplifies and automates the processing of DTNA Invoice files, DTNA Purchase Order (PO) files, Advanced Shipping Notification (ASN) message from DTNA, and Interfacing with FMS. The program is proposed to be delivered in two phases with phase 1 comprising of interfacing with FMS, processing of ASN for indirect shipment from DTNA, processing of C2ID56 file from DTNA, processing of milestones for DTNA indirect shipment, ASN file generation for DTNA indirect shipment. Phase 2 shall comprise of transmission of PO files to DTNA, processing of invoices from DTNA.

 

Responsibilities:

 

·   Developing mule flows based on the design.

·   Explored all most all mule components and design the integration component which gives best results in terms of performance.

·   Exception handling in error prone conditions.

·   Configured business events in mule flow to track each request in cloud hub and statistics

·   Worked on connectors like smtp,sftp,ftp,imap which overridden the existing functionality whish suits the requirements.

·   Handling DB data insertion using constructing SQl struct objects and calling DB Sored procedure.

·   Handled in writing Design document .

·   Implemented custom business events for tracking in cloudhub.

Environment:

MuleESB, MuleAnypoint Studio, CloudHub Dec-2013 mule runtime, Bone cp, BeanIO, Spring, Java, SOAP, REST, JUnit, MUnit, Maven, JIRA,Github.

 

Project: Sales force-Net Suite-Integration (A pre-sales project)

Client: Mule Soft Inc

Role: Integration Consultant

March-2014-April-2014

Description:

A pre-sales demo project, worked on sales-force streaming api with integration with NetSuite application. Integrated with salesforce application with netsuite using mule esb. Configures topics to listen events on opportunities statistics like opportunity creation ,close .etc events.

 

Responsibilities:

 

·   Gather requirements and planning on integration of both SAS applications using mule esb.

·   Configured push topics on salesforce application and listening the pushtopics for events to occurred at mule esb flow level

·   Exception handling in error prone conditions

·   Configured business events in mule flow to track each request in cloud hub and statistics

·   Defined a mule flow to retrieve all case closed opportunities and placing the order at NetSuite.

Environment:

                   Mule ESB,Mule Anypoint Studio,CloudHub Dec-2013 mule runtime, SalesForce, NetSuite, Munit, Junit, SOAP,java,Spring, Maven,JIRA,Github.

 

 

 

Product Development Team

Software Engineer,

Phoenix IT Solutions,

Nov/2010-Dec/2014

 

 

Meter Data Management System is the acquisition of data from the meters. It is a central data repository in which all the meter data received is maintained. The data is cleansed from the junk data received. For this the validations are performed on the data and certain algorithms are performed for data verification. After the data is validated, the analysis and reporting are done as per the client requirement. The validated data is also used by other modules like for Billing, Metering etc. for generating bills etc. These validations are mainly focused on Load Profile and the Billing data obtained to find the anomalies within the data.

 

Responsibilities:

·   Client interactions with the utility in finalizing the requirements.

·   Worked on onscreen notification of alarms along with email/sms notification based on configuration.

·   Successfully handled large data through application.

·   Integrated the third-party application with MULE ESB, Hazelcast.

·   Generated graphs and reports for Dashboard and Reporting modules.

·   Integrated the Mule ESB with the application for capturing the data.

·   Creating users and assigning roles.

·   Integrated the work flow rule engine (JBPM) for performing the validation process.

·   Also Integrated wso2 esb with the application for capturing data which resembles the functionality same as of mule esb.

 

Project: AEC - MDMS

Client: AEC (Advanced Electronics Company)

Role: Software Developer

August-2012- till date

Description:

 

AEC is a private sector company to serve local, national, regional and international clients within the defense, telecom and manufacturing industries. The AEC charter includes design, development, manufacturing, the provision of upgrades and logistical support of electronic products and systems

 

Responsibilities:

·   Client interactions with the utility gather the requirements and convert them into technical specifications/artifacts.

·   Worked on Role based Authentication and successfully integrated to the application.

·  Designed and developed application login screens with single sign on using HTML, JSP, Servlets, and JavaScript over Pentaho BI.

·  Implemented Spring MVC architecture and increased modularity by allowing the separation of cross-cutting concerns using Spring AOP.

·   Generated graphs and reports for Dashboard and Reporting modules.

·   Scheduled the kettle transformation for periodical execution in the specified time interval using Mule ESB.

·   Integrated the Mule ESB with the application in capturing the data from JMS Topic.

·   Integrated spring with MULE ESB and had a profound experience on publishing and consuming data using web services.

·   Data migration using ETL, designed reports using the Pentaho Report Designer and integrated with the application.

 

Environment: J2EE, Hibernate 3.0, Spring MVC, Jboss6, Java, JDBC, JPQL, UML, HTML, Java Script, CSS, XML, Jquery, Log4j,Zk,HornetQ,MULE ESB, Pentaho BI,Quartz1.8,Glassfish.

 

Project: KESIP (Kabul Electricity Service Improvement Program)

Client:Da Afghanistan Breshna Sherkat (DABS)

Operating System: Red Hat Linux 6

Role: Software Developer

August-2011 to July-2012

Description:

 

The Government of the Islamic Republic of Afghanistan under the assistance of USAID started a challenging project for commercialization and management of the Kabul Electricity Directorate and the newly formed national electricity utility Da Aghanistan Breshna Sherkat( DABS). The main goal of the project is to commercialize the electricity distribution services in Kabul and build the capacity of DABS to operate on a full cost recovery basis. USAID, The World Bank, and other international donors have worked closely with Afghanistan’s government to establish DABS as a commercialized, public entity to increase the availability of electricity for Afghan homes and businesses. USAID has invested $28 million in DABS and pledged another $20 million during the next two years as part of its overall $1.7 billion energy sector program in Afghanistan.

 

Responsibilities:

 

·   Worked on Dashboards, reports and configuration modules.

·   Capturing data from ftp using MULE ESB and performing Validation process by maintaining the read xml data into staging database table and finally moving the validated data to transactional database table.

·  Developed reusable x-action components for reports email based on input parameters of report.

·  Worked on Pentaho Quartz Scheduler Framework for scheduling the reports for automatic reports generation  for periodic data view (Daily,Weekly,Monthly,Yearly)

·   Worked on JBPM work flow for performing data validation process by applying validation algorithms along with providing editing for invalid data and estimation of missing data.

·   Developed role based authentication module by creating roles for login users and limiting the access the application.

·   Integrated the work flow rule engine (JBPM) for performing the validation process with MULE ESB

·   Web-Services for publishing and consuming data. 

 

Environment: J2EE, Hibernate 3.0, Spring MVC, Jboss4,Java,  JDBC, JPQL, UML, HTML, Java Script, CSS, XML,Jquery, Log4j,Zk,ActiveMQ,Hazelcast,MULE ESB,Pentaho BI,Quartz1.8, JBPM.

 

Project: TANGEDCO - MDMS

Client: TNEB (TamilNadhu Electricity Board)

Role: Senior Software Engineer

April-2013- till date

Description:

 

For utilities Meter Data Management System is an overall strategy, or process, for building decision support systems and environments that support both everyday tactical decision-making and long-term business strategy. A Data Warehouse is designed to manage historical data that does not get updated once it is processed into the model.

 

MDMS is a Data Warehouse primarily designed for storing and managing vast amounts of historical interval meter data (5 minute, 15 minute, etc), Billing data and load profile data along with the required ancillary information needed to effectively  and report useful information.

 

MDMS implementation positions a utility to utilize an enterprise-wide meter data store to link information from diverse sources and make the information accessible for a variety of user purposes such as monitoring system performance, analysis and historical operational reporting

 

Responsibilities:

 

·   Involved in formulating and preparation of design and to-be-process document based R-APDRP standards for the utility.

·   Client interactions with the utility gather the requirements and convert them into technical specifications/artifacts.

·   Integration with billing system through web-services.

·   Developed an Integration Component with third party application using ESB and ActiveMQ JMS.

·  Designed and developed application login screens with single sign on using HTML, JSP, Servlets, JavaScript over Pentaho BI.

·  Implemented Spring MVC architecture and increased modularity by allowing the separation of cross-cutting concerns using Spring AOP.

·   Worked on Data Acquisition system for real-time data based on request.

·   Scheduled the kettle transformation for periodical execution in the specified time interval using Mule ESB.

·   Integrated the Mule ESB with the application in capturing the data from FTP.

·   Integrated spring with MULE ESB and had a profound experience on publishing and consuming data using web services.

·   Data migration using ETL, designed reports using the Pentaho Report Designer and integrated with the application.

·   Worked on system performance.

·   Worked on Jpiot, openi   OLAP aggregation.

 

Environment: J2EE, Hibernate 3.0, Spring MVC, Jboss6, Java, JDBC, JPQL, UML, HTML, Java Script, CSS, XML, Jquery, Log4j,Zk,ActiveMQ,MULE ESB, Pentaho BI,Quartz1.8.,Maven

Project: PMS (PowerSync Middle tier System)

Client: Raagaa Value Add Technologies

Role: Middleware System Developer

July-2013- Aug-2013

Description:

 

Powersync is a Magento extension. This Magento extension provides a simple and flexible way of integrating Magento orders, cart, products, order notes and customers with native Salesforce objects. Synchronization takes place in real time eliminating the wait period for changes to appear in either system.

 

The goal of this project is to build a Middle tier system, which acts as the integration layer between systems like Magento and Salesforce. And also to offload some of the features like data mapping from the client to the server.

 

PMS (PowerSync Middle tier System) is an integration system that would expose integration points to seamlessly integrate with different customer systems on premise as well as on the cloud.

The objective of the design is to develop a new PMS System that will be a highly robust & secure, reliable, scalable web based application.

 

The PMS System mainly comprises of the following modules:

Customer Account Sync Service, Product Sync Service, Order Sync Service, Custom Settings / Data Mapping Service, Data Backup Service, Customer Account management Service

 

 

Responsibilities:

 

·   Involved in formulating and preparation the middleware design.

·   Worked with Restful web services both with consuming and subscribing scenarios.

·   Designed a main flow exposes a HTTP endpoint to the external system to receive the request. A router to analyze the request message header and route the request to the child data sync flow. Finally a connector to the target system.

·   Parsing of JSON data with message payload of request made at system using web service and save the data at staging level.

·   Formulated a router component looks at the payload header action field and routes the request to the appropriate child flow. The main flow passes the JSON message to the child flow.

·   Client interactions with for gather the requirements and convert them into technical specifications/artifacts.

·  Developed an Integration Component with third party application using ESB and ActiveMQ JMS. Implemented Spring MVC architecture and increased modularity by allowing the separation of cross-cutting concerns using Spring AOP.

·   Kettle transformation for saving data into staging tables at middle tier from JMS Queue integrated with Apache Camel.

·   Integrated JMS with Apache Camel and had a profound experience on publishing and consuming data using web services.

 

Environment: Spring data JPA, Spring MVC, Java, JDBC, JPQL, Log4j, ActiveMQ ,JMS, Apache Camel, Quartz1.8. Maven,REST.

 

 

 

                          Skill Set

Business Intelligence Tools:              Pentaho Kettle, Pentaho Report Designer, Pentaho Schema Work-Bench

Web Technologies:XML, JS, JQuery with AJAX.

API’s:Collections, Reflections, Multi Threading, Design Patterns, JAXB.

J2EE Technologies:Servlets 2.0, JSP 2.0, JDBC.

Frame Work’s:Zk, spring 3.x

ESB:MULE, Apache Camel, WSO2.

Enterprise integration: RESTFUL,SOAP, WSDL, JMS,JPA

Development Tools:T.O.A.D for ORACLE, T.O.A.D for DB2, DBSOLO, SQL DEVELOPER, Eclipse

Languages: Java 6.0, SQL, HTML, XML

Databases:Oracle 10g, 11gR1, 11gR2, IBM DB2, MS-SQL

 Build tools:Maven, MULE STUDIO, AnypointStudio and FUSE IDE

Operating Systems:Linux, Windows

Web and Application Server’s:Jboss6, Websphere1.7, Glassfish

Cloud Hub:Mule cloud hub runtime

 

 

         Education

 

2006 – 2010:

 

Bachelor of Technology: Viswanadha Institute of Technology and Management, Visakhapatnam, India with an aggregate of 71.00%

 

2004-2006:

 

Intermediate: Narayana Junior College, Visakhapatnam, India with an aggregate of 85%

 

2003-2004:

 

SSC:  Nalandha Talent School, Visakhapatnam, India with an aggregate of 80%

 

         Personal Details

 

Name: Mohan Bandaru

 

DOB: June 3rd, 1989

 

Fathers Name: B.Mutyalu

 

Place of Birth: Visakhapatnam

 

Phone No:09642271508

 

Marital Status:Single

 

 



Additional Info

BACK TO TOP

 

Current Career Level:

Experienced (Non-Manager)

Work Status:

US - I require sponsorship to work in this country.

 

 

Target Company:

Company Size:

 

Target Locations:

Selected Locations:

US