NARESH NALLURI
Software Research and Development – Senior ETL Consultant, Hadoop
PROFILE
• IT professional with 7 years of...
Naresh Nalluri
• Designation ETL Consultant
• Duration 15 months
• Year Nov,2011-Feb,2013
• Projects Done IVR Datawarehous...
Naresh Nalluri
Technology: IBM Datastage 8.5, Oracle 11g, UNIX
Project Description: M1 is a telecom company in Singapore. ...
Naresh Nalluri
November 2010– February 2011
Project Title: IT Bancassurance Project
Client: Tokio Marine Life, Kuala Lumpu...
Naresh Nalluri
Responsibilities:
• Understanding the business rules as per the Functional Specification, Technical Specifi...
Naresh Nalluri
Responsibilities:
• Understanding the business rules as per the Functional Specification, Technical Specifi...
of 6

Naresh Nalluri - CV

Published on: Mar 3, 2016
Source: www.slideshare.net


Transcripts - Naresh Nalluri - CV

  • 1. NARESH NALLURI Software Research and Development – Senior ETL Consultant, Hadoop PROFILE • IT professional with 7 years of experience in IBM DataStage & QualityStage 7x/8x/9x, 8 months of experience in Pentaho Data Integration (Kettle) and Data Modeling. • Have knowledge on other ETL tools – Talend, Ab Initio, Informatica PowerCenter, MSBI, ODI. • Have knowledge on Big Data – Hadoop (HDFS, MapReduce, Hive, Pig, Flume, Sqoop), Hadoop integration with ETL tools (Pentaho) • Knowledge on Relational databases - Oracle 8i/9i/10g/11g/12c and DB2 8.1/9.7, SQL/PLSQL. • Exposure to full project life cycle development from conception through implementation • Experienced in both Windows, Unix and AIX development environment • Outstanding communication, presentation and problem solving skills • Exploring a challenging career as Software Engineer with a high growth organization in IT sector TECHNICAL SKILLS ETL Tools: IBM DataStage & QualityStage 7x/8x/9x, Pentaho Data Integration (Kettle), Talend, Informatica PowerCenter, Ab Initio Programming Language: Oracle SQL/PLSQL Database: Oracle 8i/9i/10g/11g/12c and DB2 8.1/9.7, Netezza, PostgreSQL 9.4 Platform: Microsoft Windows, UNIX, Linux and AIX Data Modeling: Enterprise Architect, Oracle JDeveloper Reporting tool: Cognos 10.2.2 EDUCATION • M.Sc (Hons.) Biological Sciences & B.E (Hons.) Civil Engineering Birla Institute of Technology and Science (BITS), Pilani July 2002 to Jun 2007 PROFESSIONAL EXPERIENCE • As a Database and ETL Developer I have been working in • Company Sopra Steria Asia Pte Ltd, Singapore • Designation Database Developer • Duration 8 months • Year Feb,2015- till date • Projects Done iFOS • As a Senior ETL Consultant I have worked in • Company M1 Limited, Singapore (Through Emerio GlobeSoft Pte Ltd) • Designation Senior ETL Consultant • Duration 23 months • Year Mar,2013-Jan,2015 • Projects Done BnCC • As a ETL Consultant I have worked in • Company Citibank, Singapore (Through Emerio GlobeSoft Pte Ltd) Mobile: +65-94597839 Email: nati689@gmail.com
  • 2. Naresh Nalluri • Designation ETL Consultant • Duration 15 months • Year Nov,2011-Feb,2013 • Projects Done IVR Datawarehouse • As a DataStage Developer I have worked in • Company IBM, Kuala Lumpur, Malaysia (Through i2s Enterprise Solutions Sdn Bhd) • Designation Software Engineer • Duration 12 months • Year Nov,2010-Oct,2011 • Projects Done IT Bancassurance Project, BDW Upgrade and Migration • As a DataStage Developer and Tester I have worked in • Company IBM India Pvt Ltd, Hyderabad, India • Designation Application Programmer • Duration 13 months • Year Sep,2009-Oct,2010 • Projects Done IPS Test Automation • As a DataStage Developer I worked in • Company Tech Mahindra, Hyderabad, India • Designation Developer • Duration 23 months • Year Oct,2007-Sept,2009 • Projects Done CVX-Project Lynx PROJECTS February 2015– till date Project Title: iFOS (Integrated Field Operations System), Hong Kong Transport System Client: NEA, Singapore Team Size: 10 Technology: Pentaho Data Integration (PDI), Cognos 10.2.2, Enterprise Architect, Oracle JDeveloper, Oracle 12c, Windows, PostgreSQL Project Description: NEA is National Environment Agency in Singapore. This project mainly dealing with developing new mobile and desktop application called iFOS, Data migration from old systems to new system for inspection records (food hygiene, port health, sanitation, vector etc.) and also developing reports for users. Responsibilities: • Developed Data Model for iFOS. • Database design and development. • Data Migration from old systems to new system using Pentaho Data Integration. • Installed and configured the Cognos Server. • Developing reports using Cognos • Developed the ETL scripts using Pentaho Data Integration to load the data from Excel to PostgreSQL for another project - Hong Kong Transport System March 2013– January 2015 Project Title: BnCC Client: M1 Limited, Singapore Team Size: 11
  • 3. Naresh Nalluri Technology: IBM Datastage 8.5, Oracle 11g, UNIX Project Description: M1 is a telecom company in Singapore. Billing System in M1 is called as Arbour. Source systems Arbour and Proximity are going to be upgraded Comverse and WebPlus. EDW has around 400 datastage server jobs to load the data into tables. So need to change the ETL design according to changes in Source systems. Responsibilities: • Developed Excel reports using SQL based on user request. • Developing or modifying datastage jobs according to the changes in source systems • Supporting BAU activities. • User Support and fixing the production issues. • Worked as ETL developer and Administrator November 2011– February, 2013 Project Title: IVR Datawarehouse, Rainbow Client: Citibank, Singapore Team Size: 5 Technology: IBM DataStage 8.5, Oracle 10g, UNIX, AIX Project Description: IVR is nothing but Interactive Voice Response. The datawarehouse will maintain the all customer calls data for all South East Asia Pacific countries. These countries are Singapore, Malaysia, Thailand, Indonesia, Guam, Vietnam and Philippines. IVR Host captures all the customer calls in XML format and send to DWH. The DWH batch will run daily to capture all these customer calls happened on that particular day. Datastage jobs are used to read the xml data and do the transformations and finally loaded into datawarehouse. This transformed data is used for different BO reports across all South East Asia Pacific countries. Responsibilities: • Involving in development of minor and major enhancements. • Batch monitoring. • User Support and fixing the production issues. • Cleaning up Databases. • Maintaining the filesystems with in the threshold limit • Involving in all Production Support Activities. March 2011– October 2011 Project Title: BDW Upgrade and Migration Client: CIMB Bank, Kuala Lumpur, Malaysia Team Size: 15 Technology: IBM Datastage 8.5, DB2 9.1, AIX Project Description: This Project is running since 2003.This Project have more than 4000 Server jobs running on daily and few jobs are running weekly and monthly basis. Daily batch is taking more than 17 hours of time. So this project main aim is to cut short the time to maximum of 12 hrs. To achieve this, we converted some of the Server jobs to Parallel jobs. Responsibilities: • Migration of Server jobs to Parallel jobs. • Fine Tuning of the parallel jobs. • Replace the server jobs with parallel jobs in sequence and monitoring. • Synchronization of data. • Fix the issues and support. • Scheduling is done on daily basis, weekly basis and Monthly Basis.
  • 4. Naresh Nalluri November 2010– February 2011 Project Title: IT Bancassurance Project Client: Tokio Marine Life, Kuala Lumpur, Malaysia Team Size: 3 Technology: IBM Datastage & Qualitystage 8.1 and 8.5, Db2, Windows Project Description: TML has an ongoing partnership with RHB Bank and aims to provide an exclusive front end portal to RHB Bank users to provide operational ease as well as superior ease of use to the various relationship managers, insurance consultants and financial planners of RHB Bank. As part of the solution, there requires to be a DB and Integration layer. The integration layer designed for this implementation is a based on TML’s needs. It is an end of the day batch process in a bi-directional manner. Eg. All the data from Elixir, Life Asia and Group Asia to be pushed up to the DB layer and all the e-business submission data is pushed down to the respective consumers. Responsibilities: • Responsible for creation of both Inbound and outbound interfaces using Datastage. • Used Qualitystage to cleanse the data. • Designed and developed datastage jobs & sequence jobs using datastage designer. • Used almost all the stages available for the sequence job. • Written shell scripts for different tasks (eg.FTP get and put) • Scheduling is done on daily basis. • Involved in fine tuning the jobs • Involved in bug fixing during testing phase • Involved in migration of the project from Datastage 8.1 to 8.5. September 2009 – October 2010 Project Title: IPS Test Automation Client: IBM ISL Team Size: 8 Technology: Datastage 8.5, Oracle, Db2, Netezza, Windows, Unix, Linux and RFT Project Description: IPS Test Automation is to automate the component test suites developed using different frameworks into a single execution and reporting mechanism across multiple platforms using a single automation solution by enabling centralized execution and results for test suites across components and teams and significantly increase the resource efficiency and hardware. Responsibilities: • Responsible for developing the parallel jobs for Datastage PX GUI testcases using Designer. • Responsible for developing the server jobs using Designer. • Automated the Datastage Administrator and Director testcases. • Automated the Information Server Manager testcases. • Writing automation scripts using RFT for all of the above components. October 2007 – September 2009 Project Title: CVX-Project Lynx Client: Chevron Texaco, USA Team Size: 20 Technology: Data Stage Enterprise Edition 7.5, Oracle 10g, UNIX, Cognos 8, Windows NT, UNIX Project Description: Lynx is down stream’s Project to implement targeted, sustainable improvements in Supply Chain Optimization Processes, Technology/Information and Organizational Capabilities. This particular module deals with integration of subsystems. The source data is provided by SAP systems in form of text files. This data is then populated into SQL database by applying Chevron business rules with help of ETL Tool, Data stage. Information Architecture (IA) builds is then established to make the integration part.
  • 5. Naresh Nalluri Responsibilities: • Understanding the business rules as per the Functional Specification, Technical Specification. • Obtained detailed understanding of data sources, Flat files and Complex Data Schemas in LYNX IA Module. • Designed and coded the ETL logic using DataStage 7.5 EE to enable initial load and incremental processing from oracle and SQL server, error strategy and exception handling, restart ability and recovery, data cleanup, validation and monitoring. • Used the Data Stage Designer to design and develop jobs for extracting, cleansing, transforming, integrating, and loading data into different Data Marts and IA EDW. • Utilized share containers for code reusability and for implementing the predefined business logic. Worked with the modeler and suggested various changes in the physical model to support the business requirements. • Analyzed the performance of the jobs and project and enhance the performance using the standard techniques. Troubleshooting was done using the debugging tool • By checking job dependencies, created and scheduled the job sequences. • Automation is done by using batch logic, scheduling jobs on a daily, on a weekly and yearly basis depending on the requirement using Data Stage Director. • Performing intensive component testing (CT1) and Integration testing and develop check list and test templates.
  • 6. Naresh Nalluri Responsibilities: • Understanding the business rules as per the Functional Specification, Technical Specification. • Obtained detailed understanding of data sources, Flat files and Complex Data Schemas in LYNX IA Module. • Designed and coded the ETL logic using DataStage 7.5 EE to enable initial load and incremental processing from oracle and SQL server, error strategy and exception handling, restart ability and recovery, data cleanup, validation and monitoring. • Used the Data Stage Designer to design and develop jobs for extracting, cleansing, transforming, integrating, and loading data into different Data Marts and IA EDW. • Utilized share containers for code reusability and for implementing the predefined business logic. Worked with the modeler and suggested various changes in the physical model to support the business requirements. • Analyzed the performance of the jobs and project and enhance the performance using the standard techniques. Troubleshooting was done using the debugging tool • By checking job dependencies, created and scheduled the job sequences. • Automation is done by using batch logic, scheduling jobs on a daily, on a weekly and yearly basis depending on the requirement using Data Stage Director. • Performing intensive component testing (CT1) and Integration testing and develop check list and test templates.

Related Documents