Narendra Nath Marreddy
(602) 900 6328 nandu03.1985@gmail.com
SUMMARY
• Around 8 years of experience in diverse Designing, ...
TECHNICAL SKILLS
• ETL tools : Ab-Initio 1.15, 2.14, 3.0.3, 3.1.2,3.1.7.5
• Scripting : Korn-Shell scripting , PIG
• Datab...
Description : Similar to a report, a data file is a collection of data. Unlike a report which is formatted
to be “easy to ...
• Involved in preparing the implementation plans and participated in pre install activities.
• Also wrote all customized a...
RERUN is an additional functionality that is being added here so that a user can place a request for any
historical report...
managed project timelines, and worked with business manager to provide a smooth
implementation and transition to support g...
format like ASCII, EBDIC, XML format etc. according to the end user request. Also the reports are
sent on request, daily b...
• Built graphs to read the input files which are transmitted to our servers via SFT by an upstream
and create 3 files out ...
• Troubleshoot, fixed, and implemented bug fixes and enhancements within tight deadlines.
• Develop high level and detaile...
• Interaction with client on daily basis for requirement gathering and status update
• Monitoring the daily DWH load and W...
of 10

Narendra Nath Marreddy Ab Initio and Big Data 7.8 years

Published on: Mar 3, 2016
Source: www.slideshare.net


Transcripts - Narendra Nath Marreddy Ab Initio and Big Data 7.8 years

  • 1. Narendra Nath Marreddy (602) 900 6328 nandu03.1985@gmail.com SUMMARY • Around 8 years of experience in diverse Designing, Developing, Implementing and Testing Data warehouse Applications in the Domain of Banking, has extensive hands-on experience in AB INITIO DEVELOPMENT, DB2, ORACLE ,UNIX, Shell Scripting, Control M Scheduler. • Around two years of experience in BIG Data Environment in creating Hive queries, PIG scripts and Oozie workflows. • Good knowledge on multi file system (MFS), PDL, Meta Programming and continuous flows. • Hands-on experience in using Ab Initio PLAN. • Have a basic knowledge on Big Data SPARK. • Worked with the Banking giants like American Express and Barclays. • Specialized in ETL methodology for supporting data analysis in a large scale ETL environment. • Good knowledge and experience in parallelism concepts of Ab Initio. • Expert level knowledge and understanding of Ab Initio (ETL) architecture, design and integration with hands on expertise in designing, developing and delivering ETL solutions. • Has worked with critical components like Transform (Reformat, Filter By Expression, Join, Dedup etc), Partition (Partition By Expression, Partition By Key, Partition By Round-Robin etc), Departition (Concatenate, Gather, Merge, Interleave), Sort (Sort, Sort within Groups) and Multi stage (Rollup, Scan, Normalize) Components. Has worked with LOOKUPS. • Excellent UNIX experience and developed multiple scripts using UNIX SHELL Scripting. • Hands on experience in using DB2 UDB, SQL and fine tuning techniques. • Have a basic knowledge on Teradata. • Hands on experience in using Control-M for building the automation layer. • Experience in using the Ab Initio continuous flows. • Good Knowledge on Parallelism concepts (Data, Pipeline & Component Parallelism). • Worked with IBM scheduling tool Control-M & Control-M File watcher. • Leading a team in an Onsite - Offshore Model. • Co-ordinate with Customer to solve business and technical issues. Analyze and Understand Requirements from customer and create high level design documents. • Created test plans and captured test results. • Handled Project Estimation, Work efforts estimation, Plan deliverables and deliver error-free code on time for various projects. • Efficient in software engineering skills in generating design and architecture artifacts using SPRINT Process (Envisioning, TBD, RT, RD & WC). • Have extensive knowledge and working experience on Software Development Life Cycle (SDLC) and AGILE Methodologies. • Experience in developing applications using Waterfall & Agile methodologies. • Client appreciations for providing add-ons & successful Completion of critical projects. • A hard working and talented professional with excellent communication, analytical and programming skills, ability to work effectively in a team, flexibility to work on different technologies, a self-learner with proven expertise in development activities, including requirement analysis, design, testing and client interaction. 22-Feb-2016 AXP Internal Page 1 of 10
  • 2. TECHNICAL SKILLS • ETL tools : Ab-Initio 1.15, 2.14, 3.0.3, 3.1.2,3.1.7.5 • Scripting : Korn-Shell scripting , PIG • Database Systems : DB2 UDB 8.0/9.0 , Oracle ,HIVE • DB Utilities : DB2 import, DB2 export, DB2 Load • Scheduling tools : Control-M , Oozie • Operating System : Windows (98/XP/vista), Unix • Project Management : MS Office 2003, 2007 • Version Control Tools : EME • Case Tools : SCRUM, AGILE, SDLC • Domain Experience : Banking and Financial Services (BFS) , Retail EDUCATION • Bachelor of Sciences, Kakatiya University, India CERTIFICATIONS AND TRAININGS • AMEX certified in AGILE, Waterfall, SDLC, Capital Markets, Payments and Banking. • Trained Campus-Hires on AB INITIO. ACHIVEMENTS • Syntel TEAM award SMART for 2011. • Syntel SYNERGY award for 2012. • Syntel SYNERGY award for 2014. • Syntel SPOT recognition award for 2014. PROFFESIONAL PROFILE Employer 1 : Syntel Client : American Express (AMEX), Phoenix, AZ Duration : Sep 2010 – Till date American Express: American Express is a global financial, travel and network service provider. AMEX issues corporate cards to its corporate clients, which helps the companies and institutions to manage their travel, entertainment and purchasing needs. Project#1 : GLOBAL DATA FILE SERVICES Role : ETL Developer and Module lead Environment : Ab-Initio, Shell scripting, DB2, UNIX, Windows XP, Control-M 22-Feb-2016 AXP Internal Page 2 of 10
  • 3. Description : Similar to a report, a data file is a collection of data. Unlike a report which is formatted to be “easy to read” by a person, data files are formatted to be “easy to read” by a computer. Data files do not include titles, spacing or highlights that make it easy for a user to read. Instead, they contain rows of data in specific positions that make it efficient for another system to read and integrate this data. Data file users can leverage data to better manage their business as follows:  Audits can be faster, more reliable, and more complete  Corporate spending policies can be more effectively enforced  Support supplier negotiations  Reduces risk of fraud and delinquency Roles and Responsibilities: • As a Technical Lead led the requirements analysis, produced and gained user acceptance of detailed functional specifications, created and presented live prototype demos, and provided technical development leadership (including application design, assignment and supervision of coding done by junior developers, and coding of any unassigned or lagging areas of development). • Prepared the high-level design for ETL data flow, recovery mechanism in case of a failure. • Interacting with the clients to gather the requirements for each sprint. • Technical design discussions with the portfolio Architect on the requirements. • Built a re-usable generic sub-graph which can be leveraged for the validation layer. • Build the partitioning graph based on the transaction types to process transaction addenda data to generate the data files. • Used multi file system (MFS) to attain data parallelism which resulted in faster execution (5k files created in less than 30 minutes). • Used XML components for creating the files in XML format. • Used EXEL components for creating EXEL files. • Used MQs and JMS (Java Message Services) to interact with online screens . • Transformation components are used to generate the summarized transaction data at the highest hierarchy as well as for the individual card members. • Built a re-usable generic graph to handle the double byte characters. • Used PDL with meta programming to automate the creation of files with different layouts depending on the market. • Used Ab Inito PLAN and packaged the entire application which gave more control on the execution process. • A reconciliation process has been built in ab Initio with the help of the built-in types, transformation, aggregation components. • Guided and worked with the team in development of ETL Procedures and Functions/wrapper scripts in accordance with Business Requirements and was involved in SQL optimization of existing queries. Identification of problems in the graphs and modules, arriving at solutions. • Extensively interacted with EME to maintain version control on objects by using features like Check In and Check Out. • Involved in Unit testing / Integration testing with the interfacing systems. 22-Feb-2016 AXP Internal Page 3 of 10
  • 4. • Involved in preparing the implementation plans and participated in pre install activities. • Also wrote all customized application documentation, provided user training, supervised and coordinated all phases of testing (through user-acceptance testing), conducted code reviews, managed project timelines, and worked with business manager to provide a smooth implementation and transition to support group. Project#2 : DaaS -Data as a Service Role : Big Data Developer Environment : PIG, HIVE, Shell scripting, Oozie schedulers Duration : Sep 15–Till Date Description : DaaS is based on the concept that the product, data in this case, can be provided on demand to the user regardless of geographic or organizational separation of provider and consumer. Below are the benefits  Flexible Format  Open API Framework  Low Total Cost Of Ownership  Better Time to market  Self Service model Roles and Responsibilities: • User requests for a data file via front end Java Screens called @work Reporting. Once user submits his request, @work Reporting will frame a query depending on his selection and the query will be written to a file and this file will be transmitted to the server. • Created a file watcher which waits for the query file and once this file is in, this watcher will kick start the data file creation process. • Created Hive queries to extract data as per the user’s request from Corner Stone. • Created PIG scripts to transform the data as per the user requirement. • Created Oozie workflows to automate data loading into the Hadoop Distributed File System and PIG to pre-process the data. • Created Hive tables to load large sets of structured, semi-structured and unstructured data coming from UNIX, NoSQL and a variety of portfolios. • Assisted with data capacity planning and node forecasting. • Collaborated with the infrastructure, network, database, application and BI teams to ensure data quality and availability. Project #3 : MSSU RERUN Role : ETL Developer and Module lead Environment : Ab-Initio, Shell scripting, DB2 UDB 9.0, UNIX, Windows XP, Control-M Duration : Jun 14 – Nov 14 Description : MSSU is a process which identifies and pulls the Users request for a report from data base tables and passes on the report request to the end graphs which in turn creates the reports and sent it to the end users on request, daily basis, and cyclic basis or according to the frequency specified. 22-Feb-2016 AXP Internal Page 4 of 10
  • 5. RERUN is an additional functionality that is being added here so that a user can place a request for any historical report that has been generated earlier so that he will receive a duplicate copy of the same. Roles and Responsibilities: • Interacting with the clients to gather the requirements for each sprint. • Technical design discussions with the portfolio Architect on the requirements. • Built the conditional DML to parse the Transactional input file as it comes with different record types. • Create a Validation layer in Ab Initio and a response file creation system to validate and provide an automated response to the source systems indicating the errors in the data they sent over. • Build generic transformations using the Ab Initio built-in types and components for handling the multi-byte characters coming as part of the feed from accounts receivables. • Build the partitioning graph based on the record types to process transaction addenda. • SIT/UAT related activities. • Promoting the code to production and warranty support. Project #4 : MSSU KR TEMPLATES MIGRATION Role : ETL Developer and Module lead Environment : Ab-Initio, Shell scripting, DB2 UDB 9.0, UNIX, Windows XP, Control-M Duration : Dec 13 – Sep 14 Description : KR files for JAPA region are being migrated from old MSSU to new MSSU. Reasons for migration to new process are delay in delivery of the files to GDFS, missing transactions and duplicate data in some files. Stand Alone constraint will be that the existing functionality should not get affected. A common framework will be developed to support all existing functionalities of the KR files with higher performance. However missing transactions and duplicate data issues will be fixed by modifying the code in the old MSSU. Roles and Responsibilities: • Worked as a Project Lead. • As a PL led requirements analysis, produced and gained user acceptance of detailed functional specifications, created and presented live prototype demos, and provided technical development leadership (including application design, assignment and supervision of coding done by junior developers, and coding of any unassigned or lagging areas of development). • Updated the MSSU process to support 2 new frequencies, weekly & any date in a month. • MSSU process was also trying to process around 13 million records in a single flow which was causing the graph to execute for 35-50 minutes. Updated the process to partition the data into multiple flows so that more number of records will be processed at the same time ,thereby reducing the execution time to 15 min • Also wrote all customized application documentation, provided user training, supervised and coordinated all phases of testing (through user-acceptance testing), conducted code reviews, 22-Feb-2016 AXP Internal Page 5 of 10
  • 6. managed project timelines, and worked with business manager to provide a smooth implementation and transition to support group. Project #5 : CUSTOMIZED REPORTING TRANSFORMATION Role : ETL Developer Environment : Ab-Initio, Shell scripting, DB2 UDB 9.0, UNIX, Windows XP, Control-M Duration : Aug 13 – Nov 13 Description : This project will make improvements to the Customize Reporting tool which will drive more client enrollment, usage, and satisfaction. The new functionality will allow clients/users to migrate away from legacy tools. In old architecture, the user needs to have access to GIDM to create reports and CIW to view the reports. The new web-screen GMIP combines both the functionality of GIDM and CIW in single user friendly web –screen. The project involves creation of hierarchy details, allowing user to access, create and view reports on new web screen (GMIP). This will allow user to create new reports, edit old reports setups, delete reports, view processed reports and delete them. As part of ETL team, we are loading data into various tables which are being used by web screen to display information. The jobs are critical as data is directly used by web to display, hence the architecture and framework should be efficient. Roles and Responsibilities: • Worked as a Sr.Analyst programmer. • Developed ETL Procedures and Functions/wrapper scripts in accordance with Business Requirements and was involved in SQL optimization of existing queries. • Created a graph which reads the I/P given by the user via MQ and creates a data file. User has the option to select a template from a list of 13 templates and the same graph can create a data file with 13 different layouts. • Built the ability in the graph where the process can identify special characters in the input data and replace the special character with an appropriate value, using a set of predefined combinations provided by business. • Provided Support for SIT/UAT. • Performing code migrations as part of weekly/Monthly Change request releases. • Troubleshoot, fixed, and implemented bug fixes and enhancements within tight deadlines. • Develop high level and detailed level technical and functional documents consisting of Detailed Design Documentation function test specification with use cases and unit test documents. Project #6 : MANAGEMENT SYSTEM SUPPORT UTILITY (MSSU) Role : ETL Developer Environment : Ab-Initio, Shell scripting, DB2 UDB 8.0, UNIX, Windows XP, Control-M Duration : Mar 12 – Aug 13 Description : The project is aimed to send transactions/profile information to the End users on their request in the form of files/reports. The files/reports can be sent in any one of the file 22-Feb-2016 AXP Internal Page 6 of 10
  • 7. format like ASCII, EBDIC, XML format etc. according to the end user request. Also the reports are sent on request, daily basis, and cyclic basis or according to the frequency specified. MSSU is one such process which identifies and pulls the request from data base tables and passes on the report request to the end graphs which in turn creates the reports and sent it to the end users. Roles and Responsibilities: • Worked as a Sr.Analyst programmer. • Developed ETL Procedures and Functions/wrapper scripts in accordance with Business Requirements and was involved in SQL optimization of existing queries. • Built graphs to read data from tables and validate the data against business rules and the data that has passed the validation will be further refined to fit the business needs which are later delivered to the clients via SFT. • Built the ability in the graph where the process tracks the records that failed validation rules and if the rejected record count crosses a predefined threshold limit, the graph will be aborted and an email alert will be sent to the client and the data administrators informing about the issue. • Provided Support for SIT/UAT. • Performing code migrations as part of weekly/Monthly Change request releases. • Troubleshoot, fixed, and implemented bug fixes and enhancements within tight deadlines. • Develop high level and detailed level technical and functional documents consisting of Detailed Design Documentation function test specification with use cases and unit test documents. Project #7 : GDC SALES ENABLEMENT Role : ETL Developer Environment : Ab-Initio, Shell scripting, DB2 UDB 8.0, UNIX, Windows XP, Control-M Duration : May 11 – Feb 12 Description : The objective of the process is to extract the GDC CM Charge Volume feed, GDC BCA Charge Volume feed and GDC BCA Demographics feed and after suitable transformations on the feed, a report for each feed needs to be generated and the generated reports are to be sent to the GDC Business Planning Group through e-Mail. Roles and Responsibilities: • Worked as a Sr.Analyst programmer. • Developed ETL Procedures and Functions/wrapper scripts in accordance with Business Requirements and was involved in SQL optimization of existing queries. 22-Feb-2016 AXP Internal Page 7 of 10
  • 8. • Built graphs to read the input files which are transmitted to our servers via SFT by an upstream and create 3 files out of every input file as per the mapping documents. Once the files are created the files are transmitted to 3 different down streams via e mail. • Created a file watcher to check for the input files and once the files are received in our servers the file watcher will kick start the graphs. • Provided Support for SIT/UAT. • Performing code migrations as part of weekly/Monthly Change request releases. • Troubleshoot, fixed, and implemented bug fixes and enhancements within tight deadlines. • Develop high level and detailed level technical and functional documents consisting of Detailed Design Documentation function test specification with use cases and unit test documents. Project #8 : CARD DAILY DATA FEED (KR1025) EMEA (MEXICAN MARKET) Role : ETL Developer Environment : Ab-Initio, Shell scripting, DB2 UDB 8.0, UNIX, Windows XP, Control-M Duration : Oct 10 – May 11 Description : The KR1025 daily file format was developed by Corporate Services in order to summarize the transaction data and has developed into a general use file for interfacing with automated expense systems. Since most expense automation technology is driven from the US, there has been an increasing requirement for global standardization on a format compatible with that used in the US. As a result all Amex regional operations have developed versions of the KR1025. There are certain minor differences between versions to accommodate local system and regulatory requirements but all versions are broadly interchangeable. Typically individual KR1025 files are generated for a given sub-corp. Multiple KR1025 files are generated for clients having several sub-corps. However, few clients in Mexico are unable to process multiple KR1025 files. To enable the client to use the KR1025 data file, the individual KR1025 files (each file based on one sub-corp) needs to merged under one holding corp. Roles and Responsibilities: • Worked as an Analyst programmer. • Developed ETL Procedures and Functions/wrapper scripts in accordance with Business Requirements and was involved in SQL optimization of existing queries. • Built graphs to create a data file with some 400 fields as per the mapping provided and send these files to clients via SFT. Multiple instances of these graphs can be run at the same time to handle the data file volume. • Provided Support for SIT/UAT. • Performing code migrations as part of weekly/Monthly Change request releases. 22-Feb-2016 AXP Internal Page 8 of 10
  • 9. • Troubleshoot, fixed, and implemented bug fixes and enhancements within tight deadlines. • Develop high level and detailed level technical and functional documents consisting of Detailed Design Documentation function test specification with use cases and unit test documents. Employer : JSC InfoTech Client : Barclays & Compu Credit Duration : Aug 2008 – Sep 2010 Project #1 : WAREHOUSE MIGRATION (BARCLAYS) Role : ETL Developer Environment : Ab-Initio, Shell scripting, Oracle 9, UNIX, Windows XP Duration : Jun 09 – Aug 10 Description : The project focuses on the data integration process for maintaining information in the Information Warehouse database. This new subject area provides more flexibility for defining and identifying new banking accounts and transactions from internal and external systems. In addition, the data will be integrated with a prioritization and hierarchy of information defined by the business. Roles and Responsibilities: • Development of Ab Initio graphs that runs on daily basis(EOD) • Development of relational graphs that would handle the specific order of the processing where one load process has to be started only after completion of another load process. • Development of Connect graphs that acts as a bridge to the downstream systems Loads processing. • Daily interaction with the Client for the requirements/development, which would make development faster, and process improvement better. • SIT/UAT/Installation of projects. Project #2 : DMA FILE PROCESSING Role : ETL Developer Environment : Ab-Initio, Shell scripting, Oracle 9, UNIX, Windows XP Duration : May 08 – Jun 09 Description : The project aimed to develop a System for Compu Credit, UK. That would involve several types of Mainframe files to be input to the File Processing and they need to be authenticated, business rules applied, processed and load into the data warehouse. The objective of this project is to make the whole system more subjective and to reduce the processing time of 10 hours to 3 to 4 hours. Roles and Responsibilities: • Development of Ab Initio graphs to load data from sources involving Flat-files, etc. to Oracle database • Involved in Unit testing and peer to peer testing the graphs • Scripts were run through UNIX shell programs 22-Feb-2016 AXP Internal Page 9 of 10
  • 10. • Interaction with client on daily basis for requirement gathering and status update • Monitoring the daily DWH load and Working for Change for Requests (CRs). 22-Feb-2016 AXP Internal Page 10 of 10

Related Documents