NARENDER REDDY ANDRA
Mobile: +91-8686627747
Email id: narender.acn@gmail.com
S U M M A R Y
ROLES & SKILL’S
INFORMATICA
Dev...
• Extensive experience using Informatica Data Quality 9.x including
Address Doctor and IDE.
• Created mapping and mapplets...
E D U C A T I O N / E X P E R I E N C E S
Masters of Computer Applications Osmania University Post Graduated
Bachelor of C...
• Created logical data objects and sql services in IDQ .
• Created scheduling script to run the idq jobs.
• Tested the cod...
• Obtained golden records using Informatica master data management hub console.
• Created base objects, relations ships us...
• Security Management, Code migrations
• Used LDAP systems in order to provide access.
Environment: Informatica Power Cent...
Bayerische Motoren Werke AG (BMW) is a German automobile, motorcycle and engine
manufacturing company Motors. BMW is known...
• Created new mappings and profiles, mapplets in IDQ.
•
• Used scheduling tool tidal and Informatica scheduling for job sc...
• Developed Complex mappings in Informatica to load the data from various sources using
different transformations like Sou...
Responsibilities:
• Developed several Mappings and Mapplets using corresponding Source, Targets and
Transformations.
• Wor...
Environment: Informatica Power Center, Business Objects 6.1/5.1, Oracle 9i/8i, Toad, SQL Server 2000,
DB2 7.0, TERADATA
of 11

Narender Reddy Andra Profile

Published on: Mar 3, 2016
Source: www.slideshare.net


Transcripts - Narender Reddy Andra Profile

  • 1. NARENDER REDDY ANDRA Mobile: +91-8686627747 Email id: narender.acn@gmail.com S U M M A R Y ROLES & SKILL’S INFORMATICA Developer/Lead BUSINESS SOLUTIONS DW IMPLEMENTATION ETL IMPLEMENTATIONS ENVIRONMENT & TOOLS INFORMATICA & COMPONENTS INFORMATICA IDQ &IDE INFORMATICA MDM &IDD Erwin Data Modeler DATABASES TERADATA ORACLE SQL SERVER DB2 DOMAIN: BANKING INSURANCE AUTOMOBILE MANUFACTURING TELECOME CLIENTELE: BANK OF AMERICA STATE FARM ZURICH BMW MOTOR CORP HONEYWELL CISCO OTHER TOOLS TIDAL SHEDULING TOOL, CONTROL M Remedy HP QC PROFILE: • Round Eight years of overall IT experience in System Analysis, Design and Development in the field of data warehouse & databases • Round Eight 8 Years of Strong experience in implementing & design of Data Warehousing Applications including Analysis and Development and administration using Informatica data quality(IDQ), Informatica master data management(MDM) ,IDD (informatica data directer), Informatica Power Center 9x,8x Power Center, Teradata. • Hands on experience in Business Intelligence experience using Business Objects 5.1, • Experience in Databases Oracle 11.2/10g/9i/8i, TOAD, IBM DB2, MS SQL Server 2005/2000, Teradata Extensively worked with ETL tools to extract data from various source including Oracle, DB2, Flat files. • Well experienced in design and implementation of Informatica data quality, both development and configuration • Experience in Informatica master data management development to obtain golden records. • Extensive experience in informatica data director in creating applications • Hands on experience in TERADATA with Informatica • Had knowledge on informatics express, BDE, metadata manger and informatics cloud. • Performed analysis, design, development, testing, documentation, maintenance and support of complex applications and processes. • Participate in business requirements sessions, data model design walkthroughs and infrastructure reviews • Hands on experience in gathering requirement and preparing functional documents and Translated business requirements into technical database requirements and design. • Reviewed and created MDM design docs, configuration and Interfaces. • Created function in MDM hub. • Used address doctor in MDM hub and IDQ as well • Created base object tables, validation rules, mappings, trust, match & merge in hub console to generate golden record. • Created quires &packages in HUB • Driven requirements and analysis for data quality rule definition, threshold analysis, defect / rule- violation reporting.
  • 2. • Extensive experience using Informatica Data Quality 9.x including Address Doctor and IDE. • Created mapping and mapplets and rules in IDQ to address clean and transform data . • Created table profiling, join profiling, created mappings using transformation like validator, match, merge etc. • Experience deploying Data Quality Framework and Solutions required • Experienced in creating scripts using Teradata Bteq,tpump,fastload, mload, etc utilities. • Experience in Performance tuning of sources, targets, mappings and sessions. Worked with modify Oracle Stored Procedures and experienced in loading data into Data Warehouse/Data Marts using Informatica. • Knowledge of Software Development Life Cycle (SDLC) including requirement analysis, design, development, testing, and implementation. Provided End User Training and Support • Experience with TOAD& sql developer to test, modify and analyze data, create indexes, and compare data from different schemas. • Hands on experience with schedule Tidal and control-m • Created complex views, mviews, procedures, triggers ,packages • Supported in 24/7 support environment, Handled on call support • Writing test-cases for mappings and routines developed and carrying out testing in UAT environment before promoting to Production. • In-depth experience in troubleshooting and problem resolution in production data issues. • Coordinating with various teams in the migration/deployment process for performing application migration/deployment to a higher environment. Enhancement and Defect Management. • Expert in Incident Management working through critical incidents. Manning command center and conducting Incident resolution bridge calls. • Worked on Problem tickets and made code fixes in the existing production system. • Handled L1&L2 calls as part of the Production support. • Highly adaptive to a team environment and proven ability to work in a fast paced teaming environment with excellent communication skills. • Strong analytical and problem-solving skills. • Exceptional ability to quickly master new concept and technology. • Used Erwin Data Modeler tool for data ware house designing.
  • 3. E D U C A T I O N / E X P E R I E N C E S Masters of Computer Applications Osmania University Post Graduated Bachelor of Computer Applications Osmania University Post Graduated P R O F E S S I O N A L S U M M A R Y Bank Of America, UK MAR 2015 – Till Date MDM/IDQ consultant Project : PODS MDM Bank of America offers industry-leading support to more than 3 million small business owners through a suite of innovative, easy-to-use online products and services. The company serves clients through operations in more than 40 countries. And the project is a strategic roadmap that focuses on enhancing operational excellence as well as improving the customer experience, PODS is currently undertaking various enterprise data management initiatives aimed at implementing a robust Customer Relationship Marketing solution. In line with this strategic roadmap, the MDM project is one such initiative sponsored by Sales and Marketing that focuses on providing an in- depth understanding of the customer. MDM value proposition as it pertains to this document, in line with the Sales and Marketing objective of enhancing customer acquisition and retention: Responsibilities: • Designed and developed MDM/Idq jobs. • Played an extensive role in business requirement gathering, requirement analysis and data rules creation in idq and MDM development. • Cleated mappings and cleansing functions to load staging tables in MDM. • Created base objects, relations ships using MDM. • Configured relation among table to obtain LKP. • Configured Match/merge in MDM to obtain golden records. • Configured validation and trust. • Related queries and packages in hub console. • Configured batch group to load the jobs. • Configured auto merge. • Configured manual merge. • Created application using IDD . • Obtained golden records using Informatica master data management hub console. • Loaded the landing tables with the help of etl jobs and IDQ mappings and workflows • Created mapping applications in IDQ to lad landing data • Create rules, mapplets ,mappings in IDQ. • Drive requirements and analysis for data quality rule definition, threshold analysis, defect / rule- violation reporting. • Cleated profiles, score cards in idq .
  • 4. • Created logical data objects and sql services in IDQ . • Created scheduling script to run the idq jobs. • Tested the code and deployed in to production environment • Prepared high level and low level design documents. • Implemented Type-1 & Type-2 Mappings for both Dimensions and Facts. • Designed the complex mapping with complex business logs • In order to maintain data quality developed data profiling, data mapping, data validation, data manipulation using IDQ. • Created table profiles and join profiles. • Created mappings and mapplets using Informatica idq tool. • Used address validate transformation (address doctor) to validate incoming address. • Analyzed/profile data for the incoming sources. Used Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer. • Developed Mappings, Sessions and Workflows using Informatica Power Center Designer and Workflow Manager. • Participated in Resource Planning, Estimation, Design, Coding, Unit Testing and Production Support. • Worked with various teams like QA, DBA, System Maintenance and Support, Underwriters and Compliance. • Creation & Review of Unit Test case documents. Did the peer reviews of code. • Assisted Testers with data issues as well as tested the ETL flows. • Provided Pre & Post production assistance. • Uploaded/modified Customer Parameter files in UNIX to use variables for Workflow sessions. • Played the role of admin and performed admin tasks such as code migration Environment: Informatica Power Center, Informatica idq, Informatica MDM, Oracle 10g,HP Defect Tracker. Statefarm, Bloomington Sep 2014 – Feb 2015 Sr. ETL Developer/Lead Project : DIS- MOM(Management Organization Marketing) State Farm is a group of insurance and financial services companies in the United States and Canada. The group's main business is State Farm Mutual Automobile Insurance Company, a mutual insurance firm that also owns the other State Farm companies. The data related to their operations is present in various source systems like oracle. DIS- MOM Data Warehouse stores terabytes of Premium, Claims & Accounting information. This data is loaded into Data Marts for Actuarial & Financial analysis. Responsibilities: • Designed and developed ETL framework/MDM/Idq. This is the collections of all claims, customer’s information. • Played an extensive role in business requirement gathering, requirement analysis, database design, ETL design too. • Implemented Informatica MDM hub.
  • 5. • Obtained golden records using Informatica master data management hub console. • Created base objects, relations ships using MDM. • Designed schema and relations ships between tables. • Created and configured schema objects. • Configured relation among table to obtain LKP. • Cleated mappings and clench functions to load staging tables in MDM. • Configured Match/merge in MDM to obtain golden records. • Configured validation and trust. • Related queries and packages in hub console. • Configured batch group to load the jobs. • Configured auto merge. • Configured manual merge. • Hand on experience on IDD . • Loaded the landing tables with the help of etl jobs. • Tested the code and deployed in to production environment • Redesigned the existing data mart ODS, in order to increase the performance and reduce the maintenance cost. Successfully implemented this in production. • Prepared high level and low level design documents. • Worked as onsite lead for offshore team. Coordinated with the offshore team members to explain them business requirements and assign tasks. • Implemented Type-1 & Type-2 Mappings for both Dimensions and Facts. • Designed the complex mapping with complex business logs • In order to maintain data quality developed data profiling, data mapping, data validation, data manipulation using IDQ. • Created table profiles and join profiles. • Created mappings and mapplets using Informatica idq tool. • Used address validate transformation (address doctor) to validate incoming address. • Analyzed/profile data for the incoming sources. Used Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer. • Developed Mappings, Sessions and Workflows using Informatica Power Center Designer and Workflow Manager. • Worked with Teradata Database in writing BTEQ queries and Loading Utilities using Multiload, Fastload, FastExport. • Automation and scheduling of UNIX shell scripts and Informatica sessions and batches using Control-M • Participated in Resource Planning, Estimation, Design, Coding, Unit Testing and Production Support. • Worked with various teams like QA, DBA, System Maintenance and Support, Underwriters and Compliance. • Creation & Review of Unit Test case documents. Did the peer reviews of ETL code. • Assisted Testers with data issues as well as tested the ETL flows. • Provided Pre & Post production assistance. • Uploaded/modified Customer Parameter files in UNIX to use variables for Workflow sessions. • Played the role of admin and performed admin tasks. • Worked with Shortcuts across Shared and Non Shared Folders • Migrated the code in to UTE and production environment.
  • 6. • Security Management, Code migrations • Used LDAP systems in order to provide access. Environment: Informatica Power Center 8.1,9.1.0, Informatica idq, Informatica MDM, Teradata, Oracle 10g, Toad, Control-M, Unix Shell Scripting (korn shell), SQL, PL/SQL,MS Excel, HP Defect Tracker. Zurich, UK Feb 2014 – Sep 2014 Sr. ETL Developer/Lead Project : DIS- MOM(Management Organization Marketing) Zurich is a global insurance company which is organized into three core business segments: General Insurance, Global Life and Farmers. The SAP Convergence program Lion (as HUDSON) was started at the beginning of 2010 in an effort to help Zurich achieve the objectives defined in the Global Finance Strategy. Responsibilities: • Designed and developed ETL framework/MDM/Idq. This is the collections of all claims, customer’s information. • Detailed analysis and design on Mappings, Mapplets, Worklets for the Clients in order to take precautionary measures. • Developed and implemented the parameters whenever the Finance HUB requires the switch necessary changes which are reuse and develop the new applications. • Creating the reusable mappings for the UK GI, UK Life, Isle of Man and Ireland Life to the SAP Convergence Core platform based on the functional specs. • Involved in requirement gathering, Build and UAT and Deployment phases. • Coding system has been created to show the sections to be agreed by each BU, whenever a change of relevance to BU takes place the changes. • Optimizing Query Performance, Session Performance & Delivering objects for client that is correct an on time. • Prepared the Documentation to describe program development logic, coding, testing, changes. • Created the required tables for the current project and Interaction with DBA, Business & Operations. • Involved in Requirement gathering • Co-ordination with client side IT Team and business users • Coordinating with teams and ensuring the activity completion in line with customer expectations. • Understanding change requests and performing impact analysis. • Created database objects like views, Mview’s and etc as per the requirements. • Involved in testing Phase of UAT (User Acceptance testing). BMW June’11 – Feb’ 14 SR SYSTEMS PROGRAMMER ANALYST Project : SDWH, CAESER-INFO, ODS
  • 7. Bayerische Motoren Werke AG (BMW) is a German automobile, motorcycle and engine manufacturing company Motors. BMW is known for its performance and luxury vehicles, and is a global leader in premium car sales. Accenture is a business partner in IT services and maintenance of nearly 70 applications of BMW for locations German, USA (Woodcliff lake, Hilliard) Sales Tracking and Reporting (STAR), SDWH (Service DWH, CAESER-INFO, ODS, BSC (BALANCE SCORE CARD) application is a part of their Enterprise Data warehouse. In this application, their overall vehicle sales and flow orders are tracked w.r.t country wise and are reported accordingly. Data is being fetched from different source systems like Legacy systems, Flat files, and SAP systems, processed and populated using Informatica Power Center. Responsibilities: • Implementation ETL solutions using tool IDQ, MDM and power center • Performed administration tasks too. • As part of master data management worked on the Informatica MDM tool to derive golden records. • Configured match/merge in MDM. • Created mappings to loaded staging tables in hub console. • Configured trust and relation ships . • Created quries and packages. • Confegered batch group in MDM hub . • Configured MDM hub in order to derive golden records. • Implemented Informatica SIF using MDM hub. • Attended the requirement gathering meetings and delivered the code as per the client requirement. • We have developed, tested and deployed the new ETL flows in to production. • Worked as a SME for the Application(SDWH,STAR,CAESER_INFO,ARP,ODS) • Worked as offshore Lead, Coordinated with the team members to explain them business requirements and assign tasks. • Preparation of Informatica ETL design documents and Operations Manual as part of new requirement. • Implemented Type-1 & Type-2 Mappings for both Dimensions and Facts. • Extraction of data from various sources like xmls, flat files, Oracle, Microsoft Access Database using Informatica Power Center. • Worked on JAVA, SAP transformation along with Lookup, Stored procedure, EXP, JNR. • Did the peer reviews of ETL code. • Performed ETL testing using SQL developer and Informatica Debugger. • Overall Performance prospective, mapping level objects tuning. Informatica Session level Partitioning & Recovery Strategy have been implemented • Addressed the issues raised by the business team. Attended the SLA meetings. • Fixed the bugs in the existing production system. • Fine-tuned the existing mappings by find out the bottle necks in the code. • Used partitions and puss down optimist in order to tune the system. • Used stored procedure to refresh metalized views, enable and disable indexes and refresh table stats. • Used the Informatica data quality tool to profile the data and to standardize the data. • Extensively worked on profiling, address doctor, standardizer, and parser, merge etc. transformation in IDQ to improve the quality of the data in the system.
  • 8. • Created new mappings and profiles, mapplets in IDQ. • • Used scheduling tool tidal and Informatica scheduling for job scheduling. • Created new jobs in tidal and created dependences accordingly. • Tivoli Service Desk Tickets monitoring, assigning, accepting & resolving. Environment: Informatica Power Center 8.6.1, Informatica IDQ, MDM, Oracle 10g, , SQL Server, Toad, Control-M, Unix AIX, TERADATA, Unix Shell Scripting (korn shell), SQL, PL/SQL,MS Excel, , AR Remedy Ticket Tracking System. Honeywell, US APRIL ‘10 – MAR ‘12 ETL DEVELOPER Project : ACS CP/S (Automation and Control systems & Censing and Control Product Search) The Honeywell CPS Data team to be extract and load the data from the various sources like Oracle, SQL Server, DB2, Database and SAP GUI. The CP/S program is so large running 14 super tankers and 20+ projects at any time it is very important that we have a common reporting structure for the data management of all projects within the program. We leverage the same utility database as with the data processes and centrally locate all project plan details in a central set of tables and provide reporting to the deployment leads, data leads and full data team for activity management, follow-up, and issue resolution. Its SAP Integration project which is migrating from Non-Legacy system to SAP R/3. Responsibilities: • Build the Acquire and Localize processes based on mapping documents and direction from Data Leads, Super Users. • Involved in new object development we have never loaded before by developing Informatica Mappings and Workflows. • Add new data cleansing rules, Bug fix to localize based on data action inputs from data leads and Super users inputs. • Working Informatica PowerCenter client tools like Source Analyzer, Warehouse Designer, Mapping Designer and Transformation Developer. • Using most of the transformations such as the Source Qualifier, Application Source Qualifier, Expression, Aggregator, Connected & unconnected lookups, Filter, Router, Sequence Generator, Sorter, and Joiner. • Using Developed SQL scripts for business data loads. • Imported data from various Sources transformed and loaded into Data Warehouse Targets using Informatica client Tools. • Using shortcuts to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically. • Using Workflow Manager for workflows and session management, Database connection management. • Optimizing Query Performance, Session Performance. • Delivering objects for client that is correct an on time. • Prepared the Documentation to describe program development, logic, coding, testing, changes and corrections.
  • 9. • Developed Complex mappings in Informatica to load the data from various sources using different transformations like Source Qualifier, Look up (connected and unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Update Strategy Rank and Router transformations. Used debugger to test the mapping and fixed the bugs. • Played an extensive role in business requirement gathering, requirement analysis, database design, ETL design, development and implementation of the end to end solution, using SDLC techniques. • Involved in conceptual, logical and physical database design using Dimensional Data Modelling. • Developed Mapplets using corresponding Source, Targets and Transformations. • Executed sessions, sequential and concurrent batches for proper execution of mappings and sent e-mail using server manager. • Used session partitions, dynamic cache memory, and index cache to improve the performance of Informatica server. • Assisted in Migrating Repository. • Created database triggers for Data Security. • Optimize SQL queries for better performance. • Designed Informatica transformations for loading data from various sources like flat files/ ODBC sources. • Worked closely with Software Developers to isolate, track, and troubleshoot defects • Data Maps Creation Using Informatica Power Exchange Interface for Power Center. • Worked with different Informatica tuning issues and fine-tuned the transformations to make them more efficient in terms of performance. • Participating in User meetings, gathering requirements, discussing the issues to be resolved. Translating user inputs into ETL design documents. • Analysis, Design and Development, test and implementation of Informatica transformations and workflows for extracting the data from the multiple legacy systems. • Developed procedures to populate the customer data warehouse with transaction data, cycle and monthly summary data, and historical data. • PL/SQL Modify and made enhancements to the existing code if required • Designed and documented the validation rules, error handling and test strategy of the mapping. • Responsible for Defect Tracking and Management Metrics. • Defects are logged and change requests are submitted using defects module of Test Director. • Provided Pre & Post production assistance. Environment: Informatica Power Center 8.6, Power Exchange, Oracle 9i/8i, Toad, SQL, SQL Server 2000 Cisco, US May 2008– May 2010 ETL INFORMATICA DEVELOPER Project : EDWTD (Enterprise data warehouse Teradata) The EDWTD is the Data warehouse built on Teradata platform is developed by Cisco. This acts as a state of art data warehouse which is a near real time and SSOT (the initiative at Enterprise level to store exactly one record) for different transaction. EDW Provides information for data mining and extracts the information needed for Telecom services, plans and revenues to meet the Extracting data from Various Source systems. Erwin was used to Construct Dimensional Modeling and Load Star Schema into the Oracle database with Business Objects as Corporate Reporting tool.
  • 10. Responsibilities: • Developed several Mappings and Mapplets using corresponding Source, Targets and Transformations. • Working on Dimensional modeling to Design and develop STAR Schemas. • Working closely with user decision makers to develop the transformation logic to be used in Informatica. • Analyzing the source data and deciding on appropriate extraction, transformation and loading strategy • Extensively worked on all the transformations like Filter, Aggregator, Expression, Router, Lookup, Update Strategy, Sequence generator, Rank • Business Requirements gathering and Analysis. • Using Workflow manager for Workflow and Session Management, database connection management and Scheduling of jobs to be run in the batch process. • Developing number of Complex Informatica Mappings, Mapplets and Reusable Transformations for weekly Loading of Data. • Fixing invalid Mappings, testing of Stored Procedures and Functions, Unit and Integration Testing of Informatica Sessions, Batches and Target Data. • Designed Informatica transformations for loading data from various sources like flat files • Worked with Teradata Database in writing BTEQ queries and Loading Utilities using Multiload, Fastload, FastExport. • Worked closely with Software Developers to isolate, track, and troubleshoot defects • Worked with different Informatica tuning issues and fine-tuned the transformations to make them more efficient in terms of performance. • Assisted in the process of preparation of the document Informatica transformation Development Standards. This document describes the general guidelines for Informatica developers, the naming conventions to be used in the Transformations and also development and production environment structures.Monitoring the jobs handle in 3 refreshes (CORP, APAC and EMEA). • Attend & Resolve when the production job fails. • Escalation to existing CDW SMEs in case the issue cannot be resolved and / or pertains to data. • Receiving pages from the Cisco Operations Team for any failed jobs in CDW production environment. Acknowledging (via ST chat) to the Cisco Operations team that the issue(s) is being looked at. • Alliance remedy tool is being used for case management. • The Team has to proactively work on the PVCS, $Universe, Informatica, Teradata and Linux. • Running Month end / Quarter end jobs (most of them need manual intervention) and resolving issue(s). • Communication to the concerned mail aliases for any failure that results in missing the existing. • SLAs for reasons within / beyond our control. • Interaction with DBA, Sys-Admin s, Operations & $U teams. • Participate in the infrastructure process improvement and tool improvement of any existing tools as well as participate in the definition any new processes and tools as needed • Help the application support team in case of any issues. • Proactively participate in performance tuning of jobs, which have shown degradation in performance or have started to impact SLAs. • Scheduled the batches and sessions at specified frequency.
  • 11. Environment: Informatica Power Center, Business Objects 6.1/5.1, Oracle 9i/8i, Toad, SQL Server 2000, DB2 7.0, TERADATA

Related Documents