|
|
|
|
Teradata/Informatica/ETL Resume
|
| Desired Industry: Information Technology |
SpiderID: 84985 |
| Desired Job Location: Dallas, Virginia |
Date Posted: 5/23/2025 |
| Type of Position: Contractor |
Availability Date: 2 weeks |
| Desired Wage: |
|
|
U.S. Work Authorization: Yes |
| Job Level: Management (Manager, Director) |
Willing to Travel: |
| Highest Degree Attained: Masters |
Willing to Relocate: Yes |
Objective: I am a Senior Consultant with more than 15 years of experience in solution design, development, and stakeholder engagement in Telecom, Banking, Retail, F&B and Insurance domains. My key skills are Teradata, Snowflake, ETL and UNIX. I also have experience as a Data Analyst, Migration Lead, coordinator among teams and a mentor. I would like to contribute in design, development and analysis.
Experience: 2023.11 – Till date Senior Consultant/Verizon Wireless Project: GCOMM • Create ETL Data pipelines to load data from cloud storage to BigQuery • Create DAG using Python/framework and build a data pipeline using Airflow Create tables and views for GCP and migrate through GIT CI/CD Jenkins pipeline • Understand the new framework to deploy various objects to GCP • Create/modify JSON files to integrate to the framework • Load and maintain Airflow & incremental metadata in Google Data store Project: EDW Core Services • On-call support and work on production failures and resolution • Create mload, TPT, BTEQ scripts as needed for various enhancements • Create UNIX scripts as per the new requirements • Create PR’s to mitigate and apply quick solutions for the failures in production • Discuss with business on various requirements/enhancements • Create complex SQL queries as per the Business requirements • Perform Unix and SIT testing for the tickets and enhancements • Co-ordinate with Business to complete UAT and migrate code to production • Create new jobs and deploy into the scheduler Project: Rich Call Data(RCD) • Create Tpt load scripts to load calls data from UB to EDW • Create BTEQ script to aggregate data • Create Fast export scripts to generate summary reports and share to CTIA Project: Elvis • Analyze all the existing tables in EDW to identify the impacted list as part of the loan number changes in the upstream • Create BTEQ to replicate the DWE feed logic to be able to generate the feeds after migration to GCP
Tools: Teradata, UNIX, GCP, JIRA, Agile methodologies, Jenkins, GIT 2019.10 – 2023.05 Senior Consultant/Data Architect, ABInbev (St.Louis, MO) Project: People Transformation The project involves migration of legacy systems to Teradata. My responsibilities are • Work with Data modeller and collect the technical requirements • Extract data from multiple source systems and load into Teradata • Write complex SQL queries on Teradata, SQL Server as per Business requirements • Create utility scripts like MLoad, FLoad, TPT to load data from flat files • Create Fast export scripts to export data to flat files • Create mappings, workflows, sessions using Informatica as per the Business requirements • Create export scripts like FExport, TPT export to take data out of Teradata and loaded into Azure SQL server • Write Bteq scripts to transform data and load into temporary tables • Performance tuning of poorly performing SQL queries • As per the Data Validation and UAT bugs raised by Zones, work with the team, resolve bugs and generate new files • Design Data quality framework and develop to ensure the accuracy of the data • Design and develop reconciliation reports to perform automatic data validation between source and target systems • Code review and performance tuning as needed Tools: Teradata 16, Azure, SQL Server, Unix, DevOps, SAP 2018.11 – 2019.08 Senior Consultant/Data Engineer, Volvo Project: IMS (Inventory Management System) The project involved modifying the existing application as per new requirements or enhancements. Involved in customer facing to discuss the requirements and delivery. Creating Technical documentation, development, Unit Testing, UAT support was the major activities. Created various Teradata scripts like Bteq, MLoad, FLoad, FExport etc. Involved in discussions related to Data Model modifications. Project was implemented in Agile. Tools: Teradata 14, SQL Server, Unix, Quality Center 2016.03– 2018.05 Technical Architect/Lead, Societe Generale Project: IFRS9 implementation The project involved modifying the existing SAFIR application and aligning it to IFRS9 standards. The project was implemented in Agile methodology. My responsibilities included design, development, support for testing (unit, system integration, user acceptance), creating technical design documentation (high and low level), involve in functional discussions and stakeholder management. I developed Teradata scripts (TPT, Fast load, MLoad, Bteq), Informatica mappings and workflows, and shell scripts (to automate Teradata and Informatica data flows). Through data analysis I contributed to data model improvements. Tools: Teradata 14.1, Informatica, Unix, HP Agile manager, SVN Tool
2013.07 – 2016.02 Solution Designer, Capgemini The project involved migration of Retail data from SAP to Teradata EDW. I was the solution designer and ETL lead for the project. My responsibilities included solution design including designing of ETL path from SAP to Teradata using BODS and Teradata scripts, reviews and approval of Semantic and front end design. I also created schedule and job design in Maestro scheduler. I was also involved in creating low level technical specifications for ETL team, design ETL strategies, development and code review. Tools: Teradata 13, UNIX, BODS, SharePoint, Quality Center, maestro
2011.02 – 2013.06 Technical Lead, InRhythm Technical solutions The project involved enhancements to data mart. The data mart gets daily sales data from enterprise data warehouse (EDW) and is aggregated for various hierarchies like customer, product, time and geography. This aggregated data supports KPIs required by the business. I was responsible for offshore delivery and was involved in coding, testing, technical documentation, mentoring and guiding the team along with onsite offshore coordination. Tools: Teradata 13, UNIX, Tivoli
2008.10– 2010.12 Senior developer - Teradata/Informatica, AT&T Project: Sunrise Dashboard application Sunrise is a dashboard application that has business KPIs. These KPIs are computed daily by aggregating the EDW data and published to the dashboard. This project is involved requirements for enhancements and fixes. My responsibility was to create technical solution for these requirements, development, Unit testing and migration of the code to production. I created Teradata BTeq, Fast export, Fast Load and MLoad. I also created Unix scripts to automate Informatica and Teradata code. Tools: Teradata 12, Unix, Test Director
2006.10– 2008.09 Teradata developer - Teradata, Nationwide Insurance This project involved working on various enhancements as per the Business requirements. My responsibility as a developer was to create various Teradata scripts like Mload, Fload, Fast export, Bteq etc. as per the requirements. Tools: Teradata 12, Informatica 8.1, Unix, Test Director
Education: 2004 - 2004 M.S in computers in Texas A&M University (Unfinished) 2000 - 2003 Master of Computer Applications from Madras University, Chennai 1997 - 2000 Bachelor of Commerce from Sri Venkateswara University, Thirupathi
Skills: Databases Teradata, Oracle, SQL Server (On-Prem & Cloud), Postgres ETL Tools Informatica, Data Stage
Cloud Technologies Azure Data Modelling Tools Erwin, Visio Scripting Languages UNIX Data Visualization Tools Power BI
Candidate Contact Information:
JobSpider.com has chosen not to make contact information available on this page. Click "Contact Candidate" to send this candidate a response. |
|
|
|
|
|