Job Description
As a Data Engineer who works in a data migration role you will use your extensive MS SQL Server & Pentaho Platform skills to help lead and improve the S&P Global Platforms. You will build, maintain, and extend a data migration repository to successfully migrate data from target systems.
What’s in it for you: This position provides a tremendous opportunity to grow personally and professionally in this role. This is the place to learn and enhance your skills while having the chance to work on enterprise level products and new technologies & use your technical project management skills to lead large projects end to end in an Agile environment across one or multiple OTC teams. We invest in the right technology and people, so we can turn raw data into actionable insights. That is why we are critical to governments, companies, and individuals the world over. Our systems support the operational Excellence in all that we do.
Responsibilities
- Responsible for end-to-end implementation of MS SQL Server scripts including(SQL views, Stored Procedures, Triggers, Advanced Joins, etc)
- Should be able to perform data extraction, data loading solving the errors, checking filter criteria.
- Understanding on the steps - input, output, and connection to database.
- Understanding on job/transformation, inspecting the data.
- Understanding on salesforce basics - fields, objects, and lookups.
- Knowledge on schema, database, views.
- Provide technical knowledge/expertise to support the requirements definition, design, and development of data warehouse components/subsystems, specifically Extract-Transform Load or Extract-Load-Transform technologies and implementations and/or Change Data Capture technologies and implementations.
- Getting fallout report and analyzing the bugs and errors, should understand how to get output file.
- Research, design, develop, and modify the ETL and related database functions, implementing changes/enhancements to the system.
- Design ETL processes and develop source to target transformations, and load processes.
- Develop, test, integrate and deploy Kettle transformations including performing configuration management activities related to the ETL/CDC environment.
- Resolve requests for assistance in troubleshooting issues assigned to the team.
- Support functional/regression testing activities in support of system functionality releases, patches, and upgrades.
- Support Production jobs and debug for any issues during Failures, perform the RCA for failure and fix the issue to resume the operations.
- Knowledge in scheduling tools like TWS, Informatica Schedulers, and SSIS a plus.
- Analyze process improvement areas/recommend changes to processes/procedures for efficiencies/cost-savings/etc.
- Work in partnership with key business users, identify potential ways of improving the efficiency and/or effectiveness of current business operations
- Build a deep technical understanding of how the business operates departmental/divisional structure, functions, processes, procedures, and current application functionality
- Assist with the design of MS SQL solutions and project planning. Add value in all stages of project work (definition, development, deployment)
- Strong background in Pentaho, Informatica, or similar ETL tolls.
- Must know how to migrate data from using tools like data loader, work bench, dataloader.io to perform data migration.
- Lead & Coordinate with QA, UAT and Go-Live Activities.
What We’re Looking For
- Thorough knowledge of delivering projects in an agile scrum environment
- Able to provide leadership, participate and be a productive member of the team.
- Must have strong SQL Skills
- Must be able to review the code related to customization as well that of integration.
- Be the lead subject matter expert in driving the industry best practices for the Salesforce ecosystem and associated integrated tools.
- Experience with Developing and maintaining Code using MS SQL,SSIS, Pentaho Data Integration.
- Experience in Developing Pentaho Data integrations for integrations with Salesforce, Flat files, MS SQL server, Oracle, and Other Applications.
- Experience working with salesforce Data objects, its Validations and writing soql queries to extract data from salesforce
- Prior experience of Transforming data from a Legacy Salesforce and Zuora implementation to Salesforce CPQ and Billing modules implementation using Pentaho Data Integrations.
- Strong analytical and debugging skills.
- Experience with Designing Large scale data integration solutions with high performance.
- Knowledge of Oracle applications and performing Integrations with them.
- Knowledge on Zuora Subscription and Billing model is a plus.
- Experience working in Agile methodology and able to deliver code within stipulated time.
- Experience in writing Performance efficient complex SQL queries, Stored procedures in Microsoft sql server and tuning existing queries and Procedures.
- Performing Unit testing, and code deployments and Maintenance activities on existing code
- Performing Load runs to Higher environments and Resolving any Bugs identified.
- Experience working with or having knowledge on other ETL tools like Informatica is a plus