How to Crack This Jobs:
- The first thing apply before which Skills are requirement for the job role mention in your resume you will short list easily.
- After you will prepare exam which skills are asking for exam you can prepared easily selected for this job role.
Education Qualification for This Role:
- Bachelor’s in Engineering/Computer science
Eligibility criteria:
- 0-3 years of hands-on experience with at least one of the databases including Postgres,
Oracle, Teradata, DB2, Sybase etc. - Experienced with Linux and Window’s operating platforms a must
Skills Requirements for This Job Description:
- Demonstrate strong and hands-on experience and background in database principles and concepts.
- Strong experience in designing and developing large-scale data solutions with solid understanding of data modeling, data processing, data warehousing and best practices.
- Proficient in SQL and writing/understanding stored procedures, packages and PL/SQL programming in general.
- Proficient in database performance tuning with clear understanding of indexing, partitioning, analyzing SQL plans, optimizer, hints etc.
- Expertise in Extract-Transform-Load(ETL)/Extract-Load-Transform(ELT) concepts with clear understanding in each of the stages and relevant tools/technologies.
- Up to 3 years of hands-on expertise with ETL tools like Informatica, Data stage, Talend etc. and should be able to justify the experience with good examples of design/development of robust ETL processes.
- Strong background in Data Profiling, Data Analytics workloads and their role in design/development of data related processes.
- Ability to work with multiple teams involved in applications end to end design, to understand and clearly articulate the data pipelines and intricacies/inter-dependencies involved.
- Provide support/guidance to the development teams during the analysis, development, review and testing processes.
- Exposure to any cloud platforms (AWS/GCP/Azure) and knowledge on data engineering/processing on cloud using tools/technologies like Python, Spark, Kafka, Glue etc. is an added advantage.
- Good communication and inter-personal skills.
Role and Responsibilities This Jobs:
- Responsible for owning the design/development aspects of database/ETL operations of multiple critical ratings applications in MIS Tech
- Must be able to operate effectively and collaboratively with a team of skilled professionals with wide variety of skillsets and establish a cordial/healthy relationship.
- Analysis of the business and technical requirements to define and design the database/ETL service interfaces required to support the requirements.
- Planning, engineering, design, implementation and managing database/ETL services for technologies in scope.
- Collaborate with the developers and application leads/architects to deliver solutions based on Moody’s standards and requirements.
- Partner with application support and development teams to deliver resilient and cost-effective solutions that meet demanding business requirements.
- Provide stability to the current architectural solutions and explore opportunities to redesign/rewrite components to make the solutions robust, reliable and resilient.
- Play a key role in sunsetting some of the legacy applications/processes by carefully evaluating the inter-dependences and impact analysis and migrating to modern data platforms in the future.
- Lead the developers in the team by guiding them in a direction that is aligned with Moody’s future vision from technology/culture perspective.
- Enabling an environment of continuous learning, knowledge sharing, and teamwork.
Location:
- Bangalore, Karnataka, IN | Gurgaon, Haryana, IN.
FOR MORE DETAILS | CLICK HERE |
APPLY THROUGH LINK | CLICK HERE |
Leave a Reply