
Search by job, company or skills
Prudential's purpose is to be partners for every life and protectors for every future. Our purpose encourages everything we do by creating a culture in which diversity is celebrated and inclusion assured, for our people, customers, and partners. We provide a platform for our people to do their best work and make an impact to the business, and we support our people's career ambitions. We pledge to make Prudential a place where you can Connect, Grow, and Succeed.
. This role is to design data integration how to ingest, transform and integrate structured/semi structure/un-structure data and delivers data to a scalable data architecture platform and methodologies to collect of data from various sources into a single data source.Job Responsibilities:
Ingest, transform and blend/integrate structured and unstructured data and delivers the data to a MDM, data warehouse and/or data lake
Include both technical processes and business logic to transform data from disparate sources into cohesive meaningful data with quality, governance and compliance considerations
Use the technical and business processes to combine data from disparate sources into meaningful and valuable information
Design, build and manage the information and big data infrastructure that helps analyze and process data that the organization requires & optimize systems to perform smoothly
Prepare the data for the data scientist's exploration and discovery process
Evaluate, compare and improve the different approaches including design patterns innovation, data lifecycle design, data ontology alignment, annotated datasets, and elastic search approach
Job Requirements:
Qualifications
University degree in Computer Science, Engineering and/or a technically oriented field
Over 05 years of experience with Flink/Spark, Databricks
Over 05 years of experience with Azure (DP200 and/or DP201, DP203 certification acts as a plus)
03+ years experience mining data as a data analyst
3+ years using programming languages such as Java, DAX, MDX, SQL, Python
03+ years of experience with PowerBI, Tableau or Qlik
03+ years experience in the life insurance domain is required.
Passionate about analytics machine learning technology & applications and eager to learn.
English communication
Knowledge and skill / Kiến thức v kỹ năng
Good knowledge of Big Data technologies, such as Spark, Hadoop/MapReduce
Good knowledge of Azure services like Storage Account, Azure DataBricks etc.
Good knowledge of SQL and excellent coding skills.
Strong knowledge of data modeling and data mining.
Self-Development, communication, problem-Solving Skills.
Open-minded, multi-tasking, teamwork, flexible and interest to learn new things
Competencies / Năng lực lãnh đạo
Leadership Capability Model - Intermediate level
Prudential is an equal opportunity employer. We provide equality of opportunity of benefits for all who apply and who perform work for our organisation irrespective of sex, race, age, ethnic origin, educational, social and cultural background, marital status, pregnancy and maternity, religion or belief, disability or part-time / fixed-term work, or any other status protected by applicable law. We encourage the same standards from our recruitment and third-party suppliers taking into account the context of grade, job and location. We also allow for reasonable adjustments to support people with individual physical or mental health requirements.
Job ID: 146489283