Be an Early Applicant
The Role
The intern will build data pipelines, orchestrate workflows, model data, and collaborate on data discovery, gaining knowledge of modern data technologies.
Summary Generated by Built In
JOB DESCRIPTIONKeppel is a leading global asset manager and operator with strong capabilities in energy & environment, urban development and connectivity, creating solutions for a sustainable future.
The intern will be responsible in the following:
• Building Data pipelines
• Workflow Orchestration
• Data Modelling
• Collaborating with the business for Data Discovery exercise
Learning Outcomes:
• Learning opportunities on cutting edge modern data technologies and understanding of Keppel's Business modelJOB REQUIREMENTSJob Requirements:
• Python & SQL is must.
• Know-how pertaining to Airflow/DBT/Snowflake is preferable.
• Basic knowledge in Data Warehouse design prinicples.
#LI-LG1BUSINESS SEGMENTCorporatePLATFORMOperating Division
Top Skills
Airflow
Dbt
Python
Snowflake
SQL
Am I A Good Fit?
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.
Success! Refresh the page to see how your skills align with this role.
The Company
What We Do
Keppel is a global asset manager and operator with strong expertise in sustainability-related solutions spanning the areas of infrastructure, real estate and connectivity. Headquartered in Singapore, Keppel operates in more than 20 countries worldwide, providing critical infrastructure and services for renewables, clean energy, decarbonisation, sustainable urban renewal and digital connectivity.
Keppel creates value for investors and stakeholders through its quality investment platforms and diverse asset portfolios, including private funds and listed real estate and business trusts, and has a total portfolio with more than S$65 billion of assets under management.







