AWS Data Engineer

Reposted 21 Days Ago
Be an Early Applicant
Johannesburg, City of Johannesburg, Gauteng
In-Office
Senior level
Information Technology • Consulting
The Role
Develop and manage data pipelines on AWS, utilizing Glue and various AWS services for ETL processes, and collaborating with stakeholders to design data solutions.
Summary Generated by Built In

Amidel is an Information Technology and Business Consulting Company that provides highly specialised solutions to large and small enterprises in both the private and public sectors.

We are seeking a highly motivated and experienced Intermediate/Senior AWS Data Engineer. The ideal candidate should have a strong background in C# or Python programming and building data pipelines on AWS, especially with AWS Glue Jobs using PySpark or AWS Glue Spark. This role offers an exciting opportunity to collaborate with leading financial institutions, contributing to the design and implementation of data pipelines. The candidate should have at least the AWS Certified Data Engineer - Associate certificate.

Job family: Data Engineering/Computer Science/Finance

Minimum Experience: 
5 Years Python/C# Development
3 Years AWS Data Engineering

AWS Certified Data Engineer - Associate certificate

Education Requirements: 
Bachelor's degree in Computer Science, Information Systems, or related field.
Advantageous: AWS Certified Machine Learning – Specialty Certificate


Contract Type: Permanent


RESPONSIBILITIES

Responsibilities differ across client engagements but may include:

·        Creating data models that can be used to extract information from various sources and store it in a usable format.

·        Lead the design, implementation, and successful delivery of large-scale, critical, or difficult data solutions involving a significant amount of data.

·        Utilize expertise in SQL and have a strong understanding of ETL and data modelling.

·        Ability to ingest data into AWS S3, perform ETL into RDS or Redshift.

·        Use AWS Lambda (C# or Python) for event-driven data transformations.

·        Designing and implementing security measures to protect data from unauthorized access or misuse.

·        Maintaining the integrity of data by designing backup and recovery procedures.

·        Work on automating the migration process in AWS from development to production.

·        You will deliver digestible, contemporary, and immediate data content to support and drive business decisions.

·        You will be involved in all aspects of data engineering from delivery planning, estimating and analysis, all the way through to data architecture and pipeline design, delivery, and production implementation.

·        From day one, you will be involved in the design and implementation of complex data solutions ranging from batch to streaming and event-driven architectures, across cloud, on-premise and hybrid client technology landscapes.

Optimize cloud workloads for cost, scalability, availability, governance, compliance, etc.



QUALIFICATIONS AND COMPETENCIES



·        Must have experience with AWS Glue Jobs using PySpark or AWS Glue Spark.

·        Realtime ingestion using KAFKA is an added advantage.

·        Strong SQL and C# or Python programming knowledge.

·        Objective oriented principles in C# or Python: classes and inheritance.

·        Expert knowledge of data engineering packages and libraries and related functions in C# or Python.

·        AWS technical certifications (Developer Associate or Solutions Architect).

·        Experience with development and delivery of microservices using serverless AWS services (S3, RDS, Aurora, DynamoDB, Lambda, SNS, SQS, Kinesis, IAM).

·        Ability to understand and articulate requirements to technical and non-technical audiences.

·        Have experience working with RDBMS databases, such as Postgres, SQL Server and MySQL.

·        Apply knowledge of scripting and automation using tools like PowerShell, Python, Bash, Ruby, Perl, etc.

·        Stakeholder management and communication skills, including prioritising, problem solving and interpersonal relationship building.

·        Effectively and efficiently troubleshoot data issues and errors.

·        Strong experience in SDLC delivery, including waterfall, hybrid, and Agile methodologies.

·        Experience delivering in an agile environment.

·        Experience in implementing and delivering data solutions and pipelines on AWS Cloud Platform.

·        A strong understanding of data modelling, data structures, databases, and ETL processes.

·        An in-depth understanding of large-scale data sets, including both structured and unstructured data.

·        Knowledge and experience in delivering CI/CD and DevOps capabilities in a data environment.

·        Ability to clearly communicate complex technical ideas.

·        Experience in the financial industry is a plus.

·        An AWS Certified Machine Learning – Specialty Certificate is an advantage.

If you are passionate about working with data and have a desire to work in a dynamic and challenging environment, we encourage you to apply. This is an excellent opportunity to make a significant impact in a leading investment bank and to grow your career in data engineering.

Top Skills

AWS
Aws Glue
Aws Lambda
Aws Rds
Aws Redshift
Aws S3
C#
Kafka
Perl
Powershell
Pyspark
Python
Ruby
SQL
Am I A Good Fit?
beta
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
HQ: Gauteng
30 Employees
Year Founded: 2013

What We Do

Amidel is an Information Technology and Business Consulting Company that provides highly specialised solutions to both large and small enterprises in the private and public sectors. Founded in 2013, Amidel has experienced professionals and industry leaders with over 124 years of combined experience and an average of 8 years experience.

Similar Jobs

Hybrid
Johannesburg, Gauteng, ZAF
289097 Employees

ServiceNow Logo ServiceNow

Architect

Artificial Intelligence • Cloud • HR Tech • Information Technology • Productivity • Software • Automation
Remote or Hybrid
Johannesburg, Gauteng, ZAF
27000 Employees

TransUnion Logo TransUnion

Analyst Batch Customer Engagement

Big Data • Fintech • Information Technology • Business Intelligence • Financial Services • Cybersecurity • Big Data Analytics
Hybrid
Johannesburg, Gauteng, ZAF
13000 Employees

Mondelēz International Logo Mondelēz International

Manager, ISC Finance CS&L, SACEA & WA

Big Data • Food • Hardware • Machine Learning • Retail • Automation • Manufacturing
Hybrid
Johannesburg, Gauteng, ZAF
90000 Employees

Similar Companies Hiring

Scrunch AI Thumbnail
Software • SEO • Marketing Tech • Information Technology • Artificial Intelligence
Salt Lake City, Utah
Amplify Platform Thumbnail
Fintech • Financial Services • Consulting • Cloud • Business Intelligence • Big Data Analytics
Scottsdale, AZ
62 Employees
Standard Template Labs Thumbnail
Software • Information Technology • Artificial Intelligence
New York, NY
10 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account