AWS Data Engineer

Posted 5 Days Ago
Be an Early Applicant
Bengaluru, Karnataka
Mid level
Software • Consulting
The Role
The AWS Data Engineer will design and develop scalable ETL pipelines, collaborate with cross-functional teams to meet project objectives, perform data analysis to address quality issues, automate deployment processes in AWS, and document data governance practices while staying updated on data engineering best practices.
Summary Generated by Built In

Company Description

Our Parent Organization, NEC Corporation, is a 25 billion Company with offices spread across the globe. 
NEC is a multinational provider of information technology (IT) services and products, headquartered in Tokyo Japan with 122 years of experience in evolution with technology and innovation. It is recognized as a ‘Top 50 Innovative Company’ globally


NEC Corporation has established itself as a leader in the integration of IT and network technologies while promoting the brand statement of “Orchestrating a brighter world.” 
NEC enables businesses and communities to adapt to rapid changes taking place in both society and the market as it provides for the social values of safety, security, fairness and efficiency to promote a more sustainable world where everyone has the chance to reach their full potential. 

NEC Software Solutions (India) Private Limited! is based in Mumbai (Worli & Airoli) and Bangalore with an employee strength of 1500+. 
It is one of the foremost providers of end-to-end IT services across various sectors. 
We work with diverse industry verticals which include publishing, media, financial services, retail, healthcare and technology companies around the world. Our customers range from two-person startups to $bn listed companies.

We have more than 30 years of experience in providing end to end IT services across the globe and have earned a reputation for delighting our customers by consistently surpassing expectations and helping them deliver robust, market-ready software products that meet the highest standards of engineering and user experience. 
Supported by more than 1300 exceptionally talented manpower, we are a hub for offshore support and technology services

Job Description

What you’ll be doing: 

  • Building Scalable Data Pipelines: Designing and developing high-quality, scalable ETL pipelines for processing big data using AWS analytical services, leveraging no-code tools, and reusable Python libraries to ensure efficiency and reusability.
  • Collaborating & Aligning with Project Goals: Working closely with cross-functional teams, including senior data engineers, engineering managers, business analysts, to understand project objectives and deliver robust data solutions, following Agile/Scrum principles to drive consistent progress.
  • Data Discovery & Root Cause Analysis: Performing data discovery and analysis to uncover data anomalies, while identifying and resolving data quality issues through root cause analysis. Making informed recommendations for data quality improvement and remediation.
  • Automating Deployments: Managing the automated deployment of code and ETL workflows within cloud infrastructure (AWS preferred) using tools like GitHub Actions or AWS CodePipeline or any modern CI/CD systems to streamline processes and reduce manual intervention.
  • Effective Time Management: Demonstrating strong organizational and time management skills, prioritizing tasks effectively, and ensuring the timely delivery of key project milestones.
  • Documentation & Data Mapping: Developing and maintaining comprehensive data catalogues, including data mapping and documentation, to ensure data governance, transparency, and accessibility for all stakeholders.
  • Learning & Contributing to Best Practices: Continuously improving your skills by learning and implementing data engineering best practices, staying updated on industry trends, and contributing to team knowledge-sharing and codebase optimization.


What we’re looking for: 

  • Experience: 2 to 5 years of experience in data engineering or related analytical roles, with a minimum of 2 years working on cloud and big data technologies on AWS. AWS experience is highly preferred, and familiarity with Google BigQuery an Google Analytics 4 is a plus.
  • Data Expertise: Strong analytical skills in handling and processing structured and semi-structured datasets, with hands-on experience in designing and implementing scalable data engineering solutions, on AWS.
  • Cloud Technologies: Proficiency in building data pipelines and working with data warehousing solutions on AWS (Redshift, S3, Glue, Lambda, etc.). Experience with alternative cloud platforms (e.g., Google Cloud, Azure) is a bonus.
  • Programming Skills: Strong programming proficiency in Python, with additional experience in Java/Scala being a plus. Ability to write efficient, reusable, and scalable code to process large datasets.
  • Data Warehousing: Proven experience with modern data warehousing tools like AWS Redshift, Snowflake, or equivalent platforms, with a focus on performance optimization and query tuning.
  • Version Control & Automation: Hands-on experience with version control systems like GitHub, GitLab, or Bitbucket, and with CI/CD pipelines using tools like GitHub Actions, AWS CodePipeline, Jenkins, etc., to ensure smooth, automated deployments.
  • Data Governance & Security: Knowledge of data governance practices, compliance standards, and security protocols in a cloud environment.
  • Optional Skills: Experience in business intelligence (BI) tools like Tableau, Power BI, or QuickSight, and exposure to data visualization techniques will be an advantage.
  • Collaboration & Problem Solving: Ability to work in a cross-functional team, collaborating closely with data scientists, analysts, and product managers to deliver high-impact data solutions. Strong problem-solving skills and adaptability to changing business requirements.

Additional Information

Excellent Communication Skills required.

Top Skills

Python
The Company
Hemel Hempstead
1,380 Employees
On-site Workplace

What We Do

Innovation when it matters most. We build software and services that help keep people safer, healthier, and better connected worldwide. Our customers are national governments and international health bodies. They’re also police forces, emergency services, local authorities, and housing providers, all working to prevent harm and provide the right support. Our software and services get them great outcomes

Similar Jobs

Take-Two Interactive Software Logo Take-Two Interactive Software

Data Engineer (AWS DevOps)

Gaming • Information Technology • Mobile • Software
Bengaluru, Karnataka, IND
6500 Employees

Quantiphi Logo Quantiphi

Senior AWS Data Engineer - Quicksight Specialist

Artificial Intelligence • Big Data • Machine Learning
Bengaluru, Bengaluru Urban, Karnataka, IND
3494 Employees

Nielsen Logo Nielsen

Principal Data Engineer - AWS

Digital Media • Information Technology • Analytics
Hybrid
Bengaluru, Karnataka, IND
30034 Employees

Fractal Logo Fractal

Python+AWS Data Engineer

Artificial Intelligence • Consulting
Bengaluru, Karnataka, IND
5262 Employees

Similar Companies Hiring

Jobba Trade Technologies, Inc. Thumbnail
Software • Professional Services • Productivity • Information Technology • Cloud
Chicago, IL
45 Employees
RunPod Thumbnail
Software • Infrastructure as a Service (IaaS) • Cloud • Artificial Intelligence
Charlotte, North Carolina
53 Employees
Hedra Thumbnail
Software • News + Entertainment • Marketing Tech • Generative AI • Enterprise Web • Digital Media • Consumer Web
San Francisco, CA
14 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account