AWS Cloud Data Engineer

| Chantilly, VA, USA
Apply
By clicking Apply Now you agree to share your profile information with the hiring company.

Overview

BigBear.ai is seeking a highly skilled and experienced AWS Cloud Data Engineer to join our dynamic team. The ideal candidate will be responsible for building and optimizing our data ingestion processes using a variety of AWS services. A strong background in AWS environments, particularly in constructing data lakes, is highly desirable.

Hybrid, with 3 days onsite in Chantilly, VA.

What you will do

  • Design and implement data ingestion pipelines using AWS services such as DMS, DataSync, Glue, Athena, Lambda, S3, RDS, and Step Functions
  • Develop and optimize data processing and storage strategies, ensuring efficient data flow within the AWS ecosystem
  • Architect and build AWS-based data lakes, ensuring scalability, reliability, and security
  • Collaborate with cross-functional teams to understand data needs and implement solutions that meet business objectives
  • Ensure data solutions are compliant with security protocols and best practices
  • Monitor, troubleshoot, and continuously improve data systems and processes
  • Stay abreast of the latest AWS offerings and technologies, identifying opportunities for system enhancements.

What you need to have

  • Bachelor's Degree in Computer Science, Engineering, Information Technology and 2+ years of experience, or;
    • Master's Degree and 0 - 3+ years of experience, or;
    • in lieu of Bachelor's degree, 8+ additional years of relevant experience
  • Clearance: able to obtain and maintain active Secret level clearance
  • CompTIA Security+ certification
  • Minimum of 2 years of experience in data engineering, with a strong focus on cloud-based solutions
  • Demonstrated experience with AWS data services (DMS, DataSync, Glue, Athena, Lambda, S3, RDS, Step Functions)
  • Proven expertise in designing and building data lakes in an AWS environment
  • Strong knowledge of database design, data modeling, and data warehousing principles
  • Proficiency in SQL and experience with programming/scripting languages (e.g., Python, Java)
  • Excellent problem-solving skills and the ability to work in a fast-paced, dynamic environment.

What we'd like you to have

  • AWS Certified Data Analytics - Specialty or AWS Certified Big Data - Specialty certification
  • Experience in working with large, complex datasets and real-time data ingestion
  • Familiarity with additional AWS services and Infrastructure as Code (IaC) tools, such as CloudFormation and GitLab
  • Strong communication and collaboration skills, with an ability to convey complex technical concepts.

About BigBear.ai

BigBear.ai delivers AI-powered analytics and cyber engineering solutions to support mission-critical operations and decision-making in complex, real-world environments. BigBear.ai's customers, which include the US Intelligence Community, Department of Defense, the US Federal Government, as well as customers in manufacturing, healthcare, commercial space, and other sectors, rely on BigBear.ai's solutions to see and shape their world through reliable, predictive insights and goal-oriented advice. Headquartered in Columbia, Maryland, BigBear.ai is a global, public company traded on the NYSE under the symbol BBAI. For more information, please visit: http://bigbear.ai/ and follow BigBear.ai on Twitter: @BigBearai.

What you will do

  • Design and implement data ingestion pipelines using AWS services such as DMS, DataSync, Glue, Athena, Lambda, S3, RDS, and Step Functions
  • Develop and optimize data processing and storage strategies, ensuring efficient data flow within the AWS ecosystem
  • Architect and build AWS-based data lakes, ensuring scalability, reliability, and security
  • Collaborate with cross-functional teams to understand data needs and implement solutions that meet business objectives
  • Ensure data solutions are compliant with security protocols and best practices
  • Monitor, troubleshoot, and continuously improve data systems and processes
  • Stay abreast of the latest AWS offerings and technologies, identifying opportunities for system enhancements.

What you need to have

  • Bachelor's Degree in Computer Science, Engineering, Information Technology and 2+ years of experience, or;
    • Master's Degree and 0 - 3+ years of experience, or;
    • in lieu of Bachelor's degree, 8+ additional years of relevant experience
  • Clearance: able to obtain and maintain active Secret level clearance
  • CompTIA Security+ certification
  • Minimum of 2 years of experience in data engineering, with a strong focus on cloud-based solutions
  • Demonstrated experience with AWS data services (DMS, DataSync, Glue, Athena, Lambda, S3, RDS, Step Functions)
  • Proven expertise in designing and building data lakes in an AWS environment
  • Strong knowledge of database design, data modeling, and data warehousing principles
  • Proficiency in SQL and experience with programming/scripting languages (e.g., Python, Java)
  • Excellent problem-solving skills and the ability to work in a fast-paced, dynamic environment.
More Information on BigBear.ai
BigBear.ai operates in the Artificial Intelligence industry. The company is located in Columbia, MD, Chantilly, VA, Lexington, MA, San Diego, CA, Ann Arbor, MI, Virginia Beach, VA, Huntsville, AL and Charlottesville, VA. BigBear.ai was founded in 2020. It has 700 total employees. It offers perks and benefits such as Partners with nonprofits, Friends outside of work, Eat lunch together, Intracompany committees, Open door policy and OKR operational model. To see all 35 open jobs at BigBear.ai, click here.
Read Full Job Description
Apply Now
By clicking Apply Now you agree to share your profile information with the hiring company.

Similar Jobs

Apply Now
By clicking Apply Now you agree to share your profile information with the hiring company.
Learn more about BigBear.aiFind similar jobs