Hadoop Application Support Lead

Sorry, this job was removed at 11:13 a.m. (CST) on Friday, January 6, 2023
Find out who's hiring in Charlotte, NC.
See all Cybersecurity + IT jobs in Charlotte, NC
Apply
By clicking Apply Now you agree to share your profile information with the hiring company.

Job Description:
Position Summary
Following is a summary of the essential functions for this job. Other duties may be performed, both major and minor, which are not mentioned below. Specific activities may change from time to time.

  • Support multiple projects with competing deadlines
  • 24/7 Monitoring of production applications to make sure SLAs are met
  • Triage and Remediation of Job failures
  • Work with upstream/downstream applications for any delays or data issues
  • Ability to generate and submit the reports to senior management
  • Sound understanding and experience with Hadoop ecosystem (Cloudera). Able to understand and explore the constantly evolving tools within Hadoop ecosystem and apply them appropriately to the relevant problems at hand.
  • Experience in working with a Big Data implementation in production environment
  • Experience in HDFS, Map Reduce, Hive, impala, Linux/Unix technologies is mandatory
  • Experience in Flume/Kafka/spark is an added advantage
  • Experience in Unix shell scripting is mandatory
  • Able to analyze the existing shell scripts/python/perl code to debug any issues or enhance the code
  • Maintenance activities like performance maintenance, storage maintenance, platform clean up, etc
  • Sound knowledge of relational databases (SQL) and experience with large SQL based systems.
  • Strong IT consulting experience in various data warehousing engagement, handling large data volumes, architecting big data environments.
  • Deep understanding of algorithms, data structures, performance optimization techniques and software development in a team environment.
  • Benchmark and debug critical issues with algorithms and software as they arise.


Primary Skill
Hadoop
Secondary Skill
Java
Tertiary Skill
Python
Required Skills

  • Bachelor's degree in a technical or business-related field, or equivalent education and related training.
  • Ten years of experience in data warehousing architectural approaches and minimum 5 years in big data (Cloudera).
  • Exposure to and strong working knowledge of distributed systems.
  • Excellent understanding of client-service models and customer orientation in service delivery.
  • Ability to grasp the 'big picture' for a solution by considering all potential options in impacted area.
  • Aptitude to understand and adapt to newer technologies.
  • The ability to work with teammates in a collaborative manner to achieve a mission.
  • Presentation skills to prepare and present to large and small groups on technical and functional topics.


Desired Skills

  • Previous experience in the financial services industry.
  • Previous experience in production support.
  • Broad BofA technical experience and good understanding of existing testing/operational processes and an open mind on how to enhance those.
  • Understanding of industry trends and relevant application technologies.
  • Experience in designing and implementing analytical environments and business intelligence solutions.


Shift:
1st shift (United States of America)
Hours Per Week:
40

Read Full Job Description
Apply Now
By clicking Apply Now you agree to share your profile information with the hiring company.

Similar Jobs

Apply Now
By clicking Apply Now you agree to share your profile information with the hiring company.
Learn more about Bank of AmericaFind similar jobs