GCP Data Engineer

| Remote
Sorry, this job was removed at 4:07 p.m. (CST) on Thursday, July 28, 2022
Find out who's hiring in Phoenix, AZ.
See all Data + Analytics jobs in Phoenix, AZ
Apply
By clicking Apply Now you agree to share your profile information with the hiring company.

Looking for only W2 Candidates

Required minimum qualifications include:

· Creating/updating ETL specifications and supporting documentation

· Developing ETL utilizing tools such as Informatica workflows, scripting and load utilities, Talend, etc.

· Knowledge of big data ingestion using Talend, Scoop, Hive, etc

· Implementing data flow scripts using technologies such as Unix /Sqoop / Hive QL / Pig scripting

· Designing schemas, data models, and data architecture

· Designing, building and supporting data processing pipelines to transform data in Big Data or Teradata platforms

· Work with business analysts to understand business requirements and use cases

· Problem Solving and fixing technical issues

· Lead day-to-day activities and influence team members to use, follow, and support Agile software development practices

· Collaborate with the product owner and key partners in Project Management, Business, QA, and Technology Operations to ensure high-quality delivery of software projects on time and budget

In addition, ideal candidates will also have the following desired qualifications:

· 6+ years of experience in understanding standard methodologies for building and designing ETL code

· Strong SQL experience with the ability to develop, tune and debug complex SQL applications is required

· Knowledge in schema design, developing data models, and demonstrable ability to work with complex data is required

· Experience in ETL tools such as Informatica, Talend, or any other tools is required

· Strong Teradata coding and utilities experience is a plus

· Experience in working in large environments such as RDBMS, EDW, NoSQL, etc. is a plus

· Hands-on experience with Hadoop, MapReduce, Hive, SPARK, Kafka, and HBASE is a strong plus

· Understanding of Hadoop file format and compressions

· Experience with scheduling tools (eg. Control M, ESP)

· Understanding of standard methodologies for building Data Lake and analytical architecture on Hadoop is preferred

· Scripting/programming with UNIX, Java, Python, Scala, etc. is preferred

· Exposure to cloud data platforms such as GCP/Big Query would be a plus

· Knowledge in real-time data ingestion is helpful

· Experience collaborating with business and technology partners and offshore development teams

· Good interpersonal, analytical, problem-solving, and organizational skills

· Excellent written/verbal communication skills

More Information on NucleusTeq
NucleusTeq operates in the Information Technology industry. The company is located in Phoenix, AZ. NucleusTeq was founded in 2018. It has 500 total employees. It offers perks and benefits such as Dental insurance, Vision insurance, Health insurance, Life insurance, 401(K) and Performance bonus. To see all 40 open jobs at NucleusTeq, click here.
Read Full Job Description
Apply Now
By clicking Apply Now you agree to share your profile information with the hiring company.

Similar Jobs

Apply Now
By clicking Apply Now you agree to share your profile information with the hiring company.
Learn more about NucleusTeqFind similar jobs