GCP Data Engineer
Looking for only W2 Candidates
Required minimum qualifications include:
· Creating/updating ETL specifications and supporting documentation
· Developing ETL utilizing tools such as Informatica workflows, scripting and load utilities, Talend, etc.
· Knowledge of big data ingestion using Talend, Scoop, Hive, etc
· Implementing data flow scripts using technologies such as Unix /Sqoop / Hive QL / Pig scripting
· Designing schemas, data models, and data architecture
· Designing, building and supporting data processing pipelines to transform data in Big Data or Teradata platforms
· Work with business analysts to understand business requirements and use cases
· Problem Solving and fixing technical issues
· Lead day-to-day activities and influence team members to use, follow, and support Agile software development practices
· Collaborate with the product owner and key partners in Project Management, Business, QA, and Technology Operations to ensure high-quality delivery of software projects on time and budget
In addition, ideal candidates will also have the following desired qualifications:
· 6+ years of experience in understanding standard methodologies for building and designing ETL code
· Strong SQL experience with the ability to develop, tune and debug complex SQL applications is required
· Knowledge in schema design, developing data models, and demonstrable ability to work with complex data is required
· Experience in ETL tools such as Informatica, Talend, or any other tools is required
· Strong Teradata coding and utilities experience is a plus
· Experience in working in large environments such as RDBMS, EDW, NoSQL, etc. is a plus
· Hands-on experience with Hadoop, MapReduce, Hive, SPARK, Kafka, and HBASE is a strong plus
· Understanding of Hadoop file format and compressions
· Experience with scheduling tools (eg. Control M, ESP)
· Understanding of standard methodologies for building Data Lake and analytical architecture on Hadoop is preferred
· Scripting/programming with UNIX, Java, Python, Scala, etc. is preferred
· Exposure to cloud data platforms such as GCP/Big Query would be a plus
· Knowledge in real-time data ingestion is helpful
· Experience collaborating with business and technology partners and offshore development teams
· Good interpersonal, analytical, problem-solving, and organizational skills
· Excellent written/verbal communication skills