Big Data Engineer

Sorry, this job was removed at 10:04 p.m. (CST) on Tuesday, June 28, 2022
Find out who's hiring in Dallas, TX.
See all Data + Analytics jobs in Dallas, TX
Apply
By clicking Apply Now you agree to share your profile information with the hiring company.

What you'll do...
Position: Big Data Engineer
Job Location: 350 N Saint Paul St, Dallas, TX 75201
Duties: Defines the data elements and Business requirements for the Enterprise Data Lake. Analyzes and strategizes programs to collect, store and visualize data from various sources and Design data elements into a structured Data Model. Designs the Data Model at Enterprise level using Modeling tool like Erwin, Visio, and Lucid Chart. Prepares and documents Metadata definition for Enterprise Data Lake. Designs and develops Big data applications using Core Java, Scala; deploys the applications and workflow into cloud environments. Hadoop eco system using real-time streaming technologies using Spark. Experience with cloud-based systems including Infrastructure and Platforms as a Service. Works with NoSQL databases Cassandra or MongoDB. Utilizes ETL Technologies and creates data pipelines; interprets business requirement converting to technical design. Wring shell scripts using Linux to invoke and schedule applications; proficient knowledge on version control system like GIT or SVN. Writes complex SQL and Database level program like Procedures, Functions, and Triggers. using Oracle SQL and PL/SQL along with real time monitoring and Tuning the Applications using Database Tools SQL Explain-Plan, SQL Profiler, AWR report, and SQL Analyzer.
Minimum education and experience required: Bachelor's degree or the equivalent in Computer Science, Information Technology, Engineering, or a related field plus 5 years of experience in software engineering or related experience.
Skills Required: Must have experience with: Coding and Testing in Relational Database Management Systems (Oracle); Performance tuning in SQL and PL/SQL; Performing impact analysis in database objects; Building dashboards in Tableau; Performing server level activities using Unix shell script; Job monitoring, scheduling and configuration in UC4/Automic; Creating data models such as Star and Snowflake schema; Designing ETL processes and pipelines in Hadoop using Hive/Spark or Python; and Building data pipeline jobs using Cloud technologies such as GCP, Azure or AWS. Employer will accept any amount of experience with the required skills.
Wal-Mart is an Equal Opportunity Employer.
#LI-DNP #LI-DNI

Read Full Job Description
Apply Now
By clicking Apply Now you agree to share your profile information with the hiring company.

Similar Jobs

Apply Now
By clicking Apply Now you agree to share your profile information with the hiring company.
Learn more about Walmart Global TechFind similar jobs