Spark Scala Data Developer
Technical/Functional Skills
1. Must have knowledge on Spark/Spark SQL - Spark Core, Spark SQL, Data Frames, Spark Streaming, Driver Node, Worker Node, Stages, Executors
2. Knowledge in Scala.
3. Knowledge in Data Quality
4. Knowledge on HDFS, Hive. Pig, Scoop, Spark, Kafka
5. Real Time and Batch Process ing
6. AWS-Knowledge of Architecture and Offerings besides general Cloud concepts
7. Must have solid CLI and console hands on with IAM, S3, EC2, Lambda
8. ETL Data Pipeline.
Experience Required
6-8 Yrs
Roles & Responsibilities
1. Develop enterprise applications using agile methodology.
2. Create Real time streaming data ingestion using Spark, Scala.
3.To meet specific business requirements in Scala and Pyspark.
Extract Transform and Load data from Various Sources Systems to AWS S3 using Spark SQL.
4. Develop Spark applications in Data Bricks Notebook, deploy in cluster and scheduling, monitoring and troubleshooting of job failures.
5. Identify functional test scenarios in line with business objectives.
6. Support testing efforts and Support documentation efforts
7. Understand and consume application requirements and use cases
8. Design and document technical solution strategies and architectures fulfilling complex business requirements
9. AWS-Knowledge of Architecture and Offerings besides general Cloud concepts
10. Must have solid CLI and console hands on with VPC, S3,EC2,Lambda
11. Preferably should have experience with Athena, SNS, SQS, Redshift, DynamoDB, Cloudwatch, IAM etc.