WHAT YOU WILL GET TO DO
- Provide Top Quality solution design and implementation for clients.
- Design and execute data abstractions and integration patterns (APIs) to support complex distributed computing problems.
- Ensure that data security, governance, and compliance best practices are embedded into all solutions.
- Provide support in defining the scope and the estimating of proposed solutions.
- Engage with our clients to understand their strategic objectives.
- Translate business requirements into scalable and cost effective technology solutions that optimize data availability, performance, and usability.
- Work with client and engagement leaders to understand their strategic business objectives and align data architecture solutions accordingly.
- Identify gaps in data infrastructure, governance, or integration, and work with clients to resolve them in a timely manner.
- Utilize Big Data technologies to architect and build a scalable data solution.
- Architect and implement top-quality data solutions using cloud (AWS, GCP, Azure), big data frameworks (Apache Spark, Kafka, Databricks), and modern data platforms (Snowflake, BigQuery, Redshift).
- Stay up to date with emerging technology trends in cloud data platforms, AI-driven analytics, and data architecture best practices to make recommendations that align with client needs.
- Responsible for the design and execution of abstractions and integration patterns (APIs) to solve complex distributed computing problems.
- Participate in pre-sales activities, including proposal development, RFI/RFP response, shaping a solution from the client’s business problem.
- Support pre-sales activities, including proposal development, RFI/RFP responses, and solution presentations.
- Act as thought leader in the industry by creating written collateral (white papers or POVs), participate in speaking events, create/participate in internal and external events.
- Mentor and guide data engineers, analysts, and solution architects on data engineering best practices, architecture frameworks, and cloud infrastructure.
WHAT YOU BRING TO THE TEAM
- Bachelor’s Degree in Computer Science, Information Technology, Data Science, Engineering, or a related field plus 5 years of experience in related occupations.
- 5+ years of experience in the following: Professional development experience in architecting and implementing Big Data Solutions; Scripting languages: Java, Scala, Python, or Shell Scripting.
- 4+ years of experience in the following: Cloud: AWS, Azure, or GCP; Using one or more of the following ETL tools: Informatica, Talend, IBM DataStage, Azure Data Factory, AWS Glue;
- At least 2 of the following Big data tools and technologies: Linux, Hadoop, Hive, Spark, HBase, Sqoop, Impala, Kafka, Flume, Oozie, MapReduce;
- Performing complex data migration to and from disparate data systems/platforms as well as to/from the cloud: AWS, Azure, or GCP.
- 3+ years of experience in the following: Data Visualization tools: PowerBI, Tableau, Looker or similar.
- Telecommuting is permitted. 40 hours/week, must also have authority to work permanently in the U.S. Applicants who are interested in this position may apply at www.jobpostingtoday.com Ref #98372 for consideration.
BENEFITS
- Health Care Plan (Medical, Dental & Vision)
- Retirement Plan (401k, IRA)
- Life Insurance (Basic, Voluntary & AD&D)
- Unlimited Paid Time Off (Vacation, Sick & Public Holidays)
- Short Term & Long Term Disability
- Employee Assistance Program
- Training & Development
- Work From Home
- Bonus Program
Top Skills
What We Do
Wavicle Data Solutions is a trusted cloud, data and analytics consulting and development partner for businesses that want to get more value from growing volumes of data. Well-known brands across industries work with our team of data management experts, cloud migration consultants, and analytics professionals to modernize their data environments; build robust analytics solutions; and leverage machine learning to improve predictive insights.
Combining our expertise in big-data architectures with artificial intelligence and machine learning concepts, we help enterprises imagine new ways to manage costs, increase sales, and become more efficient. Whatever your business goal, we’ll help you tap into the right data and get it to the right place at the right time to the right user. Wavicle was founded in 2013 on the principle that modern organizations rely on data to drive their businesses, yet often lack the time, staff, or knowledge to leverage this valuable resource.
We invest heavily to stay ahead of the technology curve through ongoing training, certifications, and proof of concept endeavors. We’re also accredited Talend Partners, AWS Partners, Snowflake Partners, Databricks Partners, and Tableau Partners, among others. As technology experts, you count on us to help navigate through the maze of options and challenges to deliver innovative solutions that unlock the potential of your data.
Our capabilities include: data engineering and integration services, cloud migration services, modern data architecture services, enterprise data lakes/data warehouse services, ETL migration services, analytics dashboard and reporting development, machine learning model development, and data science services.
Headquartered in Oak Brook, IL, USA and Coimbatore, TN, India, Wavicle has more than 500 employees delivering solutions to customers around the globe.








