Location: Jordan
The Opportunity
As a Data Engineer – AIoT and IoT Analytics, you will design and implement intelligent data infrastructure for ingesting, processing, and analyzing large-scale sensor and machine data. You’ll build reliable, secure, and scalable pipelines—both in the cloud and at the edge—powering analytics and AI across distributed IoT systems. You’ll also bring Infrastructure as Code (IaC) principles to automate and standardize deployments for AIoT data platforms.
Key Responsibilities
- Design and implement streaming and batch data pipelines for ingesting telemetry, time-series metrics, and edge-generated events
- Build and extend AIoT DataOps and MLOps components to support model versioning, deployment, and continuous training
- Build data ingestion and processing pipelines for structured and unstructured IoT data.
- Apply Infrastructure as Code (IaC) practices to provision, version, and automate deployment of data processing platforms using tools like Terraform, Pulumi, or Ansible
- Implement data governance, quality checks, and policy enforcement across environments
- Collaborate with solution architects, data scientists, and embedded engineers to optimize edge-cloud data pipelines
- Collaborate with backend, ML, and product teams
- Deploy and monitor infrastructure across hybrid and multi-cloud environments, ensuring high availability, low-latency, and secure communication
- Work with MQTT brokers, Kafka, and message-driven architectures to connect data streams from devices to AI pipelines
- Enable time-series storage, analytics, and alerting for sensor data, system logs, and inference results
- Support real-time analytics for anomaly detection, predictive maintenance, and operational optimization
- Standardize infrastructure and pipeline deployment through templated, repeatable workflows integrated with CI/CD
- Optimize data workflows for performance and reliability
- Drive data performance tuning and architectural decisions based on scale, volume, and velocity requirements
- Develop scalable ETL frameworks integrating with our analytics platforms.
Comply with QHSE (Quality Health Safety and Environment), Business Continuity, Information Security, Privacy, Risk, Compliance Management and Governance of Organizations policies, procedures, plans, and related risk assessments.
Requirements
Requirements:
- Bachelor’s degree in Computer Science, Engineering, or a related technical field
- 5-8 years of experience in data engineering, with a strong emphasis on IoT, streaming, or AI-integrated platforms
- Strong programming skills in Python, Scala, or Java, and fluency in SQL
- Proven experience with tools like Apache Spark, Flink, Beam, Airflow, ClickHouse, Kafka, or Temporal
- Hands-on experience implementing Infrastructure as Code (IaC) using Terraform, Pulumi, or Ansible
- Familiarity with containerized data workloads (Docker, Kubernetes) and hybrid deployments
- Experience in designing dimensional and time-series data models
- Understanding of data lifecycle management, data lineage, and access control
- Ability to work across cloud and edge environments, supporting cloud-native and resource-constrained IoT systems
- Fluent English and Arabic is required
Benefits
Class A Medical Insurance
Top Skills
What We Do
Optimiza, is a leading, regional Systems Integration and digital transformation solutions provider that supports its clients' pursuit of operational excellence and profitability.
Our IP solutions cover a wide spectrum of sectors and provide clients with highly secure, user-friendly, versatile, and seamless systems in a variety of work areas including document management, healthcare, insurance, accounting, HR, and banking.
With over 41 years of operational experience, hundreds of projects delivered, and intellectual capital that spans multiple industry sectors, Optimiza's team of over 400 experts is fully capable of integrating and delivering innovative consulting, business, and technology solutions with a commitment to excellence and client satisfaction.







