DATA INTEGRATIONS ENGINEER
As a Data Integration Engineer, you will be responsible for designing, implementing, and maintaining data integration solutions to handle real-time streaming data from various sources like IoT/IIoT protocols, third-party APIs, or even raw files. Your main objective will be to process data in real-time and provide valuable insights for our organization. You will work with a diverse range of Big Data tools and technologies. The successful candidate will have experience in embedded systems bringup requirements engineering management, systems integration, developing programs to drive HW and SW planning and articulating the big picture. This role is open to applicants globally as it is a fully remote role.
Additionally, you will be involved in the development of a Data Streaming platform using Rust.
Responsibilities:
- Data Integration Design: Collaborate with cross-functional teams to understand data requirements, source systems, and data formats.
- Design efficient data integration pipelines for real-time data streaming from multiple sources.
- Programming Languages: Develop custom data processing components and applications using Java and Python to meet specific business requirements.
- ETL Development: Implement Extract, Transform, Load (ETL) processes to ingest and transform data from various streaming sources into a format suitable for analysis and storage.
- Real-time Data Processing: Develop and optimize data processing workflows to ensure timely handling of streaming data, maintaining low-latency and high-throughput capabilities.
- Big Data Tools: Utilize and maintain various Big Data tools such as Apache NiFi, Spark, Kafka, Druid/TimescaleDB, etc. to build scalable and robust data integration solutions.
- Message Broker Configuration: Set up and configure message brokers like RabbitMQ, AMQP, and Kafka to enable efficient data exchange between different systems and applications.
- IoT/IIoT Protocols Integration: Integrate and work with IoT/IIoT protocols such as MQTT, SNMP, CoAP, TCP, and WebSockets to capture data from edge devices and industrial systems.
- Data Quality and Validation: Implement data validation checks and data quality measures to ensure the accuracy and reliability of the integrated data.
- Performance Monitoring: Monitor the performance and health of data integration pipelines, making necessary adjustments to optimize data flow and resource utilization.
- Troubleshooting and Issue Resolution: Diagnose and resolve issues related to data integration, ensuring smooth and uninterrupted data streaming.
- Collaboration: Work closely with data scientists, data analysts, and other stakeholders to understand data needs and deliver actionable insights.
Technical Requirements:
- Strong experience in designing and implementing data integration solutions for real-time streaming data.
- Proficiency in using Big Data technologies such as Apache NiFi, Apache Spark, Kafka, and Druid.
- Familiarity with message brokers like RabbitMQ, AMQP, and Kafka for data exchange and event-driven architectures.
- Hands-on experience with IoT/IIoT protocols such as MQTT, SNMP, CoAP, TCP, and WebSockets.
- Proficiency in programming languages such as Java and Python for developing custom data processing components.
- Understanding of data streaming platforms and experience with Rust is a plus.
- Knowledge of data quality assurance and validation techniques to ensure reliable data.
- Ability to troubleshoot and resolve issues related to data integration and streaming processes.
- Strong analytical and problem-solving skills, with a keen eye for detail.
- Excellent communication and teamwork skills to collaborate effectively with cross-functional teams.
- Experience with cloud-based platforms and distributed systems is advantageous.
- Have a never ending curious mindset to learn and work with new tools and technologies
Top Skills
What We Do
Mi-C3 International Limited is a Malta based company that was established in 2014 by the CEO and founder, Glen Scott.
The history and the philosophy of Mi-C3 is in its name which is a combination of two acronyms. The genesis of the technology was in supporting Command and Control for organisations that needed technology to improve their situational awareness to manage and secure their facilities. However, our technology enabled Command and Control to be enhanced with Collaboration (C3) as organisations were empowered with multi-directional communications that took command and control to a new level as more decision-making could be automated and information flows organised in a way that reams of data are transformed into meaningful intelligence (Mi).
Generations of the Mi-C3 software started as a project in the early 2000’s as a solution to support a large oil and gas company operating across multiple countries to help manage its complex operation and considerable security risk spread over a wide and poorly accessible geographical area before the Internet of Things (IoT) was widely understood, and has subsequently grown to over 30,000 sites, and monitoring more than 1 million end points.
Over the years, this solution has evolved into our award winning* flagship product, AFFECTLI, and has delivered significant value to clients through the use of a combination of cutting-edge technologies in the fields of Digital Process Transformation, smart workspaces, AI, machine learning, data management and analytics.
AFFECTLI is an established, flagship product of Mi-C3 that won the Malta MCA eBusiness Award for Best B2B Application 2018, was shortlisted for a nomination in the World Summit Awards 2018 under the Business & Commerce category, is ISO 9001:2015; ISO/IEC 20000:2011 and SOC 2 Type II compliant.