We are seeking a motivated AI & Data Engineer with 2–3 years of experience to join our innovative team. This role sits at the intersection of data engineering, cloud analytics, and applied AI, where you’ll design scalable data pipelines while also enabling LLM-powered applications and machine learning workflows.
You will collaborate closely with data scientists, ML engineers, and product teams to transform raw data into intelligent insights, AI-driven features, and production-ready LLM solutions.
Key Responsibilities
Data Engineering & Analytics
Design, build, and optimize ETL pipelines using Apache Airflow
Build and maintain Snowflake data warehouses for scalable ingestion, transformation, and analytics
Develop data workflows for structured and semi-structured data using AWS services
Integrate data from APIs, databases, and cloud storage using Python
Implement data quality checks, monitoring, and validation frameworks
Design and document data models to support analytics and ML workflows
Document pipeline architectures, schemas, and best practices
AI, ML & LLM Enablement
Collaborate with data scientists and ML engineers to prepare datasets for AI/ML training and inference
Support LLM-powered applications, including:
Prompt engineering and optimization
Fine-tuning LLMs (instruction tuning, domain adaptation)
Managing embeddings, vector databases, and retrieval pipelines
Build data pipelines to support RAG (Retrieval-Augmented Generation) workflows
Develop Streamlit applications for data exploration, AI demos, and internal tools
Assist in deploying and monitoring AI/ML pipelines in production environments
Required Qualifications
2–3 years of professional experience in Data Engineering, AI Engineering, or a related role
Strong hands-on experience with Apache Airflow
Practical experience with AWS services (S3, Lambda, Glue, EC2, or similar)
Experience working with Snowflake for data warehousing
Strong Python skills for data processing, automation, and API integration
Experience building data apps or dashboards using Streamlit
Solid understanding of data modeling and ETL design
Working knowledge of AI/ML workflows, especially data preparation
Familiarity with LLMs, including:
Prompt engineering
Fine-tuning or adapting LLMs
Working with embeddings and vector stores
Bachelor’s degree in Computer Science, Data Engineering, AI, or equivalent experience
Preferred Qualifications
Advanced experience with AWS data and ML services
Deep expertise in Snowflake performance tuning and optimization
Experience building complex Streamlit applications
Advanced Python skills for large-scale data wrangling
Experience with LLM fine-tuning, evaluation, and deployment
Familiarity with LangChain, LlamaIndex, or similar LLM frameworks
Knowledge of Airflow best practices for scalable pipeline design
Experience with AI/ML model deployment and monitoring
Top Skills
What We Do
With sales of over $500 million and more than 200 employees across the globe, Monark is an established leader in property development, e-commerce business models and business consulting. Since 2001, Monark has built a diverse portfolio of luxury properties throughout North America, ranging from top-tier golf courses to over $100 million worth of real estate, including townhomes, hotels and resorts. Monark specializes in all aspects of the construction industry, right from the planning stage to development. In addition to construction and property management, Monark Group is comprised of multiple successful e-commerce sites and consulting firms. With proven success in the e-commerce world, independent retailers look to Monark to develop and support their online retail businesses. To date, Monark has invested in over 30 major e-commerce sites and numerous business start-ups. Monark is committed to providing superior customer service and owes much of its success to this commitment. Today, Monark continues its reign by developing its prospering business portfolio and aiding investors in realizing their dreams.









