Architect - DE

Posted 13 Hours Ago
Be an Early Applicant
Bengaluru, Bengaluru Urban, Karnataka
Expert/Leader
Artificial Intelligence • Big Data • Machine Learning
The Role
As a Data Architect, you will design and deliver big data pipelines for healthcare organizations using cloud services and data ingestion technologies. Responsibilities include leading data engagements, ensuring data quality, mentoring team members, and collaborating with various stakeholders. You will also implement data solutions and contribute to the development of scalable data architectures.
Summary Generated by Built In

While technology is the heart of our business, a global and diverse culture is the heart of our success. We love our people and we take pride in catering them to a culture built on transparency, diversity, integrity, learning and growth.
If working in an environment that encourages you to innovate and excel, not just in professional but personal life, interests you- you would enjoy your career with Quantiphi!

Role: Technical Architect - Data

Experience Level: 8 to 14 Years

Work location: Mumbai, Bangalore, Trivandrum

You will be working as a Data Architect within the domain, designing and delivering big data pipelines for structured and unstructured data that are running across multiple geographies, helping healthcare organizations achieve their business goals with use of data ingestion technologies, cloud services & DevOps. You will be working with Architects from other specialties such as Cloud engineering, Software engineering, ML engineering to create platforms, solutions and applications that cater to latest trends in the healthcare industry such as digital diagnosis, software as a medical product, AI marketplace, amongst others.

Role & Responsibilities:
  • More than 15 years of experience in Technical, Solutioning, and Analytical roles.
  • 5+ years of experience in building and managing Data Lakes, Data Warehouse, Data Integration, Data Migration and Busines Intelligence/Artificial Intelligence solutions on Cloud (GCP/AWS/Azure)
  • Ability to understand business requirements, translate them into functional and non-functional areas, define non-functional boundaries in terms of Availability, Scalability, Performance, Security, Resilience etc.
  • Experience in architecting, designing, and implementing end to end data pipelines and data integration solutions for varied structured and unstructured data sources and targets.
  • Experience of having worked in distributed computing and enterprise environments like Hadoop, GCP/AWS/Azure Cloud.
  • Well versed with various Data Integration, and ETL technologies on Cloud like Spark, Pyspark/Scala, Dataflow, DataProc, EMR, etc. on various Cloud.
  • Experience of having worked with traditional ETL tools like Informatica/DataStage/OWB/Talend, etc.
  • Deep knowledge of one or more Cloud and On-Premise Databases like Cloud SQL, Cloud Spanner, Big Table, RDS, Aurora, DynamoDB, Oracle, Teradata, MySQL, DB2, SQL Server, etc.
  • Exposure to any of the No-SQL databases like Mongo dB, CouchDB, Cassandra, Graph dB, etc.
  • Experience in architecting and designing scalable data warehouse solutions on cloud on Big Query or Redshift.
  • Experience in having worked on one or more data integration, storage, and data pipeline toolsets like S3, Cloud Storage, Athena, Glue, Sqoop, Flume, Hive, Kafka, Pub-Sub, Kinesis, Dataflow, DataProc, Airflow, Composer, Spark SQL, Presto, EMRFS, etc.
  • Preferred experience of having worked on Machine Learning Frameworks like TensorFlow, Pytorch, etc.
  • Good understanding of Cloud solutions for Iaas, PaaS, SaaS, Containers and Microservices Architecture and Design.
  • Ability to compare products and tools across technology stacks on Google, AWS, and Azure Cloud.
  • Good understanding of BI Reposting and Dashboarding and one or more toolsets associated with it like Looker, Tableau, Power BI, SAP BO, Cognos, Superset, etc.
  • Understanding of Security features and Policies in one or more Cloud environments like GCP/AWS/Azure.
  • Experience of having worked in business transformation projects for movement of On-Premise data solutions to Clouds like GCP/AWS/Azure.

Role:

  • Lead multiple data engagements on GCP Cloud for data lakes, data engineering, data migration, data warehouse, and business intelligence.
  • Interface with multiple stakeholders within IT and business to understand the data requirements.
  • Take complete responsibility for the successful delivery of all allocated projects on the parameters of Schedule, Quality, and Customer Satisfaction.
  • Responsible for design and development of distributed, high volume multi-thread batch, real-time, and event processing systems.
  • Implement processes and systems to validate data, monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
  • Work with the Pre-Sales team on RFP, RFIs and help them by creating solutions for data.
  • Mentor Young Talent within the Team, Define and track their growth parameters.
  • Contribute to building Assets and Accelerators.
Other Skills:
  • Strong Communication and Articulation Skills.
  • Good Leadership Skills.
  • Should be a good team player.
  • Good Analytical and Problem-solving skills.


 

If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us!

Top Skills

AWS
Azure
GCP
Hadoop
Pyspark
PyTorch
Scala
Spark
TensorFlow
The Company
HQ: Marlborough, MA
3,494 Employees
On-site Workplace
Year Founded: 2013

What We Do

Quantiphi is an award-winning AI-first digital engineering company driven by the desire to solve transformational problems at the heart of business.
Quantiphi solves the toughest and complex business problems by combining deep industry experience, disciplined cloud, and data-engineering practices, and cutting-edge artificial intelligence research to achieve quantifiable business impact at unprecedented speed.

Similar Jobs

BlackLine Logo BlackLine

Senior Software Engineer - SAP ABAP

Cloud • Fintech • Information Technology • Machine Learning • Software • App development • Generative AI
Hybrid
Bengaluru, Karnataka, IND
1810 Employees
Hybrid
Bengaluru, Karnataka, IND
289097 Employees

Spotnana Logo Spotnana

Staff Software Engineer, Frontend

Big Data • Cloud • Information Technology • Software • Travel
Easy Apply
Bengaluru, Karnataka, IND
356 Employees

Spotnana Logo Spotnana

Staff Software Engineer, Backend

Big Data • Cloud • Information Technology • Software • Travel
Easy Apply
Bengaluru, Karnataka, IND
356 Employees

Similar Companies Hiring

MassMutual India Thumbnail
Insurance • Information Technology • Fintech • Financial Services • Big Data
Hyderabad, Telangana
InCommodities Thumbnail
Renewable Energy • Machine Learning • Information Technology • Energy • Automation • Analytics
Austin, TX
234 Employees
RunPod Thumbnail
Software • Infrastructure as a Service (IaaS) • Cloud • Artificial Intelligence
Charlotte, North Carolina
53 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account