Job Description
Purpose of the role
To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure.
Accountabilities
- Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data.
- Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures.
- Development of processing and analysis algorithms fit for the intended data complexity and volumes.
- Collaboration with data scientist to build and deploy machine learning models.
Analyst Expectations
- Will have an impact on the work of related teams within the area.
- Partner with other functions and business areas.
- Takes responsibility for end results of a team’s operational processing and activities.
- Escalate breaches of policies / procedure appropriately.
- Take responsibility for embedding new policies/ procedures adopted due to risk mitigation.
- Advise and influence decision making within own area of expertise.
- Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct.
- Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function.
- Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function.
- Make evaluative judgements based on the analysis of factual information, paying attention to detail.
- Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents.
- Guide and persuade team members and communicate complex / sensitive information.
- Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation.
All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.
Join us as a Data Engineer at Barclays where you'll spearhead the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unapparelled customer experiences.
Primary Skills:
Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data. Design, Develop & Testing of complex ETL components and should possess key critical skills relevant for success in role, such as Abinitio ETL, Cloud Data technologies, Python, Spark, Ab Initio, SQL, Big Data (Hadoop, Hive, Impala) as well as job-specific technical skills.
This role is based in Pune.
Top Skills
What We Do
Barclays is a British universal bank. We are diversified by business, by different types of customers and clients, and by geography. Our businesses include consumer banking and payments operations around the world, as well as a top-tier, full service, global corporate and investment bank, all of which are supported by our service company which provides technology, operations and functional services across the Group.
With over 325 years of history and expertise in banking, Barclays operates in over 40 countries and employs approximately 83,500 people. Barclays moves, lends, invests and protects money for customers and clients worldwide.