You will be responsible for
As a Bigdata Architect you will handle designing of solutions required to automate Administration, Data Capture, Processing, Data Modelling Processes to meet business's Analytics needs.
The resulting design may run on multiple platforms and may be composed of multiple software packages and custom components. -This role defines best practices in the critical evaluation and selection and / or development of the software components and hardware requirements of the applications and data, and the development of the application, including evaluation and selection of development methods, development processes, best practices and tools.
Introduce and follow good development practices, innovative frameworks, and technology solutions that help business move faster.
- Drive continuous improvement and innovation.
- Work with cross functional teams effectively
- Define Big Data Architecture catering to business's analytics and data warehousing requirements.
- The Architect is responsible for Big Data Storage, Processing and Analytics Use Cases Development and design Analytics Models with NFR (Performance, Availability, Scalability, And Integrity).
- Analyze Complex/New analytics (realtime/ advanced / predictive analytics) requirements, datawarehousing requirements, anticipate potential problems and future trends, assess potential integration solutions, Impacts, and risks.
- Develop and implement Big Data solutions when assigned to work on delivery projects.
- Adapt communications and approaches to conclude technical scope discussions with various Partners, resulting In Common Agreements.
- Actively consult, conduct pre-sales, develop POCs, patriciate incubation/accelerator/product programmes
How we'll help you grow
- You’ll have access to all the technical and management training courses you need to become the expert you want to be
- You have the opportunity to work in many different areas to figure out what really excites you
- Architecting, Designing and Delivery experience in buidling Data Lake Solutions.
- Hands on delivery experience in Hadoop/ IBM Big Insight/ Azure HD Insight/ Hortonworks, Cloudera, MapR. Data Processing and modelling using R, Spark, Machine Learning using Spark ML, H2o, BI Tools (Cognos, QlikView, Tableau, SAP BO, etc), Graph Database (Neo4J, Apache Titan etc], Query (Big SQL, Apache Drill etc)
- Proficient with NoSQL databases (Key/Value, Document, Column family, Graph databases), Apache SOLR, Lucene etc.
- Good understanding of NoSQL platforms like HBase, Couch Base, Vertica, MongoDB, Cassandra
- Have suitable qualifications and industry certifications
Production level hands on experience
- Experience with large scale Hadoop administration and development (required)
- 1 Petabyte data set or above
- 5+ years Big Data ecosystem experience along with admin, development, cloud and app integration experience
- 3+ years Consulting and team lead experience
- 5+ years enterprise projects – customer centricity, optimization, predictive engines, enterprise data hub
We thrive for authority. This can only be achieved by working with the best people, offering them the most challenging projects and create a continuous learning environment.
All this is in place so you can accelerate your career.
What can you expect?
- Inspiring working environment
- The most challenging assignments
- Every 2nd week in-house knowledge sharing session (XKE).
- Freedom to accelerate
- Much more!
At Xebia you find like-minded colleagues who are forerunners in their field, are used to get customer organizations moving and have the courage to leave the beaten track. By sharing knowledge with customers and communities, we constantly broaden our expertise and decide what the next cool thing is to work on.
We challenge you to accelerate your personal development. Curious? We like to tell you more or invite you to one of our events!
Curious to find out more?
You are welcome to get to know us.
Get in touch with Hari: email@example.com