Big Data Architect (Level: 11/10/9/8)

  • IT Company
  • Elanın qoyulma tarixi: 17.04.2018
    Vakansiyaya müraciət tarixi bitmişdir.

İşin təsviri

Big Data Architect (Level: 11/10/9/8)

Location: Latvia

What you will do:

• Participating in vendor landscape and technology assessments (covering for example Hadoop, Spark, Kafka, NoSQL etc.)
• Executing proof of concepts to assess the value of Big Data use cases
• Lead solution architecting activities in the area of Big Data for developers’ teams.
• Architect modern data solutions in a hybrid environment of traditional and modern data technologies
• Building and implementing architecture roadmaps for next generation Big Data Analytics solutions for our clients
• Assess and define tactical and strategic opportunities to enable our clients to achieve new business capabilities through the use of Big Data and traditional tools and technologies.
• Lead and consult the clients in full lifecycle implementation from requirements analysis, platform selection, technical architecture design, application design and development till testing, and deployment
• Involving in sales cases of Information Management work
• Adopting, utilizing and further developing Accenture's Information Management approaches, methodologies and assets
• Actively participate in related Big Data communities and technology forums

Basic qualifications

• Experience in building and implementing architecture roadmaps for next generation enterprise data and analytics solutions
• Experience in designing, building and operationalizing production-grade Big Data solutions using modern data technologies such as Hadoop and/or NoSQL databases
• Experience with open source eco-system components such as Apache Spark, Apache Flume, Apache Oozie, Apache Hive, Apache Drill, Apache Kylin, Apache Flink, Apache HBase, Apache Cassandra, Apache Kafka
• Hands-on coding experience in one or more of: Scala, Java and/or Python

Preferred skills

• Architecting large scale Hadoop/NoSQL operational environments for production deployments
• Understanding of traditional ETL tools (Pentaho, Talend, Informatica), RDBMS, SQL
• Metadata management with Hadoop and NoSQL data in a hybrid environment
• Experience in large scale cloud data solutions using platforms such as AWS, Azure or Google Cloud Platform
• Experience in re-architecting and rationalizing traditional data environments with Hadoop or NoSQL technologies
• Knowledge of enterprise security frameworks and tools
• Experience with resource management frameworks such as Docker and/or Mesos
• Related certifications such as TOGAF

Benefits:

- All relocation costs for the candidate and his/her family
- Full health insurance
- Signing bonus
- 2 times in a year performance bonus
- Flight back tickets to home country for the candidate

Interested Candidates please send your CV to [email protected] indicating the title of position in the subject line of your message.
Otherwise your candidacy will not be considered.