Big data solution architect
Big Data Solution Architect
Ever since its inception, Cronos has demonstrated an extraordinary growth, both in revenue and staff numbers. Cronos' ability to swiftly adapt to the ever-changing market demands, made it the most preferred technology partner for many companies on the lookout for high quality ICT solutions.
A thorough understanding of customers' needs and a continuous quest for innovation, combined with a consistent but flexible attitude towards excellence, paved the way to rapid growth and success.
Cronos is looking to hire a Big Data Solution Architect in order to help our customers build a Big Data environment on premises or in the cloud. He or she is able to describe the structure and behaviour of a Big Data solution and how this can be delivered using Big Data technology such as Hadoop. He or she needs to have hands-on experience with Hadoop applications (e.g. administration, configuration management, monitoring, debugging, and performance tuning).
The Big Data Solution Architect is responsible for managing the full life-cycle of a Hadoop solution including:
• Working in customer-facing roles to contribute throughout the end-to-end delivery lifecycle of complex and large-scale technology solutions
• Use case and business case development
• Vendor landscape and technology assessment
• Solution Architecture formulation (including role of conventional information management platforms)
• Proof of concept planning through execution to prove the value of Big Data use cases
• Full lifecycle implementation from requirements analysis, platform selection, technical architecture design, application design and development, testing, and deployment
• Assess emerging trends and developments in the Big Data domain
• Deep experience across systems integration, information management, data management and architecture, and business analytics
• Solution architecture development including work estimation, resource allocation and initial pricing of proposals; experience in presenting solutions to senior stakeholders internally and at the client side
• Hands-on expertise with Apache Software Foundation projects & subprojects like HDFS, MapReduce, Flume, Yarn, Pig, Hive, HBase, Spark, Solr, Mahout,… preferably from Open Source vendors like Cloudera, HortonWorks,…
• Firm understanding of major programming/scripting languages like Java, Linux, PHP, Ruby, Phyton, and R
• Solid knowledge of RDBMS, ETL & BI-tooling
• Previous experience with cloud computing infrastructure (e.g. Amazon Web Services EC2) is an asset