Skip directly to content

Seeking Big Data Engineer Consultants for Global Consulting Company

on Fri, 06/10/2016 - 14:29

Am responsible for hiring Big Data professionals for Accenture

Accenture serves over 90% of the Fortune Global 100. Has offices in 200 cities in over 55 countries. Forbes ranked Accenture as one of the Top 25 Happiest Places to Work

To view and apply to the Big Data Engineer Consultant role visit Accenture.com go to the Careers link and enter the following in the space bar: 00386879

To view and apply to the Sr. Manager level Big Data role enter in the space bar: 00386926

My email: Kelly.b.Smith@Accenture.com

*No agencies please

 

Organization: Digital Growth Platform

Location: Multiple US Locations

Travel: 80-100% (Monday-Thursday/Friday)

The Big Data Engineer Consultant empowers clients to turn information into action by gathering, analyzing and modeling client data which enables smarter decision making. Uses a broad set of analytical tools and techniques to develop quantitative and qualitative business insights. Works with partners as necessary to integrate systems and data quickly and effectively, regardless of technical challenges or business environments.

Job Description

Data Engineers at the Consultant level will be responsible for architecture, design and implementation of Hadoop and NoSQL based full scale solutions that includes data acquisition, storage, transformation, security, data management and data analysis using these technologies. A solid understanding of infrastructure planning, scaling, design and operational considerations that are unique to Hadoop, NoSQL and other emerging data technologies is required. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to identify and apply Hadoop and NoSQL solutions to challenges with data and provide better data solutions to industries.

Basic Qualifications

  • Bachelor's degree in Computer Science, Engineering, Technical Science or 3 years of IT/Programming experience
  • Minimum 2+ years of building and deploying applications java applications in a Linux/Unix environment.
  • Minimum of 1+ years designing and building  large scale data loading, manipulation, processing, analysis, blending and exploration solutions using Hadoop/NoSQL technologies (e.g. HDFS, Hive, Sqoop, Flume, Spark, Kafka, HBase, Cassandra, MongoDB etc.)
  • Minimum 1+ years of  architecting and organizing data at scale for a Hadoop/NoSQL data stores
  • Minimum 1+ years of  coding with MapReduce Java, Spark, Pig, Hadoop Streaming, HiveQL, Perl/Python/PHP  for data analysis of  production Hadoop/NoSQL applications

Preferred Skills

·         Minimum 2 years designing and implementing relational data models working with RDBMS move to preferred

·         Minimum 2 years working with traditional as well as Big Data ETL tools move to preferred

·         Minimum 2 years of experience designing and building  REST web services move to preferred

·         Designing and building  statistical analysis models, machine learning models, other analytical modeling using these technologies on large data sets (e.g. R, MLib, Mahout, Spark, GraphX) move to preferred

·         Minimum 1 year of experience implementing large scale cloud data solutions using AWS data services e.g. EMR, Redshift move to preferred

·         2+ years of hands-on experience designing, implementing and operationalizing  production data solutions using emerging technologies such as Hadoop Ecosystem (MapReduce, Hive, HBase, Spark, Sqoop, Flume, Pig, Kafka etc.), NoSQL(e.g. Cassandra, MongoDB), In-Memory Data Technologies, Data Munging Technologies.

·         Architecting large scale Hadoop/NoSQL operational environments for production deployments

·         Designing and Building different data access patterns from Hadoop/NoSQL data stores

·         Managing and Modeling data using Hadoop and NoSQL data stores

·         Metadata management with Hadoop and NoSQL data in a hybrid environment

·          Experience with data munging / data wrangling tools and technologies