Big Data Senior Developer

Sorry, this job is no longer available. Please Search for Jobs to conduct a new search

Eagle is assisting our client in the search for a Big Data Senior Developer. This is a three (3) month contract position with a top tier organization, scheduled to begin immediately.

Summary:

Our clients’ Network Big Data team is seeking an experienced senior developer capable of owning the technical delivery of complex development projects.

The suitable candidate will have advanced development skills in a real-time Big Data environment, including experience with the following tools:

Tool/Environment

Skill level required

Minimum Years work experience

Scala

Expert

3

Kafka

Expert

3

HDFS

Expert

3

Spark Streaming

Expert

3

Oozie

Expert

3

Docker

Strong working knowledge

1

Kubernetes

Strong working knowledge

1

Solr

Expert

2

Linux Scripting

Expert

3

Sqoop

Expert

2

sshfs

Strong working knowledge

1

Networking and connectivity

Expert

3

sftp, ssh, ssl

Expert

3

Experience working in Hadoop and Kafka environments with Kerberos enabled

Expert

3

  • The developer must have extensive experience reading real-time data from a Kerberized Kafka cluster using Scala/Spark Streaming and making data available in real-time to Kerberized Hadoop cluster data stores including:
    • Hive tables accessible via Impala;
    • Solr;
  • Developer must be capable of independently working through technical challenges and providing thought-leadership to junior and intermediate developers on the team;
  • Developer must have experience creating solutions consistent with security best practices to ensure that sensitive data is secured properly in flight and at rest;
  • The candidate must have excellent communication skills, as they will be working directly with the business in an Agile environment to understand and refine technical  requirements;

Other attributes that are valuable for the role include:

  • Proven skills in developing high-quality, highly optimized, high performance and maintainable software for big data solutions specifically in the Hadoop ecosystem;
  • Experience in architecture, design, software development, testing, deployment, maintenance, production and operation of data solutions;
  • Experience building and testing code in non-production environments.  This includes unit, regression, performance and end-to-end testing;
  • Working experience developing projects in IntelliJ or Eclipse with Maven and integrating to GitHub;
  • Able to follow software development life cycle (SDLC), development and security standards;
  • Ability to measure software performance in non-production and production environments and improve its efficiency;
  • The ability to troubleshoot connectivity issues between source and target systems, including problems with routing and firewall rules;
  • The ability to support customer issues and incidents regarding the big data platforms through to resolution;
  • Building automation for repetitive yet complex tasks through the use of automation technologies to streamline operations;
  • Exposure to Continuous Improvement methods;
  • Proficient understanding of distributed computing principles; and,
  • The candidate should possess a degree in Engineering, Mathematics, Science or Computer Science or alternatively a diploma in software development with a focus on Big Data languages/tools.

 Don’t miss out on this excellent career opportunity, apply online today!

Eagle is an equal opportunity employer and will provide accommodations during the recruitment process upon request. We thank all applicants for their interest; however, only candidates under consideration will be contacted. Please note that your application does not signify the beginning of employment with Eagle and that employment with Eagle will only commence when placed on an assignment as a temporary employee of Eagle.

JOB ID# 54421

  • Posted On: April 02, 2018
  • Job Type: Contract
  • Job ID: 54421
  • Location: Toronto/GTA ON