Kafka Developer

We are seeking a Kafka Developer with development experience in Hadoop Ecosystem. The position will primarily be responsible interface with key stakeholders and apply your technical proficiency across different stages of the Software Development Life Cycle including Requirements Elicitation, Application Architecture definition and Design. You will play an important role in creating the high-level design artifacts. You will also deliver high quality code deliverables for a module, lead validation for all types of testing and support activities related to implementation, transition and warranty. You will be part of a learning culture, where teamwork and collaboration are encouraged, excellence is rewarded, and diversity is respected and valued.

Required Qualifications:
  • Candidate must be located within commuting distance of Phoenix, AZ or be willing to relocate to the area. This position may require travel in the US and Canada.
  • Bachelor’s Degree or foreign equivalent, will consider work experience in lieu of a degree
  • 2+ years of experience with Information Technology
  • Experience in Hadoop ecosystem, i.e. Hadoop, Cloudera, Scala, SPARK, Kafka
  • Strong knowledge in object oriented concepts, data structures and algorithms
  • Good experience in end-to-end implementation of DW BI projects, especially in data warehouse and mart developments
  • Knowledge and experience with full SDLC lifecycle
  • Experience with Lean / Agile development methodologies
  • U.S. Citizenship or Permanent Residency required, we are not able to sponsor at this time
Preferred Qualifications:
  • 1.5+ years of experience in software development life cycle
  • 1.5+ years of experience in Project life cycle activities on development and maintenance projects
  • 1+ year of experience in Hadoop ecosystem, i.e. Hadoop, SPARK, Kafka, Kafka, Python
  • At least 1 year of experience in Relational Modeling, Dimensional Modeling and Modeling of Unstructured Data
  • Good experience in end-to-end implementation of DW BI projects, especially in data warehouse and mart developments
  • Good understanding of Data integration, Data Quality and data architecture
  • Good expertise in impact analysis due to changes or issues
  • Experience in preparing test scripts and test cases to validate data and maintaining data quality
  • Strong understanding and hands-on programming/scripting experience skills – UNIX shell, Perl, and JavaScript
  • Experience with design and implementation of ETL/ELT framework for complex warehouses/marts. Knowledge of large data sets and experience with performance tuning and troubleshooting
  • Hands-on development, with a willingness to troubleshoot and solve complex problems
  • CI / CD exposure
  • Ability to work in team in diverse/ multiple stakeholder environment
  • Ability to communicate complex technology solutions to diverse teams namely, technical, business and management teams
  • Excellent verbal and written communication skills
  • Experience and desire to work in a Global delivery environment
Please send your resume to ben.cofield@noviglobal.com for more information.

Novi Global does not discriminate in any way and is acting as an Employment Agency in relation to these vacancies.
Job Type:
General
Job Title:
Kafka Developer
Location:
United States - Arizona
Sector:
General
Salary:
£Negotiable

How to apply

Apply Now

Got a Question or need some advice?

We are always here to help and support all our candidates and clients however we can.

Please feel free to call us, or fill in our contact form, and we will do our best to help.

 

contact us