IT/Software/Technology

Senior Hadoop Developer

Contract

Strategic Staffing Solutions

STRATEGIC STAFFING SOLUTIONS HAS AN OPENING!

Strategic Staffing Solutions is currently looking for a Senior Hadoop Developer for a contract opening with one of our largest clients located in Charlotte, NC!

This is a Contract Opportunity with our company that MUST be worked on a W2 Only. No C2C eligibility for this positionVisa Sponsorship is Available ! The details are below.

Location: Charlotte, NC 

Duration:  13+ Months

To apply: Please email your resume in Word Format to Bob Cromer at: bcromer@strategicstaff.com and Reference Job Order #: 203505 or Click the Apply Button.

Job Description:

  • Lead or participate in complex initiatives on selected domains.
  • Assure quality, security and compliance for supported systems and applications.
  • Serve as a technical resource in finding software solutions. Review and evaluate user needs and determine requirements.
  • Provide technical support, advice, and consultation with the issues relating to supported applications.
  • Create test data and conduct interfaces and unit tests.
  • Design, code, test, debug and document programs using Agile development practices.
  • Understand and participate to ensure compliance and risk management requirements for supported area are met and work with other stakeholders to implement key risk initiatives.
  • Conduct research and resolve problems in relation to processes and recommend solutions and process improvements.
  • Assist other individuals in advanced software development.
  • Collaborate and consult with peers, colleagues and managers to resolve issues and achieve goals
  • Design and build data services that deliver Strategic Enterprise Risk Management data
  • Design high performing data models on big-data architecture as data services.
  • Design and build high performing and scalable data pipeline platform using Hadoop, Apache Spark, MongoDB and object storage architecture.
  • Design and build the data services on container-based architecture such as Kubernetes and Docker
  • Partner with Enterprise data teams such as Data Management & Insights and Enterprise Data Environment (Data Lake) and identify the best place to source the data
  • Work with business analysts, development teams and project managers for requirements and business rules.
  • Collaborate with source system and approved provisioning point (APP) teams, Architects, Data Analysts and Modelers to build scalable and performant data solutions.
  • Effectively work in a hybrid environment where legacy ETL and Data Warehouse applications and new big-data applications co-exist
  • Work with Infrastructure Engineers and System Administrators as appropriate in designing the big-data infrastructure.
  • Work with DBAs in Enterprise Database Management group to troubleshoot problems and optimize performance
  • Support ongoing data management efforts for Development, QA and Production environments
  • Utilizes a thorough understanding of available technology, tools, and existing designs.
  • Leverage knowledge of industry trends to build best in class technology to provide competitive advantage.
  • Acts as expert technical resource to programming staff in the program development, testing, and implementation process.

Required Qualifications:

  • 4+ years of Specialty Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education.
  • 5+ years of application development and implementation experience
  • 5+ years of experience delivering complex enterprise wide information technology solutions
  • 5+ years of ETL (Extract, Transform, Load) Programming experience
  • 3+ years of reporting experience, analytics experience or a combination of both
  • 4+ years of Hadoop development/programming experience
  • 5+ years of operational risk or credit risk or compliance domain experience
  • 5+ years of experience delivering ETL, data warehouse and data analytics capabilities on big-data architecture such as Hadoop
  • 6+ years of Java or Python experience
  • 5+ years of Agile experience
  • 5+ years of design and development experience with columnar databases using Parquet or ORC file formats on Hadoop
  • 5+ years of Apache Spark design and development experience using Scala, Java, Python or Data Frames with Resilient Distributed Datasets (RDDs)
  • 2+ years of experience integrating with RESTful API
  • Excellent verbal, written, and interpersonal communication skills
  • Experience designing and developing data analytics solutions using object data stores such as S3
  • Experience in Hadoop ecosystem tools for real-time & batch data ingestion, processing and provisioning such as Apache Spark and Apache Sqoop
  • Ability to work effectively in virtual environment where key team members and partners are in various time zones and locations
  • Ability to interact effectively and confidently with senior management
  • Knowledge and understanding of project management methodologies: used in waterfall or Agile development projects
  • Knowledge and understanding of DevOps principles
  • A BS/BA degree or higher in information technology – Masters required 

The S3 Difference:

The global mission of S3 is to build trusting relationships and deliver solutions that positively impact our customers, our consultants, and our communities. The four pillars of our company are to:

  • Set the bar high for what a company should do
  • Create jobs
  • Offer people an opportunity to succeed and change their station in life
  • Improve the communities where we live and work through volunteering and charitable giving

As an S3 employee, you’re eligible for a full benefits package that may include:

  • Medical Insurance
  • Dental Insurance
  • Vision Insurance
  • 401(k) Plan
  • Vacation Package
  • Life & Disability Insurance Plans
  • Flexible Spending Accounts
  • Tuition Reimbursement

 

 

 

Job ID: JOB-203505
Publish Date: 23 Jun 2022

Tagged as: Senior Hadoop Developer