Job Description
Role:: hadoop administrator
Visa::GC/USC/GCEAD
Duration::6 month
LINKEDN IS Required
KEY ROLES & RESPONSIBILITIES (required):
Manage and maintain Spark for uninterrupted availability
2+ years hands-on experience with supporting Spark ecosystem technologies in production.
Maintain, upgrading, installing production environments for large-scale applications data and analytical applications.
3+ years hands-on experience with scripting with python or Anaconda.
Background as in SQL DBA would be a big plus
Experience with data science tools too (e.g., Data Robot) will be a plus.
Check, back-up, and monitor the entire system, routinely
Ensure that the connectivity and network are always up and running and plan for capacity upgrading or downsizing as and when the need arises.
Focus on continuous improvements including usage of appropriate tools, techniques and automation.
Gather and present metrics and active
EDUCATION
Bachelor degree in informatics, life sciences or equivalent work experience is required.
Indicate below any additional preferred requirements if applicable (optional):
Masters of Science (MS) degree in relevant area of study is preferred.
Scripting experience with bash or python or shell scripting. Must feel comfortable writing automation scripts