• Preserve customer trust
  • Think customer first
  • Demonstrate extreme ownership
  • Fail fast and learn fast
  • Know when to lead and know when to follow
  • Support your team
  • Design, implement and deploy data intensive applications on a global scale
  • Be proactive in identifying and solving operational issues
  • Monitor and evaluate application performance
  • Provide technical inputs, evaluate and recommend new ways to improve reliability, scalability and maintainability of the application
  • Build data pipelines, ETL, and management of high volume data across distributed systems.
  • Research, analyze, and formulates recommendations regarding technologies, products, and solutions to fulfill customer requirements within the enterprise.
  • Have fun doing it
Basic Qualifications:
  • BS or MS in computer science or related fields
  • Experience of working within product development teams and usage of tools like GitHub, Jenkins (and Jira)
  • Hands on experience with threat/anomaly detection and prevention systems/tools
  • Professional experience with developing and deploying production level code in Java
  • Experience in administering Big Data systems and related technology(Hadoop(Horton) and Elastic (ELK))
Preferred Qualifications:
  • BS in Computer Science or related field with 3+ years of experience or MS in Computer Science or related field with 2+ years of experience
  • Experience in troubleshooting issues in complex, distributed, multi-tier architectures.
  • Experience with building data intensive distributed systems
  • Experience in security engineering and operations related to threat detection systems/tools
  • Experience with any of Apache Kafka, Hive, Hadoop
  • Experience in handling analytics on large data sets
  • Experience with Elasticsearch (ELK stack)
  • Experience in DevOps and Infrastructure as a Code (Saltstack, Puppet)
  • Experience in developing and deploying in Docker/Containers on Kubernetes
  • Experience in software development using Python, Go, SQL