• Apply proven expertise and build high-performance, scalable data warehouse application
• Securely source external data from numerous global partners
• Intelligently design data models for optimal storage and retrieval
ploy inclusive data quality checks to ensure high quality of data
• Optimize existing pipelines and implement new ones, maintenance of all domain-related data pipelines
• Ownership of the end-to-end data engineering component of the solution
• Collaboration with the program’s SMEs, data scientists proficiency in LAMP and the Big Data stack environments (Hadoop, MapReduce, Hive)
• competence with relational databases (Oracle, MySQL, Vertica)
• experience working with enterprise DE tools (Informatica), ability to learn in-house DE tools
• coding and scripting experience with Python, Java, PHP, SQL, CLI