The Big Data Design and Support role will be responsible for the design and build of world class high-volume real-time data ingestion and processing frameworks and advanced analytics on big data platforms. He/She will research, develop, optimize, and innovate frameworks and patterns for enterprise scale data analysis and computations as part of our Big Data.
• Strong Informatica knowledge and skills, especially Informatica Big Data Management.
• Strong business/technical oral and written communication skills
• Minimum 5 years work experience in IT and Data ecosystems
• Experience in process analysis, requirements analysis and system engineering
• Strong written and oral communication skills
• Independent, structured and analytical work
• 4+ years’ programming/scripting languages Java and Scala, python, R, Pig
• 2+ years of experience of developing solutions in cloud environments (Azure, AWS etc) with focus on analytics stack.
• 2+ years of experience in big data streaming frameworks, data processing and real-time ingestion patterns.
• Experience working with and evaluating open source technologies and demonstrated ability to make objective choices
• Support on-call
• Experience architecting highly scalable, highly concurrent and low latency systems
• Experience scaling applications to processing multiple petabytes
• Experience with Visualization Tools such as Tableau
• Knowledge of software/data design methods, data structures, and modeling standards
• Knowledge of Chef scripting, Docker containers
• Working Experience with continuous integration and continuous delivery tools
• Experience with multiple cloud computing platforms
• Experience with more than one data streaming technologies
• Understanding of Machine Learning skills (like Apache Mahout, Spark MLib)
• Ability to communicate complex architectures and insights in a clear, precise, and actionable manner