It's fun to work in a company where people truly BELIEVE in what they are doing!
We're committed to bringing passion and customer focus to the business.
Job Location: All Fractal Location Job Description Platform – • Google Cloud Platform • Cloud Storage Bucket • DataProc • Dataflow • Big Query • Cloud Run • IAM • Composer (Airflow) • Big Data Technologies • Spark, Hive • Python, Scala • SQL • CI/CD • Jenkins/GitHub/Nexus Outcomes/Success Criteria • Drive the implementation of the defined innovation strategy through the design and architecture for analytics platform components/services utilizing Google Cloud Platform infrastructure and provide technical expertise to deliver advanced analytics • Design, build & operate a secure, highly-scalable, reliable platform to consume, integrate and analyze complex data using a variety of best-in-class platforms and tools • Collaborate with various global IT teams to develop Big Data reference architecture patterns for data ingestion, data processing, analytics and data science solutions • Drive innovation through developing proof-of-concept’s and prototypes to help illustrate approaches to technology and business problems • Provide strong technical skills with ability to design, architect, and get into low-level implementation details • Be a hands-on developer and build scalable, real time, Big Data systems whilst providing a deep business understanding of CPG functions & processes • Proven experience in working with globally distributing agile teams • Develop and maintain modern development & data management methodology Qualifications & Experience (List NO MORE than 5 minimum requirements candidates must meet in order to be considered and NO MORE than 5 desired qualifications that you would ideally like to see in candidates.) • A bachelor's degree in computer science, math, physics, engineering or a related field • 3+ years of progressive experience in data and analytics, with at least 6 months experience in Google cloud platform • Overall, 3+ years of progressive experience working in software engineering teams • Experience with Google Cloud Platform technology stack for Data Analytics and Big data technologies (Big Query - DWH concepts, ANSI SQL, Big Table – HBase, DataProc – Spark/Hadoop, Dataflow – Apache Beam/Scala/Python) is required • Experience in developing the data pipeline, ETL processes, data platform, data modelling, and data governance processes in delivering end-to-end solutions • Solid foundation in design, data structures and algorithms, and strong analytical and debugging skills with customer-facing products experience • Good understanding of private and public cloud design considerations and limitations in the areas of virtualization, global infrastructure, distributed systems, load balancing, networking, massive data storage, Hadoop, MapReduce, and security. • Strong programming experience in Python and Scala • Experience of using development tools like Eclipse and IntelliJ • Experience of using Jenkins, GitHub flow and artifact repo. • Experience developing real time data streaming pipelines • Drive innovation by assessing, piloting, building DevOps/Cloud tooling and services to improve overall developer experience and productivity • Expertise & experience in building large scale, cloud based and open source projects. • Hands-on solution driven attitude, with ability to turn around quick insights along with delivering strategic capabilities • Multi-tasking ability to participate in multiple complex programs at the same time in an agile environment • Strong organization and prioritization skills along with outstanding written and verbal communication skills • Knowledge of agile software development methodolog
If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us!
Not the right fit? Let us know you're interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest!