Job Summary
Job Description
DUTIES: Provide technical leadership in manipulating and transforming large, complex data using SQL and ETL processes; manage code repositories using Github; work in an Agile development environment; use Python for data manipulation, analysis, and scripting; create data visualizations and dashboards using Tableau and Thoughtspot; work with telecommunications architecture and wireless technologies; analyze and process large volumes of data using Spark and Hadoop; process data using Databricks, Linux, and Unix Shell Scripting; deploy, manage, and scale data infrastructure and services using AWS; design and architect databases and applications and lead projects through implementation, using in-depth understanding of product life cycle and industry technical knowledge; determine appropriateness of data storage for optimum data storage organization; determine how tables relate to each other and how fields interact within the tables to develop relational models; collaborate with technology and platform management partners to optimize data sourcing and processing rules to ensure appropriate data quality; create system architecture, design and specification using in-depth engineering skills and knowledge to solve difficult development problems and achieve engineering goals; works closely with a variety of team members to clearly define data product requirements and technical roadmaps; determine and source appropriate data for a given analysis; work with data modelers/analysts to understand the business problems they are trying to solve then create or augment data assets to feed their analysis; integrates knowledge of business and functional priorities; and guide and mentor junior-level engineers Position is eligible to work remotely one or more days per week, per company policy.
REQUIREMENTS: Bachelor’s degree, or foreign equivalent, in Computer Science, Engineering, or related technical field, and five (5) years of experience manipulating and transforming data using SQL and ETL processes; managing code repositories using Github; working in an Agile development environment; of which three (3) years of experience include using Python for data manipulation, analysis, and scripting; creating data visualizations and dashboards using Tableau and Thoughtspot; working with telecommunications architecture and wireless technologies; of which one (1) year of experience includes analyzing and processing large volumes of data using Spark and Hadoop; processing data using Databricks, Linux, and Unix Shell Scripting; and deploying, managing, and scaling data infrastructure and services using AWS.
Employees at all levels are expected to:
Disclaimer:
This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications.
Skills
Data Visualization, Extract Transform Load (ETL), Structured Query Language (SQL)We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. That's why we provide an array of options, expert guidance and always-on tools that are personalized to meet the needs of your reality—to help support you physically, financially and emotionally through the big milestones and in your everyday life.
Please visit the benefits summary on our careers site for more details.
Want to receive frequent updates by email? Subscribe to our automatic job service!
Company:
ComcastEmployee Type:
Full timeLocation:
United StatesSalary:
$ 107800 - $ 200200