
"We are looking for a Senior Data Engineer at GLS/NXT to ensure the reliability and integrity of our growing data infrastructure and analytics platform. In this role, you will work at the intersection of data engineering, product, and operations - driving quality across our data pipelines, models, and warehouse architecture. You'll design and implement testing strategies, proactively identify data quality issues, and collaborate with teams to build scalable, well-documented solutions."
"Collaborate across functions to scale GLS/NXT data infrastructure. Contribute to the roadmap and development of GLS/NXT data products. Design, build, monitor, and scale data pipelines for Data Warehousing, integrating various data sources and destinations. Develop data models and schemas supporting analytical requirements. Work with other engineering teams to resolve architecture and infrastructure challenges, designing effective solutions. Participate in cross-functional projects supporting data applications and reporting."
"Bachelor's degree in Computer Science or a related engineering field. 5+ years of proven experience in data engineering roles. Expertise in designing, building, monitoring, and scaling data engineering pipelines. Hands-on experience with workflow orchestration tools such as Apache Airflow for designing, scheduling, and managing complex data pipelines. Proficient in modern data programming languages, including Python and Go. Extensive hands-on experience with cloud platforms, specifically AWS and GCP. Familiarity with data warehousing technologies, including Google BigQuery and Snowflake. Proficient in SQL for diverse data sources,"
Design, build, monitor, and scale data pipelines and warehouse architectures integrating diverse data sources and destinations. Develop data models and schemas to support analytical requirements and reporting. Implement testing strategies and proactive monitoring to identify and resolve data quality issues. Collaborate with product, operations, and engineering teams to scale infrastructure, resolve architecture challenges, and contribute to data product roadmaps. Participate in cross-functional projects supporting data applications and reporting, and maintain comprehensive systems documentation. Require a bachelor's degree in Computer Science or related field, 5+ years of data engineering experience, proficiency in Airflow, Python and Go, cloud platforms (AWS, GCP), BigQuery, Snowflake, and SQL.
Read at Berlin Startup Jobs
Unable to calculate read time
Collection
[
|
...
]