Senior Data Engineer
About The Position
We are looking for senior Data Engineer to join our data science team! Get ready to work with everything connected to data: expanding and optimizing our data pipeline architecture as well as data flows, building and improving our data systems and collecting data for cross-functional teams.
Come work in our Ramat Hahayal office and catch an exceptional opportunity to tackle multiple challenging projects while leveraging data sources to come up with new and innovative ideas!
What’s the Job?
- Create and maintain optimal data pipeline architecture
- Assemble large, complex data sets that meet both functional & non-functional business requirements
- Identify, design, and implement improvements in internal processes: manual processes automation, data delivery optimizations, infrastructure re-designs for greater scalability, etc.
- Use SQL and big data cloud technologies to build the infrastructure required for optimal loading, extraction, and transformation of data from a wide variety of sources
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics
- Work with data teams to assist with data-related technical issues and support their needs associated with data infrastructure
- Create data tools for analytics and help our team of data scientists in building and turning our products into the most progressive ones in the industry
- Work with data and analytics experts, striving to improve our data systems functionality
- 5+ years of experience as a Data Engineer
- 5+ years of experience with data warehousing technologies such as Amazon Redshift, Snowflake, Redis, MongoDB and experience with BI solutions development (DWH, ETL)
- 5+ years of experience in SQL and ETL tools like SSIS\ Informatica
- 5+ years of experience with Python
- 5+ years of experience in open-source Big Data tools like Hadoop, Hive, Spark, Presto, Sqoop
- Experience with data platforms architecture, in particular, Big Data architecture , Deep understanding of big data storage formats, especially Apache Parquet, Snappy
- Experience with AWS cloud services like S3 , EMR , Athena, Lambda, Kinesis, Glue, etc.
- Experience with Docker and Kubernetes, Fargate, ECS and CI/CD based on Jenkins.
- Experience with Cython, numpy, pandas, koalas, pyarrow, fastparquet, scipy, celery, async.io, django
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
- willing to collaborate and problem solve in an open team environment.
- Flexible and adaptable in regard to learning and understanding new technologies.
- B.Sc. or higher degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field
- Experience with Linux administration
- Experience with software architecture
- Understanding of cloud security best practices.
- Experience with agile development (Scrum, Kanban).
- Experience supporting and working with cross-functional teams in a dynamic environment
- Ability to take ownership and facilitate consensus among a diverse group of stakeholders.
PLEASE NOTE, ONLY CV'S IN ENGLISH WILL BE ACCEPTED.
Webpals Group is a leading digital performance publisher. We deliver the most relevant and valuable users to global online businesses. Our most talented technology and digital marketing experts are committed to driving exceptional performance.
You’ll be joining our Tech Department – the tech nerds who work on challenging projects, which become cutting edge data-driven content, ML/AI aimed for personalized customer journeys, data pipelines platform and more.