--}}

JOB DESCRIPTION


 

What you will do

  • Build data pipelines that collect and transform data to support ML models, analysis and reporting.
  • Work in a high volume production environment making data standardized and reusable, from architecture to production.
  • Work with off-the-shelf tools including DynamoDb, SQS, S3, RedShift, Snowflake, Mysql but often push them past their limits.
  • Work with an international multidisciplinary team of data engineers, data scientists and data analysts.


Who you are:

  • At least 5+ years of experience in data engineering / software engineering in the big data domain.
  • At least 5+ years of coding experience with Python or equivalent.
  • SQL expertise, working with various databases (relational and NoSQL), data warehouses, external data sources and AWS cloud services.
  • Experience in building and optimizing data pipelines, architecture and data sets.
  • Experience with ML pipelines and MLOps tools.
  • Familiarity with data engineering tech stack - ETL tools, orchestration tools, micro-services, K8, lambdas.
  • End to end experience - owning features from an idea stage, through design, architecture, coding, integration and deployment stages.
  • Experience working with cloud services such as AWS, Azure, Google Cloud. 
  • B.Sc. in computer science or equivalent STEM.



Salary

Competitive

Monthly based

Location

Tel Aviv, Tel-Aviv District, Israel

Job Overview
Job Posted:
1 week ago
Job Expire:
5m 54s
Job Type
Hybrid
Job Role
Engineer
Education
Bachelor Degree
Experience
5 - 10 Years
Slots...
1

Share This Job:

Location

Tel Aviv, Tel-Aviv District, Israel