- Apply diverse data technologies such as Spark, Kafka, Clickhouse, and other similar tools from Big Data technology landscape to build business-critical, scalable and robust data pipelines and intuitive data products that power data discovery & analysis in a self-service manner;
- Design an extensible, highly scalable and optimised real-time data analytics platform;
- Own the data products critical for product innovation and recommendation engines;
- Continually acquire new data sources to develop an increasingly rich dataset that characterizes users, content, and marketing;
- Be comfortable outside of your comfort zone - explore new tech, make your own tool, or find a new way to address an old problem.
- Senior experience in software engineering, deployment, and integration with data delivery systems and other components, building microservices, providing APIs for models access;
- Proficiency in at least one high-level programming language (Scala, Java, Python or equivalent);
- Experience in data environments, such as Data Lakes, Data Warehouses, Data Marts, Data transformation concepts and working with large data volumes;
- Experience in building ETL pipelines to perform feature engineering on the large-scale datasets using Apache Airflow
- Experience with sourcing and modeling data from application APIs.
- Experience with building stream-processing applications using Spark-Streaming, Kafka Streams or others.
- Work in an international IT product company with offices in 4 countries
- Remote full-time work or work from a comfortable office. It doesn't matter where you work from, what matters is the result
- Flexible schedule. It is enough to coordinate time zones and have intersections of working hours with the team
- Paid 4 Sick Days and 1 Day Off per calendar year
- Sports program compensation
- Free online English lessons with a native speaker
- Large payments under the referral program, in which the bonus is received by both the employee who recommends and the candidate who accepts the offer
- Training, internal workshops, participation in international professional conferences and corporate events
- A wide relocation program for both employees and newcomers.
- Experience with low-latency NoSQL datastores (such as HBase, Cassandra, MongoDB), Relational databases (such as MySQL, Postgres) and Search systems (such as ElasticSearch, Solr) is a plus;
- Experience with running Machine Learning models in production is a plus;
- DevOps tools (e.g. Ansible, Docker, Kubernetes);
- Masters, Ph.D., or equivalent experience in Software Engineering, Mathematics or Computer Science.