The Key Responsibilites:
- Designing, creating, testing, and deploying Apache Flink job configurations for stream data processing.
- Maintaining and developing the Apache Flink cluster, including monitoring, diagnosing, and resolving performance and stability issues.
- Collaborating with the BI team on designing and optimizing queries and structures in the Apache Doris database, assisting in converting Qlik reports.
- Implementing and maintaining a data warehouse for reporting and analytical systems.
- Designing, implementing, and developing a data collection system for water meters and power generators (IoT) using technologies such as MQTT, Modbus.
- Participating in integration projects at clients, including occasional onsite visits throughout Poland.
- Contributing to the creation and improvement of processes and best practices related to maintaining production and test environments.
Job requirements:
- Experience working with technologies: Apache Flink, Apache Kafka, Apache Doris,
- Good self-organization skills, ability to solve problems independently and collaborate with a team.
- Ability to create and maintain clear pre-implementation and post-implementation documentation.
- Basic knowledge of databases in the areas of configuration, administration, and optimization (MSSQL, MySQL/MariaDB, PostgreSQL, Apache Doris).
- Knowledge of SQL and principles of data modeling in analytical and reporting environments.
- Proficiency in English for reading technical documentation and communicating with the support department of a given manufacturer
We offer:
- Attractive salary (b2b contract)
- Exposure to new technologies
- Implementation and maintenance of multiple projects
- Getting to know various IT environments and infrastructures
- Access to internal and external training
- Integration meetings and work in a casual atmosphere (you can check on social media what we mean)
- Opportunity for promotion to Team Leader position - for individuals showing commitment, initiative, and leadership skills
- Subsidies for training and courses
- Integration meetings
- Company phone for personal use
- No dress code
- Coffee / tea
Good to have skills:
- Experience with Change Data Capture (CDC), Zookeeper, Ceph, Keepalived, Proxmox, or a similar stack.
- Good knowledge of Linux operating systems (RHEL and derivatives, Debian and derivatives) in the scope of daily administration.
- Experience in the Data Warehouse area, including architecture design, processing processes, and integration of streaming data.




















































