In this role, you will:
-
Take ownership, improve, scale and iterate on existing data processing pipelines.
-
Design and implement new data processing pipelines.
-
Collect and monitor performance metrics.
-
Play a central role in discussing and implementing security best practices.
-
Good knowledge of at least one programming language (C++, Python, Scala, Java).
-
Experience with AWS services (S3, EC2, IAM, EMR, Glue, Athena, Kinesis) or any other cloud platform.
-
Experience with Hadoop (or similar) Ecosystem.
-
Comfortable with SQL, good understanding of SQL engine basics.
-
Proficient understanding of distributed computing principles.
-
Experience with workflow management tools (Airflow, Oozie, Luigi).
-
Be able to explain their position and give arguments for certain decisions;
-
Creative, resourceful and innovative problem solver.
-
Good communication skills in English, both written and spoken.
-
Prior experience in the mapping, navigation, or automotive industry.
-
Hands-on experience with data processing platforms/frameworks (Spark or other).
Задайте вопрос работодателю
Где предстоит работать
Вакансия опубликована 26 февраля 2025 в Минске