Przegląd oferty

Lokalizacja
Warszawa, Mazowieckie
Rodzaj pracy
Pełny etat
Wynagrodzenie netto
16,800 zł - 25,200 zł Za MiesiÄ…c
Data opublikowania
2 lat temu

Szczegóły

ID oferty
2559
Typ pracy
Hybrydowo
Rozmiar firmy
ponad 200
Wynagrodzenie
Umieszczone w ofercie
Wykorzystywane technologie
Python, snowflake, unix, BI, Database
Typ umowy
B2B
Rekrutacja
Online
Rekrutacja w języku
Polski Angielski
Korzyści
Prywatna opieka medyczna, Finansowanie kursów, Międzynarodowe projekty, Małe zespoły, Elastyczne godziny pracy,
Poziom doświadczenia
Mid
Wymagana
Python

Opis oferty

  • Experience with agile environment with quick turnaround and strict deadlines.
  • Familiar with CI/CD concepts and the processes.
  • Minimum 3 years of experience in development or architecting large-scale data solutions, performing architectural assessments, crafting architectural options and analysis, finalizing preferred solution alternative working with IT and Business stakeholders Experience including data ingestion pipeline.
  • Solid experience in programming in Python.
  • Hands on experience with Docker, GIT & REST API.
  • Linux and bash skills.
  • Proactive approach and out of the box thinking.
  • Hands on experience with building complex applications.
  • A solid experience and understanding of architecting, designing and operationalization of large scale data and analytics solutions.

O stanowisku / o projekcie 

New Python Developer will join to the Datawarhousing Development team.

In this team you will develop, maintain, test and evaluate data sourcing solutions within data platforms. We are also involved in the design of solutions matching the capabilities to prioritised business challenges. Besides above, we are responsible for delivering the ETL development capability on the platform.

You can work hybrid from Tricity or Warsaw because we value direct contact.

Zakres obowiązków

  1. Developing backend application with combination of Python and Snowflake connection, 
  2. Developing scripts Unix, Python etc. to do Extract, Transform and Load data. 
  3. Provide production support for Data Warehouse issues such data load problems, transformation translation problems. 
  4. Translate requirements for BI and Reporting to Database design and reporting design 
  5. Understanding data pipelines and modern ways of automating data pipeline using cloud based solutions. 
  6. Testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions.