Senior Data Engineer Networking Domain

Przegląd oferty

Lokalizacja
Wrocław, Dolnośląskie
Rodzaj pracy
Pełny etat
Wynagrodzenie netto
26,000 zł - 35,000 zł Za MiesiÄ…c
Data opublikowania
2 lat temu

Szczegóły

ID oferty
3276
Typ pracy
Zdalnie
Rozmiar firmy
ponad 200
Wynagrodzenie
Umieszczone w ofercie
Wykorzystywane technologie
Python, GCP, AWS, Linux, BigQuery, AWS, Redshift, SQL, CI/CD, Docker, Kubernetes
Typ umowy
B2B
Rekrutacja
Online
Rekrutacja w języku
Polski
Korzyści
Prywatna opieka medyczna, Finansowanie kursów, Spotkania integracyjne, No dress code, Lunch card, Kawa za free, Play room, Prysznic, Parking rowerowy, Startup atmosphere, Międzynarodowe projekty, Małe zespoły, Elastyczne godziny pracy,
Poziom doświadczenia
Senior Expert
Wymagana
Python

Opis oferty

  • Hands-on experience with GCP's BigQuery and Cloud Functions (or other clouds like AWS Redshift and Lambda)
  • Expertise in telemetry domain
  • Experience with writing technical documentation (HLD, LLD)
  • Experience with solution architecture (Open Architecture, C4 modeling)
  • Knowledge of networks, network security and network platforms
  • Experience with data stream processing
  • Very good understanding of database design concepts and approaches
  • Knowledge of Python
  • Knowledge of query languages (BigQuery/SQL/…)
  • Experience with CI/CD tools and processes
  • Basic understanding of virtualization technologies with emphasis on Kubernetes, Docker would be a plus
  • Experience with non-relational databases would be a plus
  • Good communication skills, English (C1 level - fluent, communicative and technical), ability to confront technical solutions with the team and the customer’s technical representatives to validate the solution with the client

Opis oferty 

The project and the team

Our client is a networking company, building an ETL solution to gather and expose telemetry data through API. The platform is being vastly extended and integrated with the customer’s other subsystems.

What else you should know:

  • The team consists of less than 10 people including an architect, engineering manager, software/data engineers familiar with numerous APIs, data structuring and processing techniques, presenting output in multiple ways depending on the business needs
  • We use the Agile approach
  • Our tech stack for the project includes: core of the system - GCP (BigQuery, Cloud Functions, CloudRun, Scheduler, API Gateway), systems integrated with the core - AWS, Azure, other relational and non-relational databases, data repositories and custom API’s.
  • The client is based in the US

We work on multiple interesting projects at a time, so we may invite you to an interview for another project, if we see that your competencies and profile are well suited for it.

Zakres obowiązków

  1. Designing an ETL system’s architecture in a cloud (GCP) environment 
  2. Implementing features (Python), utilizing GCP services (BigQuery, Cloud Functions, CloudRun, Scheduler, API Gateway) 
  3. Implementing open telemetry solutions suitable for either OLTP or OLAP approaches 
  4. Investigating possible bottlenecks and improving overall ETL performance 
  5. Taking part in technical design discussions 
  6. Delivering automatic tests for your code 
  7. Validating the solution with the client (demo) 
  8. Working according to Agile methodology and collaborating with the client’s teams