Senior Data Engineer

SeQura
100% remoto desde Europa
Jornada completa

Descripción del puesto de trabajo

Hi there! We are SeQura, an innovative, fast-growing fintech company based in sunny Barcelona, born in 2013, seeking to be the preferred payment platform, helping people buy things they love, paying in an instant, fair and flexible way. Humanizing financial services is our main mantra, and that’s why our actions and work are based on transparency, ambition, respect, and collaboration.

Our growth and financing model directly impacts our culture and approach to business. We have a long-term vision and operate with freedom, we are a company independent to venture capital, and we are owners of our destiny.

We are a big community formed by not only us but more than 4.000 Merchants who offer our payment methods to more than 1 million Shoppers, the ones who have led us to the highest NPS in our industry in Spain.

We have an exciting new opening for a Senior Data Engineer to join our growing team of data and analytics experts. You will play a key role in designing and developing our data ecosystem, big data pipelines, data tooling, and processes. You will also help make our data accessible, reliable, and efficient to query. 

You will contribute with your knowledge and experience to improve and increase the company's data flows. The final goal is to set up, evolve and maintain an infrastructure so the different company departments can be independent to understand, handle and manipulate their own data and KPIs.

If you're a proactive, determined, detail-oriented, and hands-on person who wants to help improve Sequra's product having ownership and exposure to work hand in hand with seasoned professionals in a collaborative environment, this is the perfect place for you.

In our growing (+40 people) tech & product team, we have a pragmatic, agile, quality-oriented development methodology.

We like lean & agile methodologies, XP, good practices, design principles, and patterns.

We enjoy working in a healthy culture that pursues a healthy balance between personal and professional life and encourages us to keep growing both professionally and personally and promoting people’s creativity and productivity.

We have amazing growth opportunities and a solid customized career plan for you. We also offer relocation and remote work while also having the possibility to come to our AMAZING offices and be given the best equipment to work with.

 

Our Data journey

We started our data journey just 3 years ago and it has already been an amazing drive! We’ve built a passionate and vibrant team, organized in a hub & spoke model, with roles such as data scientist, data engineers and analytics engineers.

Our long term vision is to walk towards a Data Mesh model, where we will split people into producers and consumers domains.

For this reason we’re building a modern data stack, with technologies like Airflow, DBT, Tableau and other viz tools in a Kappa Architecture provided by Debezium as CDC and Kafka. All tools within Kubernetes paradigm and infrastructure as a code.

Working on AWS Cloud, we have a S3 Data Lake with several layers and a Data Warehouse in Redshift feeding Tableau Dashboards and first Metabase visualizations.

 

How your day to day would look like... 🧐

You will help to implement the new Modern Data Infrastructure working closely with the Devops team.
Include a Data Governance system on Data Infrastructure and GDPR compliance.
You will work closely with and for Domain Analytics Engineers in order to understand and provide them the most useful tools for analytical purposes.
You will develop complex and efficient pipelines to transform raw data sources into powerful, reliable components of our data lake.
You will optimize and improve existing features or data processes for performance and stability.
Coach analysts and data scientists on software engineering best practices.

 

You’ll be a good fit if you have… 🙂

Demonstrable experience (5+ years) working as a Data Engineer on AWS cloud or similar.
Strong SQL knowledge. You can write complex queries and optimize them for performance, scalability and ease of maintenance.
Fluent programming skills in Python.
3+ years Working experience with cloud providers, especially in AWS.
Experience with task orchestration tools (ex.Airflow, Luigi).
Familiar with Redshift, SnowFlake or BigQuery modern Data Warehouses.
Develop transformation jobs with a SQL-based transformation framework like DBT or a distributed computing framework like Apache Spark.
Be familiar with software development best practices and their applications to analytics (version control, testing, CI/CD, automation).
Excellent interpersonal skills including written and verbal communication for working with Analytic Engineers and Data Scientists.
English proficiency.


It would be nice if you also have... 🤭

Fluent spoken and written Spanish.
Experience in the following technologies:Familiarity with infrastructure and automation tools (Terraform, Kubernetes, or similar).
Great expectations
Experience with building stream-processing systems, using solutions such as Kafka and Spark Streaming
Data Modeling: Snowflake or Star Schema
Git / GitHub Actions
Defining Data Governance systems

 

You’re going to be a match if you are... 🤗

Agile: You thrive in a fast-paced environment and can deliver quality work quickly and with grace.
Growth Mindset: eager to create new processes, new teams, and interdisciplinary exchange.
Proactive, accountable, analytical, and pragmatic.
Someone with high attention to detail/product ownership.
A team player.
A good communicator. ​
Motivativated to learn and grow.
Passionate for driving change and efficiency with partners of different disciplines.

 

Source: sequra.recruitee.com

Categoría

Tipo de empleo

Jornada completa, 100% remoto.

Ubicación

Solo para personas ubicadas en Europa.

SeQura

Publicada 28/09/2022
8 visualizaciones
0 inscritos

Suscríbete a nuestra newsletter y recibe nuevas ofertas de empleo en remoto en tu email 👇