Data and Integration Engineer

Job number 1170 Date posted Jan 04, 2023

Job number

Date posted
Jan 04, 2023

Job category
BI / Big Data
Level of experience

Job type
Employment type

Work site

job benefits
Pleasant work environment

Learning and development

Employee benefits

Attractive compensation

Career guidance

About ELCA

We are ELCA, one of the largest Swiss IT tribe with over 2,000 experts. We are multicultural with offices in Switzerland, Spain, France, Vietnam and Mauritius. As one of Switzerland’s best employers of 2022, we are proud to rank among the top 20 preferred companies within the “Internet, Telecommunication and IT” category. Since 1968, our team of engineers, business analysts, software architects, designers and consultants provide tailor-made and standardized solutions to support the digital transformation of major public administrations and private companies in Switzerland. Our activity spans across multiples fields of leading-edge technologies such as AI, Machine & Deep learning, BI/BD, RPA, Blockchain, IoT and CyberSecurity.

ELCA is one of Switzerland’s biggest Information Technology Companies.

Since the company was founded in 1968, we have offered our customers a single source for the complete spectrum of IT services including consulting, development, and operations.

Our team of 2000 employees advises companies on the best use of modern information technologies, develops and implements efficient and stable solutions and applies our know-how to ensure excellence in use.

ELCA Engineering is growing and our Business Intelligence and Big Data (BI/BD) business line is looking for a Data Engineer.

Your role

The BI/BD business line is responsible for providing technology platforms that drive our customers’ growth in analytics, data science, and data integration.

We are looking for a motivated Data Engineer, with experience in the implementation of complex distributed IT environments processing or integrating large amounts of data.

Ideally, an experienced person with competencies either in big data technologies such as Apache Spark, MPP databases or in data integration (APIM, integration processes, stream processing) using either PaaS tools in the Cloud or one of the market leaders (Mulesoft, Software AG, WSO2…).

  • You will work with business and software engineering teams to design and build modern/agile integration platforms from the ground up, driven by concrete business use cases
  • You will participate to teams through the end-to-end project lifecycle, covering the initial conception, business requirements, software architecture, implementation, and flawless delivery. You will be acting as an experienced professional and coach for the more junior team members
  • You will work with business analysts and data scientists to understand and implement their use cases
  • You will support pre-sales on tender responses, proof-of-concept work, and the design of modern/agile data and integration platforms, both on premise and in the cloud
  • You will participate to group-wide thought leadership initiatives to advance our data & integration platform practice and sustain our technical excellence

Our offer

  • A dynamic work and collaborative environment with a highly motivated multi-cultural and international sites team
  • Attractive prospects for career path & Personal development through training and coaching
  • The chance to make a difference in peoples’ life by building innovative solutions
  • Various internal coding events (Hackathon, Brownbags), see our technical blog
  • Monthly After-Works organized per locations
  • Good work-life balance (2 days per week from home)
  • Attractive pension fund with 3 types of employees’ contributions
  • Premium and worldwide coverage with Zurich insurance fully supported by the company
  • 1/2 SBB fare abonnement
  • Mobile and internet discount program

Your profile

  • 4+ years of experience in designing and implementing large-scale integration platforms
  • Either an expertise in data engineering (covering data collection, preparation/transformation, and serving to end users- knowledge of data federation, data mesh and data product concepts) or an expertise of integration patterns and best practices such as synchronous vs. asynchronous communications, RESTful APIs, messaging, publish-subscribe, micro-integrations, API management
  • Strong SQL expertise is required
  • Strong knowledge of DevOps and tooling for CI/CD pipelines, automation, infrastructure as code, automated testing, code quality
  • Experience with one of the Hyperscaler Clouds (AWS, GCP, Azure), with Snowflake or Databricks. A certification is a plus
  • Good understanding of distributed systems, real-time vs. (micro-)batch processing and related design principles and patterns
  • Good communication skills, fluent in French or German, and in English

Leave a Reply