Open Credo

WHO WE'RE LOOKING FOR

OpenCredo (OC) is a UK based software development consultancy helping clients achieve more by leveraging modern technology and delivery approaches. We are a community of passionate technologists who thrive on delivering pragmatic solutions for our clients' most complex challenges. Curious, tenacious but always sensitive to our clients' context, we are not afraid to speak our minds to help steer our clients towards understanding and achieving their key goals. We are looking for a hands-on senior level data engineer with experience in dealing with various data centric problems and challenges. You understand the benefits of event streaming and the cases for ETL and batch processing, with experience bringing these approaches together in a coherent solution. You will have worked with modern data technologies including RDBMS as well as various NoSQL solutions, ideally operating at scale. You are excited by the prospect of data meshes and how this has the potential to change the business and data landscape.

REASONS TO WORK HERE

We’ll give you…

  • A highly competitive basic salary
  • 5% matched contributory pension
  • Private Health Insurance
  • Life Insurance
  • 25 days’ holiday plus public holidays (plus and extra day for each year of service)
  • Childcare vouchers
  • Cycle to work scheme
  • A high spec laptop (of course!)


Need more reasons? Heres a few more...

  • Work with some of the most exciting new technologies
  • Spark off co-workers who’ll challenge your thinking and help you to achieve your potential
  • Deal openly and honestly with customers
  • Benefit from a transparent environment including regular company meetings where we discuss anything and everything
  • Have exceptional opportunities as a speaker, blogger and contributor to open source projects. We have some great connections in the wider technology community that we encourage our team to make the most of!
  • Work alongside senior leaders who understand and value passionate technologists;
  • Enjoy coming to work! We’re a friendly, sociable bunch who genuinely support each other and have a lot of fun.

REQUIREMENTS

What we’re looking for:

  • Data expert: You will be comfortable articulating and explaining both key data concepts as well as diving into the details and the nitty-gritty where required. You are confident in your ability to deliver and grow as a technical Data Expert.
  • Data Pipelines & LifeCycle: You are comfortable with designing and building data pipelines and have a solid appreciation for how all phases of the data lifecycle from ingestion, to cleansing, transforming and analysing fit together.
  • A Problem Solver with a Can-Do Attitude: You can be relied on as the person who gets stuck in and makes things happen.
  • Distributed Systems Experience: You have developed and worked with Big data architectures. You are aware of the fallacies of distributed computing and how this impacts and relates to complex data platforms and storage solutions. You know what the CAP theorem is.
  • A Skilled Technologist: You have a background in programming and creating data-centric solutions using code. You are an accomplished programmer in one or more programming languages.
  • Innovation & Continuous Learning: You enjoy and actively seek to learn about new technologies and techniques in the Data and AI/ML space.

Experience with as many of these as possible:

  • Real-time Streaming: Design & Development of real-time data streaming solutions leveraging modern technologies and industry practises using technologies such as: Apache Kafka, Flink, Pulsar, Spark, Beam
  • Cloud Vendor Solutions: Design & Development of data pipelines or solutions within one or more cloud providers (AWS, GCP, Azure) utilising a mixture of open source as well as vendor-specific data offerings.
  • Event Driven Solutions: Hands-on architecting, design and development of event driven systems - Appropriate usage of techniques such as event sourcing, CQRS, domain driven design
  • Data Modelling & Data Engineering: A solid understanding of data modelling and experience with data engineering tools and platforms such as Kafka, Spark, various RDBMS databases and Hadoop
  • ML & AI: Integration of, and productionisation of (MLOps) ML Models
  • Graph Technologies: Understanding of connected data problems and challenges and the technologies used to manage this (eg graph databases like Neo4j)
  • Data At Scale: Design/implementation of large multi-terabyte relational models or a big data cluster/rollout

Although aptitude and attitude are what makes a great OCer, and interviewing with OC is not a box-ticking exercise of technologies, as a rough guideline, we think you are likely to have the following on your CV:

  • 5+ yrs of data centric development experience
  • Strong communication skills; ability to articulate your ideas and thoughts with others
  • Demonstrable experience with modern data platforms, practises & approaches such as Streaming, Event Sourcing, Data Pipelines
  • Deep expertise in at least one modern data technology or solution stack

Apply Now

SHARE THIS POSITION

Twitter LinkedIn Facebook Email

We are radical Problem Solvers

We are radical Problem Solvers

Adept at meeting a plethora of technology challenges, we modernise legacy systems, implement brand new solutions, and harness contemporary technologies. 

We can guide and strategise, architect and code, collaborate and empower. Being highly adaptable means we are able to work across diverse environments, industries and contexts. Ever pragmatic, we carefully curate our teams with our clients’ needs firmly in mind. 

We endeavour, at all times, to make the complex simple.