OpenCredo (OC) is a UK-based software development consultancy helping clients achieve more by leveraging modern technology and delivery approaches. We are a community of passionate technologists who thrive on delivering pragmatic solutions for our client's most complex challenges. Curious, and tenacious but always sensitive to our client's context, we are not afraid to speak our minds to help steer our clients towards understanding and achieving their key goals. We are looking for a hands-on senior-level data engineer with experience in dealing with various data-centric problems and challenges. You understand the benefits of event streaming and the cases for ETL and batch processing, with experience bringing these approaches together in a coherent solution. You will have worked with modern data technologies including RDBMS as well as various NoSQL solutions, ideally operating at scale. You are excited by the prospect of data meshes and how this has the potential to change the business and data landscape.
What we’re looking for:
● Data expert: You will be comfortable articulating and explaining both key data concepts as well as diving into the details and the nitty-gritty where required. You are confident in your ability to deliver and grow as a technical Data Expert.
● Data Pipelines & LifeCycle: You are comfortable with designing and building data pipelines and have a solid appreciation for how all phases of the data lifecycle from ingestion, to cleansing, transforming and analysing fit together.
● A Problem Solver with a Can Do Attitude: You can be relied on as the person who gets stuck in and makes things happen.
● Distributed Systems Experience: You have developed and worked with Big data architectures. You are aware of the fallacies of distributed computing and how this impacts and relates to complex data platforms and storage solutions. You know what the CAP theorem is.
● A Skilled Technologist: You have a background in programming and creating data-centric solutions using code. You are an accomplished programmer in one or more programming languages.
Innovation & Continuous Learning: You enjoy and actively seek to learn about new technologies and techniques in the Data and AI/ML space
We’ll give you…
Need more reasons? Heres are few more...
Experience with as many of these as possible:
● Real-time Streaming: Design & Development of real-time data streaming solutions leveraging modern technologies and industry practises using technologies such as Apache Kafka, Flink, Pulsar, Spark, Beam
● Cloud Vendor Solutions: Design & Development of data pipelines or solutions within one or more cloud providers (AWS, GCP, Azure) utilising a mixture of open source as well as vendor-specific data offerings.
● Event Driven Solutions: Hands-on architecting, design and development of event-driven systems - Appropriate usage of techniques such as event sourcing, CQRS, domain-driven design
● Data Modelling & Data Engineering: A solid understanding of data modelling and experience with data engineering tools and platforms such as Kafka, Spark, various RDBMS databases and Hadoop
● ML & AI: Integration of, and productionisation of (MLOps) ML Models
● Graph Technologies: Understanding of connected data problems and challenges and the technologies used to manage this (eg graph databases like Neo4j)
● Data At Scale: Design/implementation of large multi-terabyte relational models or a big data cluster/rollout
● Alternative Data Technologies: understanding or experience of one or more of : data lakehouse, data warehouse, data lake
Adept at meeting a plethora of technology challenges, we modernise legacy systems, implement brand new solutions, and harness contemporary technologies.
We can guide and strategise, architect and code, collaborate and empower. Being highly adaptable means we are able to work across diverse environments, industries and contexts. Ever pragmatic, we carefully curate our teams with our clients’ needs firmly in mind.
We endeavour, at all times, to make the complex simple.