PROFILE
A DevOps Engineer with a strong coding background, skilled in architecting and deploying
scalable, secure cloud infrastructures. Proven expertise in containerising applications within
Kubernetes environments, improving deployment cadence through advanced CI/CD pipelines,
and configuring immutable infrastructure using Infrastructure as Code (IaC). I am a well-rounded
and professional team player, always striving to build strong relationships with those around me
and to make a positive impact within any organisation I work with.
EMPLOYMENT HISTORY
DevOps engineer, MRP
Oct 2020 — Mar 2024
United Kingdom
Spearheaded the migration from a monolith system to a cloud-native, containerised SaaS
platform, driving modernisation efforts to align with industry best practices and enhance
scalability and agility.
Engineered CI/CD pipelines using GitHub Actions for a multi-environment setup, automating build
and deployment processes to ensure rapid and reliable delivery of software.
Implemented Kubernetes for container orchestration, optimising resource utilisation and enabling
seamless scaling of applications to meet fluctuating demand.
Created Helm templates to streamline the deployment of applications to EKS clusters, ensuring
consistency and efficiency in the release project.
Leveraged Terraform and CloudFormation to design and provision infrastructure as code, enabling
the creation of immutable infrastructure and safer deployment processes.
Established a comprehensive monitoring solution utilising Prometheus, Loki, and Grafana stack,
providing real-time visibility into system performance and reliability.
Ensured secure credential management using Secrets Manager, maintaining compliance with
security best practices and regulatory requirements.
Integrated a variety of SAST, SCA and DAST security scanning tools into CI/CD pipelines.
implementing a shift-left approach to identify and remediate code quality issues early whilst also
reducing potential risks associated with third-party dependences.
Managed and maintained self-managed Kafka clusters, facilitating scalable and reliable event
streaming solutions to accommodate high volumes of data, with throughput reaching tens of
millions of events to support critical business operations and extensive data processing
requirements.
Enhanced web-scraping efficiency by integrating Scrapy with Kafka, resulting in a ten-fold increase
in data throughput and processing capabilities.
Developed a custom microservice to ingest web traffic hits from a CloudFront application,
leveraging serverless architecture with AWS Lambda, enabling real-time data processing and
analysis.
Senior q/kdb+ engineer, FD
Technologies
Aug 2019 Oct 2020
United Kingdom
Lead developer in the migration towards containerisation for a Tier one Investment bank.
JAMES DICKSON
DEVOPS ENGINEER
MSci. Physics
DETAILS
ADDRESS
Newry
United
Kingdom
PHONE
N/A
EMAIL
jamespd1550@gmail.com
LINKEDIN
https://linkedin.com/in/jdickson1/
GITHUB
https://github.com/jdickson1992
BLOG
https://jdickson.dev/
SKILLS
Languages
q/kdb+
Python
EDA
Apache Kafka
Cloud Engineering
AWS
Containerisation
Docker
Orchestration
Kubernetes
Swarm
EKS
ECS
Helm
IaC
Terraform
Cloudformation
Configuration Management
Ansible
CI/CD Pipleines
Github Actions
Jenkins
Observability
Prometheus
Loki
Grafana
AlertManager
Linux
shell
awk
Version Control
Git
Code / Security Scanning
SonarQube
Snyk
Trivvy
Defender for Cloud
Spearheaded the modernisation of legacy infrastructure by implementing a next-generation kdb
+/Kafka trading platform in the North American region, delivering heightened scalability,
flexibility, and performance.
Transformed traditional tickerplant components into a Kafka cluster, facilitating the retrieval of
offsets and data by RDBs and other subscribers upon startup, thus enabling horizontal scalability
and independent subscriber placement on separate hosts.
Ensured compliance with a latency Service Level Agreement (SLA) of within 1 second. Upon
completion, the platform processed nearly 2 billion orders daily, offering low latency and high-
performance access to sales and trading activity.
Orchestrated the migration of q processes responsible for transforming FIX order messages into
q binary format to a hybrid cloud platform, enabling horizontal scalability and elasticity.
Implemented parser processes deployed to isolated spot servers across multiple data centres,
automatically spawning additional instances during high-volume trading periods to meet latency
SLAs, such as market open and close.
q/kdb+ engineer, FD Technologies,
Centre of Excellence
Nov 2015 Aug 2019
United Kingdom
Served as a kdb+ developer in an offshore centre of excellence for the Fixed Income Division of
a Tier 1 Investment Bank.
Developed, tested, and deployed new client request functionality for FX and FX Options plants,
collaborating with desk quants to maintain Fixed Income kdb+ analytic libraries and stored
procedures.
Provided on-call support for production issues, ensuring prompt resolution and minimal
disruption to operations.
Acted as a matrix manager, liaising with five internal teams to oversee
on-the-ground activities including workspace professionalism, conducting 1-1 colleague
engagements, and managing monthly timesheets.
EDUCATION
MSci. Physics , First Class Honours (1:1)
Sep 2011Jul 2015
Queen's University
Belfast
Completed a broad range of modules including Quantum Mechanics, Astrophysics, General
Relativity, Atomic and Condensed Matter Physics, Solid State Physics, Electromagnetism,
Nuclear and Fundamental Physics and Biological physics.
Certified Machine Learning Engineer, Udacity Mar 2018Jan 2019
Cloud DevOps Engineer, Udacity July 2020 Dec 2020
OPEN-SOURCE PROJECTS
Kafka Scrapy Connect, PyPi
A custom library that integrates Scrapy with Kafka to improve broad crawl
throughput - https://pypi.org/project/kafka-scrapy-connect/
Kafka KRaft, GitHub Action
Builds a lightweight kafka cluster for seamless integration testing in CI/CD pipelines -
https://github.com/marketplace/actions/kafka-kraft
References available upon request.