Capco Logo

Capco

Senior DevOps Engineer (AWS) - Payments (She/ He/ They)

Posted 5 Hours Ago
Be an Early Applicant
Remote or Hybrid
Hiring Remotely in Poland
Senior level
Remote or Hybrid
Hiring Remotely in Poland
Senior level
The Senior DevOps Engineer will design and manage AWS infrastructures, automate processes, and build CI/CD pipelines while collaborating with development teams to implement DevOps practices.
The summary above was generated by AI

CAPCO POLAND 

*We are looking for Poland based candidate. 

At Capco Poland, we’re not just another consultancy — we’re the spark behind digital transformation in the financial world. As a global leader in technology and management consulting, we thrive on helping clients tackle the toughest challenges across banking, payments, capital markets, wealth, and asset management.

The Project:

Join a greenfield IT transformation in the fintech sector, working with Java 21/Spring Boot and a strong focus on software quality and Craftsmanship practices. We are currently looking for a skilled, product-oriented DevOps Software Engineer to join one of our product-focused DevOps teams. It’s a great opportunity to help businesses of all shapes and sizes to accelerate their growth journey - quickly, simply, and securely. We are the innovators at the heart of the payments technology industry, shaping how the world pays and gets paid. Our technology powers the growth of millions of businesses across 5 continents. And just as we help our customers accelerate their business, we are committed to helping our people accelerate their careers. Together, we shape evolution.

Our Goal:

Delivering a fantastic payment experience that the customers love is what drives us forward. Our cloud-native acquiring platform processes billions of card transactions, including authorization processing and clearing & settlement, ensuring trust and reliability for our customers.

Role Overview:

DevOps Engineer with over 5 years of hands-on experience designing, implementing, and operating AWS cloud infrastructures, automating processes, and building secure and scalable CI/CD pipelines. Strong background collaborating with development teams to enable DevOps and DevSecOps best practices in production environments. 

Responsibilities include:

  • Active participation and contribution in all phases of the product lifecycle including discovery, delivery and operations
  • Delivering committed objectives, adapting to whatever is needed to reach the goal you and your team have committed to reach

Key tech stack: AWS, Terraform (IaC), EKS, CI/CD, Python

Role Requirements:

  • Cloud Platform - AWS
    • 5+ years of hands-on experience with AWS.
    • Strong knowledge of core AWS services, including:
      • ECS, EKS, Lambda, EC2, S3, EBS, VPC, IAM, CloudWatch, CloudTrail
      • Route 53, API Gateway, EventBridge, SNS, SQS
      • RDS, Aurora PostgreSQL, DynamoDB
    • Experience with AWS Cognito:
      • User pools, identity pools, app clients, and authentication flows.
    • Solid understanding of AWS security best practices:
      • Least-privilege IAM, KMS, Secrets Manager, ACM certificates.
    • Experience with Route 53:
      • Public and private hosted zones, DNS management, and service integration.
    • Familiarity with multi-account architectures, AWS Landing Zones, and permission boundaries.
    • Proven ability to design and maintain secure, scalable, production grade AWS infrastructures.
    • AWS Solutions Architect (Associate/Professional) certification or equivalent experience.
  • Infrastructure as Code - Terraform
    • Advanced experience with Terraform:
      • Reusable and versioned module design.
      • Remote state management using S3 and DynamoDB.
      • Multi-environment (dev, qa, sta, prod) IaC architectures.
    • Experience managing Terraform state, handling drift, and refactoring IaC.
    • Familiarity with:
      • Checkov, TFLint, terraform-docs for security and code quality.
    • Terraform Associate certification (preferred).
  • Configuration Management & Automation
    • Solid experience with Ansible:
      • Development of roles, playbooks, inventories, and reusable automation patterns.
      • Integration with CI/CD pipelines and hybrid environments.
  • Operating Systems & Scripting
    • Strong knowledge of Linux, especially RHEL based systems.
    • Experience working with bastion hosts and secure SSH access.
    • Proficiency in Python:
      • Automation scripts, AWS integrations (boto3), internal tooling.
    • Experience using Node.js for scripting and CI/CD related tasks.
  • CI/CD & DevSecOps
    • Experience designing and maintaining end-to-end CI/CD pipelines.
    • Hands-on experience with pipelines for backend and frontend applications.
    • Tools and platforms:
      • Jenkins, GitLab CI, Azure DevOps, GitHub Actions
    • Integration of:
      • SonarQube for code quality.
      • Security scanning and policy-as-code tools (Checkov, Sentinel, etc.).
      • Automated testing and validation stages.
    • Experience with container registries:
      • Amazon ECR, JFrog Artifactory, Nexus.
  • Containers & Orchestration
    • Strong experience with Docker:
      • Image optimization, multi-stage builds, private registries.
    • Good working knowledge of Kubernetes, preferably EKS.
    • Familiarity with GitOps practices and tools:
      • ArgoCD, FluxCD (nice to have).
    • CKA / CKAD certifications are a plus.
  • Observability & Monitoring
    • Hands-on experience with Grafana and Prometheus for infrastructure and Kubernetes monitoring.
    • Experience with AWS CloudWatch (metrics, logs, alarms, dashboards).
    • Experience with Loki for centralized log aggregation.
    • Working experience with Datadog for infrastructure monitoring, APM, and alerting.
    • Experience with Splunk and Logz.io for log management and analysis.
    • Knowledge of alerting best practices, SLIs/SLOs, and incident troubleshooting.
  • Databases
    • Working knowledge of SQL and NoSQL databases:
      • PostgreSQL, MySQL, MongoDB.
    • Experience with AWS-managed databases:
      • RDS, Aurora PostgreSQL, DynamoDB.
    • Experience supporting database-backed applications in cloud environments.
  • Additional Skills
    • Proficiency with Git and common branching strategies.
    • Experience with monitoring, logging, and alerting solutions.
    • Understanding of immutable infrastructure and continuous delivery principles.
    • Strong troubleshooting skills across infrastructure, networking, and application layers.
    • Strong documentation skills and experience with knowledge transfer.

We offer a flexible collaboration model based on a B2B contract, with the opportunity to work on diverse projects.

 ONLINE RECRUITMENT PROCESS STEPS

  • Screening call with Recruiter
  • Technical Interview
  • Client Interview 
  • Feedback/ Offer

We have been informed of several recruitment scams targeting the public. We strongly advise you to verify identities before engaging in recruitment related communication. All official Capco communication will be conducted via a Capco recruiter.

Top Skills

Ansible
Aurora Postgresql
AWS
Aws Rds
Azure Devops
Ci/Cd
Docker
DynamoDB
Eks
Github Actions
Gitlab Ci
Grafana
Jenkins
Kubernetes
Prometheus
Python
Terraform

Capco Charlotte, North Carolina, USA Office

101 South Tryon Street, Suite 2600, Charlotte, NC, United States, 28280

Similar Jobs at Capco

21 Hours Ago
Remote or Hybrid
Poland
Mid level
Mid level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
The Data Scientist will bridge business needs with technology solutions, focusing on cash and loan products, and consult on data analysis, model design, and data quality.
Top Skills: Azure MlDatabricksHadoopPythonRSASScikit-LearnSQLTensorFlow
2 Days Ago
Remote or Hybrid
Poland
Mid level
Mid level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
Analyze large datasets and validate data using SQL and Python. Develop KPIs, maintain reports, and present insights to business stakeholders.
Top Skills: HadoopPythonSQL
2 Days Ago
Remote or Hybrid
Poland
Senior level
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
Analyze large datasets using SQL and Python, validate data, develop KPIs, translate business problems into insights, and collaborate with teams.
Top Skills: HadoopPythonSQL

What you need to know about the Charlotte Tech Scene

Ranked among the hottest tech cities in 2024 by CompTIA, Charlotte is quickly cementing its place as a major U.S. tech hub. Home to more than 90,000 tech workers, the city’s ecosystem is primed for continued growth, fueled by billions in annual funding from heavyweights like Microsoft and RevTech Labs, which has created thousands of fintech jobs and made the city a go-to for tech pros looking for their next big opportunity.

Key Facts About Charlotte Tech

  • Number of Tech Workers: 90,859; 6.5% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Lowe’s, Bank of America, TIAA, Microsoft, Honeywell
  • Key Industries: Fintech, artificial intelligence, cybersecurity, cloud computing, e-commerce
  • Funding Landscape: $3.1 billion in venture capital funding in 2024 (CED)
  • Notable Investors: Microsoft, Google, Falfurrias Management Partners, RevTech Labs Foundation
  • Research Centers and Universities: University of North Carolina at Charlotte, Northeastern University, North Carolina Research Campus

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account