Grid Dynamics — вакансії

  • Outsource
  • 1001-5000
  • 2006
  • Київ, Львів, Харків, Дніпро, Yerevan, Chisinau, Belgrade, Gdansk, Krakow, Warszawa, Wroclaw, Dallas
  • E-commerce / Marketplace, Energy, Healthcare / MedTech / LifeScience, Industry / Manufacturing, Insurance, Machine Learning / Big Data, Media / Entertaiment, Mobile, Retail, Software Development & Hi-Tech, Telecom / Communications

Актуальні вакансії компанії

Досвід від 7 років Senior Full-time Не має значення Є тестове завдання Office Київ, Львів, Харків, Дніпро, Одеса
12.03.20251
Детальніше
  • Apache Spark
  • Presto
  • Hive
  • Apache Flink
  • Apache Beam
  • AWS
  • Kubernetes
  • Java
  • Python
  • Ray
  • ML
  • AI

We are looking for engineers well versed in cloud data infrastructure to advise customers by building solutions using diverse technologies like Spark, Flink, Kafka, Ray and others. As part of this group, you will educate customers on the value proposition of the cloud data platform, address their challenges and deliver solutions that go from proof of concept to production.You will be expected to effectively partner with cross-functional engineering teams and customers.

Essential functions

  • Must have:
    • AWS experience with kubernetes operating knowledge.
    • Excellent communication skills
    • Big data experience with Spark or Flink
    • Experience with automation for testing, monitoring and CI/CD

Qualifications

  • 8+ YOE, with 5+ years of experience working with big data technologies and cloud environments
  • Hands-on experience on batch processing (Spark, Presto, Hive) or streaming (Flink, Beam, Spark Streaming)
  • Experience in AWS and knowledge in its ecosystem. Experience in scaling and operating kubernetes.
  • Excellent communication skills is a must, experience working with customers directly to explain how they would use the infrastructure to build solutions that meet their business goals
  • Proven ability to work in an agile environment, flexible to adapt to changes
  • Able to work independently, research on possible solutions to unblock customer
  • Programming experience in Java or Python
  • Fast learner and experience with other common big data open source technologies is a big plus
  • Knowledge on machine learning is a nice-to-have

Would be a plus

  • Experience working in a customer-facing or consulting role
  • Programming experience in Java and Python
  • Knowledge on Ray
  • Knowledge on Machine Learning and AI
Відгукнутися
Досвід від 4 років Senior Full-time Не має значення Є тестове завдання Office Київ, Львів, Харків, Дніпро, Одеса
12.03.20252
Детальніше
  • Java
  • Spring Boot
  • React.js
  • AWS
  • EC2
  • AWS Lambda
  • RESTful API
  • CI/CD
  • Docker
  • Kubernetes
  • Angular
  • Vue.js
  • Kafka

We are seeking a highly skilled Full Stack Engineer with expertise in back-end development and front-end technologies. The ideal candidate will be responsible for designing, implementing, and deploying services with a Java Spring Boot back-end and a React.js front-end, leveraging their experience with AWS and Kafka. The candidate should be able to solve complex problems and feel responsible for the end-to-end development lifecycle, from code to release, to monitoring. They should bring a DevOps mindset. Our software engineering teams embrace our “you build it, you run it” approach and work across the complete technology stack, which ranges from ReactJS / HTML / CSS, and extends to Java/TypeScript/NodeJS, AWS, CI/CD, and Infrastructure as Code (IaC).

Essential functions

  • Back-End Development: Develop robust and scalable back-end service using Java SpringBoot.
  • Front-End Integration: Develop front-end of the service by using ReactJs.
  • Cloud Infrastructure: Deploy and manage the service on AWS, ensuring high availability, scalability, and performance.
  • Continuous Integration & Deployment: Implement CI/CD pipelines to automate model deployment and updates.
  • Monitoring & Optimization: Monitor model performance and make necessary adjustments to ensure accuracy and efficiency.
  • Collaboration: Work closely with product manager, data scientists and other engineers to deliver end-to-end solution.

Qualifications

  • Education: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
  • Experience:
    • 4+ years of experience in software engineering.
    • Strong proficiency in Java SpringBoot for back-end development.
    • Solid experience with ReactJs for front-end development.
    • Hands-on experience with AWS services (e.g., EC2, Lambda).
  • Skills:
    • Proficient in Java SpringBoot, ReactJS.
    • Experience with RESTful APIs and microservices architecture.
    • Knowledge of DevOps practices, including CI/CD and containerization (Docker/Kubernetes).
    • Excellent problem-solving and analytical skills.

Would be a plus

  • Familiarity with other front-end frameworks like Angular or Vue.js.
  • Experience with Kafka for real-time data streaming and processing
Відгукнутися
Досвід не має значення Senior Full-time Intermediate / B1 Є тестове завдання Office Київ, Львів, Харків, Дніпро, Одеса
12.03.20252
Детальніше
  • GCP
  • GitLab
  • Linux
  • Bash
  • Python
  • CI/CD
  • Git
  • Kubernetes
  • Docker
  • Helm

The customer is planning to migrate the data to Gitlab. Experienced engineers with migration experience are needed to assist the current team.

Client description:

The Client along with Bloomingdales is one of the oldest and biggest department store chains in USA. The Client is focusing on online retail and DevOps Team is constantly improving and refining the delivery process. Engineers work on CI/CD pipelines based on Jenkins and Jenkins pipelines written in Groovy, communicate with Development and QA departments from customer side and help them with application code (mostly written in Java) delivery from GitLab to production. The DevOps Team supports their own portal for test environments deployment, which is based on GCP and Kubernetes. The infrastructure which is being changed right along with code requirements is monitored by Prometheus. The variety of tasks from different areas ensures that engineers will not only deal with routine problems but also improve their skills and learn new technologies.

Details on tech stack:

  • Linux/*NIX administering
  • Bash/Python scripting
  • Jenkins
  • Kubernetes, Docker, Helm
  • Git or other version control systems
  • GCP
  • Chef/Ansible

Min requirements to the candidate:

  • Experience in migration is a must (GCP, Gitlab, etc)
  • Linux/*NIX administering, Bash/Python scripting
  • Knowledge of CI/CD principles and best practices
  • Git or other version control systems
  • English – at least Intermediate
  • Kubernetes, Docker, Helm
Відгукнутися
Досвід не має значення Senior Full-time Не має значення Є тестове завдання Office Київ, Львів, Харків, Дніпро, Одеса
12.03.20254
Детальніше
  • API
  • Agile
  • AWS
  • GCP
  • Docker
  • Kubernetes
  • Java
  • SQL
  • NoSQL
  • Redis
  • Cassandra
  • Voldemort

We are looking for a highly skilled Back End Java Engineer to join our online web services team, contributing to the development of scalable and efficient backend solutions.

Essential functions

  • Design, implement, and maintain scalable web applications.
  • Build and optimize backend APIs and frontend interfaces.
  • Work with cloud platforms (AWS, GCP) and containerization tools (Docker, Kubernetes).
  • Design and optimize databases (SQL and NoSQL), including large-scale technologies such as Redis, Cassandra, Voldemort, or similar.

Qualifications

  • Proven ability to design, implement, and maintain scalable web applications.
  • Experience building and optimizing backend APIs and frontend interfaces.
  • Proficiency in the Agile software development lifecycle.
  • Hands-on experience with cloud platforms (AWS, GCP) and containerization tools (Docker, Kubernetes).
  • Proficiency in Java backend programming languages.
  • Familiarity with databases (SQL and NoSQL) and experience in database design and optimization.
  • Proven work experience with large-scale technologies such as Redis, Cassandra, Voldemort, or similar.
  • Availability to work with up to 8 pm EET overlap.
Відгукнутися
Досвід не має значення Senior Full-time Не має значення Є тестове завдання Office Київ, Львів, Харків, Дніпро, Одеса
12.03.20251
Детальніше
  • Apache Spark
  • Scala
  • Hadoop
  • Kafka
  • Oracle
  • PostgreSQl
  • Teradata
  • Cassandra

We are building scalable data pipelines and infrastructure to generate reports based on massive datasets. Our team is responsible for designing, developing, and validating jobs using the latest versions of Scala and Apache Spark, ensuring accurate statistical results for our stakeholders.
This is a distributed team environment, offering an exciting opportunity to collaborate with top Big Data engineers across Europe and overseas.
While this role primarily focuses on Big Data engineering, experience with CI/CD and DevOps practices is a strong advantage, as infrastructure-related tasks will also be part of the job.

Essential functions

  • Design and Develop Scalable Data Pipelines
  • Implement and Validate Big Data Solutions
  • Integrate and Manage Infrastructure
  • Collaborate in a Distributed Environment

Qualifications

  • Strong expertise in Spark and Scala
  • Hands-on experience with Hadoop
  • Proficiency in processing and computation frameworks: Kafka, Spark
  • Experience with database engines: Oracle, PostgreSQL, Teradata, Cassandra
  • Understanding of distributed computing technologies, approaches, and patterns

Would be a plus

  • Experience with Data Lakes, Data Warehousing, or analytics systems
Відгукнутися
Досвід не має значення Senior Full-time Не має значення Є тестове завдання Office Київ, Львів, Харків, Дніпро, Одеса
12.03.20251
Детальніше
  • AWS services
  • AWS Glue
  • Athena
  • EMR
  • EC2
  • IAM
  • MWAA
  • Python
  • PySpark
  • Django
  • Great Expectations
  • Soda

The client is the largest pan-European online car market with around 1.5 million listings and more than 43,000 car dealer partners offer inspiring solutions and empowering services. We amaze our customers by delivering real value.

Details on tech stack:

  • Expertise in AWS services, especially Glue, Athena, EMR, EC2, IAM, MWAA
  • Proficiency in Python and PySpark

Key requirements to the candidate:

  • AWS Services: Expertise in AWS services, especially Glue, S3, Athena, EMR, EC2, and MWAA.
  • Programming Languages: Proficiency in Python, PySpark, SQL, and/or Scala.
  • Big Data Technologies: Hands-on experience with Spark, and Trino, Presto
  • Data Platforms: Experience in building data platforms, not just using them

Qualifications

  • Expertise in AWS services, especially Glue, Athena, EMR, EC2, IAM, MWAA
  • Proficiency in Python and PySpark
  • Experience in building data platforms, not just using them
  • Proficiency in data modeling techniques and best practices
  • Experience in implementing data contracts
  • Experience in applying data governance policies
  • Experience with data quality frameworks (Great expectations, Soda)
  • Familiarity with the data mesh architecture and its principles
  • Django experience
  • Important: Strong Python knowledge.
Відгукнутися
Досвід від 3 років Senior Full-time Не має значення Є тестове завдання Office Serbia, Belgrade, Novi Sad
09.12.20246
Детальніше
  • AWS
  • AWS Glue
  • Amazon S3
  • Athena
  • EMR
  • EC2
  • MWAA
  • Python
  • PySpark
  • SQL
  • Scala
  • Apache Spark
  • Trino
  • Presto
  • ETL
  • Great Expectations
  • Soda

Grid Dynamics, a global software services company driving enterprise-level digital transformation solutions for Fortune 1000 corporations, is looking for a Senior Data Engineer .

Essential functions

  • Design and optimize data ingestion systems to ensure a consistent flow of high-quality data into the platform, creating a solid foundation for developing data products and supporting comprehensive KPI tracking and analysis.
  • Demonstrate expertise in Infrastructure as Code (IaC), utilizing tools like Terraform or CloudFormation to automate infrastructure deployment and management.
  • Translate business requirements into robust technical architecture, including designing physical schema and logical data models.
  • Engage with stakeholders to understand their needs, providing technical guidance for current and future data platform projects.
  • Analyze user interaction with the data platform, focusing on patterns of use to identify areas for improvement and optimize user engagement over time.
  • Implement and maintain data governance frameworks, emphasizing automated processes for data quality checks, compliance adherence, and secure data handling, while collaborating with engineering teams to integrate governance protocols into data pipelines and platform architecture.
  • Participate actively in all Data Platform Engineering team meetings and knowledge-sharing sessions, contributing to team learning and process improvement.

Qualifications

  • AWS Services: Expertise in AWS services, especially Glue, S3, Athena, EMR, EC2, and MWAA.
  • Programming Languages: Proficiency in Python, PySpark, SQL, and/or Scala.
  • Big Data Technologies: Hands-on experience with Spark, and Trino, Presto
  • Data Platforms: Experience in building data platforms, not just using them. (MUST HAVE)

Would be a plus

  • Data Engineering: 3+ years of work experience as a Data Engineer in a cloud-based environment, with a focus on AWS.
  • ETL and Data Modeling: Advanced understanding of ETL processes, data modeling techniques, and best practices.
  • Data Governance and Quality: Experience in applying data governance policies, implementing data contracts, and using data quality frameworks like Great Expectations and Soda.
  • Data Architecture: Strong understanding of data architecture to enable ML and data analytics.
  • Data Mesh: Familiarity with data mesh architecture and principles, with an appreciation for decentralized data management and shared data ownership.
Відгукнутися
Досвід від 5 років Senior Full-time Не має значення Є тестове завдання Office Gdansk, Krakow, Warszawa, Wroclaw
17.04.20248
Детальніше
  • Machine learning
  • LLM
  • Python
  • JAX
  • TensorFlow
  • PyTorch
  • Java
  • Scala

We are looking for ML Engineer to join Artificial Intelligence and Machine Learning team in the HEIS (Health and Essential Industry Solutions) domain, which focuses on applying cutting-edge machine learning techniques to revolutionize health and essential industry sectors. As a Machine Learning Engineer, you will collaborate with talented researchers and engineers to develop innovative solutions that enhance the product impact in healthcare, essential services, and related industries.

Essential functions:

  • Build machine learning tooling to facilitate various phases of the ML lifecycle from model training, data ETL, end-to-end model evaluation and deployment
  • Work with technical and non-technical stakeholders to build solutions to align LLMs for specific use cases.
  • Deliver reusable and easy-to-use tooling to integrate with existing data and machine learning systems.

Qualifications:

  • Strong understanding of machine learning principles, especially in the context of LLMs.
  • 5+ years of proficiency in Python, including machine learning packages like Jax/Tensorflow or PyTorch
  • Skills in Java/scala (preferred)
  • Experience building scalable deep learning systems
  • Experience with large scale data infrastructure
  • Strong verbal and written communications skills with the ability to work effectively across internal and external organizations and virtual teams.
  • BS/BA or equivalent degree in computer science or similar (preferred).
Відгукнутися

Переваги для співробітників Grid Dynamics

  • English Courses
  • Relocation assistance
  • Гнучкий графік роботи
  • Догляд за дітьми співробітників
  • Компенсація витрат на спорт
  • Компенсація навчання
  • Медичне страхування
  • Освітні програми, курси

Читайте нас в Telegram, щоб не пропустити анонси нових вакансій.