Proxet — вакансії

  • Outsource
  • 251-500
  • 2009
  • Київ, Чернівці, Wroclaw, Boston
  • Advertising / Marketing, Cloud technologies, E-commerce / Marketplace, FinTech, Healthcare / MedTech / LifeScience, Machine Learning / Big Data, Real Estate, Software Development & Hi-Tech, VR / XR

Актуальні вакансії компанії

Досвід від 3 років Middle, Senior Full-time Advanced / Fluent / C1 Є тестове завдання Remote Україна, Poland
27.03.20251
Детальніше
  • Python
  • SQL
  • Palantir Foundry
  • PostgreSQl
  • Redis
  • Snowflake
  • Azure
  • DataBricks

We are seeking a skilled and adaptable Data Engineer who is passionate about data infrastructure and long-term career growth. This role offers an opportunity to build and maintain scalable data solutions while developing expertise in Palantir Foundry and other modern data tools. We value individuals who are excited to expand their technical capabilities over time, work on multiple accounts, and contribute to a dynamic and growing team.
You will play a pivotal role in transforming raw data from various sources into structured, high-quality data products that drive business decisions. The ideal candidate should be motivated to learn and grow within the organization, actively collaborating with experienced engineers to strengthen our data capabilities over time.

About the project

This project focuses on building a centralized data platform for a leading investment firm that supports data-driven decision-making for high-growth companies. Currently, data is sourced from multiple locations, including Excel files, third-party tools, and custom applications, managed within separate systems. This decentralized approach creates inefficiencies and introduces the potential for data inaccuracies.
The objective is to integrate these data sources into a single, unified platform that streamlines access and reduces manual errors. By transforming financial, legal, and operational data into structured data marts, the platform will enable advanced analytics and real-time visualization through BI tools on both web and mobile interfaces.

Skills & Experience

  • Bachelor’s degree in Computer Science, Software Engineering, or equivalent experience.
  • Minimum 3 years of experience in Python, SQL, and data engineering processes.
  • Experience with Palantir Foundry or a strong willingness to learn and develop expertise in it.
  • Proficiency in multiple database systems, such as PostgreSQL, Redis, and a data warehouse like Snowflake, including query optimization.
  • Hands-on experience with Microsoft Azure services.
  • Strong problem-solving skills and experience with data pipeline development.
  • Familiarity with testing methodologies (unit and integration testing).
  • Docker experience for containerized data applications.
  • Collaborative mindset, capable of working across multiple teams and adapting to new projects over time.
  • Fluent in English (written & verbal communication).
  • Curiosity and enthusiasm for finance-related domains (personal & corporate finance, investment concepts).

Nice to have

  • Experience with Databricks.
  • Experience with Snowflake.
  • Background in wealth management, investment analytics, or financial modeling.
  • Contributions to open-source projects or personal projects showcasing data engineering skills.

Responsibilities

  • Design and maintain scalable data pipelines to ingest, transform, and optimize data.
  • Collaborate with cross-functional teams (engineering, product, and business) to develop solutions that address key data challenges.
  • Support data governance, data quality, and security best practices.
  • Optimize data querying and processing for efficiency and cost-effectiveness.
  • Work with evolving technologies to ensure our data architecture remains modern and adaptable.
  • Contribute to a culture of learning and knowledge sharing, supporting newer team members in building their skills.
  • Grow into new roles within the company by expanding your technical expertise and working on diverse projects over time.
Відгукнутися
Досвід від 5 років Senior Full-time Не має значення Є тестове завдання Remote, Hybrid Україна
27.03.20252
Детальніше
  • AWS
  • Snowflake
  • Salesforce
  • Workato
  • Microsoft Power BI
  • Python

We are seeking a Senior Data Engineer to lead the design and implementation of a robust data pipeline and warehouse architecture leveraging Snowflake on AWS. This role will focus on ingesting and transforming data primarily from Salesforce (SFDC) and potentially other marketing and sales systems, enabling advanced analytics and reporting capabilities. The candidate will play a key advisory role in defining and implementing best practices for data architecture, ingestion, transformation, and reporting.

About the project

Our client is a global real estate services company specializing in the management and development of commercial properties. Over the past several years, the organization has made significant strides in systematizing and standardizing its reporting infrastructure and capabilities. Due to the increased demand for reporting, the organization is seeking a dedicated team to expand capacity and free up existing resources.

Skills & Experience

  • 5+ years of experience in data architecture, data engineering, or related roles.
  • Proven expertise in designing and implementing data pipelines on AWS.
  • Hands-on experience with Snowflake (ingestion, transformation, and data modeling).
  • Strong understanding of Salesforce (SFDC) data structures and integrations.
  • Deep knowledge of data warehouse architectures, including Medallion architecture and data governance.
  • Good to know: Workato (or similar integration tools) and Power BI for dashboards and reporting.
  • Experience in data validation, cleansing, and optimization techniques.
  • Exceptional communication and stakeholder management skills.
  • Ability to work independently and deliver results in a fast-paced environment.

‍Responsibilities

  • Design and implement scalable data pipelines on AWS to ingest and transform data from Salesforce (SFDC) and other sources into Snowflake.
  • Integrate SFDC data using tools like Workato (or propose alternative solutions).
  • Provide advisory services on Snowflake architecture and implement best practices for ingestion, validation, cleansing, and transformation.
  • Develop the initial set of data products (analytics, dashboards, and reporting) in Power BI.
  • Guide the creation of a semantic layer and optimize data governance using Medallion architecture.
  • Ensure scalability, efficiency, and performance of the data infrastructure.
Відгукнутися
$4000 – 6000
Досвід від 5 років Senior Full-time Upper-Intermediate / B2 Є тестове завдання Office Київ
Бонус за рекомендацію: $1000
10.05.20248
Детальніше
  • Java
  • NoSQL
  • AWS
  • Agile
  • Scala

About the role:

Are you a Java Developer looking for new challenges? What about working with a high-profile client with millions of QPS? If you have experience developing high-performance distributed systems, this is the perfect opportunity to work on a new digital marketing management platform with a world-renowned client.

About the project:

Our client is a leading streaming service based in the United States. With millions of users worldwide, it’s devices provide easy access to free TV, live news, sports, movies, etc. The client has an advertising business and also licenses its hardware and software to other companies.

Skills & Experience:

  • Background in computer science or similar quantitative field;
  • 5+ years of professional software development experience;
  • Expert Knowledge of Core Java;
  • Experience developing high-scale and high-performance distributed systems;
  • Good understanding of algorithms, data structures, performance optimization techniques, object-oriented programming, multi-threading and real-time programming;
  • Product-focused mindset;
  • Team player with strong interpersonal skills;
  • English – Upper-intermediate or above.

Will be a plus:

  • Experience with cache optimization, distributed cache and NO SQL DB is a plus;
  • Experience with Big Data and AWS services is a plus;
  • Experience in the advertising domain a big plus.

Responsibilities:

  • Work with a highly skilled engineering team in all phases of the Agile development process from design to deployment;
  • Design, develop, and maintain a high scale, high-performance real-time applications;
  • Work with quality assurance, release engineering and product management to deliver quality software;
  • Identify, design, and implement improvements to the current architecture. This may include: internal process improvements, automating manual processes, optimizing data delivery, reducing cost, re-designing infrastructure for greater reliability, etc;
  • Take your own initiative in the development process and working atmosphere improvements, be proactive in suggesting new vision and approaches to the platform development; anticipate in problems or issues solutions that may arise;
  • Deliver constant value back to the business in a highly agile team approaching near-continuous deployment.
Відгукнутися

Переваги для співробітників Proxet

  • English Courses
  • Team buildings
  • Відпустка по догляду за дитиною
  • Гнучкий графік роботи
  • Допомога психотерапевта
  • Компенсація витрат на спорт
  • Медичне страхування
  • Надається ноутбук
  • Оплачувані лікарняні
  • Освітні програми, курси

Читайте нас в Telegram, щоб не пропустити анонси нових вакансій.