TRIONIKA

TRIONIKA – продуктовая трафик-генерирующая компания, которая занимается созданием и продвижением своих проектов на рынки, а также генерирует качественный трафик в EdTech и FinTech направлениях с помощью разных онлайн-каналов. Компания специализируется на монетизации и трафике, в частности на SEO, Media Buying, Partnership Programs.
  • Product / Startup
  • 101-250
  • 2010
  • Київ
  • Advertising / Marketing, Software Development & Hi-Tech

Преимущества для работников

  • English Courses
  • Work-life balance
  • Гнучкий графік роботи
  • Компенсація навчання
  • Медичне страхування
  • Оплачувані лікарняні
  • Освітні програми, курси
  • Регулярний перегляд зарплатні

Вакансии TRIONIKA

К сожалению, мы не нашли актуальных вакансий. Вы можете просмотреть вакансии других компаний

Опыт от 3 лет Middle Full-time Intermediate / B1 Есть тестовое задание Remote Україна, Poland
15.10.20241
Подробнее
  • ETL
  • DataBricks
  • Apache Spark
  • Python
  • Azure Data Factory
  • Azure Synapse
  • SQL
  • Microsoft Azure
  • AWS
  • Git

Infopulse, part of Tietoevry Create, is inviting a talented professional to join our growing team as a Data Engineer/ETL Developer. Our customer is one of the Big Four companies providing audit, tax, consulting, and financial advisory services.

Areas of Responsibility

  • Design, develop, and maintain ETL processes to support data integration and reporting requirements
  • Work with Databricks, PySpark (Scala or Python) to create scalable and efficient data pipelines
  • Utilize Azure Data Factory for orchestrating and automating data movement and transformation
  • Write, optimize, and troubleshoot complex SQL queries for data extraction, transformation, and loading
  • Collaborate with cross-functional teams to understand data requirements and deliver high-quality data solutions
  • Monitor and ensure the performance, reliability, and scalability of ETL processes

Qualifications

  • 3+ years of experience in ETL development or data engineering roles
  • Strong experience with Databricks and Spark / Python
  • Proficiency in Azure Data Factory and Azure Synapse
  • Advanced SQL skills, including query optimization and performance tuning
  • Solid understanding of data warehousing concepts and best practices
  • Strong problem-solving skills and attention to detail
  • Excellent communication skills with the ability to work collaboratively in a team environment

Will be an advantage

  • Experience with cloud data platforms (e.g., Azure, AWS)
  • Familiarity with data governance and security practices
  • Experience with version control systems like Git
Откликнуться
Опыт от 3 лет Senior Full-time Upper-Intermediate / B2 Есть тестовое задание Remote Україна, Poland
15.10.20241
Подробнее
  • DataBricks
  • Databricks Unity Catalog
  • PySpark
  • Scala
  • Python
  • SQL
  • Azure Data Factory
  • ETL

Infopulse, a part of Tietoevry Create, is looking for a skilled and experienced Senior Databricks Developer to join our growing team. Our customer is one of the Big Four companies providing audit, tax, consulting, and financial advisory services

Areas of Responsibility

  • Lead the migration of data assets and workloads from legacy Databricks environments to Databricks Unity Catalog
  • Design, develop, and maintain scalable ETL processes using Databricks, PySpark (Scala or Python), and other relevant technologies
  • Ensure seamless data integration and compliance with data governance standards during the migration process
  • Optimize and troubleshoot complex SQL queries for data extraction, transformation, and loading within the new Databricks UC framework
  • Collaborate with cross-functional teams to understand migration requirements and deliver high-quality data solutions
  • Monitor the performance, reliability, and scalability of the new Databricks Unity Catalog environment post-migration
  • Provide administrative support and configuration management for the Databricks platform, ensuring best practices in security and data governance

Qualifications

  • 3+ years of experience in Databricks development, including significant experience with Databricks administration
  • Proven track record of successfully migrating data environments to Databricks Unity Catalog or similar platforms
  • Strong experience with PySpark (Scala or Python) for data pipeline creation and optimization
  • Proficiency in SQL, with advanced skills in query optimization and performance tuning
  • Familiarity with Azure Data Factory and other cloud-based ETL tools
  • Solid understanding of data warehousing concepts, data governance, and best practices in a cloud environment
  • Strong problem-solving abilities and attention to detail, especially in migration scenarios
  • Excellent communication skills, with the ability to work collaboratively with technical and non-technical stakeholders
Откликнуться
Опыт от 5 лет Senior Full-time Upper-Intermediate / B2 Есть тестовое задание Remote Україна, Sofia, Varna
15.10.20241
Подробнее
  • Python
  • R
  • Scala
  • TensorFlow
  • PyTorch
  • scikit-learn
  • Generative AI
  • GPT
  • BERT
  • DALL-E
  • Hugging Face Transformers
  • OpenAI
  • CI/CD
  • MLflow
  • Kubeflow
  • TFX
  • SQL
  • Apache Spark
  • Hadoop
  • AWS
  • Microsoft Azure
  • GCP
  • Docker
  • Kubernetes
  • Kafka
  • Apache Flink

Infopulse, Part of TietoEvry Create, is inviting a talented professional to join our growing team as a Senior Data Scientist.

Areas of Responsibility

  • Solution Design and Architecture
    • Design and architect end-to-end data science solutions that align with business objectives and technical requirements.
    • Develop scalable and maintainable data science workflows, including data ingestion, preprocessing, modeling, and deployment.
    • Ensure the integration of data science solutions with existing systems and platforms.
  • Solution Implementation and Deployment
    • Oversee and participate in implementing data science solutions, including developing and deploying machine learning models.
    • Ensure solutions are robust, scalable, and perform well in production environments.
    • Conduct code reviews and ensure adherence to coding standards and best practices.
  • Performance Optimization and Troubleshooting
    • Optimize the performance of data science solutions, including model accuracy, computational efficiency, and resource utilization. Improve efficiency by creating repeatable and reusable modules
    • Troubleshoot and resolve technical issues related to data science solutions.
  • Technical Leadership
    • Provide technical leadership and guidance to data scientists, data engineers, and other stakeholders.
    • Stay updated with the latest advancements in data science, machine learning, and AI technologies, and apply them to improve solution designs.
  • Data Strategy and Governance
    • Define data strategy and governance frameworks to ensure data quality, security, and compliance.
    • Establish best practices for data management, including data acquisition, storage, and processing.
  • Collaboration and Communication
    • Work closely with business stakeholders to understand their needs and translate them into technical requirements.
    • Communicate complex technical concepts to non-technical stakeholders clearly and concisely.
    • Foster a collaborative environment to facilitate knowledge sharing and innovation.

Qualifications

  • 5+ years of experience in the field.
  • Strong programming skills in languages such as Python, R, or Scala.
  • Machine Learning Frameworks and Libraries.
    • Expertise in machine learning frameworks and libraries (e.g., TensorFlow, PyTorch, scikit-learn).
  • Generative AI (GenAI).
    • Experience with Generative AI models (e.g., GPT, BERT, DALL-E) and frameworks (e.g., Hugging Face Transformers, OpenAI GPT-3).
    • Knowledge of fine-tuning GenAI models for specific tasks and industries.
    • Ability to design and implement GenAI solutions for various applications such as text generation, image generation, and conversational AI.
    • Familiarity with techniques for training and deploying GenAI models.
    • Experience in leveraging GenAI for tasks such as automated content creation and data augmentation.
  • MLOps.
    • Proficiency in MLOps practices, including model deployment, monitoring, and continuous integration/continuous deployment (CI/CD) for machine learning models.
    • Experience with MLOps tools and platforms (e.g., MLflow, Kubeflow, TFX).
  • Data Manipulation and Analysis.
    • Proficiency in data manipulation and analysis using SQL and data processing tools (e.g., Apache Spark, Hadoop).
  • Cloud Platforms.
    • Experience with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data science and machine learning services.
    • Understanding of cloud infrastructure and services for scalable AI deployments.
  • Containerization and Orchestration.
    • Proficiency in containerization technologies (e.g., Docker) and orchestration tools (e.g., Kubernetes) for deploying and managing data science solutions.
  • Big Data Technologies.
    • Experience with big data technologies (e.g., Apache Kafka, Apache Flink) for handling and processing large datasets.

Will be an advantage

  • A degree in Data Science and/or Mathematics.
Откликнуться
Опыт от 4 лет Middle, Senior Full-time Upper-Intermediate / B2 Есть тестовое задание Remote Україна
15.10.20241
Подробнее
  • AWS
  • Windows Server
  • MSSQL
  • Python
  • Shell
  • PowerShell
  • Kubernetes
  • Docker
  • Terraform
  • Ansible
  • Datadog

Infopulse, part of Tietoevry Create, is looking for a Cloud Engineer to join our Cloud Operations team for a remote collaboration.
You will collaborate closely with our dedicated Cloud Engineers, Cloud Architect, Security Engineer, and DevOps Engineers. The Regional Cloud Operations (RCO) team in EMEA manages several environments with different setups and complexity.
As a Cloud Engineer in our team, your scope is broad and tasks will vary over time and along with team deliverables and load. You will be a part of long-term initiatives and feature improvements across our environments. You should put standards and theory into practice and have a clear understanding of what we need to improve and implement based on our deliverables.

Areas of Responsibility

  • Production and targeted migrations
  • Key components upgrade (e.g., VPN solutions, AD controllers where uptime and availability are critical)
  • Detailed performance optimization
  • Cloud infrastructure improvements
  • Environment and components modernization
  • Day-to-day monitoring
  • L2-L3 support including customer on/off boarding and troubleshooting

Qualifications

  • Strong AWS knowledge and clear understanding of main services and security best practices
  • Practical experience of Windows Server administration (AD, MS Remote Desktop services etc.)
  • Experience in L2-L3 support and customer support
  • Strong troubleshooting skills
  • Proficiency in network administration
  • Experience with production monitoring
  • MS SQL administration skills
  • Monitoring & logging tools experience
  • An Upper Intermediate level of English

Will be an advantage

  • AWS certification
  • Scripting experience, preferably with Python 3, Shell or PowerShell
  • Knowledge of Kubernetes/Docker
  • Knowledge of Terraform/Ansible
  • Experience with Datadog.
Откликнуться
Опыт от 2 лет Junior, Middle Full-time Intermediate / B1 Есть тестовое задание Remote Україна
15.10.20244
Подробнее
  • .NET Core
  • C#
  • NHibernate
  • Web Services
  • OData
  • ASP.NET Core
  • JavaScript
  • JQuery
  • HTML
  • CSS
  • Microsoft SQL Server
  • Oracle
  • Agile
  • Kendo UI

Infopulse, part of Tietoevry Create, is inviting a talented professional to join our growing team as a Junior/Middle .NET Developer to contribute to the development of one of the leading treasury management products in Europe and EEM.
Our customer is the French division of a British multinational enterprise software company headquartered in Newcastle, the world’s third-largest supplier of enterprise resource planning software (behind Oracle and SAP), and the largest supplier to small businesses with more than 6 million customers worldwide.

Areas of Responsibility

  • Designing and developing new functional modules
  • Developing Platform and Core features
  • Doing Unit Testing
  • Building quality code adhering to industry standards of coding practices
  • Providing technical maintenance
  • Supporting customers migration
  • Coordinating technical activities and documentation throughout the project

Qualifications

  • Knowledge of .NET Core and C#
  • Practical experience with NHibernate, Web Services, and OData
  • Excellent knowledge of ASP.NET Core, JavaScript, Jquery, HTML/CSS
  • Knowledge of SQL Server and Oracle
  • Knowledge of Agile methodology
  • At least an Intermediate level of English

Will be an advantage

  • Knowledge of Kendo UI
  • Knowledge of French
  • Experience in Cash/Treasury management

Personal skills

  • Structured and open-minded personality
  • Ability to communicate thoughts in a clear way
  • Ability to hear and accept others' opinions
  • Commitment to software development and IT
Откликнуться

Справочная информация

Компания:TRIONIKA
Адрес:01032, г. Киев, ул. Евгения Коновальца, 36Д
Юридическое название:ООО "ТРИОНИКА УКРАИНА"
Телефон:+38 099 267 17 12
E-mail:info@trionika.com
Официальный сайт TRIONIKA:trionika.com

Телефон, контакты компании

  • г. Киев, ул. Евгения Коновальца, 36Д;
  • +38 099 267 17 12;
  • hr@trionika.com

Компания TRIONIKA в соцсетях

* Информация взята с официального сайта компании "TRIONIKA" trionika.com и других открытых источников.

"TRIONIKA"
  • TRIONIKA
  • Отзывов: 1
  • Рейтинг: 4 з 5
  • Оценок: 3