**Experienced Full Stack Data Entry Specialist – Cloud-Based Data Pipelines and ETL Process Management**

Posted 2026-05-06
Remote, USA Full-time Immediate Start

At arenaflex, we're on a mission to revolutionize the retail industry through innovative technology and data-driven insights. As a key member of our Supported Hunt Information group, you'll play a vital role in designing, implementing, and maintaining data pipelines, datasets, and ETL processes for arenaflex's Supported Pursuit Publishing platform. If you're passionate about data engineering, cloud computing, and collaboration, we want to hear from you!

  • *About arenaflex Worldwide Tech**

Imagine working in a dynamic environment where one line of code can make a significant impact on millions of people. That's what we do at arenaflex Worldwide Tech. We're a team of talented professionals, including programmers, data scientists, network security experts, and operations specialists, who are shaping the future of retail. Our mission is to empower individuals and organizations through innovative technology and data-driven insights. We're human-driven and tech-engaged, and we're committed to developing our team members in the areas of skills that will shape the future.

  • *Key Responsibilities**

As a Full Stack Data Entry Specialist, you'll be responsible for:

  • Designing, implementing, and maintaining robust, adaptable data pipelines for ingesting, transforming, and storing large volumes of data to support marketing on arenaflex.com and its subsidiaries.
  • Working on cloud platforms like Azure and Google Cloud for data storage, processing, and analysis.
  • Developing and deploying large-scale, containerized applications using Docker and Kubernetes on public clouds like Google GCP and Microsoft Azure.
  • Collaborating with other scrum teams, QA, Product, Program Management, and Partner Operations, while working with cross-functional project development teams.
  • Participating in 24/7 on-call rotations to investigate production issues across cross-functional teams.
  • Ensuring data quality, accessibility, and reliability for analysis and reporting for sponsors using the self-service platform.
  • *What You'll Bring**

To succeed in this role, you'll need to demonstrate:

  • A strong aptitude in data engineering concepts, data set design, ETL cycles, and data mining.
  • Proficiency in working with data technologies, including SQL, Python, Flash, Scala, Hadoop, and related tools and frameworks.
  • Experience with ETL tools like Apache NiFi.
  • Solid skills working with social databases (e.g., Azure SQL) and NoSQL databases (e.g., Cassandra).
  • Experience with real-time message processing using Apache Kafka.
  • Experience in designing and implementing data models for effective capacity and recovery.
  • A growth-oriented mindset and a willingness to raise the technical bar, as well as identify potential opportunities for improving existing processes, tools, and frameworks to achieve high scale and efficiency.
  • Experience working with Docker and Kubernetes, as well as Distributed Computing Services like Google GCP and Microsoft Azure, and Distributed Storage Frameworks like Hive and Hadoop.
  • *Essential Qualifications**
  • Bachelor's degree in Computer Science, Information Technology, or a related field.
  • 3+ years of experience in data engineering, cloud computing, and data analysis.
  • Strong understanding of data engineering concepts, data set design, ETL cycles, and data mining.
  • Proficiency in working with data technologies, including SQL, Python, Flash, Scala, Hadoop, and related tools and frameworks.
  • Experience with ETL tools like Apache NiFi.
  • Solid skills working with social databases (e.g., Azure SQL) and NoSQL databases (e.g., Cassandra).
  • *Preferred Qualifications**
  • Master's degree in Computer Science, Information Technology, or a related field.
  • 5+ years of experience in data engineering, cloud computing, and data analysis.
  • Experience with real-time message processing using Apache Kafka.
  • Experience in designing and implementing data models for effective capacity and recovery.
  • A growth-oriented mindset and a willingness to raise the technical bar, as well as identify potential opportunities for improving existing processes, tools, and frameworks to achieve high scale and efficiency.
  • Experience working with Docker and Kubernetes, as well as Distributed Computing Services like Google GCP and Microsoft Azure, and Distributed Storage Frameworks like Hive and Hadoop.
  • *Skills and Competencies**
  • Strong analytical and problem-solving skills.
  • Excellent communication and collaboration skills, with the ability to present to both technical and non-technical audiences.
  • A growth-oriented mindset and a willingness to learn and adapt to new technologies and processes.
  • Strong attention to detail and ability to work in a fast-paced environment.
  • Ability to work independently and as part of a team.
  • *Career Growth Opportunities and Learning Benefits**

At arenaflex, we're committed to developing our team members in the areas of skills that will shape the future. We offer:

  • Opportunities for career growth and advancement in a dynamic and innovative environment.
  • Access to cutting-edge technologies and tools.
  • Collaborative and supportive team environment.
  • Opportunities for professional development and training.
  • Flexible work arrangements, including remote work options.
  • *Work Environment and Company Culture**

At arenaflex, we're committed to creating a positive and inclusive work environment that values diversity, equity, and inclusion. We offer:

  • A dynamic and collaborative work environment.
  • Flexible work arrangements, including remote work options.
  • Opportunities for professional development and training.
  • Access to cutting-edge technologies and tools.
  • A comprehensive benefits package, including health insurance, retirement savings, and paid time off.
  • *Compensation, Perks, and Benefits**

At arenaflex, we offer a competitive compensation package, including:

  • A salary range of $80,000 - $120,000 per year, depending on experience.
  • Comprehensive benefits package, including health insurance, retirement savings, and paid time off.
  • Flexible work arrangements, including remote work options.
  • Opportunities for professional development and training.
  • Access to cutting-edge technologies and tools.
  • *How to Apply**

If you're passionate about data engineering, cloud computing, and collaboration, and you're looking for a challenging and rewarding career opportunity, we want to hear from you! Please submit your resume and a cover letter explaining why you're the ideal candidate for this role.

Similar Jobs

Back to Job Board