hero

Careers

Are you as enthusiastic about innovation as we are? Our portfolio companies are hiring.
OCA Ventures
59
companies
207
Jobs

Senior Big Data Engineer

Placer.ai

Placer.ai

Data Science
Tel Aviv-Yafo, Israel
Posted on Dec 11, 2023

ABOUT PLACER.AI:

Placer.ai is a fast-growing big data startup led by seasoned executives and repeat entrepreneurs who are building the world's first "Google Analytics for the physical world." Placer.ai’s platform provides instant visibility into any property in the U.S., presenting accurate details about visitation patterns and demographic breakdowns of visitors. Placer.ai’s customers can see where visitors have been before, where they go afterwards, where they typically go for sports, entertainment, groceries, etc., and what their interests are. Placer.ai's A.I.-based SaaS platform replaces archaic solutions such as manual surveys, installed cameras and other people-counting systems, creating a blue ocean market of more than $100B.

Placer.ai has grown 3x year-over-year for the past 3 years, counting more than 1000 paying customers across a range of industries, including 2 of the world’s top-10 retailers, 2 of the top-10 CPG firms worldwide, a world’s top hospitality firm, 2 of the world’s top-10 commercial real estate (CRE) firms and 2 of the world’s top multinational asset managers and hedge funds. Placer.ai has just raised $100M for Unicorn ($1B+) valuation in Series C funding.

SUMMARY:

As a Senior Data Engineer, working within Placer.ai's Foundation Entity Group, you will play a pivotal role in designing, developing, and maintaining the data infrastructure that powers our location analytics platform.

RESPONSIBILITIES:

  • Data Pipeline Architecture and Development: Design, build, and optimize robust and scalable data pipelines to process, transform, and integrate large volumes of data from various sources into our analytics platform.
  • Data Quality Assurance: Implement data validation, cleansing, and enrichment techniques to ensure high-quality and consistent data across the platform.
  • Performance Optimization: Identify performance bottlenecks and optimize data processing and storage mechanisms to enhance overall system performance and reduce latency.
  • Cloud Infrastructure: Work extensively with cloud-based technologies (GCP), to design and manage scalable data infrastructure.
  • Collaboration: Collaborate with cross-functional teams including Data Analysts, Data Scientists, Product Managers, and Software Engineers to understand requirements and deliver solutions that meet business needs.
  • Data Governance: Implement and enforce data governance practices, ensuring compliance with relevant regulations and best practices related to data privacy and security.
  • Monitoring and Maintenance: Monitor the health and performance of data pipelines, troubleshoot issues, and ensure high availability of data infrastructure.
  • Mentorship: Provide technical guidance and mentorship to junior data engineers, fostering a culture of learning and growth within the team.

REQUIREMENTS:

  • Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
  • 5+ years of professional experience in software development, with at least 3 years as a Data Engineer.
  • Spark expertise (mandatory): Strong proficiency in Apache Spark, including hands-on experience with building data processing applications and pipelines using Spark's core libraries.
  • PySpark/Scala (Mandatory): Proficiency in either PySpark (Python API for Spark) or Scala for Spark development.
  • Data Engineering: Proven track record in designing and implementing ETL pipelines, data integration, and data transformation processes.
  • Cloud Platforms: Hands-on experience with cloud platforms such as AWS, GCP, or Azure.
  • SQL and Data Modeling: Solid understanding of SQL, relational databases, and data modeling.
  • Big Data Technologies: Familiarity with big data technologies beyond Spark, such as Hadoop ecosystem components, data serialization formats (Parquet, Avro), and distributed computing concepts.
  • Programming Languages: Proficiency in programming languages like Python, Java, or Scala.
  • ETL Tools and Orchestration: Familiarity with ETL tools and frameworks, such as Apache Airflow.
  • Problem-Solving: Strong analytical and problem-solving skills.
  • Collaboration and Communication: Effective communication skills and collaboration within cross-functional teams.
  • Geospatial Domain (Preferred): Prior experience in the geospatial or location analytics domain is a plus.

WHY JOIN PLACER.AI?

Join a rocket ship! We are pioneers of a new market that we are creating

  • Take a central and critical role at Placer.ai
  • Work with, and learn from, top-notch talent
  • Competitive salary
  • Excellent benefits
  • Hybrid work model

NOTEWORTHY LINKS TO LEARN MORE ABOUT PLACER

Placer.ai is an equal opportunity employer, and we are committed to building a team culture that celebrates diversity and inclusion.

Placer.ai’s applicants are considered solely based on their qualifications, without regard to an applicant’s disability or need for accommodation. Any Placer.ai applicant who requires reasonable accommodations during the application process should contact Placer.ai’s Human Resources Department to make the need for an accommodation known.

#LI-AC1