Principal Data Architect
Ottometric Inc.
Principal Data Architect
Novi Sad, Serbia
Our team is global, addressing the challenges of bringing safe ADAS and autonomous automotive systems to market for our customers.
As a Principal Data Architect, you will be the primary visionary for our global data strategy. You will tackle the "unsolved" problems of autonomous vehicle data: how to efficiently store, index, and query petabytes of high-dimensional, multi-modal sensor data.
You will lead the transition of our data infrastructure into a state-of-the-art Open Lakehouse architecture, leveraging Apache Iceberg and the Hadoop ecosystem to create a deterministic, high-performance environment for ML research and safety-critical validation.
This role would require you to work for two years in our Serbian office, with the option of then moving to the US office.
Customer Relationship
Personal Evolution
Autonomy
Administrative Work
Technical Expertise
Responsibilities
- Lead the design of a data lakehouse that supports the requirements of ADAS/AV, including 4D spatial-temporal querying and multi-modal data fusion.
- Develop custom partitioning schemes, Z-ordering, and hidden indexing strategies tailored for LiDAR, radar, and video metadata.
- Solve challenges regarding data consistency, deterministic "replay" of vehicle logs, and massive-scale data lineage
- Develop algorithms for data deduplication and "intelligent tiering," ensuring that rare "edge-case" driving data is preserved while optimizing the cost-to-performance ratio of the petabyte-scale lake.
- Partner with ML teams to ensure the data architecture supports emerging paradigms like Foundation Models and End-to-End Autonomous Driving architectures.
Must Have
- PhD in Computer Science, Distributed Systems, Database Systems, or a related quantitative field.
- 5+ years of experience in data systems, with a significant track record of designing large-scale distributed architectures.
- Deep, "under-the-hood" knowledge of Apache Iceberg (specification and implementation) and the Hadoop ecosystem (HDFS, Spark, Trino/Presto).
- Evidence of contributions to the field, such as publications in top-tier conferences (e.g., SIGMOD, VLDB, ICDE, OSDI) or a history of significant contributions to major open-source data projects.
- Expert-level understanding of query optimization, file format internals (Parquet/Avro), and the trade-offs of distributed consensus protocols.
Nice to have
- Automotive Safety Standards: Understanding of data integrity requirements for ISO 26262 or SOTIF (Safety of the Intended Functionality).
- Geospatial Mastery: Experience with H3, S2, or other spatial indexing systems for high-frequency GPS and trajectory data.
- Cloud Economics: Proven ability to manage the financial architecture of massive cloud deployments (AWS/Azure/GCP).
What's great in the job?
- Great team of smart people, in a friendly and open culture
- No dumb managers, no stupid tools to use, no rigid working hours
- No waste of time in enterprise processes, real responsibilities and autonomy
- Expand your knowledge of various business industries
- Create content that will help our users on a daily basis
- Real responsibilities and challenges in a fast evolving company
Our Product
Discover our products.
What We Offer
Each employee has a chance to see the impact of his work. You can make a real contribution to the success of the company.