Data Infrastructure Engineer
Who we are:
Semios is a market leader in leveraging internet-of-things (IoT) and Big Data to improve the sustainability and profitability of specialty crops. Our sensors currently collect over 200 million data points per day. These hyper-granular measurements are combined with external data sources in our custom analytics platform to provide tree fruit growers with decision-making tools that empower them to minimize resources mitigate risks.
We know our journey is only achievable by having a great team who shares ideas, tries new things and learns as we go.
Our innovative work has received several industry awards:
- THRIVE - Top 50 Leading AgTech (2018,2019) – recognized as exemplifying some of the best in agriculture technology around the globe
- Global CleanTech Top 100 (2018,2019) – identified as one of the companies best positioned to solve tomorrow’s clean technology challenges
- BC Export Awards - Clean Technology (2018)
- BC Technology Impact Awards - Most Promising Startup (2017)
Who You Are:
You are looking to make a difference, you want to know your work with Big Data has real-world benefits. You are also excited to make your mark by helping drive the future of Semios Data Infrastructure & Data Platform.
What you will do:
As a Data Infrastructure Engineer, you will help design, implement and deploy data pipelines using cutting-edge tools and products. You will work jointly with our Data Science team to help build grower models and ensure their execution.
The Semios Data Infrastructure Team is currently embarking on a revamp of our Data Platform that puts a fully-managed Data Warehouse at the center of our data ecosystem. We are committed to choosing a toolchain and suite of products we believe to be the future of Big Data collection, storage and analysis. Therefore, successful candidates in this role must be comfortable evaluating best-in-class tech and help drive decisions in this rapidly growing space.
We want you to succeed, so you will need:
- Advanced skills in SQL; how to write elegant queries; written for humans first, machines second.
- The ability to thrive both autonomously and in a team environment.
- Experience with at least one Data Warehouse (BigQuery, RedShift, Snowflake, On-Prem)
- Excellent verbal & written communication skills: a talent to distill complex ideas to different audiences.
- An in-depth experience with Big Data. A proven track-record of effective collection, storage, and access.
- Hands-on experience with provisioning and developing on cloud platforms, particularly AWS and GCP.
- Proven experience with workflow and scheduling tools like Airflow, Luigi, or Kubeflow.
- Excellent troubleshooting skills to rapidly identify and resolve issues.
Nice to have:
- Significant exposure to at least one relational database (Postgres, MySQL).
- Real world experience with containers (Docker) & container management systems (Kubernetes).
- A fluency in Python, Node or other imperative language or ability to learn quickly and with enthusiasm.
- Advanced education in Big Data whether from Academia, or Certifications.
Why this is the opportunity for you:
- Sleep better knowing you're making the world a better place through more sustainable food production.
- Opportunity to solve meaningful and interesting real-life puzzles.
- Work with a team that values fun, laughter, and each other.
- Competitive salary, performance based incentives, stock options, and flexible work hours.
- Tech-focused office location, convenient to transit and bike paths.
- Use the latest, and best tech for the job.