APERIO is a fast-growing, well-funded, data integrity startup looking for the right people to join our growing team. APERIO has built an entirely new critical foundation for validating industrial sensor data at scale. We turn large volumes of dirty, unreliable data into usable, trustworthy data, thus unlocking the Industry 4.0 revolution. As such, we help customers drive profitability and sustainability goals while mitigating risk in their industrial operations.
APERIO’s customer-facing engineering group, passionate about people, technology, and everything in between. We strive to provide technical leadership to our customers, define deployment architectures, use cutting-edge technologies to overcome complex technical challenges while analyzing product gaps, and work closely with our product and development groups to deliver the best solutions to our worldwide customers. We’re looking for a bright, solution-oriented customer facing DevOps Delivery Engineer to join the team and build, run, manage and monitor cloud-based and customers environments.
Roles and Responsibilities
As a DevOps Engineer with 2+ years of experience in live production systems. You’re passionate about automating things to make life simple. You’re also great with Linux system internals and shell scripting (bash/zsh/etc), have experience running production systems over the cloud and have spent time managing scalable web applications. You have great troubleshooting skills of Cloud and Physical environments.
As a DevOps Delivery Engineer, you will:
- Build multi-data center architecture with dozens and hundreds of servers
- Quickly identify and resolve product and infrastructural issues.
- Get a chance to work with virtually all aspects of our rapidly-developing solution.
- Drive the success of our systems’ advancements, cooperating with the development, product and sales groups
Bonus points if you have:
- Experience with at least one programming language – Python, Golang, etc.
- Experience with configuration management/deployment tools, containers, Kubernetes
- Strong networking foundations (TCP/IP, communication protocols, etc)
- Experience with SQL/NoSQL technologies like MySQL, ElasticSearch, etc.
- Experience with RabbitMQ.
- Experience with monitoring tools.
Our mission is to unlock the imagination of builders. We’re the data integrity layer of mission critical systems all around the world. We empower our employees to think like owners, and innovate like entrepreneurs. As a hard-working, collaborative team, we demonstrate integrity in everything we do. We love what we and we want you to as well.
If you think you would be a good fit in our team – we want to hear from you. Please send us your details and CV below.