Apply Now Please apply only once to the Planet Expat program
  • Madrid, Spain

  • Starting in June of 2017

  • 2200 EUR/month

    Based on profile
  • 12 to 48 months

Company Info

This opportunity is part of the Planet Expat Work Abroad Program, which aims to maximize your chances of joining one of the most innovative companies in Europe and Latin America. Here is more information about the hiring company:

Planet Expat provides career-boosting opportunities in some of the most innovative & dynamic companies in Latin America, Europe, the US and Asia. We are now hiring for the company below:

"We are a tech-startup dedicated to providing meaningful data about the hospitality industry. We gather big data from major short term rental market platforms such as Airbnb and online travel agents to generate analytics for our customers: occupancy rates, price monitoring, etc. 

Our clients include large companies from the hospitality, real estate and financial sectors.

We are an international team of 12 people, mainly based in Spain and California." 


  • Day to day maintenance of ubuntu servers and storage services based on AWS.

  • Constant monitoring of alerts and events to report potential issues in the system.

  • Scale infrastructure depending on the current needs of the systems.

  • Research and design distributed database infrastructure based on mongodb to find the correct balance between cost and performance.

  • Room to grow into node.js development or other scripting languages if interested.


  • Bachelor or master degree in computing engineering, information systems or similar fields.

  • Proven 2+ years professional experience in database administration for large mongodb databases

  • Proven 2+ years professional experience in a backend software platforms like node.js, python or php.

  • Experience managing systems in large IaaS platforms like Amazon Web Services.

  • Experience in implementing bash scripts for Linux systems.

Bonus points if:

  • Experience in node.js

  • Experience building crawling systems that gather data from public websites.

  • Experience in data mining platforms like hadoop, spark or elastic search