View all jobs

ETL Engineer-Hadoop

Cambridge, MA
Overview:
Exciting and unique opportunity to join an early stage mega-funded Hadoop technology startup that has already taken a market leadership position in the fast growing unstructured data revolution. If you are passionate about working with massive data sets, have experience n the area of developing ETLs for extraction of large quantities of data from various sources that include but are not limited to enterprise application and web properties, we want to hear from you!

Responsibilities:
  • Solid experience with data flow architectures and I/O for larger volumes of data.
  • Hands on programming, including rapid prototyping, Agile development methodologies, programming practices and testing methodologies and tools.
  • Building enterprise-grade applications that perform and are resilient in the context of system failures.
  • Developing object-oriented programming languages such as Java, Python and/or Ruby experience
  • Debugging large scale distributed systems
  • Helping to build the Hadoop Ecosystem and participating in building company culture in accordance with the firm's core values.
Experience:
Experience with Hadoop or large-scale Data Anayltics, Web Analytics or Visualization in a developer/user role.Additionally:
  • Experience with the NoSQL technologies, including HBase, in a developer/user role
  • Experience as Open Source contributor very desirable (include pointers to those please)
  • Ability to work in an agile and collaborative setup within an engineering team
  • Strong oral and written communication skills
  • Over 5 years experience in developing ETLs or other forms of integrating mission critical systems.
  • A bachelors degree in Computer Science or equivalent experience
  • Last but not least, candidates should be excited to help build the Hadoop Ecosystem and to participate in building company culture as expressed by core values

Powered by