Sorry. This page is not yet translated.

Confluent provides the streaming platform that enables enterprises to maximize the value of data.

Consulting Engineer - Singapore
Job Description / Skills Required

Dubbed an "open-source unicorn" by Forbes, Confluent is the fastest-growing enterprise subscription company our investors have ever seen. And how are we growing so fast? By pioneering a new technology category with an event streaming platform, which enables companies to leverage their data as a continually updating stream of events, not as static snapshots. This innovation has led Sequoia Capital, Benchmark, and Index Ventures to recently invest a combined $125 million in our Series D financing. Our product has been adopted by Fortune 100 customers across all industries, and we’re being led by the best in the space—our founders were the original creators of Apache Kafka®. We’re looking for talented and amazing team players who want to accelerate our growth, while doing some of the best work of their careers. Join us as we build the next transformative technology platform!
Consulting Engineers drive customer success by helping them realize business value from the burgeoning flow of real-time data streams in their organizations. In this role you’ll interact directly with our customers to provide software, development and operations expertise, leveraging deep knowledge of best practices in the use of Apache Kafka, the broader Confluent Platform, and complementary systems like Hadoop, Spark, Storm, relational databases, and various NoSQL databases.  
Throughout all of these interactions, you’ll build strong relationships with customers, ensure exemplary delivery standards, and have a lot of fun building state-of-the-art streaming data infrastructure alongside colleagues who are widely recognized as leaders in this space.
Promoting Confluent and our amazing team to the community and wider public audience is something we invite all our employees to take part in.  This can be in the form of writing blog posts, speaking at meetups and well known industry events about use cases and best practices, or as simple as releasing code.
While Confluent is headquartered in Palo Alto, you can work remotely from any location on the West Coast of the United States as long as you are able to travel to client engagements as needed

A typical week at Confluent in this role may involve:

    • Preparing for an upcoming engagement, discussing the goals and expectations with the customer and preparing an agenda
    • Researching best practices or components required for the engagement
    • Delivering an engagement on-site, working with the customer’s architects and developers in a workshop environment
    • Producing and delivering the post-engagement report to the customer
    • Developing applications on Confluent Kafka Platform
    • Deploy, augment, upgrade Kafka clusters
    • Building tooling for another team and the wider company
    • Testing performance and functionality of new components developed by Engineering
    • Writing or editing documentation and knowledge base articles
    • Honing your skills, building applications, or trying out new product features

Required skills and experience:

    • Deep experience building and operating in-production Big Data, stream processing, and/or enterprise data integration solutions using Apache Kafka
    • Experience operating Linux (configure, tune, and troubleshoot both RedHat and Debian-based distributions)
    • Experience with Java Virtual Machine (JVM) tuning and troubleshooting
    • Experience with distributed systems (Kafka, Hadoop, Cassandra, etc.)
    • Proficiency in Java
    • Excellent communication skills, with an ability to clearly and concisely explain tricky issues and complex solutions
    • Ability and willingness to travel up to 50% of the time to meet with customers
    • Bachelor-level degree in computer science, engineering, mathematics, or another quantitative field

Nice to have:

    • Experience using Amazon Web Services, Azure, and/or GCP for running high-throughput systems
    • Experience helping customers build Apache Kafka solutions alongside Hadoop technologies, relational and NoSQL databases, message queues, and related products
    • Experience with Python, Scala, or Go
    • Experience with configuration and management tools such as Ansible, Teraform, Puppet, Chef
    • Experience writing to network-based APIs (preferably REST/JSON or XML/SOAP)
    • Knowledge of enterprise security practices and solutions, such as LDAP and/or Kerberos
    • Experience working with a commercial team and demonstrated business acumen
    • Experience working in a fast-paced technology start-up
    • Experience managing projects, using any known methodology to scope, manage, and deliver on plan no matter the complexity