Signals Analytics is a product intelligence platform that revolutionizes the way decision makers consume and interact with business insights, analytics and the data behind them. Managing different types of data, performing analyses, and linking insights to real new product development decisions, Signals Playbook ™ enables product teams to build better, more successful products by transforming multiple external big data sources into actionable intelligence findings.
At Signals Analytics, we help the world's leading brands to make better decisions faster and with less risk. We’re looking for hard-working, passionate people to help us unite the art and science of sound decision-making.
The opportunity to shape how companies create winning products, discover breakthrough cures and put social good at the heart of their business is in our grasp. We’ve already proven that we can move faster than anyone else, but right now we’re only part of the way there.
Signals Analytics is developing a world-first cloud service that transforms big data into highly valuable intelligence information for our Fortune 1000 customers.
We invite you to join our Data Pipeline team to build our next generation solution. With our highly-motivated, top-notch engineering group, in a warm and friendly fast-growing company you will:
- Build and own high-availability, container-based micro-services to support data streaming and processing infrastructure
- Keep up to date with trends and new technologies & processes
- Follow software engineering, agile methodologies and data engineering best practices
- Contribute to shaping our engineering practices, innovation and culture
- Work closely with other teams to help create new innovative solutions
- Work with the Products team to define requirements and translate those into well designed software components with unit tests
- 7+ years of hands-on experience in Software Development
- Excellent programming skills in Python and Java
- Experience working with Amazon AWS Infrastructures
- Hands-on experience developing large-scale distributed software systems
- Experience working with large, unstructured, complex and diverse data
- Experience with Agile development methodologies and related tools
- Knowledge of large scalable data platforms like Hadoop, Cassandra, etc.
- Knowledge of data stream processing solutions, like Spark, Kafka, Storm, etc.
- Knowledge of other programming languages - a plus
- Team player with deliveries and goals oriented approach
- Excellent English communication skills
Additional Valuable Experience with:
- NLP technologies
- Data Science, Machine Learning, data classification and normalization
- Data warehouse solutions
- Graph databases
- Bachelors or above in computer science or equivalent related educational
Location: Poleg - Netanya (the company provides daily shuttle from/to Tel–Aviv)