# This Role:

As StreamNative's Cloud Product Engineer, you will have the opportunity to build and shape a powerful, industry-leading event streaming product from the get-go. The Cloud Product Engineer will be someone that posses unique engineering talent and is excited about the opportunity to join the StreamNative team, build our Pulsar-as-a-service solution, and help transform the future of the streaming messaging industry.

# Responsibilities

- Build, orchestrate, and maintain an enterprise-grade cloud-native streaming data service offering of Apache Pulsar.
- Help deliver the Pulsar-as-a-service offering in our customer’s cloud environment by leveraging container orchestration tools like Kubernetes.
- Collaborate with product and engineering teams to improve products, meet new requirements of product, and continuously improve the quality and competitiveness of products.
- Design and develop an automation platform to standardize cloud-native operations.


# About You

- Excellent problem-solving skills, good communication skills, and teamwork spirit.
- Understand various common network protocol principles and virtualization technologies.
- Experience with containers, cloud-native/microservice architectures.
- Experience with container orchestration technology, e.g. Kubernetes, a strong plus.
- Experience in using automation tools, such as Puppet, Ansible or Terraform.
- Experience with distributed systems, load balancing, and system monitoring.

# The Perks!

- Competitive salary
- Equity in a fast-growing enterprise startup
- Awesome, supportive coworkers with a good sense of humor
- Working with a globally distributed team of passionate developers, hackers and open-source fanatics
- Remote friendly
- Medical, dental, vision insurance
- Flexible paid time off
- Commuter benefits

This position is for developing the core technologies around Apache Pulsar - the cloud-native event streaming system. To learn more about the project, checkout its website: pulsar.apache.org 


## What you will get to do


- Develop core technologies around Apache Pulsar and Apache BookKeeper
- Build ecosystems around Apache Pulsar and Apache BookKeeper
- Build developer communities of Apache Pulsar and Apache BookKeeper and publish technical posts
- Develop commercial products on Apache Pulsar and Apache BookKeeper


## About You


- Excellent problem-solving skills, good communication skills, and teamwork spirit;
- Familiar with at least one language of Java/Python/Go/C++, with good programming skills;
- Familiar with the principles and key technologies of distributed systems. Knowledge and practice in distributed message middleware systems (such as Kafka/RabbitMQ) is preferred;
- Experience with big data systems (Hadoop, HDFS, Hive, Spark, Flink, etc.) is preferred;
- Experience with Docker, Kubernetes, Ansible/Terraform, etc. is preferred.


## The Perks!

- Competitive salary
- Equity in a fast-growing enterprise startup
- Opportunity to build and shape a powerful, industry-changing event streaming product from the get-go
- Awesome, supportive coworkers with a good sense of humor
- Working with a globally distributed team of passionate developers, hackers and open-source fanatics
- Remote friendly
- Benefits to keep you happy and healthy
- Medical, dental, vision insurance
- Flexible paid time off
- Commuter benefits

Location: Berlin, Germany
 

Our teams are pushing the boundaries of what can be achieved with data stream processing, allowing our users and customers to gain more insights into their data in real-time.

Apache Flink currently powers some of the largest data stream processing pipelines in the world, with users such as Alibaba, Uber, ING, Netflix, and more running Flink in production.

Ververica was founded in 2014 by the original creators of the Apache Flink project, and we’re building the next-generation platform for real-time data applications.

We are looking for passionate software developers that enjoy solving challenging problems and are excited about open source, helping us build world-class data streaming technology.

As a software engineer at Ververica, you will take a dual role: being involved in both the open source Apache Flink project, as well as Ververica’ products. You should be comfortable with taking responsibility for designs and features, and be able to work both self-supervised and in cross-functional teams. As a citizen of the Flink open source community, you will be in direct touch with Flink users and participate in the day-to-day open source work. We also encourage engineers to talk publicly about their work at conferences, meetups, and via blog posts.

You will love this job if you …
  • … have a good knowledge of Java (Scala is a plus)
  • … have an aptitude for simple, robust, and elegant designs, including how to design appealing APIs and libraries
  • … have experience in working collaboratively on large code bases is a plus
What we offer … 
  • Competitive salary
  • Great career opportunities in a world-class team with peers from top-companies, universities and research institutes.
  • Tech gear of your choice
  • International team environment (10 nationalities so far)
  • Flexible working arrangements (home office, flexible working hours)
  • Unlimited vacation policy, so take time off when you need it
  • Snacks, coffee and beverages in the office
  • Relocation assistance if needed
  • Hackathons and weekly technical Lunch Talks to keep your head full of inspirations and ideas!
  • Subsidized Gym membership (Urban Sports Club)
  • Subsidized German classes in the office
  • Free Lunch 3 times a week in the office
  • Free public transportation ticket
Location: Berlin, Germany

Our teams are pushing the boundaries of what can be achieved with data stream processing, allowing our users and customers to gain more insights into their data in real-time.

Apache Flink currently powers some of the largest data stream processing pipelines in the world, with users such as Alibaba, Uber, ING, Netflix, and more running Flink in production.

Ververica was founded in 2014 by the original creators of the Apache Flink project, and we’re building the next-generation platform for real-time data applications.

We are looking for strong systems builders that have a deep understanding of the internals of distributed systems, database systems, or performance-critical code in general. As a distributed systems engineer at Ververica, you will be working on the deep internals of distributed data stream processing. You should be comfortable with taking responsibility for designs and features, and be able to work both self-supervised and in cross-functional teams.

As a citizen of the Flink open source community, you will be in direct touch with Flink users and participate in the day-to-day open source work.

We also encourage engineers to talk publicly about their work at conferences, meetups, and via blog posts.

You will love this job if you …

  • … have experience in building large data processing or distributed systems during PhD research or prior work experience
  • … have a deep understanding of one or more of the following areas: distributed systems, database systems, performance optimization
  • … have a strong foundation of algorithms and application design
  • … have an aptitude for simple, robust, and elegant designs, including how to design appealing APIs and libraries
  • … have experience in developing systems or working on large code bases in any programming language
What we offer …
  • Competitive salary
  • Great career opportunities in a world-class team with peers from top-companies, universities and research institutes.
  • Tech gear of your choice
  • International team environment (10 nationalities so far)
  • Flexible working arrangements (home office, flexible working hours)
  • Unlimited vacation policy, so take time off when you need it
  • Snacks, coffee and beverages in the office
  • Relocation assistance if needed
  • Hackathons and weekly technical Lunch Talks to keep your head full of inspirations and ideas!
  • Subsidized Gym membership (Urban Sports Club)
  • Subsidized German classes in the office
  • Free Lunch 3 times a week in the office
  • Free public transportation ticket
Location: Europe (any location)

Data stream processing is redefining what’s possible in the world of data-driven applications and services. Apache Flink is one of the systems at the forefront of this development, pushing the boundaries of what can be achieved with data stream processing.

Apache Flink currently powers some of the largest data stream processing pipelines in the world, with users such as Alibaba, Uber, ING, Netflix, and more running Flink in production. Flink is also one of the most active and fastest-growing open source projects in the Apache Software Foundation.

Ververica was founded in 2014 by the original creators of the Apache Flink project, and we’re building the next-generation platform for real-time data applications. We are tackling some of today’s biggest challenges in big data and data streaming

Your role:

Ververica is currently building a new team of Solution Architects in Europe and the US. You’ll be part of a new and fast-growing team helping customers having a great experience using our products and Apache Flink. The role will sit at the forefront of one of the most significant paradigm shifts in information processing and real-time architectures in recent history – stream processing – which sets the foundation to transform companies and industries for the on-demand services era.

You will work with engineering teams inside of our customers to build the best possible stream processing architecture for their use cases. This includes reviewing their architecture, giving guidance on how they design their Flink applications, and helping them take their first steps with our products.

Some of the customer engagements will be carried out remotely via phone and screen share, but the position also includes traveling to customers to help them onsite.

And when you’re not working with our customers, there are plenty of opportunities at Ververica to learn more about Flink, contribute to the products and open source projects, and help evangelizing Apache Flink to users around the world.

What you’ll do all day:
  • Use your experience to solve challenging data engineering and stream processing problems for our customers
  • Meet with customers, understand their requirements, and help guide them towards best-of-breed architectures
  • Provide guidance and coding assistance during the implementation phase and make sure projects end in successful production deployments
  • Become an Apache Flink and stream processing expert
You will love this job if you …
  • … are experienced in building and operating solutions using distributed data processing systems on large scale production environments (e.g. Hadoop, Kafka, Flink, Spark)
  • … are fluent in Java and/or Scala
  • … love to spend the whole day talking about Big Data technologies
  • … have great English skills and like talking to customers
  • … like traveling around the Europe/USA and visiting new places
What we offer:
  • Competitive salary
  • International team environment (10 nationalities so far)
  • Flexible working arrangements (home office, flexible working hours)
  • Unlimited vacation policy, so take time off when you need it
Full-Time
 
We are looking for experienced Senior Java Developers who want to make an impact not only in the office, but in the world around them. We need talented developers to help shape the future of our products. Expect to find great meaning in what you do, enjoy it, and
be compensated competitively for your time and talents.
 
The Company
We are passionate about building software that solves problems. Cogility Software is a leading data analytics technology provider focused on enhancing human and system performance by providing actionable intelligence in the most challenging and complex
environments.
 
Our Products
Our products include a highly scalable solution that employs the latest in cloud-based technologies, machine learning, advanced semantic analysis and complex event processing.
Solutions developed using the product are principally targeted for the law enforcement, big data, logistics, distribution, insider threat, maritime and government intelligence applications. Apply to learn more about our products!

What you’ll be doing
As a Senior Java Developer, you will architect the components and servers that our customers use to solve their biggest problems. The mission of a Java Developer is to design and build capabilities that allow users to analyze their data to meet their needs. They are
involved in all stages of the product development and deployment lifecycle: idea generation, user interviews, planning, design, prototyping, execution, shipping and iteration:
 
  • Will code, test, debug, and install both new programs / technologies and changes to existing programs / technologies of a complex nature with minimal assistance
  • Will design programs / technologies under the direction of Technical Lead and Project Managers
  • Work the architecture of system design, where your contribution fits into the overall project scope allowing you to have a big picture understanding.
What we need
  • Must have a minimum of 4+ years of work experience in a similar position or product development
  • Ability to write clean, maintainable code
  • Strong engineering background
  • Familiarity with data structures, storage systems, cloud infrastructure, distributed computing, and other technical tools
  • Proficiency in: Java, pluses: Apache Flink, Apache Kafka, Elixir, Phoenix, or other big-data framework(s)
  • Maintain code integrity and organization
  • Proficient experience using server APIs (REST, JS-API, GraphQL, etc)
  • A good understanding of: the software development process including development and deployment
  • Understanding and implementation of security and data protection
  • Requires a bachelor’s degree or technical certification or equivalent work experience
What we want
  • Skill and comfort working in a rapidly changing environment with dynamic objectives and iteration with users
  • Must be able to meet tight deadlines in a fast-paced environment and handle multiple assignments / projects at once
  • Be able to communicate and work with people of all technical levels in a team environment
  • Be willing to take feedback and incorporate it into your work
  • Be willing take direction from team lead but must be self-managing and make decisions with minimal supervision
  • Ability to deal positively with shifting priorities
Additional Requirements:
  • Must work from our Irvine office location
  • Be willing to travel to client (on occasion)
Benefits:
  • Competitive Salary
  • Generous medical, dental, and vision plans
  • Vacation, sick, and paid holidays offered
  • Stand / sit workstations
  • Kitchen stocked with snacks and drinks
  • Work with talented and collaborative co-workers
  • Casual environment
Application:
 
Full-time position – no contractors.
 
Cogility Software Corporation, Irvine CA 92618
 
“Cogility Software Corporation is an Equal Opportunity Employer”

 

 

London,UK

Minimum qualifications:

  • Master's degree in Engineering, Statistics, Mathematics, Economics, an Applied Science or equivalent practical experience.
  • Programming experience in Python and/or R.
  • Quantitative analytics experience, with a focus on marketing analytics, statistical modeling, machine learning, digital attribution, forecasting, optimization and predictive analytics.
  • Ability to speak and write in English and German fluently and idiomatically.

Preferred qualifications:

  • SQL and Big Data experience.
  • Experience in delivering bespoke analytics to stakeholders (e.g., problem scoping/definition, modeling, interpretation, presentation).
  • Experience using and/or deploying digital analytics and measurement solutions.
  • Knowledge of statistical programming languages (R/Python/TensorFlow).
  • Ability to visualize models and results. Ability to debug and troubleshoot code and models.

About the job

gTech’s Professional Services team takes a creative, collaborative, and customer-centric approach to provide foundational services and forward-looking business solutions to top advertiser and publisher customers. Through technical implementation, optimization, and key solutions, gTech Professional Services helps customers attain their business goals while building long-term capabilities.

As a Data Analyst, you'll work closely with clients and produce innovative and actionable quantitative models and analyses to address the challenges of marketing effectiveness, return on investment and prediction. The Global Premium Services team is a solution-generating team that helps our Sales teams and advertisers. In addition to troubleshooting on the customer side, we work with Sales, Product, and Engineering teams within Google to develop better tools and services to improve our products based on the evolving needs of our users. As a cross-functional and global team, it's our job to help keep the lights on and the ads relevant.

Google creates products and services that make the world a better place, and gTech’s role is to help bring them to life. Our teams of solution-oriented trusted advisors support millions of customers globally. Our solutions are rooted in our technical skill, product expertise, and a thorough understanding of our customers’ complex needs. Whether the answer is a bespoke solution to solve a unique problem, or a new tool that can scale across Google, everything we do aims to ensure our customers benefit from the full potential of Google products.

To learn more about gTech, check out our video.

Responsibilities

  • Lead analytics aspects of client engagements for marketing effectiveness and marketing portfolio management using deep modeling knowledge.
  • Work with clients to align Google Analytics 360 suite attribution and analytic solutions with key organizational challenges. Develop value-based roadmaps to solve client business challenges on a continuous and repeatable basis.
  • Collaborate with clients to build an end-to-end Machine Learning framework. Engage stakeholders, assess data readiness and scale a proof of concept to a larger solution.
  • Work with client/internal teams to translate data and model results into strategic insights that are clear, accurate, relevant, understandable and applicable to clients' decision making and needs. Co-present to and work with clients to integrate recommendations into business processes.
  • Collaborate with Product/Engineering teams to increase and optimize capabilities, employing methods which create opportunities for scale, helping to drive innovation.

London, UK


Minimum qualifications:

  • Master's degree or equivalent practical experience.
  • Experience leading/managing a team across multiple locations.
  • Experience in technical consulting and project management in corporate/professional IT services or product development.
  • Ability to travel up to 50% of the time as required.

Preferred qualifications:

  • Experience in leading IT implementations, IT support, deployments and/or business development activities.
  • Experience with relevant cloud technology in infrastructure, application development and developer operations. Demonstrated experience in scalable application development.
  • Experience implementing large-scale cloud or software projects in enterprise environments.
  • Demonstrated organizational, analytical and influencing skills.
  • Effective presentation and communication skills

About the job

As a Head of Engineering in EMEA, you'll lead a technical Cloud team of Managers and Field Engineers. You will work closely with our regional leads to accommodate the demand for skills, provide technical leadership across the region and mentor a growing team.

You'll have the opportunity to thrive while working in small, highly-performing teams and bringing your demonstrated leadership acumen to a team responsible for ensuring the best experience for customers moving to Google Cloud.

The Google Cloud Professional Services team is responsible for introducing Google’s clients to Google Cloud Platform and G Suite, and assisting them from inception to production. We solve the most challenging and complex technical problems - from infrastructure migration, to network optimization, to security best practices and much more. We also work closely with product engineering and support staff to flawlessly lead client deployments, implementations and the integration of custom features.

Responsibilities

  • Manage and grow a team of Strategic Cloud Engineers across the EMEA region.
  • Enable the team to develop technical tools/assets, document reproducible solutions for customers and partners and provide highly technical implementation support in customer environments, including guidance on implementation feasibility of cross-product integrations.
  • Consult with strategic key customers on technical issues based on in-depth product and technical knowledge.
 

London, UK

Minimum qualifications:

  • Bachelor's degree in Computer Science, a related technical field or equivalent practical experience.
  • Experience with developing web applications and programming languages (Python, Go, Java, JVM Hotspot, HTTP, HTTPS).
  • Industry experience in technical support, professional services, engineering, sustaining engineering or systems engineering.

Preferred qualifications:

  • Experience in consulting/engineering and product support.
  • Background with Middleware, web services and related standards such as HTTP, REST, SOAP and OAuth.
  • Demonstrated network administration skills with an understanding of Linux system administration.
  • Ability to keep pace with rapid changes in products.
  •  Demonstrated troubleshooting, follow-through and problem solving skills. Excellent resourcefulness, attention to detail and communication skills.

About the job

Apigee Edge is an advanced API platform that enables enterprises to create, manage, secure and scale APIs. Using Apigee, enterprises can design and deploy API proxies in public and private clouds, secure data in transit with OAuth 2.0, SAML and two-way TLS, protect against traffic spikes by adding dynamic routing, caching and rate-limiting policies to APIs, and measure the success of both API operations and API business with end-to-end analytics.

As part of an entrepreneurial team in this rapidly growing business, you will help businesses of all sizes leverage technology to connect with customers, employees and partners.

As a Technical Solutions Engineer, you'll combine software development, networking and systems administration expertise with an aptitude for technical consulting, troubleshooting and analysis. You'll work across multiple customer-facing teams to improve the product's vision, while also ensuring that global businesses and organizations continue to be successful. You are a technical solutions expert, helping Googlers and customers approach and resolve technology problems and design Google-scale services.

Google Cloud helps millions of employees and organizations empower their employees, serve their customers, and build what’s next for their business — all with technology built in the cloud. Our products are engineered for security, reliability and scalability, running the full stack from infrastructure to applications to devices and hardware. And our teams are dedicated to helping our customers and developers see the benefits of our technology come to life.

Responsibilities

  • Diagnose and resolve customer problems for Google Cloud Platform products as part of a global team.
  • Liaise between engineering and consulting to resolve complex customer cases and recommend solutions.
  • Work on complex customer cases and bring new ideas for innovation and automation excellence into the product support team.
  • Provide customers with support through excellent communication, deep technical skills and consultancy.

Munich, Germany, Europe

AWS EMEA SARL (Germany Branch) 

Description

Are you an analytics and data warehousing specialist? Do you have experience of data ingestion, storage, processing and visualization? Do you like to solve the most complex and large-scale data challenges in the world today? Do you want to have an impact in the development and use of new data analytics technologies? Would you like a career that gives you opportunities to help customers and partners use cloud computing to build new solutions, faster, and at lower cost? Do you like to gain the deepest customer and partner insights using cloud computing technologies? Come join us!

Amazon Web Services (AWS) is looking for highly talented hands-on technical experts to collaborate with our customers and partners on key engagements in the data and analytics space. This is an excellent opportunity to join Amazon’s world class technical teams, working with some of the best and brightest engineers while also developing your skills and furthering your career within one of the most innovative and progressive technology companies. These engagements will focus on AWS Services for analytics in regard to data lake solutions on the AWS cloud, and on helping our customers and partners build innovative solutions and businesses that focus on leveraging the value of data. You will work on the complete set of technologies used in a data lake lifecycle: data integration, data storage, metadata management, data lineage and tiering, entitlements, master data management, data movement, data processing, and data visualization.

As a Data Lake Specialist Solutions Architect (SA), you will be the Subject Matter Expert (SME) for helping customers select the technologies that will support their business requirements, and successfully deploy analytics platforms on the AWS cloud. As part of the Data and Analytics Specialist Solutions Architecture team, you will work closely with the other Specialist SAs on Big Data, Databases, Analytics and Artificial Intelligence, as well as the Business Development teams, to enable large-scale customer use cases and drive the adoption of AWS for their data processing platforms. You will interact with other SAs in the field, providing guidance on their customer engagements, and developing white papers, blogs, reference implementations, and presentations to enable customers and 3rd parties to fully leverage the AWS platform. You will also create field enablement materials for the broader SA population, to help them understand how to integrate AWS solutions into customer architectures.

Candidates must have great communication skills and be very technical, with the ability to impress AWS customers at any level, from executive to developer. Previous experience with AWS is desired but not required, provided you have experience building large-scale solutions. You will get the opportunity to work directly with senior engineers at customers, partners and AWS service teams, influencing their roadmaps and driving innovation.

If you are someone who enjoys innovating, likes solving hard problems and working on the cutting edge of technology, we would love to have you on the team.


Roles and responsibilities
  • Design Customer Solutions:­ Collaborate with AWS field account teams, training and support teams to help partners and customers learn and use AWS services such as Elastic Map Reduce (EMR), Redshift, Kinesis, Amazon Machine Learning, AWS Lambda, Data Pipeline, S3, DynamoDB, and the Relational Database Service (RDS)
  • Devise Strategy: Engage with solution architects, account managers, professional services and partners to define an analytics engagement strategy for AWS operational territories and key accounts.
  • Thought Leadership: Provide global thought leadership on analytics solutions that benefit customers through the use of AWS Services. This takes the form of contribution to external publications such as the AWS Big Data Blog, whitepapers and reference architectures, as well as internal training of solution architects, professional services consultants, technical account managers, and AWS trainers
  • Serve as a key member of the business development and account management team in helping to ensure customer success in building and migrating applications, software and services on the AWS platform.
  • Assist solution providers with the definition and implementation of technical and business strategies.
  • Capture and share best-practice knowledge amongst the worldwide AWS solution architect community.
  • Understand AWS market segments, and industry verticals.
  • Understand and exploit the use of internal business support systems.

Basic Qualifications

  • The candidate will possess both technical and customer-facing skills that will allow them to be the technical support “face” of Amazon within a solution providers’ ecosystem/environment as well as directly to end customers. He or she will be able to facilitate relationships with senior personnel, as well as the technical background that enables him or her to easily interact and give guidance to software developers, IT pros, and system architects. The ideal candidate will also have a demonstrated ability to think strategically about business, product, and technical challenges
  • The right person will be highly technical and analytical, possess significant experience of software development and/or IT and networking implementation/consulting experience
  • Strong verbal and written communications skills are a must, as well as the ability to work effectively across internal and external organisations and virtual teams
  • Executive speaking and presentation skills – Formal presentations, white-boarding, large and small group presentations
  • Ability to think strategically about business, product, and technical challenges in an enterprise environment.
  • Understanding of Agile methodologies, and the ability to apply these practices to analytics projects
  • Possessing 5 or more years of data management expertise, spanning ETL processes, master data management or data management platforms experience, and integration in complex environments
  • Highly technical and analytical, possessing 10 or more years of analytics platform implementation or operations experience
  • Knowledge of the underlying infrastructure requirements such as networking, storage, and hardware optimisation
  • Implementation and tuning experience of data warehousing platforms, including knowledge of data warehouse schema design, query tuning and optimisation, and data migration and integration. Experience of requirements for the analytics presentation layer including dashboards, reporting, and OLAP
  • BS level technical degree required; computer science or maths background preferred

Preferred Qualifications

  •  Understanding of application, server, and network security is highly desired
  • Experience working within the software development or internet industries is highly desired
  • Master's degree in computer science or engineering or similar field
  • Hands on experience leading large-scale global data warehousing and analytics projects.
  • Demonstrated industry leadership in the fields of database, data warehousing or data sciences
  • Real time streaming technologies and time series with tools such as Spark, Flink, Samza etc.
  • Hadoop big Ddata knowledge – Hive metastore; storage partitioning schemes on S3 and HDFS
  • ETL – understanding and custom coding
  • NoSQL understanding and use case application – Cassandra, HBase, DynamoDB
  • Understanding and use cases application of columnar data stores
  • Caching and queueing technologies – Kafka/Kinesis, Rabbitmq/SQS, Redis/Memcache etc.
  • RDBMS skills – SQL, optimization techniques, etc.
  • Scripting/Programming skills – Python, Java, Scala, Go
  • Excellent understanding of operating systems including troubleshooting
  •  Data warehousing knowledge
  •  Security at rest and in transit

Washington, Seattle

Description

Amazon Kinesis Data Analytics(KDA) team is looking for Senior Engineers who are passionate to build distributed stream processing engines. We are looking for builders who are enthusiastic about data streaming and excited about contributing to open source.

Real-time data processing from a stream needs substantial investments from customers in writing the application and maintaining the necessary infrastructure. KDA service provides customers with fully managed stream processing platform where customers can develop their applications using SQL or Java. With the service all that customers need to do is provide the application code that needs to be run containing the business logic to process the stream and service takes care of providing building blocks/abstractions such as processing windows, execution semantics, checkpoints and infrastructure capabilities such as elasticity, fail-over etc. eliminating complexity of stream processing.

As a senior member of KDA team you will be working on making improvements to the stream processing engine, Apache Flink to make KDA service the easy place to run stream processing application. Upstream compatibility is a core tenet of KDA service and your changes to improve the engine will be contributed back to open source. As a member of the KDA service you will be working on adding new stream processing operators, improving efficiency and availability of the engine and push the envelope of stream processing.

The ideal candidate has experience working on large-scale systems, enjoys solving complex software problems, and possesses analytical, design and problem-solving skills. While not necessary having an in-depth understanding of data processing technologies such as Apache Flink, Apache Spark, Apache Storm, Hadoop frameworks is a plus. Your responsibilities will include collaborating with other engineers to build a large scale AWS service, and work with other engineers to define your team's roadmap, including identifying design and code changes needed in the underlying open source platforms.

Come join us to make stream processing main stream for our customers.

Basic Qualifications 

 

  • Bachelor’s Degree in Computer Science or related field
  • Equivalent experience to a Bachelor's degree based on 3 years of work experience for every 1 year of education
  • 5+ years professional experience in software development
  • Experience taking a leading role in building complex software systems that have been successfully delivered to customers
  • Proficiency in, at least, one modern programming language such as C, C++, C#, Java, or Perl
  •  Excellent communication skills and the ability to work well in a team.
  •  Ability to excel in a fast-paced, startup-like environment.

Preferred Qualifications

  •  Experience building extremely high volume and highly scalable web services.
  •  Experience building highly available systems and operating 24x7 services.
  • Experience with distributed systems, consistent hashing, distributed locking, replication, and load balancing.
  • Working knowledge of Kubernetes, Hadoop, MapReduce, Storm, Spark or other Big Data processing platform.
  • Experience with at least one modern scripting language such as Ruby, Python or PHP.
  • Knowledge of professional software engineering practices & best practices for the full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations.
  •  Strong customer focus, ownership, urgency and drive.
  • Master’s degree or PhD in Computer Science.

Amazon is an Equal Opportunity-Affirmative Action Employer – Minority / Female / Disability / Veteran / Gender Identity / Sexual Orientation

 

Cambridge, England

Description 

Come build the future of data streaming with the Amazon Managed Streaming for Kafka (MSK) team!

We are seeking builders for our newly launched Amazon MSK service, a fully managed service that makes it easy for customers to build and run applications that use Apache Kafka to process streaming data. We are looking for engineers who are enthusiastic about data streaming, and are as passionate about contributing to open source as they are about solving real customers' business needs, at AWS scale.

As a member of the Amazon MSK team, you will be making contributions to the entire stack - the APIs and the workflows that make up the MSK service, the core Kafka platform, and stand-alone tools that make it easier for Kafka community to operate Kafka better. Upstream compatibility is a core tenet of MSK. Your code changes to the Kafka platform will be released back to open source. As a member of a new AWS service that builds on top of a popular open source technology, this is a unique opportunity to work on a team that straddles both worlds – open source and Amazon-internal software. You will design and build new features, make performance improvements, identify and investigate new technologies, prototype solutions, build scalable services, and test and review changes, to deliver an exceptional customer experience.

The ideal candidate has experience designing large-scale systems supporting millions of transactions per second, enjoys solving complex software problems, and possesses analytical, design and problem-solving skills. Ideally you have an in-depth understanding of streaming data technologies like Amazon Kinesis or Apache Kafka, and experience with open-source data processing frameworks like Apache Spark, Apache Flink, or Apache Storm. Your responsibilities will include collaborating with other engineers to build a large scale AWS service, and work with senior leaders to define your team's roadmap, including identifying design and code changes needed in the underlying open source platforms.

Basic Qualifications

  • Bachelor's degree in Computer Science (or equivalent) experience
  • Several years experience of developing production software systems
  • Advanced software engineering skills, including the ability to write expert-level, maintainable, and robust code in C++, Java, or other core object oriented languages
  • Experience of taking product requirements and developing software architectures and designs to bring them to life
  •  Proficiency in computer science fundamentals – data structures, algorithms and OO design
  • Good communication skills and ability to work effectively on shared projects with designers, artists, testers, and other developers

Preferred Qualifications

  • Experience building extremely high volume and highly scalable online services
  • Experience operating highly available services
  • Experience with distributed systems, consistent hashing, distributed locking, checkpointing, and load balancing
  • Working knowledge of Hadoop, MapReduce, Kafka, Kinesis, Spark or other Big Data processing platforms

Austin, Texas

Eventador.io is looking for a Java/Full Stack developer to help build a world-class streaming data platform.

Requirements:

  • 3+ years of professional experience as a software engineer/systems engineer
  • Experience writing software in Java, Scala, or other JVM-centric languages. Python knowledge is useful too.
  • Experience troubleshooting/tuning applications in the JVM.
  • Experience with API frameworks such as Jersey, DropWizard, Spring, etc.
  • Experience using Github for source code control.
  • Experience building RESTful APIs.
  • Ability to support your projects in production!

Nice to have:
  •  Some experience with Apache Flink, Kafka, or other streaming technologies are useful.
  • Experience deploying software in a containerized environment (Docker, Kubernetes, etc).
  • Know your way around a Unix/Linux environment.
  • Previous experience in a fast-paced startup environment (or just the desire to join one!).


Eventador is a small team taking on big problems, and we're having a lot of fun doing it. Our headquarters is located in wonderful Austin, Texas. All the normal perks - plenty of space, free snacks, free parking on site, plenty of space to work, and an amazing team to work with.

Sound like the kind of environment you'd enjoy? Let us know, we'd love to meet you!

Equity is negotiable.