Private Cloud Specialist - APAC (Based In Australia)

Job Description:

As the world generates even more volumes of data, from any device or thing, companies are discovering the need to gain immediate insights from their data by studying recurring trends and patterns over time, and staying competitive by implementing predictive actions for their business that will yield positive outcomes.

Cloudera’s CDP Platform enables customers to harness those volumes of increasingly valuable data by securely hosting data lakes and efficiently managing data analytics & computing workloads between public clouds and private cloud solutions in data centers.

As a Cloudera Private Cloud Specialist , you will help customers be successful with deploying the CDP Private Cloud platform on different flavors of Kubernetes especially Redhat OpenShift. You will utilize your strong technical skills, business competencies and customer service orientation, provide the highest level of hands on solution design, and deliver technical value to sales teams, prospects and customers, to support sales goals.

Job Responsibilities:

  • Create and deliver customer-centric solution designs, proposed architectures and business outcomes to all levels – developers, architects, CTO, CIO

  • Provide deep-dive explanations for technologies like K8S, Red Hat OpenShift, etc., and the capabilities of our offerings to customers, partners; advise and oversee proofs of concepts.

  • Work with cross-functional teams including Sales, Marketing, Product Management, Services, Support, Training, and Engineering to share Cloudera’s Vision with customers

  • Advise customers on use case patterns through discovery and requirements workshops

  • Transform customer feedback into actionable product roadmap items

  • Work with teammates and management to define technical selling strategy

  • Participate within the Cloudera community, share evangelism activities (blogs, meetups, industry events), and contribute to internal and external knowledge repositories


  • 15+ years of professional work experience in a similar position, selling solutions to business leaders and technical champions in the enterprise.

  • 3+ years of hands on experience with Kubernetes or related container technologies; Red Hat OpenShift is a plus

  • Experience with Platform-as-a-Service (PaaS) offerings like Red Hat OpenShift, Docker, Kubernetes, Tanzu, Anthos, Rancher, Heroku, Elastic Beanstalk, or similar

  • 3+ years of experience working in a DevOps environment; familiarity with Git, agile, continuous integration (CI) and continuous delivery (CD), and DevOps best practices.

  • You have demonstrated problem solving and analytical skills.

  • You have experience in Solution Architecture/Engineering as a field of practice (an ability to listen to customer requirements, whiteboard and propose solution architectures, and be hands-on with the tech) to design, build, and demonstrate real business value

  • You have experience with the Apache Hadoop ecosystem (HDFS, YARN) and related components (Spark, Hive, Impala, Kudu, Solr, etc.) and can talk to the benefits of a centralized architecture for both data management and data access

  • You have experience with public cloud infrastructures (AWS, Azure, GCP, IBM Cloud, etc.) You have an interest in achieving your public cloud certification.

  • You have experience with the Linux OS and bare metal, VM, and distributed container platforms Kubernetes or OpenShift. You may have some experience with specific private cloud infrastructures (Server Hardware, Storage, Networking).

  • You have some experience of the formation of Data Lakes, principles of Data Warehousing, SQL and relational database patterns, and application integration

  • You have some knowledge of differentiators across the competitive landscape

  • You have some basic software development experience in Java, Scala or Python

  • You may have experience with Data Flow, Queues and Stream Processing (Nifi, Kafka, Flink, Spark) in enterprise applications

  • You have an interest in Data Science and understand the difference between Data Engineering and Applied Science

  • You care about your colleagues, and you will get the job done together. You are passionate about what you do and inspire people around you

  • Bachelor’s Degree or equivalent experience in a Technical Field

Bonus Skills:

  • Direct experience with deployment of Red Hat OpenStack Platform and K8S (Red Hat OpenShift) and advising customers/partners integration on technical solutions is a big plus

  • You understand the challenges in operations and integration of enterprise platforms for Security and Data Governance in the enterprise

  • Experience with automation technologies, especially Red Hat Ansible Automation, is a big plus

  • Experience with, and interest in, open-source software and development practices

  • NoSQL or Operational-DB experience (HBase, Phoenix, Druid, Cassandra, MongoDB, etc.) Maybe you can even debate one option over another?

  • EDW experience – Teradata, Netezza, GreenPlum, Exadata Data Science and ML experience – (R, Python, Anaconda, Jupyter, Deep Learning Frameworks etc.)

  • Integration Products experience (Talend, DataStage, Informatica BDM, Qlik, Tableau, Zoomdata, Tibco, MuleSoft, IBM, Oracle, Spring Integration, etc.)

The right person in this role has an opportunity to make a huge impact at Cloudera. If this position has piqued your interest and you have what we described - we invite you apply! An adventure in data awaits.