Data Engineer, Digital Delta


Apply now Job no: 503578
Work type: Permanent Full Time
Location: Sydney
Division: Management Consulting

Do you love solving complex problems, designing sophisticated solutions and building innovative technology for Australia's largest organisations?

  • Are you passionate about Digital Transformation?
  • Are you convinced that the Fourth Industrial Revolution is fundamentally changing the way we live, work and relate to each other?
  • Do you aspire to create the best customer experiences across Mobile App & Web, and User Experience (UX) & User Interface (UI)?
  • Do you believe in creating powerful actionable insights from Data and Analytics?
  • Do you want to work in a diverse and flexible working environment?
New digital technologies and disruptive business models mean many organisations are struggling to keep pace with the transformative changes required to drive growth and meet customer demands. KPMG Digital Delta provides end-to-end digital innovation and transformation services to help overcome this challenge.
By designing and implementing new fit-for-purpose operating models, KPMG Digital Delta helps organisations to reframe their business models, improve operational productivity, create the best customer experiences, and enhance employee collaboration. We bring together best practice knowledge and technology, along with deep expertise across all industries.
More specifically, we re-imagine and re-invent organisations to become world class digital enterprises using advanced technologies, data and human insights. We help organisations to embrace Digital Strategy, Artificial Intelligence (AI) & Cognitive, the Internet of Things (IoT), Data, Analytics & Modelling, Mobile App & Web, and User Experience (UX) & User Interface (UI) and more.
We work with clients to:
  • Formulate strategies that re-imagine organisations
  • Harness innovation from the 4th industrial revolution
  • Actioning insights from trusted data to consistently and quickly make clear decisions
  • Build adaptive organisations
  • Thrive as a connected enterprise - front, middle and back office
Your new role
The Data Engineer is the designer, builder and manager of the information or data management pipelines, preparing data for analytical or operational use. You have an aptitude for translating business problems into data & infrastructure/resource requirements and solutions. You will design, construct, test and maintain data pipelines to pull together information from different source systems; integrate, consolidate, cleanse and monitoring the data; and structure it for use in individual analytics applications. You will actively ensure the stability and scalability of our clients' systems and data platforms. You will strive to bring the best of DevOps practices to the world of data by embracing the emerging practice of DataOps. You will work proactively to:
  • Drive a technical roadmap for the team, covering non-functional requirements such as scalability, reliability and observability
  • Assess new and existing data sources on their applicability to address the business issue and translate the outcomes of the analytical solutions we design in the context of business impacts and benefits.
  • Design, construct, install, test and maintain highly scalable, resilient, recoverable data management systems
  • Recommend ways to improve data reliability, efficiency and quality in our data pipelines by applying DataOps principles. Implement monitoring systems to proactively detect unexpected variation in our data pipelines.
  • Ensure delivered systems meet business requirements and industry practices for automating build deployment and change management using and DevOps and CI/CD patterns
  • Understand, explain and evangelise buzz words such as serverless, cloud native and PaaS and how they impact the design of Data Pipelines
  • Integrate new data management technologies and software engineering tools into existing pipelines
  • Create custom software components and analytics applications as required using a variety of languages and tools
  • Be comfortable with code or tool based data pipelines and understand the pro and cons of each
  • Work closely with Digital Delta Data Scientists to extract and manipulate data from a variety of sources and subsequently cleanse, standardize, scale, bin, categorise, tokenise, stem and transform it in order to get the data into a state suitable for further analysis.
  • Work with our Data Scientists to design, develop and implement optimization algorithms and solutions in areas that might include asset and inventory management, communications, channels, risk and portfolio analysis and supply chain management.
  • Work with our Data Scientists to design, develop and implement predictive analytical models for areas such as customer segmentation, market basket analysis, offer propensity, demand planning & forecasting, fraud detection, inventory management and risk exposure.
  • Design, develop and implement the automated approach for productionising model scoring and the closed loop feedback paths required to support true test and learn.
  • Generate process-thinking aimed at achieving scaled efficiencies in the development and implementation of analytic insight.
  • Apply visual analysis techniques and toolsets to extract patterns and meaning from data in a visual format for example to perform descriptive analytics to support business case development.
Select and configure analytics toolsets considering the clients' business issue and analytic maturity.
In addition to your focus on client engagements, you will contribute to the definition and enhancement of data engineering and DataOps disciplines within the practice.
You bring to the role
  • A proven ability to undertake the responsibilities and requirements of the role, as listed above.
  • Excellent interpersonal, oral and written communication skills, with a knack for distilling complex and/or technical information for novice audiences.
  • A proven ability to develop and manage enduring client relationships, engendering a sense of trust and respect.
  • Demonstrable industry knowledge; understanding the way your primary industry functions and how data can be collected, analyzed and utilized; maintaining flexibility in the face of cloud and data industry developments. Experience in financial services, telecommunications and retail is not mandatory but highly regarded.
  • A disciplined approach to problem solving and an ability to critically assess a range of information to differentiate true business needs as opposed to user requests.
  • Experience with a range of technical skills that could include:
  • Knowledge of architecting and engineering cloud-based data solutions with the following products AWS Redshift/RDS, S3, EC2, Lambda, EMR, Glue, DynamoDB, Athena, Kinesis - or equivalents in Azure or Google Cloud Platform, : Databricks, Snowflake, with a particular focus on serverless and cloud native solutions
  • Big Data technologies such as Hadoop, Spark Streaming, Flink, Hudi, Storm, NiFi, HBase, Hive, Zepplin, Kafka, Ranger, Ambari.
  • Programming languages such as Java, Node, Go, Python, Scala, SAS, R.
  • ETL tool experience and/or Code based data pipeline experience
  • Experience with DevOps principles and tools, including:
  • Agile enterprise development environments, CICD implementation, continuous testing, Cloud resource management (cloudformation, terraform, azure ARM etc..), automation of environment deployment and automated shakeout testing.
  • Continuous Integration/Delivery tools such as Jenkins, AWS code*, Azure DevOps, Bamboo, Cloud Build, Spinakker, Sonarqube, uDeploy or similar
  • Deployment automation tools such as OpenShift, Kubernetes and Docker.
  • Version control for data, low-level hardware and software configurations, and the code and configuration specific to each tool in the chain.
  • A proven ability to:
  • Build resilient, tested, data pipelines with statistical data quality monitoring Embedded (DataOps)
  • Extract knowledge, or insight, from structured and structure data.
  • Work with an existing lifecycle management framework to collect metadata, follow coding standards, use version control, complete documentation and write and execute unit tests.
  • Determine the appropriate approach including data collection methods, sampling methods, sample sizes and data processing pipelines to formulate, execute and analyse a sound and reproducible experiment. Including the ability to recognise and construct a closed loop feedback system.
  • Learn patterns and extract answers from data using algorithms that can build a model based on input data without being explicitly programmed to do so.
  • Apply techniques of statistical inference to test hypotheses and derive estimates of population statistics from sample data.
  • Appropriately communicate discovered information to consumers, clearly using visual variables shape, colour, hue, orientation, etc.
  • Experience with SQL-based technologies (eg PostgreSQL and MySQL) and NoSQL technologies (eg Cassandra and MongoDB)
  • Data warehousing solutions and architectures,
  • Data modelling tools (eg ERWin, Enterprise Architect and Visio)
  • High-level understanding of statistical analysis and modelling, predictive analytics, text analytics and other machine learning applications
  • A sound understanding of digital and cognitive technologies and analytics..... click apply for full job details