JobsGalore.eu
JobsGalore.eu
JG
  • Log in
  • Sign up

JobsGalore.eu

  •    Post a job ad for free
  •    Upload your resume
  •    Send a message to employers
  •    Log in
  •    Sign up
  • Terms and Conditions
  • Privacy policy
  • Contact us
  • About us

Data Consultant (Data Modeller / Analyst )

VISA STATUS AND CLIENT EXPERIENCE

-   Have Permanent Residence of Australia (VISA-189- Skilled Independent)

-   Client site Experience at:

  • Australian Energy Market Operator (AEMO) as a Lead Data Modeller since May 2018
  • Parks Victoria, Melbourne as a Senior Data Modeller from Dec 2017 till May 2018)
  • University of Queensland, Australia as a Lead Data Modeller from Oct 2017 to Dec 2017
  • Johnson & Johnson, Australia as a Business Data Analyst from Oct-2016 to Mar 2017
  • American Express as a Data Architect from Mar 2011 to Sep 2016
  • Sony Pictures, US as a Data Analyst from Sep 2008 – Mar 2011
  • British Telecom, UK as a Solution Designer Sep 2006 to Sep 2008

Other experience details will be provided on request

PROFESSIONAL SUMMARY  

  • A senior consultant with 15+ years of IT experience which includes Data Modelling, Business Analysis, Data Analytics, Data Visualisation, Data Warehousing, Business Intelligence, Solution Designing, Business Modelling, Data Mining, Data Profiling
  • Worked closely with functional teams to review their data requirements and assisting with determining the most effective data access methods hence implementing data architecture standards.
  • Prepared the road map across Data Architecture, MDM, IM, Business Intelligence and Data Strategies on various computing and integration platforms including Cloud Computing and Platform as a service (PaaS)
  • Developed standards for Data Governance following DAMA ’s DMBOK and APRA’s management principles.
  • Communicated client’s business requirements to other teams by constructing easy-to-understand data and process models
  • Demonstrated understanding and successful application of proven database design principles such as: Data Governance, Data integrity, Data quality, Data cleansing, Data remediation, Data normalisation (3NF)
  • Experience in designing relational, dimensional and data vault models, deploying and supporting complex data warehouse / data marts and implementing Data Vault 2.0 methodology
  • Designed the Conceptual Data Model (CDM), logical data model (LDM) and physical data model (PDM) for a product or a service ensuring they conform to Industry standards
  • Hands-on design & development with Microsoft business intelligence using Power BI & implementing Data Warehouse in SQL Azure
  • Exposure to ETL Development, Implementation, Support and Testing of Data Warehousing Applications using Data Extraction, Transformation and Loading (ETL tools)
  • Experience in Business Objects Reporting making sure that the business rules are implemented correctly
  • Addressing issues of Data Migration (Data Validation, Clean-up and Data Mapping)
  • Effectively delivered timely and consistent deliverables and artefacts using waterfall and agile methodology
  • Strong all-round database experience with good SQL & PL/SQL knowledge
  • Strong in analytical problem-solving ability with excellent communication skills

QUALIFICATIONS

  • Master of Information Technology (MIT -3 years) from Guru Nanak Dev University, Amritsar with 68%
  • Bachelor of computer Science (B.Sc.) from Guru Nanak Dev University, Amritsar with 68%

SKILL-SET

Operating Systems: 

Windows, Unix, Macintosh, Microsoft Azure

Database Systems:

Oracle, SQL Server, MySQL, MS-Access, Microsoft Azure SQL Database

Tools:

ER Studio, Erwin Data Modeler, Business Objects, Informatica, Talend, Oracle Designer, Power Designer, IBM’s Information Analyzer, IBM’s RSA(Rational Software Architect), IBM Infosphere,Apache Hadoop, Microsoft BI stack, Microsoft Visio, SAS, UML, XML, Toad, PL/SQL Developer, SQL Server , Microsoft Visual Studio 2014, Analysis Services, Azure SQL Data Warehouse, Power BI

Domain Expertise:

Banking & Financial Services (BFS), Telecom, Media, Health, Education

PROJECT DETAILS

# Australian Energy Market Operator (AEMO) , Melbourne

May 2018- Till date

Working as a Lead Data Modeller for Forecasting group to assist in the development of demand/supply forecasts and also help AEMO rationalise their existing data structures and implement data governance processes.

Role & Responsibilities:

  • Leading the design of conceptual, logical and physical data models for supply / demand related data for both Gas & Electricity.
  • Analysing data from various internal and external sources and rationalising it to come up with a common data model
  • Helping in developing and implementing end-to-end data collection, consolidation and visualisation solutions
  • Identifying how the Data will flow through the successive stages involved thus ensuring data quality
  • Engage with customers and stakeholders to understand and document business requirements, and propose solution / design to create enterprise product
  • Undertaking complex analysis to assist in the identification of trends and potential improvement options
  • Ensure consistency and integrity of master data elements which exist in multiple source systems.
  • Prepare product module blueprints/models that demonstrate how product and service solutions can meet requirements
  • Developing Data Mapping and Transformation document for source to target mapping
  • Working closely with ETL team to develop business views and ETL framework / processes
  • Documenting metadata in Data Dictionary to assist in maintaining Data Glossary and Catalogue.
  • Defined object naming conventions and ensuring that these gets implemented across organisation
  • Working as a liaison between technology and the clients to improve processes and ensuring their adaption


Parks Victoria, Melbourne

Dec 2017- May 2018

Role & Responsibilities:

  • Analysed various Data Modelling tools and proposed the best tool (among Erwin Data Modeler & ER Studio) satisfying the custom requirements of Parks Victoria.
  • Ownership of data structures for multiple ongoing projects:

                  ◦   Park Connect (Commercial Agreements, Volunteer Management, Stakeholder Management, Scientific Research, Educational Activities Management)

                  ◦   Asset Information Management System

                  ◦   Digital (Park Web)

                  ◦   Business Intelligence (across PV projects)

  • Establish and maintain Data Dictionary (metadata) across the organisation.
  • Interacting with business and stakeholders to understand business requirements and translating business requirements into conceptual, logical, and physical data models by implementing Data Standardisation and Data Normalisation (3 NF) 
  • Model the already developed modules of Park Connect and identify the discrepancies in data structure and propose changes to ensure data quality,readiness and consistency
  • Prepared the conceptual model for proposed website for Parks Victoria Tourism and got it approved from business and all stakeholders.
  • Developing the detailed logical model for ParksWeb which will act as a base for Integration model for API design, development and testing
  • Reviewing the JSON files to ensure synchronisation of data model with integration model
  • Single point of contact for any new proposed DB changes and analysing the impact of change before giving go ahead.
  • Identify and record systems of record for data entities
  • Analysed data from various web analytics services including Google Analytics, Twitter, Facebook and came up with rationalised structure and come up with performance reporting
  • Implemented data mining, wrangling, mapping and visualisation techniques using Power BI
  • Undertook statistical analysis and came up with predictive modelling / analysis
  • Worked with stakeholders to identify technology and service gaps that need to be addressed to support on-going and future customer requirements
  • Implemented product designs & models addressing customer requirements that drive simplicity, service and product module reuse
  • Developed the Dimensional model for Park Connect & Park Web and deploy with Microsoft SQL Azure Data Warehouse 

# University of Queensland, Brisbane

Oct 2017- Dec 2017

Role & Responsibilities:

  • Review the CAUDIT Enterprise Architecture Reference Models and Catalogues to broadly understand “university business processes and data”
  • Establish initial Data Dictionary (metadata) requirements and initial metadata collection toolset
  • Identify information domains and the major data entities within those domains that support the business of the university, and
  • Model the relationships between the entities, including relationships across information domains by implementing Industry Data Standards and Data Normalisation (3 NF)
  • For each information domain, identify more fine-grained data entities that are critical to the domain and the business of the university
  • Model the relationships between those domain entities.
  • Identify and record systems of record for data entities
  • Did data mining and reporting using SAP Business Objects

#JOHNSON & JOHNSON ,Sydney

Oct 16- Mar 2017

Worked as a Business Data Analyst for Pacific DnA and primary responsibilities included:

  • Attended discovery sessions conducted at onsite to understand & communicated effectively to help technical team in finalising design steps for complex requirements.
  • Implemented Master Data Management Solution to ensure the management and integration of master data using IBM Infosphere
  • Assisted in developing business capability model and solution architecture from data perspective.
  • Worked as a liaison between technical teams and the clients to improve processes and support critical strategies by using active listening, questioning and consulting techniques. 
  • Rationalising existing data model using Erwin Data Modeller - Identifying the conformed dimensions & facts for the source systems
  • Developed data mapping documents for ETL process taking input from various sources : SSIS packages, SSRS, Existing documents, stored procedures, rationalized list of facts & dimension tables
  • Managed a Data Lake on Amazon Web Services ensuring flexibility, agility and security
  • Review Data Ingestion process/patterns & Data Transformation process

# AMERICAN EXPRESS
Apr 11 – Sep 16

Worked as a Senior Data Architect for core EDMS team (Enterprise Data Management System ). EDMS is the group that develops and maintains databases for software application systems of AMEX. I had ownership of architecture of various AMEX applications:

  • FXIP(Foreign Exchange International Payments)
  • INXS(International Exchange System) – Settlement and Forecasting
  • PPMIS (Partner Payment Management Information System)
  • CRM(Customer Relationship Management)
  • STAR(Standard Technology Application Rewards)
  • TRAVEL & HOTEL

Responsibilities:

  • Worked as a Data Architect to gather business, data and reporting requirements and translate business needs into logical data structures
  • Identify how a logical design will translate into one or more physical Databases, and how the Data will flow through the successive Stages involved
  • Develop and deliver the logical data model and first-cut physical data model for the project using ER Studio, ensuring they conform to Industry standards
  • Aligned the processes with Financial Services Industry Data models
  • Control data model changes through active involvement throughout the project lifecycle ( using waterfall or Agile / Scrum methodology ), as well as reviewing and approving data designs
  • Lead a team of 6 Data Analysts and Data Modellers and prepared project estimates and project plan for various BI initiatives and solutions for AMEX.
  • Designed Data Mart using Dimensional Modelling by identifying Facts and Dimensions
  • Working knowledge of Data Vault Modelling by creating Hubs, Links and Satellites (following Data Vault 2.0 Methodology)
  • Implemented Product information management (PIM) & maintained models and its related information in Product/Service catalogue systems
  • Ensuring architectural alignment of the designs and development done by the various technical teams
  • Involved in requirement analysis, design, development and testing of various sources to be imported to adaptive metadata repository & IBM IGC repository
  • Ensure consistency and integrity of master data elements which exist in multiple information systems or documents.
  • Managing large datasets of structured and unstructured data using HDFS (the Hadoop Distributed File System) and MapReduce to analyse big data
  • Queried NoSQL Databases using Apache Cassandra
  • Created Data model to address business needs for Master Data Management hence ensuring streamline of data processes and ensuring integration of data across multiple systems.
  • Worked with ETL team to implement solution using Informatica Power Centre, Informatica Data quality Solution (IDQ) , Big Data Management (BDM) , Enterprise Data Catalogue, Data Explorer, and Metadata Manager
  • Review detailed system design documents, data model library and any other supporting documentation
  • Ensuring regulatory and compliance requirements are being met.
  • Support metadata management and data quality management
  • Work with the DBAs to ensure optimum data performance 

#SONY PICTURES ENTERTAINMENT– USA

Sep 08 – Mar-11

Part of SPE’s PEAQ (Planning Enterprise Architecture Quality) team. Worked for various lines of business of Sony Pictures on applications like SPT,B2B Portal, dtv_security, GPMS, SPIRIT, Ventana, intsales etc.

Responsibilities:

  • Worked as a Data Analyst to gather business, data and reporting requirements and translate business needs into logical data model using Erwin Data Modeler
  • Worked as a liaison between technology and the clients to improve processes and support critical strategies by using active listening, questioning and consulting techniques. 
  • Reverse engineering the existing database schema using Erwin Data Modeler and segregating in subject areas and then identifying the issues with the existing schema and then providing recommendations for uplifting the existing structure
  • Created Conceptual Data Model with high level entities after Requirement Analysis.
  • Defining and maintaining Data Dictionary
  • Data profiling and test data identification
  • Reviewed Functional Requirements document and Prepared Report Specifications document
  • Did data mining and reporting using SAP Business Objects
  • Designing/Reviewing Universe design and Develop/Test/Review BO reports
  • Provide UAT support, user training and Hyper-care for reporting and transition to support team

#TSR Migration – British Telecom, UK

Sep 06 – Sep 08

The scope of this telecom project was to migrate customers and products from the old legacy stack (specifically CSS) and onto the new target stack, ensuring compliance with the BT Undertakings at the same time.

A new Billing Platform was being developed which contains BAL, Antillia (Geneva instance) and interfaces to Antillia like CFB, Adder and Call Record Mediation. A Migration Controller was used to migrate data from the existing CSS system to this new billing platform.

Responsibilities:

  • Solution Designer / Data Analyst responsible for analysing the existing system and Requirement Gathering from client and develop the logical data model using Erwin Data Modeler for the project, ensuring they conform to Industry Telecom standards
  • Control data model changes through active involvement throughout the project lifecycle, as well as reviewing and approving data designs
  • Designed and developed UML Use Case Diagrams, Class Diagrams, Activity Diagrams, Sequence Diagrams, Data Flow Diagrams and Collaboration diagrams using MS Visio
  • Mentoring the off-shore group of 8 Tech M (other vendor of BT) resources.
  • Responsible for Service Oriented Analysis
  • Defining Data Dictionary and Product mapping files required for migration controller.


Anonymous

Summary

Posted: 30 January 2019

Location:  Melbourne, VIC

Professional area:

  • IT
Find quickly
Send to employers

Data Consultant (Data Modeller / Analyst )

Summary

Posted: 30 January 2019

Location:  Melbourne, VIC

Professional area:

  • IT

VISA STATUS AND CLIENT EXPERIENCE

-   Have Permanent Residence of Australia (VISA-189- Skilled Independent)

-   Client site Experience at:

  • Australian Energy Market Operator (AEMO) as a Lead Data Modeller since May 2018
  • Parks Victoria, Melbourne as a Senior Data Modeller from Dec 2017 till May 2018)
  • University of Queensland, Australia as a Lead Data Modeller from Oct 2017 to Dec 2017
  • Johnson & Johnson, Australia as a Business Data Analyst from Oct-2016 to Mar 2017
  • American Express as a Data Architect from Mar 2011 to Sep 2016
  • Sony Pictures, US as a Data Analyst from Sep 2008 – Mar 2011
  • British Telecom, UK as a Solution Designer Sep 2006 to Sep 2008

Other experience details will be provided on request

PROFESSIONAL SUMMARY  

  • A senior consultant with 15+ years of IT experience which includes Data Modelling, Business Analysis, Data Analytics, Data Visualisation, Data Warehousing, Business Intelligence, Solution Designing, Business Modelling, Data Mining, Data Profiling
  • Worked closely with functional teams to review their data requirements and assisting with determining the most effective data access methods hence implementing data architecture standards.
  • Prepared the road map across Data Architecture, MDM, IM, Business Intelligence and Data Strategies on various computing and integration platforms including Cloud Computing and Platform as a service (PaaS)
  • Developed standards for Data Governance following DAMA ’s DMBOK and APRA’s management principles.
  • Communicated client’s business requirements to other teams by constructing easy-to-understand data and process models
  • Demonstrated understanding and successful application of proven database design principles such as: Data Governance, Data integrity, Data quality, Data cleansing, Data remediation, Data normalisation (3NF)
  • Experience in designing relational, dimensional and data vault models, deploying and supporting complex data warehouse / data marts and implementing Data Vault 2.0 methodology
  • Designed the Conceptual Data Model (CDM), logical data model (LDM) and physical data model (PDM) for a product or a service ensuring they conform to Industry standards
  • Hands-on design & development with Microsoft business intelligence using Power BI & implementing Data Warehouse in SQL Azure
  • Exposure to ETL Development, Implementation, Support and Testing of Data Warehousing Applications using Data Extraction, Transformation and Loading (ETL tools)
  • Experience in Business Objects Reporting making sure that the business rules are implemented correctly
  • Addressing issues of Data Migration (Data Validation, Clean-up and Data Mapping)
  • Effectively delivered timely and consistent deliverables and artefacts using waterfall and agile methodology
  • Strong all-round database experience with good SQL & PL/SQL knowledge
  • Strong in analytical problem-solving ability with excellent communication skills

QUALIFICATIONS

  • Master of Information Technology (MIT -3 years) from Guru Nanak Dev University, Amritsar with 68%
  • Bachelor of computer Science (B.Sc.) from Guru Nanak Dev University, Amritsar with 68%

SKILL-SET

Operating Systems: 

Windows, Unix, Macintosh, Microsoft Azure

Database Systems:

Oracle, SQL Server, MySQL, MS-Access, Microsoft Azure SQL Database

Tools:

ER Studio, Erwin Data Modeler, Business Objects, Informatica, Talend, Oracle Designer, Power Designer, IBM’s Information Analyzer, IBM’s RSA(Rational Software Architect), IBM Infosphere,Apache Hadoop, Microsoft BI stack, Microsoft Visio, SAS, UML, XML, Toad, PL/SQL Developer, SQL Server , Microsoft Visual Studio 2014, Analysis Services, Azure SQL Data Warehouse, Power BI

Domain Expertise:

Banking & Financial Services (BFS), Telecom, Media, Health, Education

PROJECT DETAILS

# Australian Energy Market Operator (AEMO) , Melbourne

May 2018- Till date

Working as a Lead Data Modeller for Forecasting group to assist in the development of demand/supply forecasts and also help AEMO rationalise their existing data structures and implement data governance processes.

Role & Responsibilities:

  • Leading the design of conceptual, logical and physical data models for supply / demand related data for both Gas & Electricity.
  • Analysing data from various internal and external sources and rationalising it to come up with a common data model
  • Helping in developing and implementing end-to-end data collection, consolidation and visualisation solutions
  • Identifying how the Data will flow through the successive stages involved thus ensuring data quality
  • Engage with customers and stakeholders to understand and document business requirements, and propose solution / design to create enterprise product
  • Undertaking complex analysis to assist in the identification of trends and potential improvement options
  • Ensure consistency and integrity of master data elements which exist in multiple source systems.
  • Prepare product module blueprints/models that demonstrate how product and service solutions can meet requirements
  • Developing Data Mapping and Transformation document for source to target mapping
  • Working closely with ETL team to develop business views and ETL framework / processes
  • Documenting metadata in Data Dictionary to assist in maintaining Data Glossary and Catalogue.
  • Defined object naming conventions and ensuring that these gets implemented across organisation
  • Working as a liaison between technology and the clients to improve processes and ensuring their adaption


Parks Victoria, Melbourne

Dec 2017- May 2018

Role & Responsibilities:

  • Analysed various Data Modelling tools and proposed the best tool (among Erwin Data Modeler & ER Studio) satisfying the custom requirements of Parks Victoria.
  • Ownership of data structures for multiple ongoing projects:

                  ◦   Park Connect (Commercial Agreements, Volunteer Management, Stakeholder Management, Scientific Research, Educational Activities Management)

                  ◦   Asset Information Management System

                  ◦   Digital (Park Web)

                  ◦   Business Intelligence (across PV projects)

  • Establish and maintain Data Dictionary (metadata) across the organisation.
  • Interacting with business and stakeholders to understand business requirements and translating business requirements into conceptual, logical, and physical data models by implementing Data Standardisation and Data Normalisation (3 NF) 
  • Model the already developed modules of Park Connect and identify the discrepancies in data structure and propose changes to ensure data quality,readiness and consistency
  • Prepared the conceptual model for proposed website for Parks Victoria Tourism and got it approved from business and all stakeholders.
  • Developing the detailed logical model for ParksWeb which will act as a base for Integration model for API design, development and testing
  • Reviewing the JSON files to ensure synchronisation of data model with integration model
  • Single point of contact for any new proposed DB changes and analysing the impact of change before giving go ahead.
  • Identify and record systems of record for data entities
  • Analysed data from various web analytics services including Google Analytics, Twitter, Facebook and came up with rationalised structure and come up with performance reporting
  • Implemented data mining, wrangling, mapping and visualisation techniques using Power BI
  • Undertook statistical analysis and came up with predictive modelling / analysis
  • Worked with stakeholders to identify technology and service gaps that need to be addressed to support on-going and future customer requirements
  • Implemented product designs & models addressing customer requirements that drive simplicity, service and product module reuse
  • Developed the Dimensional model for Park Connect & Park Web and deploy with Microsoft SQL Azure Data Warehouse 

# University of Queensland, Brisbane

Oct 2017- Dec 2017

Role & Responsibilities:

  • Review the CAUDIT Enterprise Architecture Reference Models and Catalogues to broadly understand “university business processes and data”
  • Establish initial Data Dictionary (metadata) requirements and initial metadata collection toolset
  • Identify information domains and the major data entities within those domains that support the business of the university, and
  • Model the relationships between the entities, including relationships across information domains by implementing Industry Data Standards and Data Normalisation (3 NF)
  • For each information domain, identify more fine-grained data entities that are critical to the domain and the business of the university
  • Model the relationships between those domain entities.
  • Identify and record systems of record for data entities
  • Did data mining and reporting using SAP Business Objects

#JOHNSON & JOHNSON ,Sydney

Oct 16- Mar 2017

Worked as a Business Data Analyst for Pacific DnA and primary responsibilities included:

  • Attended discovery sessions conducted at onsite to understand & communicated effectively to help technical team in finalising design steps for complex requirements.
  • Implemented Master Data Management Solution to ensure the management and integration of master data using IBM Infosphere
  • Assisted in developing business capability model and solution architecture from data perspective.
  • Worked as a liaison between technical teams and the clients to improve processes and support critical strategies by using active listening, questioning and consulting techniques. 
  • Rationalising existing data model using Erwin Data Modeller - Identifying the conformed dimensions & facts for the source systems
  • Developed data mapping documents for ETL process taking input from various sources : SSIS packages, SSRS, Existing documents, stored procedures, rationalized list of facts & dimension tables
  • Managed a Data Lake on Amazon Web Services ensuring flexibility, agility and security
  • Review Data Ingestion process/patterns & Data Transformation process

# AMERICAN EXPRESS
Apr 11 – Sep 16

Worked as a Senior Data Architect for core EDMS team (Enterprise Data Management System ). EDMS is the group that develops and maintains databases for software application systems of AMEX. I had ownership of architecture of various AMEX applications:

  • FXIP(Foreign Exchange International Payments)
  • INXS(International Exchange System) – Settlement and Forecasting
  • PPMIS (Partner Payment Management Information System)
  • CRM(Customer Relationship Management)
  • STAR(Standard Technology Application Rewards)
  • TRAVEL & HOTEL

Responsibilities:

  • Worked as a Data Architect to gather business, data and reporting requirements and translate business needs into logical data structures
  • Identify how a logical design will translate into one or more physical Databases, and how the Data will flow through the successive Stages involved
  • Develop and deliver the logical data model and first-cut physical data model for the project using ER Studio, ensuring they conform to Industry standards
  • Aligned the processes with Financial Services Industry Data models
  • Control data model changes through active involvement throughout the project lifecycle ( using waterfall or Agile / Scrum methodology ), as well as reviewing and approving data designs
  • Lead a team of 6 Data Analysts and Data Modellers and prepared project estimates and project plan for various BI initiatives and solutions for AMEX.
  • Designed Data Mart using Dimensional Modelling by identifying Facts and Dimensions
  • Working knowledge of Data Vault Modelling by creating Hubs, Links and Satellites (following Data Vault 2.0 Methodology)
  • Implemented Product information management (PIM) & maintained models and its related information in Product/Service catalogue systems
  • Ensuring architectural alignment of the designs and development done by the various technical teams
  • Involved in requirement analysis, design, development and testing of various sources to be imported to adaptive metadata repository & IBM IGC repository
  • Ensure consistency and integrity of master data elements which exist in multiple information systems or documents.
  • Managing large datasets of structured and unstructured data using HDFS (the Hadoop Distributed File System) and MapReduce to analyse big data
  • Queried NoSQL Databases using Apache Cassandra
  • Created Data model to address business needs for Master Data Management hence ensuring streamline of data processes and ensuring integration of data across multiple systems.
  • Worked with ETL team to implement solution using Informatica Power Centre, Informatica Data quality Solution (IDQ) , Big Data Management (BDM) , Enterprise Data Catalogue, Data Explorer, and Metadata Manager
  • Review detailed system design documents, data model library and any other supporting documentation
  • Ensuring regulatory and compliance requirements are being met.
  • Support metadata management and data quality management
  • Work with the DBAs to ensure optimum data performance 

#SONY PICTURES ENTERTAINMENT– USA

Sep 08 – Mar-11

Part of SPE’s PEAQ (Planning Enterprise Architecture Quality) team. Worked for various lines of business of Sony Pictures on applications like SPT,B2B Portal, dtv_security, GPMS, SPIRIT, Ventana, intsales etc.

Responsibilities:

  • Worked as a Data Analyst to gather business, data and reporting requirements and translate business needs into logical data model using Erwin Data Modeler
  • Worked as a liaison between technology and the clients to improve processes and support critical strategies by using active listening, questioning and consulting techniques. 
  • Reverse engineering the existing database schema using Erwin Data Modeler and segregating in subject areas and then identifying the issues with the existing schema and then providing recommendations for uplifting the existing structure
  • Created Conceptual Data Model with high level entities after Requirement Analysis.
  • Defining and maintaining Data Dictionary
  • Data profiling and test data identification
  • Reviewed Functional Requirements document and Prepared Report Specifications document
  • Did data mining and reporting using SAP Business Objects
  • Designing/Reviewing Universe design and Develop/Test/Review BO reports
  • Provide UAT support, user training and Hyper-care for reporting and transition to support team

#TSR Migration – British Telecom, UK

Sep 06 – Sep 08

The scope of this telecom project was to migrate customers and products from the old legacy stack (specifically CSS) and onto the new target stack, ensuring compliance with the BT Undertakings at the same time.

A new Billing Platform was being developed which contains BAL, Antillia (Geneva instance) and interfaces to Antillia like CFB, Adder and Call Record Mediation. A Migration Controller was used to migrate data from the existing CSS system to this new billing platform.

Responsibilities:

  • Solution Designer / Data Analyst responsible for analysing the existing system and Requirement Gathering from client and develop the logical data model using Erwin Data Modeler for the project, ensuring they conform to Industry Telecom standards
  • Control data model changes through active involvement throughout the project lifecycle, as well as reviewing and approving data designs
  • Designed and developed UML Use Case Diagrams, Class Diagrams, Activity Diagrams, Sequence Diagrams, Data Flow Diagrams and Collaboration diagrams using MS Visio
  • Mentoring the off-shore group of 8 Tech M (other vendor of BT) resources.
  • Responsible for Service Oriented Analysis
  • Defining Data Dictionary and Product mapping files required for migration controller.


Send a message

Could you help us?

JobsGalore is a young project. Could you help us? Could you tell your friends about us? Please share a link to JobsGalore.

Thank you in advance

  • About us
  • Contact us
  • Terms and Conditions
  • Privacy policy
Connect With Us:
  • Connect With Us:
© JobsGalore All rights reserved.