Government Pension Administration Agency (GPAA)
The Government Pensions Administration Agency (GPAA) is an Agency that provides administrative services to the GEPF and National Treasury. The provision of services is regulated by Service Level Agreements (SLAs).
The funds and schemes that are currently administered by the GPAA are as follows:
- The GEPF in terms of the Government Employees Pension (GEP) Law of 1996 on behalf of the GEPF's Board of Trustees.
- The TEPF in terms of the Temporary Employees Pension Fund (TEPF) Act 75 of 1979 on behalf of National Treasury's Programme 7.
- The AIPF in terms of the Associated Institutions Pension Fund (AIPF) Act 41 of 1963 on behalf of National Treasury's Programme 7.
- Post-Retirement Medical Subsidies as provided for and regulated by PSCBC resolutions on behalf of National Treasury's Programme 7.
- Military Pensions in terms of the Military Pensions Act 84 of 1976 on behalf of National Treasury's Programme 7.
- Injury on duty payments in terms of the Compensation for Occupational Injuries and Diseases Act 130 of 1993 on behalf of National Treasury's Programme 7.
- Special Pensions in terms of the Special Pensions Act 69 of 1996 on behalf of National Treasury's Programme 7.
- Other benefits payable from the National Treasury's Programme.
The Government Pensions Administration Agency (GPAA) in its efforts to streamline its business processes to provide a more effective Defined Benefits Pensions Administrations process has embarked on a Modernisation Programme.
GPAA Modernisation Programme
The core deliverables of this programme are Data Migration from legacy systems and Master Data Management aligned to an Enterprise Data Model.
The solution will provide for the monitoring and ownership of data quality within the organisation as aligned to key performance indicators and governance frameworks.
The objective of the GPAA Modernisation Programme EDM deliverable is to establish and operationalise the management of information during the entire life cycle, including data creation, processing, consumption, storage, archiving
The scope of the project includes the following services:
The following strategic benefits will be delivered:
- Profiling of data in order to classify information and capture data quality rules for the identified data sets
- Identification and prioritisation of data sets to be cleaned
- Creation of data quality reports, data cleansing mechanisms and remedial plans
- Data cleansing and enrichment execution
- Migrate information from legacy systems
Implementation of Master Data Management to provide a single view of information and to provide the GPAA with data maintenance and monitoring capabilities
- Automation of ETL processes of information from external sources
- Creation of an Enterprise Data Model with common taxonomy for business and data rules
- Definition of a strategy to sustain/embed/operationalize data maintenance within the GPAA
Definition of a data and information architecture that delivers operational, tactical and strategic quality information for analysis and decision making
- Exposing and dissemination of migrated information to operational, strategic and corporate domains
- To provide a single view of the customer
- To pay the right person, the correct amount, at the right time
- To administer funds at an economically acceptable cost per member
The requirement is for a solution that enables the GPAA to manage and mature an enterprise data management competency or practice. This includes installation, implementation, configuration and on-going maintenance and support.
At a high level the requirements can be split into the following Functional and Architectural requirements.
The migration of information from the current systems will include data analysis, profiling, classification, cleansing and transformation. The current information producers are a combination of internal and external sources. Internally the GPAA stores
information on Oracle databases, as well as on a mainframe system that runs on a Z/OS operating system with Natural programs connecting to an Adabas database version 8.2.6.
- Infomet shall provide a migration strategy and approach
- Data should be cleansed according to the GPAA administrative rules, statutes, policies and business rules
- On-going data audits shall be conducted against all identified data sources during the cleansing process
A data analysis report shall be provided from the data audit (e.g. description of problem, data source, number of occurrences, impact on production data, type of fix that was applied, number of records fixed, and number of records unable to be fixed)
- Provide reconciliation reports to ensure counts and totals between source and target
- Infomet shall retain history (audit trail) of all data elements that are changed through
- cleansing (before and after)
Infomet shall provide the testing process, including the test plan, test cases, test scenarios and expected results for the migration
Master Data Management
Business critical data is fragmented and spread across various departments, functions, processes and systems. To enable data ownership and stewardship, GPAA requires all the critical data entities in a central repository from which a single or 360-degree
view of the data is created so that the business owner does not have to go from one system to another to verify its accuracy, completeness, conformity and integrity of the data.
- The solution should be able to manage multiple business entities.
- Support best practice data governance policies and processes including control and auditing capabilities.
- Integrate to the GPAA security and reporting tools to provide fine-grained access to data and reliable data quality metrics.
Provide a workflow capability including the creation and authorisation of reference data and data definitions, and automatic notification of data quality issues.
Automatically generate changes to data services whenever the data model is updated with new attributes, entities, or sources.
Should support a combination of matching techniques (deterministic, probabilistic, heuristic, phonetic, linguistic, empirical, etc.) with each able to address a particular class of data matching.
The solution should be able to automatically create a golden record for any data entity and provide a robust unmerge functionality in order to rollback any manual errors or exceptions.
- The history of all changes to data and the lineage of how the data has changed needs to be captured as metadata.
- Reference data should be managed and maintained in a central repository to enable reuse across processes and applications.
Enterprise Data Model
The enterprise data model is simply a canonical or standard model of the common data objects or entities and their key attributes that are required to run GPAA. In order to have a central repository (MDM), especially where data is fragmented and spread
across various systems, it is important that stakeholders speak a common language. The EDM solution should facilitate this via an Enterprise Data Model around which common business and technical terminology can be created
and linked together. The GPAA requires a defined metadata function to manage the technical, operational, and business metadata. Technical metadata should include legacy platform fields and definitions. Operational metadata
should include data transfers, movement, and mapping, while business metadata encompasses business glossary and the common vocabulary.
Data Quality Monitoring
Quality of data has a direct impact on how well GPAA performs. The GPAA needs to monitor data quality metrics which is linked to key organizational and individual performance indicators. The EDM solution should provide reports and dashboards that reflect
these data quality metrics and known data issues so that stakeholders can perform the required data quality monitoring and react appropriately. Information should be available for trending and forecasting.
- System stability and availability should adhere to the GPAA requirements and Service Level Agreements
- Solution should support full redundancy.
Solution should expose sufficient and useful instrumentation to perform system monitoring, debugging and performance tuning.
- Solution should be deployed on-site.
Solution should be able to handle increases in load without impact on the performance of the system, and provide the ability to be readily enlarged.
Authentication and Authorisation
Integrate into the Oracle Identity and Access Management system for all identity management and role-based access functionality.
- Ensure Role-based Access Management is applied using single sign on (SSO)
- Solution must provide the ability to communicate and exchange information with other systems.
- Architecture should enable seamless integration of applications within the landscape.
Applications, as providers should be decoupled from consumers based on SOA principles to promote the introduction of new applications.
- Real-time / messaging integration should be completed using the ESB.
- Services should encompass business functions. These services should be available for application and process consumption.
- Services should be discoverable to enable business process management.
Application interfaces and services should be reusable. They should contain a sufficient level of isolation to enable versioning.
Data Architecture Requirements
- Solution must allow integration of OBIEE for MIS
- Bulk data movements should be visible within the Business Activity Monitoring component of the ESB.
Utilise data integration monitoring tool to view data transfers across batch and real-time messaging. This is necessary to create a single and consolidated view of data flows across a process.
- Business process management requires real-time data in order to drive process optimisations.
The architecture should contain a repository of transactions and events that are in flight, completed, cancelled and failed.
Valid and accurate data sources such as Excel and Word documents should be accepted as data sources to the Reporting Architecture.
- Consolidat financial information to a single repository for accurate and timely financial reporting.
- Structured and unstructured content should be integrated to the overarching correspondence and administration processes.
- Data should be available and understandable for discovery.
- Historic information should be available for extraction into other tools for trending and forecasting.
Architecture should integrate into a centrally maintained user authorisation and authentication component. User access to data should be controlled within central location and not managed within reporting solution.
Information integrity should be maintained, while information should be guarded against improper modification or destruction. This must be underpinned with non-repudiation and authenticity within the architecture.
Information should remain confidential when required due to either member preference or legislation. Implies, preserving authorised restrictions on access and disclosure, including means for protecting personal privacy and proprietary information. Information
classification should be enabled within the architecture to control user access.
- All access to member information should be tracked and logged for auditing purposes.
Background and Motivation
Regulation 28 of the Pension Funds Act was amended effective on 1 July 2011, with a transitional period provided until 31 December 2011. It imposes limits on the investments of retirement’s fund and these limits are intended to protect
retirement funds against making imprudent investments. Regulation 28 places limits on investments held by Retirement Funds (RA, Pension and Provident) by limiting the exposure to asset classes as well as issuer exposure, in order to prevent
over-exposure of policyholders assets to investment risks. Stakeholders are required to adhere to these limits and to demonstrate compliance by reporting on these exposures quarterly.
This regulation imposes more onerous provisions in that there are now additional requirements:
- Compliance at member level.
- A 'look through' principle to investments.
- New limits on asset class exposure and exposure to the individual issuer of financial instruments.
- Reporting to the Regulator of any breaches on a quarterly basis.
It is important to note that these changes affected both Liberty Corporate and Retail business.
Summary of Requirements
- Infomet was required to establish a solution for sourcing and provisioning of 'On Balance Sheet' asset data for Liberty's four Life Licences that will enable specific reporting with the appropriate frequency, consistency and
flexibility. The solution must specifically cater for Offshore Asset Data and provide 'Look Through' information where appropriate and available.
We also needed to establish the ability to perform 'Overlays' between the above-mentioned Asset Data to Liability Information available in the existing Infomet Finance & Risk Information Delivery Solution (FRID) at Liberty.
It was imperative to our client that the information included in FRID as a base for the above, is reconcilable between Source System, FRID and the SAP General Ledger for each relevant Source System.
Deliverables to Date
In delivering against the requirements described above, the following has been achieved:
Sourcing investment and investment related data (including information) at Investment Portfolio, Counterparty and Financial Instrument levels from a wide variety of sources, including -
- Transactional sources (x6)
- Descriptive market and instrument structure sources (x7)
Using Rule Sets to ensure proper standardisations and classifications of instruments and counterparties without losing traceability to the original source information.
Successfully utilising the Infomet GBM's Stakeholder and Item Variant Structures to identify, manage and even eliminate duplications.
Providing users with a comprehensive 'look-through' capability for collective investment instruments as required by regulators.
Presenting Investment Management Information to management and regulators in a variety of ways, including -
- Investment Register
- Investment Library
- Credit Risk Information
- Regulatory Reports (including Solvency and PFA Regulation 28)
A wide variety of End-to-end Investment Management & Administration solutions exist that help the users thereof to effectively operate the traditional Investment Management Life Cycle, including Strategy & Planning, Trading, Monitoring,
Investment Accounting and Reporting. Probably the most challenging aspect of this industry revolves around the availability of accurate, consistent and relevant data. A variety of providers have thus also established investment reference
data solutions to address this issue. Due to the rapidly changing environment, volumes and complexities, some organisations further provide end-to-end back office services to clients to reduce their own exposure to the technology and information
challenges experienced in the industry. In order to effectively manage their operations and to provide the information required internally and also externally to their 'clients' for regulatory reporting, investment managers typically have
to rely on a combination of these different types of solutions.
Based on Infomet's experiences and IMI delivery at Liberty Life, discussions and cooperation with Deloitte (South Africa), we came to the conclusion that the need exists for an Investment-in-a-Box Solution that does not aim to replace the existing solution
combinations, but to rather augment them in areas where they are less than effective. The following diagram indicates the specific areas that warranted focus in our opinion -
What does IBX do?
The IBX Solution takes data from a variety of sources (industry-wide or specific internal) using a set of preconfigured ETL's (sourcing protocols). This does not alleviate the raw cost of such data (e.g. Bloomberg) but does provide the ability to extract
the best possible value from such cost.
Our solution includes a stakeholder variant and item variant module that identifies probable and possible data duplications and inconsistencies and assists the user to resolve these - retaining such resolution rules going forward. The output of this module
is a 'Master' Counterparty or Instrument record that is constituted of the most reliable information for that item that is constructed from the entire combination of available data sources (per attribute, per source ranking).
The solution also contains a generic financial instrument classification module that allows the user to classify financial instruments appropriately and consistently without losing the 'classification' viewpoints taken by the original source of the data.
The intention is for this to be done either internally or with ongoing support from Deloitte and Infomet subject matter experts. Where instrument classification is absent or inaccurate in the source data, the solution uses
a set of 5,000 generic and specific classification rules to 'suggest' a standard classification, which is then accepted or further specified by the user - again retaining the resolution 'rule' for consistent future use.
It should be noted that the system has the ability to record and maintain different subject matter opinions and interpretations for future reference purposes.
Various role players have various different views and interpretations of the same information. From a compliance perspective, this becomes very specific and often confusing. In order to address this any solution must provide the ability to view the same
data from a variety of different viewpoints and to have the flexibility to manage changes in these prescriptions with ease.
The system allows the use specification of several benchmarks (e.g. Internal, Regulation 28, Solvency) against which information can be viewed and evaluated. These can be maintained on a 'centralised' basis by Deloitte and Infomet Subject Matter Experts.
The business user merely selects the required viewpoint and the system applies the relevant parameters of that regulatory or internal benchmark view (e.g. post-trade compliant or not). Through the ability to import periodic
(hourly, daily or otherwise) investment plans/orders compliance can also be monitored on a pre-trade basis).
Similarly, the solution ships with the required transformation rules and mappings to be able to represent the information in the most common regulatory output formats. Again, with the ability to have these managed on a centralised basis by our experts.
The generic business model architecture of the solution also ensures that different versions of these outputs (e.g. with / without rule changes) can be maintained with ease.
IBX therefore does not attempt to replace the back office, trading or accounting functionality of the traditional investment management systems, nor does it represent a 'golden' source of investment reference data. It serves as a tool to manage the standardisation,
classification and further enrichment of investment data from these sources, whilst providing a host of expert-maintained benchmark 'overlays' and regulatory output frameworks to the business.