Introduction to the Digital Appraisal, Selection and Sensitivity Review Maturity Assessment Tool

Distinguishing between records worthy of permanent preservation and those without further value is a significant challenge, especially when it comes to the vast amounts of digital records created by public bodies. It is widely accepted that the appraisal, selection, and sensitivity review of digital records will only be feasible with machine assistance. The lack of such assistance will lead to delays in the transfer of records to the archive, which negatively impacts the archive’s ability to support transparency and openness and help to ensure the government is accountable for its actions by providing public access to public records.

Building the capability of digital appraisal and sensitivity review for legacy and current records will require investment in skills, infrastructure, and cross-departmental collaboration. Most departments’ Knowledge and Information Management (KIM) function reported a lack of digital skills, no influence to obtain necessary resources for investment in this area, and a lack of engagement with the Government Digital Data (GDD) profession in finding solutions.

As part of the Information Management Assessment (IMA) program, The National Archives has developed a Digital Appraisal, Selection and Sensitivity Review (ASSR) Maturity Assessment Tool. The National Archives has worked with the Government KIM community to identify requirements at a high level and define what good looks like in elements such as planning, obtaining resources, skills, technology and senior buy-in. The aim of the Digital ASSR Maturity Assessment Tool is to provide targeted governance next steps, rather than the technical implementation or specific methods.

The Digital ASSR Maturity Assessment Tool is available as an IMA module to assess departments’ current state and progress in building capabilities. The National Archives will provide a report on recommendations to raise awareness and help Departmental Record Officers (DROs) obtain necessary resources.

The Digital ASSR Maturity Assessment Tool: Scope

The introduction to The National Archives’ Section 46 Code Maturity Assessment Tool states that “public sector bodies, by completing the Self-Assessment, will be able to measure the strengths and weaknesses of their Knowledge and Information Management (KIM) practices, set improvement targets and monitor future progress.”

This applies equally to the Digital ASSR Maturity Assessment Tool which aims to be:

  • A measure of maturity to support conversations and bids for funding and resources to carry out required work expected of the Public Records Act and Freedom of Information Act.
  • A prompt for next actions in organisations. The collective questions where scores are requiring most improvement give a steer for next steps. They will carry differing priority levels and differing resources and capability to improve upon.
  • A means for The National Archives to understand cross-Government challenges for appraisal and selection and how we can support future development in this area.

In terms of scope, the Digital ASSR Maturity Assessment tool does not include questions on destruction, as these are covered sufficiently in the S46 Maturity Assessment tool. Likewise, the Digital ASSR Maturity Assessment tool does not contain questions on transfer. The Digital ASSR Maturity Assessment tool does include questions on cataloguing and the documentation of records to be prepared for transfer.

Whether some Digital ASSR Maturity Assessment tool questions should be sub divided into records collections, as some are in the S46 Maturity Assessment tool was considered. It was decided not to do this, at least for the first versions. And so, when answering the two new tools, organisations should consider how they score across all collections, or across the main records collections that undergo appraisal & selection and sensitivity review.

During the development of the Digital ASSR Maturity Assessment tool, fundamental questions arose about why this tool is for digital ASSR and not for paper too. It has been decided that the new tool covers digital records only, because:

  • Paper records responsibilities and reviews are covered by the S46 Maturity Assessment tool.
  • The challenges relating to A&S and SR of paper records in Government are longstanding (dating back to the establishment of the PRA in the 1950s, at least) and have been well reported by Departments each year as part of The National Archives’ IMR process. Digital A&S and SR brings with it its own unique challenges, especially in terms of scale and technology. It is also the area of Knowledge and Information Management (KIM) funding most neglected in Departments, which is why a specific tool relating to digital A&S and SR are needed.

There is an acknowledgement of the need to consider hybrid record collections. Where these are held by organisations, they should be assessed as part of all digital record collections against which A&S and SR are required.

The National Archives does not advocate long term storage of digital files ready for transfer. We instead advise transferring as soon as records are ready and encourage early transfer where A&S and SR has been carried out ahead of schedule. Clearly though, organisations will need to consider storage solutions for legacy digital records, either:

  • When staged appraisal has been carried out (and records need to be held in storage until recourses are available to carry out full A&S and SR at or before the 20-year point), or
  • When the corporate file store solution changes or is upgraded, and decisions are made about where legacy records are stored.

Ambitions or an organisation’s targets within both tools will depend on several factors, including size (both digital heap and the organisation), complexity, funding and resourcing. A ‘mastering’ level of maturity may not be practical or necessary depending on your organisation’s circumstances. This is particularly true of technical capabilities and use of AI, for instance. An organisation may have the necessary understanding on AI uses in records management but have decided not to use AI because the digital heap is manageable manually by people, or because it cannot be funded, and a more cost-effective alternative is in use.

The Digital ASSR Maturity Assessment Tool Criteria: Appraisal and Selection

The expectation is that the maturity statements in the Digital ASSR Maturity Assessment tool are easy to understand, but where it is useful, we have provided further explanations and examples below.

Governance

(1.1.1.) Governance: oversight, policies and methodologies in place and in use – at the mastering level, excellent governance will include senior management engagement and involvement, including actively reviewing plans, resources, The National Archives plans and updates, and compliance meetings with The National Archives.

(1.1.2.) Governance: content asset and liability balance – as an important part of appraisal and selection is to understand value and context, you should identify assets and also liabilities. ‘Liabilities’ are digital content that have no ‘continuing, evidential and historical value’ and may, for instance, be identified for destruction after 7 years.

(1.1.3.) Governance: understanding of the size of legacy digital records – at its simplest, this is an analysis of the volume, type, and age profile of your digital heap. This maturity statement is linked to those in the ‘technical capabilities’ section; (1.3.1.) visibility; (1.3.5.) analysis; (1.3.6.) dashboard capability.

(1.1.4.) Governance: roles and responsibilities – Roles and responsibilities will vary in each organisation. This will often include

  • The records team itself;
  • Subject matter experts such as Information Asset Owners (IAOs) and Senior Information Risk Owner (SIRO) who will have knowledge of the value and context of records;
  • Senior managers with responsibility and accountability for the whole process. This may ultimately be the accounting officer (the Permanent Secretary or Chief Executive Officer).1

You may need to refer to your organisation’s decision-making framework to judge which roles and responsibilities you need to have in place.

(1.1.5.) Governance: risk – your organisation will have its own risk framework, most likely based on the management of risk in government framework. You should seek to escalate records management risks to Director level at least, and ideally to your senior risk board.

Skills

(1.2.1.) Skills: value and context of records – The GKIM Skills Framework should be used as a measure of skills obtained. For appraisal and selection, please use Records Management job role skills and the RM10 ‘permanent preservation’ skill in particular.

(1.2.2.) Skills: records collection policy and guidance – The National Archives’ Records Collection Policy and underlying guidance should be reflected in your own organisation’s policies and methodologies.

(1.2.3.) Skills: knowledge transfer – effective knowledge transfer also implies that you have the resources and time to do it. Where team continuity and succession planning has been hampered by resource shortfalls, you will not be able score as well.

Knowledge transfer will include understanding the impacts of organisational change and structures over time (20 years), which is a significant challenge where there has been rapid and frequent change.

Technical Capability

There is an appreciation that some organisations operate on shared technical platforms and may not have control over how to use it.

Along with the tooling capability required to appraise and select, there is an expectation that the skills to use and interpret the tools are also in place.

(1.3.3.) Technical capabilities: automation and specialist tools – By automation, we mean processes which are run over a set of records or data to achieve some task – for example, identifying all person names in a collection of documents, or filtering by some criteria. Specialist tools are user interfaces which help with some part of the task, beyond a metadata spreadsheet in Excel. For example, something that helps sort or sift documents, or visualise a records collection by topic.

(1.3.6.) Technical capabilities: dashboard capability – By dashboard capability, we mean the ability to visualise data analysed, which could be as simple as graphs produced in Excel, or PowerBI, or AI produced visualisations. It is implied that, to produce a dashboard, your tooling has data analytics capabilities. It naturally follows

from the metadata (1.3.4) and data analysis (1.3.5) criteria and these three may all be met by the same technical solution.

(1.3.8.) Technical capabilities: use of AI – ‘No use of AI…’ is a not necessarily a poor score. It is more about ‘considered versus unconsidered use of AI’. Where use of AI has been considered, a decision made not to use it in the circumstances of the organisation, and documented, that may be a perfectly valid and mature approach.

Automation, algorithm and machine learning capabilities include the following considerations:

  • To what degree might supervised and unsupervised machine learning be used?
  • Is the system assured and the accuracy of results known? This should be a requirement, especially if it is a service delivered by others.
  • Has a level of technical risk been applied to the capability? This should include a statement of risk appetite in using machine learning, as well as an assessment of the volume of content and whether the use of such AI will produce worthwhile results (a simpler technical solution may suit some organisations better).
  • Is there a readiness in the organisation to use machine learning? Can you provide teaching data?

You should use the Generative AI Framework for HMG. You may also wish to use the Human-AI interaction design guidelines (annex A) provided by our head of profession.

It is implied in all the lines in this section that records are kept of analytics and outputs from technical processes, which will form part of transparency and accountability expectations.

Capacity

(1.4.2.) Capacity: senior management engagement on budget and resource needs – metrics from the IMR and an understanding of the volume of records for appraisal and selection will help you to understand capacity and resource requirements. Like a ‘technical debt’, this is a way of measuring how much of a ‘PRA debt’ you are building up, especially if funding, resource and technical capability are not optimal.

Digital leaders in your organisation can be engaged in the challenge by highlighting the need to understand context, content and sensitivity of records to exploit knowledge and information through use of AI and other technologies.

Methodology

(2.1.1.) Methodology and (2.1.2.) Policy – ‘Selection policies’ set out what the department seeks to identify as being of historical value and ‘selection methodologies’ seek to apply those selection policies across multiple systems of the organisation.

For ALBs, seeking senior management support, and oversight and governance, will most likely require submissions to parent Departments, and a need to follow their policies and approaches.

(2.1.1.) Methodology – The High-Level Appraisal Methodology could include detail of hybrid records where they exist and how they are appraised and selected. Examples of published HLAMs are linked in the introduction above.

(2.1.4.) Value of records – linked to (1.1.2) content asset and liability balance. An assessment of value could be made in an information asset register, and in the creation of a record, and as part of the selection process.

Other good practice in Appraisal & Selection methodology, include:

  • Producing a file plan, which sets out the structure of your libraries, folders and files, whether they be in your corporate platform or legacy shared drives.
  • Use of a corporate lexicon or terms store, used in search or to configure tooling, to identify both valuable records for potential permanent preservation, and Redundant, Outdated and Trivial (ROT) records.

(2.2.1.) Appraisal & Selection: staged appraisal – to take appraisal decisions, authorities must:

  • understand the content, context and sensitivity of the records that they hold.
  • review records throughout their life.
  • have a documented process for selecting records for permanent preservation, transfer or destruction. (S46)

The Grigg Report expectation of first appraisal at 5 years remains good practice, even with the move from the 30 to a 20-year rule on transfer. In practice, the staged appraisal point could be when a retention period (label or tag) is reached, whether that be 1, or 5, or 7 or 10 years.

Ideally in the born digital world, records that will be destined for The National Archives will be tagged as such at creation, or very early on in their life. Often by virtue of their label, or where they are kept in the records management structure. For example, in the Home Office, the business has a choice of two types of SharePoint sites, one of which has a default retention of 15 years on the basis that those will potentially be of interest to The National Archives and so reviewed with that in mind.

The ability to carry out staged appraisal will differ and be dependent on the size and complexity of your organisation’s digital heap, and the tools and resources at your disposal. So ‘no staged appraisal’ may be a valid level of maturity for some.

(2.2.2.) Appraisal and selection – Making decisions at an appropriate level

The National Archives’ Guiding Principles on Appraisal and Selection recommends that records are appraised at the highest appropriate level. It states that, where possible, decisions

should be made on groups or categories of information based on the context in which they were created or are being used. This approach reduces the need to review records at a lower or individual item level, which can be time-consuming.

In practice, there may be instances where high-level manual review encounters obstacles due to insufficient context provided by the folder structure. Consequently, reviewers must delve deeper to comprehend the content and its context. This scenario highlights the limitations of scalability in manual reviews, necessitating a data-driven approach for contextual analysis. Such analysis can empower reviewers to make informed decisions at a higher level.

An optimal approach would combine using KIM expertise to make high-level decisions on content, supported with analytics tools to see context and automated tools to remove ROT – supported by robust quality checks and governance overseen by KIM professionals.

(2.2.7.) Cataloguing – The National Archives requirements are set out in the digital transfer steps guidance.

Engagement

(3.1.1.) Context and value: internal – engaging with the business, information assurance and the data profession, will show the value of a well-funded and effective ASSR process, to avoid the accumulation of outdated information and reduce Data Protection and security risks.

(3.1.2.) Context and value: external – the relationship, responsibilities, and engagement between ALBs and their parent Departments will differ. This line may not apply in all cases, but most ALBs will need to engage with their parent Departments to get steers on things like value and context of records, as well as re-using policies and guidance for consistency. And most Departments will have a responsibility to engage with their ALBs for the same reasons.

The National Archives has a project with all Ministerial Departments to understand the public record status of each body and then the relationship (or not) between the Ministerial department and the ALBs, NDPBs and so forth. For further information, please contact: governmenthelppoint@nationalarchives.gov.uk

Another external engagement could be with an organisation that has in the past had ownership or understanding of some legacy records, because of a machinery of Government change or another change of function.

Transparency and Accountability

(4.1.1.) Transparency and Accountability – by ‘transparent reporting on activities and progress against goals’ in the ‘Mastering’ column, we do not necessarily mean external publication. Reporting to The National Archives and the Advisory Council and having shared improvement plans is evidence of maturity.

It may also be useful to think about the measure of public trust in your organisation’s ability to review and transfer to The National Archives records of historic value. And the openness (subject to the necessary sensitivity rDigital eview) with which transfers happen.

A mature score here would also include technical documentation when algorithms are being used, to show that outputs are understood and explainable. In the form of an Algorithmic Transparency Recording Standard (ATRS). You should also consider selection and transfer of the algorithm itself as an appraisal decision supporting tool.

The Digital ASSR Maturity Assessment Tool Criteria: Digital Sensitivity Review

Sensitivity review is the process of determining whether selected records should be:

  • Transferred “open”, as no Freedom of Information (FOIA) exemptions or Environmental Information Regulations (EIR) exceptions apply.
  • transferred “closed”, either in full or in part, as FOIA exemptions or EIR exceptions apply; or
  • retained under section 3(4) of the Public Records Act, as they are classified above OFFICIAL-SENSITIVE

Governance

(1.1.1.) Governance: oversight, policies and methodologies in place and in use – at the mastering level, excellent governance will include senior management engagement and involvement, including actively reviewing plans, resources, ACNRA applications, plans and updates, and compliance meetings with The National Archives.

For sensitivity review issues on content and government classification tiers, departments are advised to share policy and procedures as they are developed with The National Archives compliance.

(1.1.4.) Governance: risk appetite for published material – within the risk of not understanding the organisation’s risk appetite, is the risk of accidental disclosure. Also, an assurance that the process does not involve sensitivity reviews being carried out outside the UK or by an unrecognised third party.

Skills

(1.2.1.) Skills: to carry out sensitivity reviews – the GKIM Skills Framework should be used as a measure of skills obtained. For sensitivity review, please use Records Management job role skills and the specific skill CG01 ‘Complying with FOI retention rules in relation to archiving records’ in the Information Rights job role.

(1.2.3.) Skills: knowledge transfer – although these statements should be a measure of knowledge transfer in the records team to aid effective sensitivity review, this may be impacted by how well (or not so well) knowledge transfer occurs in the wider business.

Technical Capability

(1.3.2.) Technical capabilities: redaction tools – a consideration in the effective and safe use of redaction tools is the permanence of the redaction. A redaction tool is ineffective if redacted text can be uncovered by some technical means.

For further guidance, please refer to The National Archives’ Redaction Toolkit

(1.3.6.) Technical capabilities: use of AI – by AI, we mean machine learning and any form of automated ruleset.

Automation, algorithm, and machine learning capabilities include the following considerations:

  • To what degree might supervised and unsupervised machine learning be used?
  • Is the system assured and the accuracy of results known? This should be a requirement, especially if it is a service delivered by others.
  • Has a level of technical risk been applied to the capability?

Within this capability is the ‘humans in the loop’; the ability of people to engage with and quality check machine learning.

You should use the Generative AI Framework for HMG. You may also wish to use the Human-AI interaction design guidelines (annex A) provided by our head of profession.

As with appraisal and selection, no use of AI is a not necessarily a ‘bad’ thing. It is more about ‘considered versus unconsidered use of AI’. Where use of AI has been considered, a decision made not to use it in the circumstances of the organisation, and documented, that may be a perfectly valid and mature approach.

Methodology

(2.1.3.) Classification – it is important that your organisation adopts the Government Security Classifications Policy (GSCP) to classify information assets across your information management environment. This ensures that information is appropriately labelled by end users to protect it and enable understanding of its sensitivity and support future sensitivity reviews. For your main IM environment, your platform supplier will produce guidance; for instance, the Microsoft guidance on protecting Government data using Microsoft Purview.

(2.1.6.) Retention and closure applications – note that there is a need to keep a copy of redacted records as part of the AI dataset being used to machine learn.

(2.1.8.) Storage – this line applies to any record being reviewed before transfer, whether it is an early transfer or at the 20-year point. Dedicated storage for retained records must be secure and must preserve the integrity of records.

(2.1.9.) Assurance – the QA process should include contingency plans for when things go wrong, including escalation routes.

Engagement

(3.1.1.) Consultation on sensitivity internally – includes consultation and engagement between ALBs and their parent Departments.

(3.1.2.) Consultation on sensitivity externally – this consultation should feature as part of transparency of decision making and be recorded. For instance, if other parties or Departments disagree regarding redactions, because they have a different risk appetite. Some sets of records will have different risk appetite to others and therefore different handling and closure risks.

(3.1.3.) Advisory Council – please note that you would not have any need to engage with the Advisory Council if your sensitivity review concluded that everything was open.

Transparency and accountability

(4.1.1.) Transparency and Accountability – maturity in this section must be considered in light of sensitivities of the records in question, and the commercial sensitivities or constraints of services offered. As with the similar question in the Digital ASSR Maturity Assessment Tool, there will not always be an expectation to publish externally, but engagement with The National Archives and the Advisory Council will be a measure of maturity.

Completing the Digital ASSR Maturity Assesment Tool

  1. In the Organisational Details tab of the workbook, please tell us who you are and when you are carrying out the self-assessment.
  2. In the Selection and Sensitivity tabs, you will find in cells B-D, a theme (e.g. Governance and Capabilities), a section (e.g. Governance) and a category (e.g. Governance: oversight, policies and methodologies in place and in use).

In cells E3-I3, you will find the five maturity levels on the scale:

  • Beginner
  • Emerging
  • Learning
  • Developing
  • Mastering

Under each maturity level there is a maturity statement. Read each statement and then rate yourself on the Beginner-Mastering scale by selecting from the dropdown in column J. The worksheet will calculate and score for each question in column K.

In the drop-down, there are also options for:

  • Unknown
  • Not Applicable

Unknown should be chosen when the maturity measure cannot be established. An unknown answer will be scored at the same maturity level as ‘beginner’ as it is expected that public record bodies understand enough about their information management practices to give a defined answer.

Not applicable will remove that question from the scoring calculation. It should be used only when a question is genuinely irrelevant to the organisation completing the ASSR Maturity Assessment Tool.

  1. For each question, you will also need to select an Improvement timescale from the column AD drop down to choose when you think you need to act on this measure:
  • No Action
  • Next Year
  • Next Two Years
  • Long Term

Where you select ‘Next Year’, you should also make a brief note in column AE as to your planned action in the next 12 months and assign the task to a person responsible for the action.

These entries will be dynamically added to the Action Plan table for managing and reviewing the progress with your improvement plan. This will be generated in the ‘Improvement Action’ tab, which we recommend be reviewed monthly or quarterly, before your next annual self-assessment against the ASSR Maturity Assessment Tool.

The Digital ASSR Maturity Assessment Tool: Scoring

Scoring will be based on 20% bands for each level of maturity:

  • Level 1 – Beginner (poor): 0-20%
  • Level 2 – Emerging (weak): 21-40%
  • Level 3 – Learning (acceptable): 41-60%
  • Level 4 – Developing (good): 61-80%
  • Level 5 – Mastering (excellent): 81-100%

This means that if an organisation scores level 1 ‘Beginner’ against all questions, the overall score would be 20%, an acknowledgement that you have begun the maturity journey. It is possible, therefore, if an organisation achieved level 5 ‘mastering’ scores across all questions, to score 100% overall.

A view of the scores for each collection type and section can be found in the summary tab.

Additional Resources

The Digital ASSR Maturity Assessment Tool implements the principals of existing guidance from The National Archives and wider UK Government. These are:

  1. Appraisal and Selection Guiding Principles
  2. Best Practice Guide for Appraising and Selecting Records for The National Archives
  3. Records Collection Policy
  4. Published examples of High-Level Appraisal Methodologies:
  5. S46 Code of Practice (part 3)
  6. Sensitivity Review Guiding Principles
  7. Access at Transfer – Sensitivity Review Overview
  8. Redaction Toolkit