Digital lifecycle – Live

BETA. This draft guidance is provided as a ready-reference for business owners and project teams in government agencies planning online delivery. It draws on the approach of the UK government's Digital by Default Service Standard, released under the Open Government Licence v2.0. We invite feedback from agencies.
  • Build services around the needs of users.
  • Think about the lifecycle of digital delivery.

You've moved your site or service into a stable Live phase. It meets the stakeholder needs you identified through earlier phases, you have resource and budget dedicated to maintaining it, and your analytics are configured to monitor its performance against KPIs. But that doesn't mean we're done. You need to constantly review how it's performing against objectives, and identify ways in which it can better serve its audience.

If you don't have budget allocated for ongoing improvement and development, you should be hearing alarm bells.

How you estimate how much you need for ongoing development depends on your site and your agency, but your estimate should not be nil! There is always room for improvement.

And periodically through the life of the product, you should review it to be sure that it remains fit for purpose, continues to meet user needs, remains secure and accessible and warrants ongoing investment.

Questions you should review during the Live phase

Performance

  • Is continued investment in the site or service justified?
    • Can you point to evidence that it continues to meet user needs and it's stated KPIs? (Analytics data can help you do this).
    • Does it still contribute to meeting your agency's strategic direction?
    • Can you point to evidence that return on continued investment is financially justified?
    • What would happen if it was retired?
  • Does it still fit with the direction of Govt.nz and other government priorities?
    • Could it be delivered through Govt.nz?
    • Are there initiatives in other parts of government (or from the private sector) that you could align or combine with to better address user needs?
  • Can it be improved?
    • Have user needs changed or developed beyond those that the site was originally intended to address?
    • Do user feedback or analytics data indicate ways in which a better or more efficient user experience could be provided?

Security

  • Is the risk profile of your product still valid?
    • Have any changes been introduced that might require a review?
  • Are your security procedures still adequate or do they need to be reviewed?
    • Are software updates being regularly reviewed and applied? Do hosting arrangements clearly assign responsibilities for maintenance and updates? How do you gain assurance that they are being met?
    • Do staff with admin access maintain sufficiently strong passwords?
    • Are all code changes peer reviewed and tested for vulnerabilities before deployment?
    • Has a vulnerability scan / penetration test been carried out recently?
    • Who is reviewing the output of any monitoring or protection systems (such as Web Application Firewalls) that are in place?
    • Are staff familiar with your organisation’s Incident Management procedures?
  • When was formal risk assessment of the product undertaken and signed off?
    • When is it due to be renewed?
  • Are you confident that the site or service can be restored if anything were to happen to it?
    • Are recovery procedures documented? How would you test them?

Privacy

Accessibility

  • Does the product still meet the Web Accessibility Standard?
    • When did the product last undergo an accessibility audit?
    • Are procedures in place to review the accessibility of new or updated content prior to publication?
    • As new functions are developed, are they tested against the Web Accessibility Standard while in development?
    • How well do content authors understand the content-related aspects of the Accessibility Standard? Do they need a refresher, or assistance with specific requirements?
  • Have deficiencies against the Web Accessibility Standard been subjected to a risk assessment?
    • Are adequate measures in place to address user needs where deficiencies might prevent users' accessing government information or services?
    • What steps are being taken to remove such obstacles?

Information and data management

  • Is your content licensed for reuse in accordance with NZGOAL?
    • If not, why shouldn't it be? If it's not, is it clear what users can do with your content? Could they, for example, copy and paste sections of it into another document?
  • Are you familiar with the Data and Information Management Principles which were approved by Cabinet?
    • What management processes can you point to as evidence that the information and data published on your service meets the Data and Information Management Principles?
  • Are you confident that metadata is being properly managed?
    • Try sampling a few pages from your site or service. Is it clear when content was last updated or reviewed, and whether they are still current? Do they have a clear purpose? Will users know who to contact - and how - if they need assistance?
  • Are publishers adhering to the requirements of your style guide?
    • Are Plain English standards being met? Are publishers following consistent practices around voice and style of communications? Is your style guide still appropriate for the nature of the site or service?
  • Are versioning processes adequate?
    • Can you review or recreate previous versions of what was published to the public internet, and how it has changed over time?
    • Are adequate controls in place to manage how content is modified, and who can modify it?
  • Do you have procedures in place for regular review of content that is published to the public internet?
    • What triggers a review or update, and who is responsible for managing it?
  • Are content authors providing non-HTML content in appropriate formats?

Navigate this guidance