Home / Blog / Insights into creating a successful Disaster Recovery Test – Part 3: Metrics
Home / Blog / Insights into creating a successful Disaster Recovery Test – Part 3: Metrics

Insights into creating a successful Disaster Recovery Test – Part 3: Metrics

[fat_col_twelve fat_sc_id=”fat-pb-58391792-0040-d9bf-25ce-77a45ec034c2″][fat_text text_color=”” font_family=”ff-main” font_weight=”fw-regular” font_size=”16″ animation=”none” animation_delay=”0″ pre_css=”” css_class=”” fat_sc_id=”fat-pb-53711c79-9ea7-be59-b0d1-ece9367665f4″]

Organizations with an active Disaster Recovery program conduct DR Tests to validate the Disaster Preparedness component of their IT Service Continuity strategies. Those exercises should validate – among other milestones – the overall recovery time (actual vs. planned RTO), completeness of DR plan documentation, level of preparedness among recovery team and the overall effectiveness of the DR response.

As a BCP/DR software solution provider, we are often called on to assist customers in the preparation, management and enhancement of their DR Tests. After one of our Utility customers completed their DR Recovery Test preparation steps (see Part 2 of this blog)  the time came to test their ability to execute. Based on management objectives, in-scope recovery elements included 344 systems (Mainframe, Unix, Linux, Wintel, Storage devices, Networks, Databases, TSM, NBU, SAP), 57 Tier-1 and-2 Applications, and involved more than 275 IT staff. As part of this DR test, 137 distinct DR plans were activated with a planned Recovery Time Objective (RTO) of 72 hours.

[/fat_text][fat_text text_color=”” font_family=”ff-main” font_weight=”fw-regular” font_size=”16″ animation=”none” animation_delay=”0″ pre_css=”” css_class=”” fat_sc_id=”fat-pb-8a2d88fa-3c7a-9ee7-c2ff-9aeb58bcf797″]

Management thinker Peter Drucker is often quoted as saying “You can’t manage what you can’t measure.”

[/fat_text][fat_text text_color=”” font_family=”ff-main” font_weight=”fw-regular” font_size=”16″ animation=”none” animation_delay=”0″ pre_css=”” css_class=”” fat_sc_id=”fat-pb-6581b533-5de1-1374-ff73-975f021382f4″]Infrastructure Sub-systems:  As part of the DR test, Milestone Dashboards (MDB) were setup to measure real-time status updates of the infrastructure Subsystem recoveries. A Subsystem could be in one of three states:

  • System Down
  • Restoration In-Progress
  • System Restored

The visibility of restored infrastructure components allowed Application Owners to initiate Application Functionality Testing.[/fat_text][fat_text text_color=”” font_family=”ff-main” font_weight=”fw-regular” font_size=”16″ animation=”none” animation_delay=”0″ pre_css=”” css_class=”” fat_sc_id=”fat-pb-d762d5fc-cc46-d2fe-18d2-cfd970364bf2″]

Application (IT Service): Certification of an application “Available” required (a) restoration of associated infrastructure Subsystems, (b) Completion of Application functional testing and (c) Completion of end-user Acceptance Testing. Metrics were adopted to measure Service Restoration status:

Application availability (100%) =

  1. Infrastructure sub-systems restored (60%) +
  2. Application Functional Testing complete (30%) +
  3. End-user Acceptance Testing (10%)

 

[/fat_text][fat_single_image style=”default” border_width=”lk-border-10x” border_color=”” position=”text-center” round_corner=”” image=”14686″ image_width=”” image_height=”” animation=”none” animation_delay=”0″ pre_css=”” css_class=”” fat_sc_id=”fat-pb-6ae0adbe-9119-2dad-f2da-77462643cd20″][/fat_single_image][fat_text text_color=”” font_family=”ff-main” font_weight=”fw-regular” font_size=”16″ animation=”none” animation_delay=”0″ pre_css=”” css_class=”” fat_sc_id=”fat-pb-a13ba0e8-da98-e952-693f-8b2cc209f6ea”]


DR Plans
: 137 IT DR plans were in scope for the exercise, comprised of 1,921 distinct Tasks. Each task had a planned duration and a sequence in which it was to be executed. Based on the intra-plan Task sequence, and taking into consideration other inter-Plan linkages, dashboards displayed the Task status in different colors indicating: (a) Task ready for execution (b) Task in-progress, or (c) Tasks completed successfully

[/fat_text][fat_text text_color=”” font_family=”ff-main” font_weight=”fw-regular” font_size=”16″ animation=”none” animation_delay=”0″ pre_css=”” css_class=”” fat_sc_id=”fat-pb-f3db3173-4543-21a7-5b0f-44adcdf5ade5″]

Each of the DR Plans could be viewed as a GANTT Chart with its “critical path” and any inherent “slack” time.

[/fat_text][fat_text text_color=”” font_family=”ff-main” font_weight=”fw-regular” font_size=”16″ animation=”none” animation_delay=”0″ pre_css=”” css_class=”” fat_sc_id=”fat-pb-a16976f1-90e6-4294-1b53-49517eb0e28c”]Stakeholders: Our client’s Annual Disaster Recovery test involved more than 275 IT staff over 3-days.  Broadly the stakeholders were grouped by their role or function during the test: •Incident Commanders •Recovery Teams •Application Functionality Testers •End-user Acceptance Testers •Executives / Observers.[/fat_text][fat_text text_color=”” font_family=”ff-main” font_weight=”fw-regular” font_size=”16″ animation=”none” animation_delay=”0″ pre_css=”” css_class=”” fat_sc_id=”fat-pb-76364989-c320-bea1-4313-44ff487787ad”]Orchestration: The Disaster Recovery test involved – Plan activation, Task allocation, monitoring the critical path, real-time system/services dashboards, staff scheduling and issue management. All required a high level of coordination among the various stakeholders.  eBRP’s CommandCentre (a DR automation platform) was deployed to support their Annual DR Test. Use of CommandCentre to manage the exercise, with it’s real-time activity logging, integrated notification and array of dashboard displays, resulting in improvement of the overall efficiency of the test and its reporting capacity.[/fat_text][fat_text text_color=”” font_family=”ff-main” font_weight=”fw-regular” font_size=”16″ animation=”none” animation_delay=”0″ pre_css=”” css_class=”” fat_sc_id=”fat-pb-a1b6bf0c-9d10-c977-d532-eb1b9455c0ae”]

 

More about this series: Part 1 of this blog series detailed the setting of DR Test Scope and Objectives.  Part 2 focused on Test Preparation, Test Execution in Part 3 and concludes with Test Review in Part 4.

[/fat_text][/fat_col_twelve]

Share:

Share:

More Posts:

Request a demo.