Skip to main content

ID & Reference Format Settings (ONB02US07)

User Story: ONB02US07


Overall Coverage Summary

Total Coverage: 100% (9/915/15 Acceptance Criteria Covered)
Total Test Cases: 1115 (913 Functional + 2 Non-Functional)
Total Acceptance Criteria: 915 (Based on user story requirements)
Coverage Percentage: (9/9)15/15) × 100 = 100%


Test Scenario Summary

A. Functional Test Scenarios

Core Functionality

  1. ID Format Configuration Management - Create, edit, view, delete ID format configurations
  2. Master vs Transaction ID Categorization - Proper categorization and handling of different ID types
  3. Format Component Management - Entity type, prefix, sequence, utility service, date element, separator configuration
  4. Live Preview Generation - Real-time preview of ID formats based on configuration
  5. Format Validation - Duplicate prevention, length validation, pattern validation
  6. Audit Trail Management - Comprehensive logging of all configuration changes

Enhanced Features

  1. Advanced Format Builder - Enhanced customization options and component reordering
  2. Multi-sample Preview - Multiple example generation with different scenarios
  3. Contextual Help System - In-context guidance and tooltips
  4. Format Testing - Test format with specific inputs and edge cases
  5. Format Dashboard - Usage statistics and health indicators

User Journeys

  1. Utility Administrator Complete Workflow - End-to-end ID format management
  2. System Admin Audit Review - Complete audit and oversight workflow
  3. Cross-role Collaboration - Multi-user scenarios and handoffs

Integration Points

  1. Entity Creation Integration - ID generation for new customers, meters, bills, payments
  2. System Configuration Integration - Integration with other SMART360 modules
  3. User Authentication Integration - Role-based access control validation

Data Flow Scenarios

  1. ID Generation Process - Format application to new entity creation
  2. Configuration Change Impact - Effects on new ID generation
  3. Audit Data Flow - Change tracking and log generation

B. Non-Functional Test Scenarios

Performance

  1. Response Time Validation - Page load and configuration update performance
  2. Concurrent User Handling - Multiple administrators accessing simultaneously
  3. Large Configuration Set Performance - Performance with many format configurations

Security

  1. Authentication & Authorization - Role-based access control
  2. Session Management - Timeout and session security
  3. Data Protection - Sensitive configuration data handling
  4. Audit Trail Security - Tamper-proof logging

Compatibility

  1. Cross-Browser Testing - Chrome, Firefox, Safari, Edge compatibility
  2. Responsive Design - Desktop, tablet, mobile compatibility
  3. Cross-Platform Testing - Windows, macOS, iOS, Android

Usability

  1. User Interface Navigation - Intuitive navigation and workflow
  2. Error Handling - Clear error messages and recovery
  3. Help System Effectiveness - Contextual help and guidance

Reliability

  1. System Stability - Continuous operation under normal load
  2. Error Recovery - Recovery from network issues and timeouts
  3. Data Integrity - Configuration consistency and accuracy

C. Edge Case & Error Scenarios

Boundary Conditions

  1. Maximum/Minimum Values - Sequence length limits, prefix length limits
  2. Format Length Limits - Maximum total ID length validation
  3. Entity Volume Limits - Maximum entities per format type

Invalid Inputs

  1. Malformed Configuration Data - Invalid characters, formats
  2. Unauthorized Access Attempts - Access beyond permitted roles
  3. Injection Attack Prevention - SQL injection, XSS prevention

System Failures

  1. Network Connectivity Issues - Handling of connectivity problems
  2. Service Unavailability - Backend service failure scenarios
  3. Database Connection Issues - Database connectivity problems

Data Inconsistencies

  1. Duplicate Format Prevention - Handling duplicate format attempts
  2. Conflicting Configuration States - Resolution of configuration conflicts
  3. Audit Log Consistency - Ensuring complete audit trail

Detailed Test Cases

Test Case 1: "CreateGenerate NewUnique MasterIDs IDWithin FormatEntity for Customer Entity"Domain"

Test Case: ONB02US07_TC_001

Title: Verify Utilitysystem Administratorgenerates canunique createIDs newwithin Mastereach IDentity's formatrespective fordomain

Test CustomerCase entity

Metadata

Created By: Arpita
Created Date: June 08, 2025
Version: 1.0

Classification

  • Module/Feature: ID & Reference Format Settings
  • Test Type: Functional
  • Test Level: System
  • Priority: P1-Critical
  • Execution Phase: Smoke
  • Automation Status: Planned-for-AutomationAutomated

Enhanced Tags for 17 Reports Support

Tags: MOD-IDFormat,Onboarding, P1-Critical, Phase-Smoke, Type-Functional, Platform-Web, Report-Engineering/Product/QA, Customer-Enterprise, Risk-High, Business-Critical, Revenue-Impact-High, Integration-End-to-End, Master-ID-Creation, Customer-Entity, Format-Builder, Onboarding Services, cx Services, API, Database, HappyPath, Cross module

Business Context

  • Customer_Segment: Enterprise
  • Revenue_Impact: High
  • Business_Priority: Must-Have
  • Customer_Journey: Onboarding
  • Compliance_Required: No
  • SLA_Related: Yes

Quality Metrics

  • Risk_Level: High
  • Complexity_Level: Medium
  • Expected_Execution_Time: 5 minutes
  • Reproducibility_Score: High
  • Data_Sensitivity: Medium
  • Failure_Impact: Critical

Coverage Tracking

  • Feature_Coverage: 25% of ID format creation feature
  • Integration_Points: Entity creation system, audit logging
  • Code_Module_Mapped: Onboarding
  • Requirement_Coverage: Complete
  • Cross_Platform_Support: Web

Stakeholder Reporting

  • Primary_Stakeholder: Engineering/Product
  • Report_Categories: Quality-Dashboard, Module-Coverage, Feature-Adoption
  • Trend_Tracking: Yes
  • Executive_Visibility: Yes
  • Customer_Impact_Level: High

Requirements Traceability

Test Environment

  • Environment: Staging
  • Browser/Version: Chrome 115+, Firefox 110+, Safari 16+, Edge Latest
  • Device/OS: Windows 10/11, macOS 12+
  • Screen_Resolution: Desktop-1920x1080
  • Dependencies: SMART360 authentication service, database connectivity
  • Performance_Baseline: < 3 seconds page load
  • Data_Requirements: Valid utility administrator credentials

Prerequisites

  • Setup_Requirements: SMART360 system accessible, test environment configured
  • User_Roles_Permissions: Utility Administrator role with ID format management permissions
  • Test_Data: Valid admin credentials (admin@utilitytest.com / TestPass123!)
  • Prior_Test_Cases: User authentication successful

Test Procedure

Step #

Action

Expected Result

Test Data

Comments

1

NavigateCreate toCustomer SMART360ID login pageformat

LoginFormat pagecreated displays correctlysuccessfully

N/AEntity: Customer

VerifyDomain UI elementssetup

2

EnterGenerate Utility5 AdministratorCustomer credentialsIDs

SuccessfulAll authenticationIDs unique within Customer domain

admin@utilitytest.comSequential / TestPass123!generation

CheckUniqueness session creationtest

3

NavigateCreate toMeter SystemID Configuration menuformat

MenuFormat expandscreated with configuration optionssuccessfully

N/AEntity: Meter

VerifySeparate navigationdomain

4

ClickGenerate "ID5 &Meter Reference Format Settings"IDs

IDAll FormatIDs Settingsunique pagewithin loadsMeter domain

N/ASequential generation

PageDomain load < 3 secondsisolation

5

Verify MasterCustomer and Meter ID and Transaction ID options visibleuniqueness

BothIDs configurationunique typeswithin displayedrespective domains

N/ACross-domain check

UIDomain validation

6

Click "Master ID" configuration type

Master ID section becomes active

N/A

Section highlighting

7

Click "Create New Format" button

New format creation modal opens

N/A

Modal display validation

8

Select "Customer" from Entity dropdown

Customer entity selected

Entity: Customer

Dropdown functionality

9

Enter sequence length

Sequence length set to 4 digits

Sequence: 4

Number validation

10

Enter prefix

Prefix field populated

Prefix: CUST

Text validation

11

Select utility service

Water (WA) service selected

Service: Water (WA)

Service selection

12

Configure date element

YYYYMM format selected

Date: YYYYMM

Date format validation

13

Set starting number

Starting number set to 1

Start: 1

Number validation

14

Select separator

Hyphen (-) selected

Separator: -

Character validation

15

Verify live preview displays

Preview shows: WA-CUST-202406-0001

N/A

Preview accuracy

16

Click "Save Configuration" button

Success message displays, modal closes

N/A

Save operation

17

Verify new format appears in table

Customer ID format listed in Master ID table

N/A

Table update validation

18

Check format details in table

All configured details match input

N/A

Data consistencyseparation

Verification Points

  • Primary_Verification: NewSystem Customergenerates IDunique formatIDs successfullywithin createdeach andentity visible in Master ID tabledomain
  • Secondary_Verifications: LiveNo previewID accuracy,collisions configurationwithin persistence,same auditentity log entry createdtype
  • Negative_Verification: No error messages displayed, no duplicate formatsIDs createdgenerated within same domain

Test Results (Template)

  • Status: [Pass/Fail/Blocked/Not-Tested]
  • Actual_Results: [Template for recording actual behavior]
  • Execution_Date: [When test was executed]
  • Executed_By: [Who performed the test]
  • Execution_Time: [Actual time taken]
  • Defects_Found: [Bug IDs if issues discovered]
  • Screenshots_Logs: [Evidence references]

Test Case 2: "LiveClassify PreviewID DynamicFormats Updatesas DuringMaster Configuration"or Transaction"

Test Case: ONB02US07_TC_002

Title: Verify livesystem previewclassifies updatesID dynamicallyformats as formateither components are modified

Created By: Arpita
Created Date: June 08, 2025
Version: 1.0

Classification

  • Module/Feature:Master ID &or ReferenceTransaction Format Settings
  • Test Type: Functional
  • Test Level: Integration
  • Priority: P2-High
  • Execution Phase: Regression
  • Automation Status: Automated

Enhanced Tags for 17 Reports Support

Tags: MOD-IDFormat, P2-High, Phase-Regression, Type-Functional, Platform-Web, Report-Engineering/QA, Customer-All, Risk-Medium, Business-High, Revenue-Impact-Medium, Integration-Real-time-Preview, Dynamic-Updates, UI-Validation, Onboarding Services, Database, HappyPathID

Business Context

  • Customer_Segment: All
  • Revenue_Impact: Medium
  • Business_Priority: Should-Have
  • Customer_Journey: Daily-Usage
  • Compliance_Required: No
  • SLA_Related: No

Quality Metrics

  • Risk_Level: Medium
  • Complexity_Level: Medium
  • Expected_Execution_Time: 3 minutes
  • Reproducibility_Score: High
  • Data_Sensitivity: Low
  • Failure_Impact: Medium

Coverage Tracking

  • Feature_Coverage: 100% of live preview feature
  • Integration_Points: Frontend preview engine, format validation
  • Code_Module_Mapped: Onboarding
  • Requirement_Coverage: Complete
  • Cross_Platform_Support: Web

Stakeholder Reporting

  • Primary_Stakeholder: Engineering/QA
  • Report_Categories: Feature-Quality, User-Experience
  • Trend_Tracking: Yes
  • Executive_Visibility: No
  • Customer_Impact_Level: Medium

Requirements Traceability

Test Environment

  • Environment: Staging
  • Browser/Version: Chrome 115+, Firefox 110+, Safari 16+
  • Device/OS: Windows 10/11, macOS 12+
  • Screen_Resolution: Desktop-1920x1080
  • Dependencies: Real-time preview service, format validation engine
  • Performance_Baseline: < 500ms preview update
  • Data_Requirements: Access to ID format configuration interface

Prerequisites

  • Setup_Requirements: User logged in as Utility Administrator
  • User_Roles_Permissions: ID format configuration access
  • Test_Data: Existing format configuration or new format creation initiated
  • Prior_Test_Cases: ONB02US07_TC_001 (login and navigation)

Test Procedure

Step #

Action

Expected Result

Test Data

Comments

1

Open ID format configuration modal

Configuration form and preview area visible

N/A

UI loading

2

Set initial configuration

Initial preview displays

Entity: Customer, Prefix: CUST

Baseline setup

3

Modify prefix from CUST to CONS

Preview updates to show CONS in format

New Prefix: CONS

Real-time update

4

Change sequence length from 4 to 6

Preview shows 6-digit sequence (000001)

Sequence: 6

Length validation

5

Modify utility service from WA to EL

Preview updates to show EL in format

Service: Electric (EL)

Service change

6

Change date format from YYYYMM to YYMM

Preview shows 2-digit year format

Date: YYMM

Date format change

7

Modify separator from hyphen to underscore

Preview shows underscores as separators

Separator: _

Separator change

8

Change starting number from 1 to 100

Preview shows sequence starting from 000100

Start: 100

Number change

9

Clear prefix field

Preview updates without prefix component

Prefix: [empty]

Component removal

10

Add prefix back

Preview restores with prefix component

Prefix: METER

Component restoration

11

Select different entity type

Preview updates with entity-appropriate format

Entity: Meter

Entity change

12

Make rapid successive changes

Preview updates smoothly without lag

Multiple rapid changes

Performance test

Verification Points

  • Primary_Verification: Preview updates dynamically with each configuration change
  • Secondary_Verifications: Update performance < 500ms, no UI freezing, accurate format representation
  • Negative_Verification: No incorrect preview display, no delayed or missed updates

Test Case 3: "Duplicate ID Format Validation and Prevention"

Test Case: ONB02US07_TC_003

Title: Verify duplicate ID format validation prevents creation of identical formats

Metadata

Created By: Arpita
Created Date: June 08, 2025
Version: 1.0

Classification

  • Module/Feature: ID & Reference Format Settings
  • Test Type: Functional
  • Test Level: System
  • Priority: P1-Critical
  • Execution Phase: Smoke
  • Automation Status: Automated

Enhanced Tags for 17 Reports Support

Tags: MOD-IDFormat,Onboarding, P1-Critical, Phase-Smoke, Type-Validation,Functional, Platform-Web, Report-Engineering/Product/QA, Customer-All, Risk-High, Business-Critical, Revenue-Impact-High, Integration-Validation-Engine,Classification-Logic, Duplicate-Prevention,Master-Transaction-ID, Onboarding Services, cx Services, mx Services, bx Services, Database, HappyPath

Business Context

  • Customer_Segment: All
  • Revenue_Impact: High
  • Business_Priority: Must-Have
  • Customer_Journey: Onboarding
  • Compliance_Required: No
  • SLA_Related: Yes

Quality Metrics

  • Risk_Level: High
  • Complexity_Level: Medium
  • Expected_Execution_Time: 4 minutes
  • Reproducibility_Score: High
  • Data_Sensitivity: Medium
  • Failure_Impact: Critical

Coverage Tracking

  • Feature_Coverage: 100% of classification feature
  • Integration_Points: Entity classification system, UI categorization
  • Code_Module_Mapped: Onboarding
  • Requirement_Coverage: Complete
  • Cross_Platform_Support: Web

Stakeholder Reporting

  • Primary_Stakeholder: Engineering/Product
  • Report_Categories: Quality-Dashboard, Module-Coverage, Business-Logic
  • Trend_Tracking: Yes
  • Executive_Visibility: Yes
  • Customer_Impact_Level: High

Requirements Traceability

Test Environment

  • Environment: Staging
  • Browser/Version: Chrome 115+, Firefox 110+, Safari 16+, Edge Latest
  • Device/OS: Windows 10/11, macOS 12+
  • Screen_Resolution: Desktop-1920x1080
  • Dependencies: SMART360 classification engine, database connectivity
  • Performance_Baseline: < 2 seconds classification response
  • Data_Requirements: Entity type configurations

Prerequisites

  • Setup_Requirements: ID Format Settings page accessible
  • User_Roles_Permissions: Utility Administrator access
  • Test_Data: Entity types for Master and Transaction classification
  • Prior_Test_Cases: Navigation and authentication verified

Test Procedure

Step #

Action

Expected Result

Test Data

Comments

1

Navigate to ID Format Settings

Master and Transaction ID sections visible

N/A

UI verification

2

Create Customer format in Master ID

Format classified as Master ID

Entity: Customer

Master classification

3

Create Meter format in Master ID

Format classified as Master ID

Entity: Meter

Master classification

4

Create Bill format in Transaction ID

Format classified as Transaction ID

Entity: Bill

Transaction classification

5

Create Payment format in Transaction ID

Format classified as Transaction ID

Entity: Payment

Transaction classification

6

Verify classification persistence

Classifications remain after page refresh

N/A

Data persistence

Verification Points

  • Primary_Verification: System properly classifies formats as Master or Transaction ID
  • Secondary_Verifications: Classification persists across sessions
  • Negative_Verification: No incorrect classification assignments

Test Results (Template)

  • Status: [Pass/Fail/Blocked/Not-Tested]
  • Actual_Results: [Template for recording actual behavior]
  • Execution_Date: [When test was executed]
  • Executed_By: [Who performed the test]
  • Execution_Time: [Actual time taken]
  • Defects_Found: [Bug IDs if issues discovered]
  • Screenshots_Logs: [Evidence references]

Test Case 3: "Preserve Existing Entity IDs During Format Changes"

Test Case: ONB02US07_TC_003

Title: Verify system does not change existing entity IDs when ID format changes are made

Test Case Metadata

Created By: Arpita
Created Date: June 08, 2025
Version: 1.0

Classification

  • Module/Feature: ID & Reference Format Settings
  • Test Type: Functional
  • Test Level: System
  • Priority: P1-Critical
  • Execution Phase: Regression
  • Automation Status: Automated

Enhanced Tags for 17 Reports Support

Tags: MOD-Onboarding, P1-Critical, Phase-Regression, Type-Functional, Platform-Web, Report-Engineering/Product/QA, Customer-All, Risk-High, Business-Critical, Revenue-Impact-High, Data-Integrity, Backward-Compatibility, Onboarding Services, cx Services, Database, API

Business Context

  • Customer_Segment: All
  • Revenue_Impact: High
  • Business_Priority: Must-Have
  • Customer_Journey: Onboarding
  • Compliance_Required: No
  • SLA_Related: Yes

Quality Metrics

  • Risk_Level: High
  • Complexity_Level: High
  • Expected_Execution_Time: 48 minutes
  • Reproducibility_Score: High
  • Data_Sensitivity: High
  • Failure_Impact: Critical

Coverage Tracking

  • Feature_Coverage: 100% of duplicateformat validationchange impact feature
  • Integration_Points: ValidationEntity engine,management databasesystem, uniquenessformat constraintsapplication engine
  • Code_Module_Mapped: Onboarding
  • Requirement_Coverage: Complete
  • Cross_Platform_Support: Web

Stakeholder Reporting

  • Primary_Stakeholder: Engineering/Product
  • Report_Categories: Quality-Dashboard, Data-Integrity, System-Reliability
  • Trend_Tracking: Yes
  • Executive_Visibility: Yes
  • Customer_Impact_Level: High

Requirements Traceability

Test Environment

  • Environment: Staging
  • Browser/Version: Chrome 115+, Firefox 110+, Safari 16+, Edge Latest
  • Device/OS: Windows 10/11, macOS 12+
  • Screen_Resolution: Desktop-1920x1080
  • Dependencies: FormatEntity validationdatabase, service,format databasemanagement constraint validationservice
  • Performance_Baseline: < 23 seconds validationformat responseupdate
  • Data_Requirements: Existing Customerentity IDrecords formatwith ingenerated systemIDs

Prerequisites

  • Setup_Requirements: SystemExisting Customer entities with existinggenerated Customer ID format configuredIDs
  • User_Roles_Permissions: Utility Administrator access
  • Test_Data: Existing format:Customer Entity=Customer,format Prefix=CUST,and Service=WAgenerated entity IDs
  • Prior_Test_Cases: ONB02US07_TC_001Format (existingcreation formatand creation)entity generation completed

Test Procedure

Step #

Action

Expected Result

Test Data

Comments

1

NavigateCreate to MasterCustomer ID configurationformat

MasterFormat IDcreated sectionwith activepattern CUST-YYYYMM-0001

N/AInitial format

SectionBaseline accesssetup

2

ClickGenerate "Create3 NewCustomer Format"entities

CreationIDs modalgenerated: opensCUST-202406-0001, 0002, 0003

N/AEntity creation

ModalExisting displaydata

3

SelectModify Customer ID format

Format updated to CONS-YYMM-00001

Modified format

Format change

4

Verify existing Customer IDs unchanged

Original IDs remain: CUST-202406-0001, 0002, 0003

N/A

Data preservation

5

Create new Customer entity

New ID follows new format: CONS-2406-00001

New entity

New format applied

Verification Points

  • Primary_Verification: Existing entity IDs remain unchanged after format modification
  • Secondary_Verifications: New entities use updated format
  • Negative_Verification: No existing ID modifications occur

Test Results (Template)

  • Status: [Pass/Fail/Blocked/Not-Tested]
  • Actual_Results: [Template for recording actual behavior]
  • Execution_Date: [When test was executed]
  • Executed_By: [Who performed the test]
  • Execution_Time: [Actual time taken]
  • Defects_Found: [Bug IDs if issues discovered]
  • Screenshots_Logs: [Evidence references]

Test Case 4: "Require Active ID Format for Each Entity Type"

Test Case: ONB02US07_TC_004

Title: Verify system requires at least one active ID format for each entity type

Test Case Metadata

Created By: Arpita
Created Date: June 08, 2025
Version: 1.0

Classification

  • Module/Feature: ID & Reference Format Settings
  • Test Type: Functional
  • Test Level: System
  • Priority: P2-High
  • Execution Phase: Regression
  • Automation Status: Automated

Enhanced Tags for 17 Reports Support

Tags: MOD-Onboarding, P2-High, Phase-Regression, Type-Validation, Platform-Web, Report-Engineering/QA, Customer-All, Risk-Medium, Business-High, Revenue-Impact-Medium, Validation-Engine, Business-Rules, Onboarding Services, Database, EdgeCase

Business Context

  • Customer_Segment: All
  • Revenue_Impact: Medium
  • Business_Priority: Should-Have
  • Customer_Journey: Onboarding
  • Compliance_Required: No
  • SLA_Related: No

Quality Metrics

  • Risk_Level: Medium
  • Complexity_Level: Medium
  • Expected_Execution_Time: 6 minutes
  • Reproducibility_Score: High
  • Data_Sensitivity: Medium
  • Failure_Impact: Medium

Coverage Tracking

  • Feature_Coverage: 100% of active format validation
  • Integration_Points: Validation engine, format status management
  • Code_Module_Mapped: Onboarding
  • Requirement_Coverage: Complete
  • Cross_Platform_Support: Web

Stakeholder Reporting

  • Primary_Stakeholder: Engineering/QA
  • Report_Categories: Business-Rules, Data-Validation, System-Reliability
  • Trend_Tracking: Yes
  • Executive_Visibility: No
  • Customer_Impact_Level: Medium

Requirements Traceability

Test Environment

  • Environment: Staging
  • Browser/Version: Chrome 115+, Firefox 110+, Safari 16+, Edge Latest
  • Device/OS: Windows 10/11, macOS 12+
  • Screen_Resolution: Desktop-1920x1080
  • Dependencies: Format validation service, business rules engine
  • Performance_Baseline: < 2 seconds validation response
  • Data_Requirements: Entity format configurations

Prerequisites

  • Setup_Requirements: ID format management interface accessible
  • User_Roles_Permissions: Utility Administrator access
  • Test_Data: Customer selectedentity inwith dropdownsingle active format
  • Prior_Test_Cases: Format creation functionality verified

Test Procedure

Step #

Action

Expected Result

Test Data

Comments

1

Create Customer ID format

Format created and active

Entity: Customer

EntityInitial selectionsetup

42

EnterAttempt identicalto prefixdeactivate asonly existingCustomer format

FieldSystem acceptsprevents input

Prefix: CUST

Same as existing

5

Select identical utility service

Service selected

Service: Water (WA)

Same as existing

6

Configure identical date format

Date format selected

Date: YYYYMM

Same as existing

7

Set identical separator

Separator selected

Separator: -

Same as existing

8

Set identical sequence length

Sequence length configured

Sequence: 4

Same as existing

9

Click "Save Configuration"

Validation error message displays

N/A

Error handling

10

Verify error message content

Clear message about duplicate format

N/A

Message clarity

11

Verify modal remains open

Configuration modal stays accessible

N/A

UI behavior

12

Modify one component (prefix)

Field accepts new value

Prefix: CONS

Component change

13

Click "Save Configuration" again

Format saves successfullydeactivation

N/A

Validation passtest

143

VerifyCreate newsecond Customer ID format in table

ModifiedSecond format appearscreated insuccessfully

Alternative listformat

Multiple formats

4

Deactivate first Customer format

Deactivation allowed

N/A

SuccessValid confirmationdeactivation

155

Attempt exactto duplicatedeactivate againlast active format

SameSystem validationprevents error occursdeactivation

Previous dataN/A

ConsistencyProtection checkrule

Verification Points

  • Primary_Verification: System preventsenforces creationat ofleast duplicateone IDactive formatsformat for sameper entity type
  • Secondary_Verifications: Clear error messaging,messaging UIfor remainsviolation functional, validation consistencyattempts
  • Negative_Verification: NoCannot duplicateleave formatsentity saved,type nowithout systemactive errorsformat

Test orResults crashes(Template)

  • Status: [Pass/Fail/Blocked/Not-Tested]
  • Actual_Results: [Template for recording actual behavior]
  • Execution_Date: [When test was executed]
  • Executed_By: [Who performed the test]
  • Execution_Time: [Actual time taken]
  • Defects_Found: [Bug IDs if issues discovered]
  • Screenshots_Logs: [Evidence references]

Test Case 4:5: "System AdminMaintain Audit LogHistory Accessfor andID Filtering"Format Changes"

Test Case: ONB02US07_TC_004ONB02US07_TC_005

Title: Verify Systemsystem Admin can access and filtermaintains audit logshistory records for all ID format changes

Test Case Metadata

Created By: Arpita
Created Date: June 08, 2025
Version: 1.0

Classification

  • Module/Feature: ID & Reference Format Settings
  • Test Type: Functional
  • Test Level: System
  • Priority: P2-High
  • Execution Phase: Regression
  • Automation Status: Manual

Enhanced Tags for 17 Reports Support

Tags: MOD-AuditLogs,Onboarding, P2-High, Phase-Regression, Type-Functional, Platform-Web, Report-CSM/QA, Customer-Enterprise, Risk-Medium, Business-High, Revenue-Impact-Low, Integration-Audit-System, System-Admin-Role, Compliance-Tracking, auth Services, ax Services, Database, HappyPath

Business Context

  • Customer_Segment: Enterprise
  • Revenue_Impact: Low
  • Business_Priority: Should-Have
  • Customer_Journey: Support
  • Compliance_Required: Yes
  • SLA_Related: No

Quality Metrics

  • Risk_Level: Medium
  • Complexity_Level: Medium
  • Expected_Execution_Time: 67 minutes
  • Reproducibility_Score: High
  • Data_Sensitivity: High
  • Failure_Impact: Medium

Coverage Tracking

  • Feature_Coverage: 80% of audit log feature
  • Integration_Points: Audit logging service, user authentication
  • Code_Module_Mapped: Onboarding
  • Requirement_Coverage: Complete
  • Cross_Platform_Support: Web

Stakeholder Reporting

  • Primary_Stakeholder: CSM/QA
  • Report_Categories: Compliance-Dashboard, System-Governance, Change-Tracking
  • Trend_Tracking: Yes
  • Executive_Visibility: Yes
  • Customer_Impact_Level: Low

Requirements Traceability

Test Environment

  • Environment: Staging
  • Browser/Version: Chrome 115+, Firefox 110+, Safari 16+, Edge Latest
  • Device/OS: Windows 10/11, macOS 12+
  • Screen_Resolution: Desktop-1920x1080
  • Dependencies: Audit logging service, role-based access control
  • Performance_Baseline: < 5 seconds log loading
  • Data_Requirements: System Admin credentials, existing audit records

Prerequisites

  • Setup_Requirements: Existing ID format changes in system for audit data
  • User_Roles_Permissions: System Admin (IT Director) role
  • Test_Data: sysadmin@utilitytest.com / AdminPass123!, audit records from previous test cases
  • Prior_Test_Cases: Format changes from ONB02US07_TC_001-003previous test cases

Test Procedure

Step #

Action

Expected Result

Test Data

Comments

1

LoginCreate withnew SystemCustomer AdminID credentialsformat

SuccessfulAudit authenticationlog entry created for format creation

sysadmin@utilitytest.comEntity: / AdminPass123!Customer

RoleCreation verificationaudit

2

NavigateModify toCustomer ID & Reference Format Settingsformat

PageAudit loadslog withentry admincreated viewfor format modification

N/AModified data

AccessModification validationaudit

3

ClickAccess "Auditaudit Logs"logs tabas System Admin

AuditAll logsformat interfacechanges displaysvisible in audit trail

N/A

TabAudit navigationaccess

4

Verify audit log table columnsdetails

Action,Complete IDchange Configuration,information Modified By, Date & Time, Details visiblerecorded

N/A

UIDetail structurevalidation

5

Filter audit logs by entity type

Filtering works correctly

Filter: Customer

Filter functionality

Verification Points

  • Primary_Verification: Complete audit trail maintained for all format changes
  • Secondary_Verifications: Audit logs accessible to authorized users
  • Negative_Verification: No missing audit entries for any changes

Test Results (Template)

  • Status: [Pass/Fail/Blocked/Not-Tested]
  • Actual_Results: [Template for recording actual behavior]
  • Execution_Date: [When test was executed]
  • Executed_By: [Who performed the test]
  • Execution_Time: [Actual time taken]
  • Defects_Found: [Bug IDs if issues discovered]
  • Screenshots_Logs: [Evidence references]

Test Case 6: "Auto-Generate ID Format Names Based on Entity Type"

Test Case: ONB02US07_TC_006

Title: Verify system auto-generates ID format names based on entity type

Created By: Arpita
Created Date: June 08, 2025
Version: 1.0

Classification

  • Module/Feature: ID & Reference Format Settings
  • Test Type: Functional
  • Test Level: System
  • Priority: P3-Medium
  • Execution Phase: Regression
  • Automation Status: Automated

Enhanced Tags for 17 Reports Support

Tags: MOD-Onboarding, P3-Medium, Phase-Regression, Type-Functional, Platform-Web, Report-Product/QA, Customer-All, Risk-Low, Business-Medium, Revenue-Impact-Low, Auto-Generation, Naming-Convention, Onboarding Services, Database, HappyPath

Test Procedure

Step #

Action

Expected Result

Test Data

Comments

1

Create Customer ID format

System generates "Customer ID" as format name

Entity: Customer

Auto-naming

2

Create Meter ID format

System generates "Meter ID" as format name

Entity: Meter

Auto-naming

3

Create Bill format

System generates "Bill Number" as format name

Entity: Bill

Auto-naming

4

Create Payment format

System generates "Payment Reference" as format name

Entity: Payment

Auto-naming

5

Verify existingnaming log entries displayconsistency

PreviousAll format changesnames shownfollow entity-based pattern

N/A

DataPattern retrieval

6

Use search functionality

Enter "Customer" in search

Search: "Customer"

Search operation

7

Verify filtered results

Only Customer-related logs shown

N/A

Filter accuracy

8

Clear search filter

All logs visible again

N/A

Filter reset

9

Use "Select Action" filter

Choose "Updated" from dropdown

Filter: Updated

Action filtering

10

Verify action filter results

Only update actions shown

N/A

Filter validation

11

Apply multiple filters simultaneously

Search + Action filter together

Search: "Meter", Action: Updated

Combined filtering

12

Click on a log entry

Detailed view opens/expands

N/A

Detail access

13

Verify detailed information

Configuration details, user, timestamp visible

N/A

Detail completeness

14

Test "Show Advanced Filters"

Additional filter options appear

N/A

Advanced features

15

Export audit data (if available)

Export functionality works

N/A

Data exportverification

Verification Points

  • Primary_Verification: System Adminautomatically cangenerates accessappropriate auditformat logs and apply filters successfullynames
  • Secondary_Verifications: CompleteNaming auditfollows trailconsistent visible,pattern
  • Negative_Verification: filterNo combinationsmanual work,name detailedentry informationrequired

Test accessibleCase 7: "Auto-Increment Sequence Numbers for New IDs"

Test Case: ONB02US07_TC_007

Title: Verify system automatically increments current sequence numbers when generating new IDs

Created By: Arpita
Created Date: June 08, 2025
Version: 1.0

Classification

  • Module/Feature: ID & Reference Format Settings
  • Test Type: Functional
  • Test Level: System
  • Priority: P1-Critical
  • Execution Phase: Smoke
  • Automation Status: Automated

Enhanced Tags for 17 Reports Support

Tags: MOD-Onboarding, P1-Critical, Phase-Smoke, Type-Functional, Platform-Web, Report-Engineering/QA, Customer-All, Risk-High, Business-Critical, Revenue-Impact-High, Sequence-Management, Auto-Increment, Onboarding Services, cx Services, API, Database, HappyPath

Test Procedure

Step #

Action

Expected Result

Test Data

Comments

1

Create Customer ID format with start number 1

Format created with sequence starting at 1

Start: 1

Initial setup

2

Generate first Customer ID

ID generated: CUST-202406-0001

N/A

First ID

3

Generate second Customer ID

ID generated: CUST-202406-0002

N/A

Auto-increment

4

Generate third Customer ID

ID generated: CUST-202406-0003

N/A

Continued increment

5

Verify current sequence number

System shows current number as 4

N/A

Counter verification

Verification Points

  • Primary_Verification: System automatically increments sequence numbers
  • Secondary_Verifications: Current number tracking accurate
  • Negative_Verification: No sequence number skipping or duplication

Test Case 8: "Allow Configurable Starting Numbers with Default Value"

Test Case: ONB02US07_TC_008

Title: Verify system allows configurable starting numbers with default value of 1

Created By: Arpita
Created Date: June 08, 2025
Version: 1.0

Classification

  • Module/Feature: ID & Reference Format Settings
  • Test Type: Functional
  • Test Level: System
  • Priority: P2-High
  • Execution Phase: Regression
  • Automation Status: Automated

Enhanced Tags for 17 Reports Support

Tags: MOD-Onboarding, P2-High, Phase-Regression, Type-Functional, Platform-Web, Report-Engineering/QA, Customer-All, Risk-Medium, Business-Medium, Revenue-Impact-Medium, Configuration-Management, Default-Values, Onboarding Services, Database, HappyPath

Test Procedure

Step #

Action

Expected Result

Test Data

Comments

1

Create new ID format without specifying start number

Default starting number set to 1

N/A

Default verification

2

Create ID format with custom start number

Starting number set to specified value

Start: 100

Custom configuration

3

Generate ID with default start

First ID uses sequence 0001

N/A

Default usage

4

Generate ID with custom start

First ID uses sequence 0100

N/A

Custom usage

5

Verify configuration persistence

Start numbers persist after system restart

N/A

Persistence test

Verification Points

  • Primary_Verification: System allows configurable starting numbers with default of 1
  • Secondary_Verifications: Configuration persists correctly
  • Negative_Verification: No invalid starting number acceptance

Test Case 9: "Prevent Duplicate ID Formats for Same Entity Type"

Test Case: ONB02US07_TC_009

Title: Verify system prevents duplicate ID formats for the same entity type

Created By: Arpita
Created Date: June 08, 2025
Version: 1.0

Classification

  • Module/Feature: ID & Reference Format Settings
  • Test Type: Functional
  • Test Level: System
  • Priority: P1-Critical
  • Execution Phase: Smoke
  • Automation Status: Automated

Enhanced Tags for 17 Reports Support

Tags: MOD-Onboarding, P1-Critical, Phase-Smoke, Type-Validation, Platform-Web, Report-Engineering/Product/QA, Customer-All, Risk-High, Business-Critical, Revenue-Impact-High, Integration-Validation-Engine, Duplicate-Prevention, Data-Integrity, Onboarding Services, Database, API, EdgeCase

Test Procedure

Step #

Action

Expected Result

Test Data

Comments

1

Create Customer ID format

Format created successfully

Entity: Customer, Prefix: CUST

Initial format

2

Attempt to create identical Customer format

System prevents creation with error message

Same configuration

Duplicate prevention

3

Modify one component and save

Format creation allowed

Prefix: CONS

Valid variation

4

Attempt exact duplicate again

System prevents with same error

Original config

Consistency check

5

Verify error message clarity

Clear duplicate format message displayed

N/A

Error messaging

Verification Points

  • Primary_Verification: System prevents duplicate ID formats for same entity
  • Secondary_Verifications: Clear error messaging, UI remains functional
  • Negative_Verification: No duplicate formats saved

Test Case 10: "Role-Based Access Control for ID Format Management"

Test Case: ONB02US07_TC_010

Title: Verify only Utility Administrator or System Admin roles can create or modify ID formats

Created By: Arpita
Created Date: June 08, 2025
Version: 1.0

Classification

  • Module/Feature: ID & Reference Format Settings
  • Test Type: Security
  • Test Level: System
  • Priority: P1-Critical
  • Execution Phase: Smoke
  • Automation Status: Automated

Enhanced Tags for 17 Reports Support

Tags: MOD-Onboarding, P1-Critical, Phase-Smoke, Type-Security, Platform-Web, Report-Engineering/Security, Customer-All, Risk-High, Business-Critical, Revenue-Impact-High, Role-Based-Access, Security-Validation, auth Services, Onboarding Services, Security, API

Test Procedure

Step #

Action

Expected Result

Test Data

Comments

1

Login as Utility Administrator

Access granted to ID format creation

admin@utilitytest.com

Admin access

2

Create new ID format

Format creation successful

Entity: Customer

Admin capability

3

Login as System Admin

Access granted to ID format management

sysadmin@utilitytest.com

System admin access

4

Modify existing ID format

Modification successful

Updated config

System admin capability

5

Login as regular user

Access denied to ID format management

user@utilitytest.com

Restricted access

Verification Points

  • Primary_Verification: Only authorized roles can manage ID formats
  • Secondary_Verifications: Appropriate error messages for unauthorized access
  • Negative_Verification: No unauthorized accessformat tomanagement restricted data, no missing audit entriespossible

Test Case 5:11: "Generate Format Preview with Actual Data"

Test Case: ONB02US07_TC_011

Title: Verify system generates preview showing format appearance with actual data

Created By: Arpita
Created Date: June 08, 2025
Version: 1.0

Classification

  • Module/Feature: ID & Reference Format Settings
  • Test Type: Functional
  • Test Level: Integration
  • Priority: P2-High
  • Execution Phase: Regression
  • Automation Status: Automated

Enhanced Tags for 17 Reports Support

Tags: MOD-Onboarding, P2-High, Phase-Regression, Type-Functional, Platform-Web, Report-Engineering/QA, Customer-All, Risk-Medium, Business-High, Revenue-Impact-Medium, Integration-Real-time-Preview, Dynamic-Updates, UI-Validation, Onboarding Services, Database, HappyPath

Test Procedure

Step #

Action

Expected Result

Test Data

Comments

1

Configure Customer ID format

Preview displays: WA-CUST-202406-0001

Complete configuration

Live preview

2

Change prefix to CONS

Preview updates to: WA-CONS-202406-0001

Prefix: CONS

Dynamic update

3

Change date format to YYMM

Preview updates to: WA-CONS-2406-0001

Date: YYMM

Format change

4

Change utility service to EL

Preview updates to: EL-CONS-2406-0001

Service: Electric

Service change

5

Verify preview accuracy

Preview matches expected format pattern

N/A

Accuracy validation

Verification Points

  • Primary_Verification: Preview accurately shows format with actual data
  • Secondary_Verifications: Real-time updates as configuration changes
  • Negative_Verification: No preview display errors

Test Case 12: "Validate Master and Transaction ID Entity Classification"

Test Case: ONB02US07_TC_012

Title: Verify system correctly classifies entities as Master ID or Transaction ID types

Created By: Arpita
Created Date: June 08, 2025
Version: 1.0

Classification

  • Module/Feature: ID & Reference Format Settings
  • Test Type: Functional
  • Test Level: System
  • Priority: P2-High
  • Execution Phase: Regression
  • Automation Status: Automated

Enhanced Tags for 17 Reports Support

Tags: MOD-Onboarding, P2-High, Phase-Regression, Type-Functional, Platform-Web, Report-Engineering/Product, Customer-All, Risk-Medium, Business-High, Revenue-Impact-Medium, Classification-Logic, Entity-Management, Onboarding Services, cx Services, mx Services, bx Services, Database

Test Procedure

Step #

Action

Expected Result

Test Data

Comments

1

Create Consumer format in Master ID

Consumer classified as Master ID

Entity: Consumer

Master classification

2

Create Meter format in Master ID

Meter classified as Master ID

Entity: Meter

Master classification

3

Create Payment format in Transaction ID

Payment classified as Transaction ID

Entity: Payment

Transaction classification

4

Create Service Order format in Transaction ID

Service Order classified as Transaction ID

Entity: Service Order

Transaction classification

5

Verify classification enforcement

System enforces proper entity-type mapping

N/A

Classification rules

Verification Points

  • Primary_Verification: Entities correctly classified by Master/Transaction type
  • Secondary_Verifications: Classification rules enforced consistently
  • Negative_Verification: No incorrect entity-type assignments

Test Case 13: "Validate Prefilled Format Standards"

Test Case: ONB02US07_TC_013

Title: Verify system provides correct prefilled formats for each entity type

Created By: Arpita
Created Date: June 08, 2025
Version: 1.0

Classification

  • Module/Feature: ID & Reference Format Settings
  • Test Type: Functional
  • Test Level: System
  • Priority: P3-Medium
  • Execution Phase: Regression
  • Automation Status: Automated

Enhanced Tags for 17 Reports Support

Tags: MOD-Onboarding, P3-Medium, Phase-Regression, Type-Functional, Platform-Web, Report-Product/QA, Customer-All, Risk-Low, Business-Medium, Revenue-Impact-Low, Prefilled-Data, Default-Formats, Onboarding Services, cx Services, mx Services, bx Services, Database, HappyPath

Test Procedure

Step #

Action

Expected Result

Test Data

Comments

1

Access Consumer entity format

Prefilled format shows: Con-0001

Entity: Consumer

Consumer prefill

2

Access Meter entity format

Prefilled format shows: Mtr-0001

Entity: Meter

Meter prefill

3

Access Request entity format

Prefilled format shows: Req-0001

Entity: Request

Request prefill

4

Access Payment entity format

Prefilled format shows: mmyy-0001

Entity: Payment

Payment prefill

5

Access Service Order entity format

Prefilled format shows: So-0001

Entity: Service Order

Service Order prefill

6

Access Billing entity format

Prefilled format shows: Inv-0001

Entity: Billing

Billing prefill

Verification Points

  • Primary_Verification: All entity types have correct prefilled formats
  • Secondary_Verifications: Prefilled formats follow defined standards
  • Negative_Verification: No incorrect or missing prefilled formats

Test Case 14: "Cross-Browser Compatibility Validation"

Test Case: ONB02US07_TC_005ONB02US07_TC_014

Title: Verify cross-browser compatibility for ID format configuration interface works across all supported browsers

Created By: Arpita
Created Date: June 08, 2025
Version: 1.0

Classification

  • Module/Feature: ID & Reference Format Settings
  • Test Type: Compatibility
  • Test Level: System
  • Priority: P2-High
  • Execution Phase: Regression
  • Automation Status: Automated

Enhanced Tags for 17 Reports Support

Tags: MOD-IDFormat,Onboarding, P2-High, Phase-Regression, Type-Compatibility, Platform-Multi, Report-QA/Engineering, Customer-All, Risk-Medium, Business-Medium, Revenue-Impact-Medium, Integration-Cross-Platform, Browser-Compatibility, UI-Consistency, Onboarding Services, Cross module

Business Context

  • Customer_Segment: All
  • Revenue_Impact: Medium
  • Business_Priority: Should-Have
  • Customer_Journey: Daily-Usage
  • Compliance_Required: No
  • SLA_Related: No

Quality Metrics

  • Risk_Level: Medium
  • Complexity_Level: High
  • Expected_Execution_Time: 20 minutes
  • Reproducibility_Score: High
  • Data_Sensitivity: Low
  • Failure_Impact: Medium

Coverage Tracking

  • Feature_Coverage: 100% of browser compatibility
  • Integration_Points: Multiple browser engines, UI components
  • Code_Module_Mapped: Onboarding
  • Requirement_Coverage: Complete
  • Cross_Platform_Support: Web (Multi-browser)

Stakeholder Reporting

  • Primary_Stakeholder: QA/Engineering
  • Report_Categories: Compatibility-Matrix, Platform-Coverage
  • Trend_Tracking: Yes
  • Executive_Visibility: No
  • Customer_Impact_Level: Medium

Requirements Traceability

Test Environment

  • Environment: Staging
  • Browser/Version: Chrome 115+, Firefox 110+, Safari 16+, Edge Latest
  • Device/OS: Windows 10/11, macOS 12+
  • Screen_Resolution: Desktop-1920x1080, 1366x768
  • Dependencies: All supported browsers installed
  • Performance_Baseline: Consistent performance across browsers
  • Data_Requirements: Same test data across all browsers

Prerequisites

  • Setup_Requirements: Multiple browsers available for testing
  • User_Roles_Permissions: Utility Administrator access
  • Test_Data: Standard format configuration data
  • Prior_Test_Cases: Functional validation completed

Test Procedure

Step #

Action

Expected Result

Test Data

Comments

1

Test Chrome browser functionality

All features work correctly

Standard test data

Baseline browser

2

Test Firefox browser functionality

Identical behavior to Chrome

Same test data

Mozilla engine

3

Test Safari browser functionality

Consistent UI and functionality

Same test data

WebKit engine

4

Test Edge browser functionality

Same results across all features

Same test data

Chromium engine

5

Verify UI element positioningconsistency

ConsistentLayout layoutand acrosscontrols browserswork identically

N/A

VisualCross-browser consistency

6

Test form field behavior

Input handling identical

Various inputs

Form compatibility

7

Verify dropdown functionality

All dropdowns work correctly

N/A

Control consistency

8

Test modal dialog behavior

Modals display and function properly

N/A

Dialog compatibility

9

Verify live preview rendering

Preview accuracy across browsers

Format configurations

Rendering consistency

10

Test responsive behavior

Interface adapts to different screen sizes

N/A

Responsive design

11

Verify JavaScript functionality

All interactive features work

N/A

Script compatibility

12

Test CSS rendering

Styling consistent across browsers

N/A

Style compatibility

13

Verify error message display

Error messages appear correctly

Invalid inputs

Error handling

14

Test navigation behavior

Menu and tab navigation works

N/A

Navigation consistency

15

Performance comparison

Similar load times across browsers

N/A

Performance parityUI

Verification Points

  • Primary_Verification: All core functionality works identically across supported browsers
  • Secondary_Verifications: UI consistency, performance parity, error handling uniformityparity
  • Negative_Verification: No browser-specific issues, no missing functionality in any browserissues

Test Case 6: "API Endpoint Authentication and Format Creation"

Test Case: ONB02US07_TC_006

Title: Verify API endpoint for ID format creation with authentication validation

Created By: Arpita
Created Date: June 08, 2025
Version: 1.0

Classification

  • Module/Feature: ID & Reference Format Settings
  • Test Type: API
  • Test Level: Integration
  • Priority: P1-Critical
  • Execution Phase: Smoke
  • Automation Status: Automated

Enhanced Tags for 17 Reports Support

Tags: MOD-IDFormat, P1-Critical, Phase-Smoke, Type-API, Platform-Backend, Report-Engineering, Customer-All, Risk-High, Business-Critical, Revenue-Impact-High, Integration-API-Endpoint, Authentication-Security, Data-Validation, auth Services, Onboarding Services, API, Database, HappyPath

Business Context

  • Customer_Segment: All
  • Revenue_Impact: High
  • Business_Priority: Must-Have
  • Customer_Journey: Onboarding
  • Compliance_Required: No
  • SLA_Related: Yes

Quality Metrics

  • Risk_Level: High
  • Complexity_Level: High
  • Expected_Execution_Time: 8 minutes
  • Reproducibility_Score: High
  • Data_Sensitivity: High
  • Failure_Impact: Critical

Coverage Tracking

  • Feature_Coverage: 100% of API creation endpoint
  • Integration_Points: Authentication service, database layer, validation engine
  • Code_Module_Mapped: Onboarding
  • Requirement_Coverage: Complete
  • Cross_Platform_Support: API (Platform-agnostic)

Stakeholder Reporting

  • Primary_Stakeholder: Engineering
  • Report_Categories: API-Quality, Integration-Health, Security-Validation
  • Trend_Tracking: Yes
  • Executive_Visibility: Yes
  • Customer_Impact_Level: High

Requirements Traceability

Test Environment

  • Environment: Staging
  • Browser/Version: N/A (API Testing)
  • Device/OS: API Client
  • Screen_Resolution: N/A
  • Dependencies: API endpoint, authentication service, database
  • Performance_Baseline: < 500ms response time
  • Data_Requirements: Valid API credentials, test payload data

Prerequisites

  • Setup_Requirements: API endpoint accessible, authentication service running
  • User_Roles_Permissions: API access with Utility Administrator privileges
  • Test_Data: API key, valid format creation payload
  • Prior_Test_Cases: Authentication service validation

Test Procedure

Step #

Action

Expected Result

Test Data

Comments

1

Send POST request without auth token

401 Unauthorized response

No auth header

Security validation

2

Send POST with invalid auth token

401 Unauthorized response

Invalid token

Token validation

3

Send POST with expired auth token

401 Unauthorized response

Expired token

Token expiry check

4

Send POST with valid auth token and complete payload

201 Created response

Valid payload below

Success scenario

5

Verify response contains created format ID

Response includes generated format ID

N/A

Response validation

6

Send POST with missing required fields

400 Bad Request response

Payload missing entity

Field validation

7

Send POST with invalid entity type

400 Bad Request response

Entity: "InvalidType"

Data validation

8

Send POST with invalid sequence length

400 Bad Request response

Sequence: -1

Boundary validation

9

Send POST with excessively long prefix

400 Bad Request response

Prefix: "VERYLONGPREFIX123"

Length validation

10

Send POST with invalid characters in prefix

400 Bad Request response

Prefix: "CU$T@"

Character validation

11

Send POST duplicate format payload

409 Conflict response

Duplicate config

Duplicate prevention

12

Verify error response format

JSON with error details

N/A

Error structure

13

Check response time for valid request

Response time < 500ms

Valid payload

Performance check

14

Send POST with SQL injection attempt

400 Bad Request, no injection

Malicious payload

Security test

15

Verify database record created

Format exists in database

N/A

Data persistence

Valid API Payload:

{
  "entity": "Customer",
  "idType": "Master",
  "prefix": "CUST",
  "sequenceLength": 4,
  "utilityService": "WA",
  "dateElement": "YYYYMM",
  "startingNumber": 1,
  "separator": "-",
  "description": "Customer Master ID Format"
}

Verification Points

  • Primary_Verification: API correctly creates ID format with proper authentication
  • Secondary_Verifications: Proper error responses, data validation, performance within limits
  • Negative_Verification: No unauthorized access, no malformed data accepted, no SQL injection possible

Test Case 7:15: "Concurrent User Performance and System Stability"

Test Case: ONB02US07_TC_007ONB02US07_TC_015

Title: Verify system performance with concurrent users accessing ID format configuration

Created By: Arpita
Created Date: June 08, 2025
Version: 1.0

Classification

  • Module/Feature: ID & Reference Format Settings
  • Test Type: Performance
  • Test Level: System
  • Priority: P2-High
  • Execution Phase: Performance
  • Automation Status: Automated

Enhanced Tags for 17 Reports Support

Tags: MOD-IDFormat,Onboarding, P2-High, Phase-Performance, Type-Performance, Platform-Web, Report-Engineering/QA, Customer-Enterprise, Risk-Medium, Business-High, Revenue-Impact-Medium, Integration-Concurrent-Users, Load-Testing, System-Stability, auth Services, Onboarding Services, Database, Cross module

Business Context

  • Customer_Segment: Enterprise
  • Revenue_Impact: Medium
  • Business_Priority: Should-Have
  • Customer_Journey: Daily-Usage
  • Compliance_Required: No
  • SLA_Related: Yes

Quality Metrics

  • Risk_Level: Medium
  • Complexity_Level: High
  • Expected_Execution_Time: 15 minutes
  • Reproducibility_Score: Medium
  • Data_Sensitivity: Low
  • Failure_Impact: Medium

Coverage Tracking

  • Feature_Coverage: 100% of concurrent access scenarios
  • Integration_Points: Database connection pool, session management, caching layer
  • Code_Module_Mapped: Onboarding
  • Requirement_Coverage: Complete
  • Cross_Platform_Support: Web

Stakeholder Reporting

  • Primary_Stakeholder: Engineering/QA
  • Report_Categories: Performance-Dashboard, System-Reliability, Scalability-Metrics
  • Trend_Tracking: Yes
  • Executive_Visibility: Yes
  • Customer_Impact_Level: High

Requirements Traceability

Test Environment

  • Environment:module, Performance Testing Environment
  • Browser/Version: Chrome 115+ (multiple instances)
  • Device/OS: Load testing infrastructure
  • Screen_Resolution: N/A
  • Dependencies: Load testing tools, performance monitoring
  • Performance_Baseline: < 3 seconds page load, < 500ms API response
  • Data_Requirements: Multiple test user accounts

Prerequisites

  • Setup_Requirements: Load testing environment configured, monitoring tools active
  • User_Roles_Permissions: Multiple Utility Administrator and System Admin accounts
  • Test_Data: 10 concurrent user credentials, varied test scenarios
  • Prior_Test_Cases: Functional validation completed

Test Procedure

Step #

Action

Expected Result

Test Data

Comments

1

Establish baseline with single user

Record baseline performance metrics

1 user

Performance baseline

2

Simulate 2 concurrent Utility Admins

System handles both users without degradation

2 users

Initial concurrency

3

Add 1 System Admin concurrent access

3 users access system simultaneously

3 users

Mixed roles

4

Increase to 5 concurrent Utility Admins

Page load time remains < 5 seconds

5 users

Moderate load

5

Add 2 more System Admins (7 total)

System maintains responsiveness

75 users

Higher load

6

Test simultaneous format creation

All users can create formats without conflict

7 users creating

Concurrent operations

7

Test simultaneous audit log access

All System Admins access logs successfully

System Admins

Concurrent reads

85

Monitor databasesystem connection usageperformance

ConnectionsPerformance remains within acceptable limits

N/A

Resource monitoring

9

Check memory and CPU utilization

System resources within normal range

N/A

System health

10

Test session management

No session conflicts or crossover

All users

Session isolation

11

Simulate network latency

Performance degrades gracefully

Simulated delays

Network resilience

12

Test rapid successive operations

System handles burst operations

Rapid clicks/operations

Burst handling

13

Monitor error rates

Error rate remains < 1%

N/A

Error monitoring

14

Test graceful user logout

Users can log out cleanly under load

N/A

Session cleanup

15

Verify data consistency

All format changes properly saved

N/A

Data integrity

Verification Points

  • Primary_Verification: System maintains performance with up to 7 concurrent users (2 roles)
  • Secondary_Verifications: No data corruption, proper session management, resource utilization within limits
  • Negative_Verification: No system crashes, no session conflicts, no data loss

Test Case 8: "Enhanced Contextual Help System Validation"

Test Case: ONB02US07_TC_008

Title: Verify enhanced contextual help system provides accurate guidance

Created By: Arpita
Created Date: June 08, 2025
Version: 1.0

Classification

  • Module/Feature: ID & Reference Format Settings
  • Test Type: Functional
  • Test Level: System
  • Priority: P3-Medium
  • Execution Phase: Regression
  • Automation Status: Manual

Enhanced Tags for 17 Reports Support

Tags: MOD-HelpSystem, P3-Medium, Phase-Regression, Type-Usability, Platform-Web, Report-Product/QA, Customer-All, Risk-Low, Business-Medium, Revenue-Impact-Low, Integration-Help-Engine, User-Experience, Enhanced-Features, Onboarding Services, HappyPath

Business Context

  • Customer_Segment: All
  • Revenue_Impact: Low
  • Business_Priority: Could-Have
  • Customer_Journey: Onboarding
  • Compliance_Required: No
  • SLA_Related: No

Quality Metrics

  • Risk_Level: Low
  • Complexity_Level: Medium
  • Expected_Execution_Time: 10 minutes
  • Reproducibility_Score: High
  • Data_Sensitivity: None
  • Failure_Impact: Low

Coverage Tracking

  • Feature_Coverage: 100% of enhanced help features
  • Integration_Points: Help content system, UI tooltips, contextual guidance
  • Code_Module_Mapped: Onboarding
  • Requirement_Coverage: Complete
  • Cross_Platform_Support: Web

Stakeholder Reporting

  • Primary_Stakeholder: Product/QA
  • Report_Categories: User-Experience, Feature-Adoption, Support-Reduction
  • Trend_Tracking: Yes
  • Executive_Visibility: No
  • Customer_Impact_Level: Low

Requirements Traceability

Test Environment

  • Environment: Staging
  • Browser/Version: Chrome 115+, Firefox 110+, Safari 16+
  • Device/OS: Windows 10/11, macOS 12+
  • Screen_Resolution: Desktop-1920x1080
  • Dependencies: Help content service, tooltip engine
  • Performance_Baseline: < 1 second help content load
  • Data_Requirements: Complete help content database

Prerequisites

  • Setup_Requirements: Enhanced help system enabled, content populated
  • User_Roles_Permissions: Utility Administrator access
  • Test_Data: Standard user credentials
  • Prior_Test_Cases: Basic navigation functionality verified

Test Procedure

Step #

Action

Expected Result

Test Data

Comments

1

Hover over Entity field label

Tooltip appears with entity explanation

N/A

Hover functionality

2

Verify tooltip content accuracy

Clear explanation of entity types

N/A

Content validation

3

Hover over Prefix field

Contextual help about prefix usage

N/A

Field-specific help

4

Check prefix examples in tooltip

Real-world examples shown

N/A

Example quality

5

Hover over Sequence Length field

Length guidance and implications

N/A

Technical guidance

6

Verify sequence length calculations

Math examples for volume planning

N/A

Calculation help

7

Hover over Date Element field

Date format options explained

N/A

Format guidance

8

Check date format examples

Multiple date format examples

N/A

Format examples

9

Access in-context help icon

Detailed help panel opens

N/A

Help panel access

10

Verify help panel content

Comprehensive format building guide

N/A

Content completeness

11

Test help search functionality

Search finds relevant help topics

Search: "prefix"

Search accuracy

12

Verify best practice recommendations

System suggests optimal configurations

N/A

Recommendation engine

13

Check format impact guidance

Clear explanations of setting effects

N/A

Impact clarity

14

Test help panel navigation

Easy navigation between help topics

N/A

Navigation usability

15

Verify help content accessibility

Help accessible via keyboard navigation

N/A

Accessibility

Verification Points

  • Primary_Verification: Enhanced help system provides accurate, contextual guidance
  • Secondary_Verifications: Help content is complete, accessible, and searchable
  • Negative_Verification: No missing help content, no broken help links, no accessibility issues

Test Case 9: "Format Testing Engine with Edge Case Validation"

Test Case: ONB02US07_TC_009

Title: Verify format testing feature validates ID generation with edge cases

Created By: Arpita
Created Date: June 08, 2025
Version: 1.0

Classification

  • Module/Feature: ID & Reference Format Settings
  • Test Type: Functional
  • Test Level: System
  • Priority: P2-High
  • Execution Phase: Regression
  • Automation Status: Planned-for-Automation

Enhanced Tags for 17 Reports Support

Tags: MOD-FormatTesting, P2-High, Phase-Regression, Type-Validation, Platform-Web, Report-Engineering/QA, Customer-All, Risk-Medium, Business-High, Revenue-Impact-Medium, Integration-Testing-Engine, Edge-Case-Validation, Enhanced-Features, Onboarding Services, cx Services, mx Services, bx Services, Database, API

Business Context

  • Customer_Segment: All
  • Revenue_Impact: Medium
  • Business_Priority: Should-Have
  • Customer_Journey: Onboarding
  • Compliance_Required: No
  • SLA_Related: No

Quality Metrics

  • Risk_Level: Medium
  • Complexity_Level: High
  • Expected_Execution_Time: 12 minutes
  • Reproducibility_Score: High
  • Data_Sensitivity: Low
  • Failure_Impact: Medium

Coverage Tracking

  • Feature_Coverage: 100% of format testing feature
  • Integration_Points: ID generation engine, validation rules, edge case handlers
  • Code_Module_Mapped: Onboarding
  • Requirement_Coverage: Complete
  • Cross_Platform_Support: Web

Stakeholder Reporting

  • Primary_Stakeholder: Engineering/QA
  • Report_Categories: Feature-Quality, Edge-Case-Coverage, System-Reliability
  • Trend_Tracking: Yes
  • Executive_Visibility: No
  • Customer_Impact_Level: Medium

Requirements Traceability

Test Environment

  • Environment: Staging
  • Browser/Version: Chrome 115+, Firefox 110+, Safari 16+
  • Device/OS: Windows 10/11, macOS 12+
  • Screen_Resolution: Desktop-1920x1080
  • Dependencies: Format testing service, ID generation engine
  • Performance_Baseline: < 2 seconds test execution
  • Data_Requirements: Various edge case test scenarios

Prerequisites

  • Setup_Requirements: Format testing feature enabled
  • User_Roles_Permissions: Utility Administrator access
  • Test_Data: ID format configuration ready for testing
  • Prior_Test_Cases: Format creation functionality verified

Test Procedure

Step #

Action

Expected Result

Test Data

Comments

1

Access format testing interface

Testing panel opens in configuration modal

N/A

Feature access

2

Test with minimum sequence number

ID generated correctly with 0001

Starting: 1

Minimum boundary

3

Test with maximum sequence number

ID generated with max value (9999 for 4 digits)

Starting: 9999

Maximum boundary

4

Test sequence rollover scenario

System handles sequence exceeding max digits

Starting: 10000

Rollover handling

5

Test with leap year date

Date element handles Feb 29 correctly

Date: 2024-02-29

Leap year edge case

6

Test with end-of-year date

Date transitions properly across years

Date: 2024-12-31

Year transition

7

Test with empty prefix

ID generates without prefix component

Prefix: [empty]

Component omission

8

Test with special characters in allowed range

System handles permitted special chars

Prefix: "TEST-1"

Character validation

9

Test with maximum field lengths

All fields at maximum allowed length

Max length data

Length boundaries

10

Test multiple rapid generations

System generates unique sequential IDs

Rapid generation

Uniqueness test

11

Test with different utility services

Format adapts to different service codes

Service: EL, GA, SW

Service variation

12

Test concurrent ID generation simulation

Multiple simultaneous requests handled

Concurrent simulation

Concurrency test

13

Verify test results display

Clear indication of test success/failure

N/A

Result reporting

14

Test format with all optional fields empty

System generates minimal valid ID

All optional empty

Minimal configuration

15

Validate test performance

Test execution completes within time limit

N/A

Performance validation

Verification Points

  • Primary_Verification: FormatSystem testingmaintains accuratelyperformance validateswith IDconcurrent generation under edge conditionsusers
  • Secondary_Verifications: ProperNo handlingdata ofcorruption, boundaryproper conditions,session clear test result reportingmanagement
  • Negative_Verification: No invalid IDs generated, no system errorscrashes duringor testingdata conflicts


Test Suite Organization

Smoke Test Suite

Criteria: P1 priority, basic functionality validation
Test Cases: ONB02US07_TC_001, ONB02US07_TC_003,ONB02US07_TC_002, ONB02US07_TC_006ONB02US07_TC_007, ONB02US07_TC_009, ONB02US07_TC_010
Execution: Every build deployment
Duration: ~1520 minutes

Regression Test Suite

Criteria: P1-P2 priority, automated tests
Test Cases: ONB02US07_TC_001,ONB02US07_TC_001 ONB02US07_TC_002,through ONB02US07_TC_003,ONB02US07_TC_012, ONB02US07_TC_004,ONB02US07_TC_014, ONB02US07_TC_005, ONB02US07_TC_006, ONB02US07_TC_007, ONB02US07_TC_009ONB02US07_TC_015
Execution: Before each release
Duration: ~6075 minutes

Full Test Suite

Criteria: All test cases including edge cases
Test Cases: ONB02US07_TC_001 through ONB02US07_TC_010ONB02US07_TC_015
Execution: Weekly or major release cycles
Duration: ~90 minutes


API Test Collection (Critical Level ≥7)

High Priority API Endpoints

1. POST /api/v1/id-formats - Create ID Format

  • Importance Level: 9
  • Authentication: Required (Bearer token)
  • Rate Limiting: 100 requests/hour per user
  • Test Coverage: ONB02US07_TC_006ONB02US07_TC_001, ONB02US07_TC_009

2. PUT /api/v1/id-formats/{id} - Update ID Format

  • Importance Level: 8
  • Authentication: Required (Bearer token)
  • Validation: Duplicate prevention, business rules
  • Test Coverage: Update scenarios in regression suiteONB02US07_TC_003

3. GET /api/v1/id-formats/validate - Validate Format

  • Importance Level: 8
  • Authentication: Required (Bearer token)
  • Purpose: Real-time validation during configuration
  • Test Coverage: ValidationONB02US07_TC_009, testing scenariosONB02US07_TC_011

4. GET /api/v1/audit-logs/id-formats - Retrieve Audit Logs

  • Importance Level: 7
  • Authentication: Required (System Admin role)
  • Filtering: Date range, user, action type
  • Test Coverage: ONB02US07_TC_004ONB02US07_TC_005

Performance Benchmarks

Expected Performance Criteria

Page Load Performance

  • Dashboard Load: < 3 seconds
  • Configuration Modal: < 2 seconds
  • Audit Logs Page: < 5 seconds
  • Large Format List: < 4 seconds

API Response Times

  • Format Creation: < 500ms
  • Format Validation: < 200ms
  • Format Retrieval: < 300ms
  • Audit Log Query: < 1 second

Concurrent User Performance

  • 2-3 Users: No performance degradation
  • 4-5 Users: < 10% performance impact
  • 6-7 Users: < 20% performance impact
  • 8+ Users: Graceful degradation with user notification

Integration Test Map

Internal System Integrations

1. Authentication Service Integration

  • Purpose: Role-based access control
  • Test Coverage: All test cases verify proper authentication
  • Critical Scenarios: Token validation, role verification, session management

2. Database Layer Integration

  • Purpose: Data persistence and retrieval
  • Test Coverage: Format creation, modification, audit logging
  • Critical Scenarios: ACID compliance, concurrent access, data integrity

3. Validation Engine Integration

  • Purpose: Business rule enforcement
  • Test Coverage: Duplicate prevention, format validation, constraint checking
  • Critical Scenarios: Real-time validation, edge case handling

External System Dependencies

1. Utility Service Catalog

  • Purpose: Available utility service types
  • Test Coverage: Service selection validation
  • Fallback: Default service options available

2. Entity Management System

  • Purpose: Available entity types for ID format assignment
  • Test Coverage: Entity dropdown population
  • Fallback: Core entity types (Customer, Meter, Bill, Payment) always available

Dependency Map

Test Execution Dependencies

Sequential Dependencies

  1. Authentication → All subsequent tests
  2. Format Creation → Format Modification tests
  3. Format Changes → Audit Log tests
  4. Basic Functionality → Performance tests

Parallel Execution Groups

  • Group A: Format creation, validation testing
  • Group B: Audit log access, filtering tests
  • Group C: UI compatibility, responsive design tests
  • Group D: API testing, performance testing

Failure Handling

  • Authentication Failure: Skip all dependent tests
  • Database Connectivity: Mark infrastructure tests as blocked
  • Service Unavailability: Use fallback test scenarios
  • Performance Environment Issues: Execute functional tests only

Edge Case Coverage (80% Detail Level)

Boundary Value Testing

  1. Sequence Length Boundaries: 1 digit (minimum) to 10 digits (maximum)
  2. Prefix Length: Empty, 1 character, 10 characters (maximum)
  3. Date Format Variations: YYYY, YYMM, YYYYMM, YYYYMMDD
  4. Starting Number Ranges: 0, 1, 999999999 (max for sequence length)

Special Character Handling

  1. Allowed Separators: Hyphen, underscore, period, none
  2. Prefix Special Characters: Alphanumeric only validation
  3. Unicode Character Support: Extended character set testing
  4. Case Sensitivity: Upper/lower case handling

Date and Time Edge Cases

  1. Leap Year Handling: February 29th date elements
  2. Year Transitions: December 31st to January 1st
  3. Month Boundaries: End-of-month date handling
  4. Timezone Considerations: UTC vs local time formatting

Volume and Scale Testing

  1. Large Entity Volumes: 1 million+ entities per format
  2. Many Format Configurations: 100+ different formats
  3. High-Frequency Generation: 1000+ IDs per minute
  4. Long-Running Sequences: Sequence number exhaustion scenarios

Security Test Scenarios

Authentication & Authorization Testing

  1. Role-Based Access: Utility Admin vs System Admin permissions
  2. Session Security: Session timeout, concurrent session handling
  3. Token Security: JWT validation, token refresh, token revocation

Input Validation Security

  1. SQL Injection Prevention: Malicious input in all fields
  2. XSS Prevention: Script injection in text fields
  3. CSRF Protection: Cross-site request forgery prevention

Data Protection Testing

  1. Audit Trail Integrity: Tamper-proof logging verification
  2. Configuration Data Security: Encryption of sensitive settings
  3. Access Logging: Complete audit trail of configuration access

API Security Testing

  1. Authentication Bypass Attempts: Unauthorized API access
  2. Rate Limiting: API abuse prevention
  3. Parameter Tampering: Invalid parameter manipulation

Validation Checklist

✅ Comprehensive Coverage Verification

  • [x] All 15 acceptance criteria covered with test cases
  • [x] All business rules tested with weighted calculations
  • [x] Cross-browser/device compatibility included
  • [x] Positive and negative scenarios covered
  • [x] Integration points tested
  • [x] Security considerations addressed
  • [x] Performance benchmarks defined
  • [x] Realistic test data provided
  • [x] Clear dependency mapping included
  • [x] Proper tagging for all 17 BrowserStack reports
  • [x] Edge cases covered at 80% detail level
  • [x] API tests for critical operations (≥7 importance) included

Acceptance Criteria Coverage Matrix

Acceptance Criteria

Test Case

Coverage Status

1. Generate unique IDs within each entity's domain

TC_001

✅ Complete

2. Classify ID formats as Master ID or Transaction ID

TC_002

✅ Complete

3. Not change existing entity IDs when format changes

TC_003

✅ Complete

4. Require at least one active ID format per entity

TC_004

✅ Complete

5. Maintain audit history for all ID format changes

TC_005

✅ Complete

6. Auto-generate ID format names based on entity type

TC_006

✅ Complete

7. Auto-increment current sequence numbers

TC_007

✅ Complete

8. Allow configurable starting numbers with default

TC_008

✅ Complete

9. Prevent duplicate ID formats for same entity type

TC_009

✅ Complete

10. Role-based access control for format management

TC_010

✅ Complete

11. Generate preview showing format with actual data

TC_011

✅ Complete

12. Classify Master and Transaction ID entities correctly

TC_012

✅ Complete

13. Provide correct prefilled formats for each entity

TC_013

✅ Complete

14. Cross-browser compatibility support

TC_014

✅ Complete

15. Concurrent user performance and stability

TC_015

✅ Complete

Report Coverage Matrix

Report Category

Test Cases Supporting

Coverage Level

Quality Dashboard

TC_001, TC_003,TC_002, TC_006,TC_007, TC_007TC_009, TC_010

High

Module Coverage

All test cases

Complete

Feature Adoption

TC_006, TC_008, TC_009TC_011, TC_013

Medium

Performance Metrics

TC_007TC_015

High

Security Validation

TC_006,TC_010, Security scenarios

High

Compatibility Matrix

TC_005, TC_010TC_014

Complete

API Health

TC_006, API test collection

High

User Experience

TC_002,TC_011, TC_008, TC_010TC_013

Medium

Compliance Tracking

TC_004,TC_005, Audit scenariosTC_010

High

Integration Health

All integration scenarios

High

Error Tracking

Negative test scenarios

Medium

Trend Analysis

All automated test cases

High

Executive Summary

P1-Critical test cases

High

Platform Coverage

TC_005, TC_010TC_014

Complete

Business Impact

All P1-P2 test cases

High

Customer Journey

TC_001, TC_004,TC_005, TC_008TC_010

Medium

Risk Assessment

All test cases by risk level

Complete


Summary

This comprehensive test suite provides complete100% coverage for all 15 acceptance criteria of the ID & Reference Format Settings user story,story supporting(ONB02US07). The test cases are organized to align directly with each acceptance criterion, making navigation and validation straightforward.

Key Coverage Highlights:

  • 15 detailed test cases covering all acceptance criteria
  • Complete functional coverage of ID format management
  • Security and performance testing included
  • Cross-browser compatibility validation
  • API testing for critical endpoints (≥7 importance level)
  • Comprehensive audit trail testing
  • Role-based access control validation
  • Edge case coverage at 80% detail level
  • Integration testing with all system dependencies

Execution Strategy:

  • Smoke Tests: 5 critical test cases (~20 minutes)
  • Regression Tests: 14 test cases (~75 minutes)
  • Full Suite: All 15 test cases (~90 minutes)

The test suite supports all 17 BrowserStack test management reports withthrough detailedcomprehensive tagging and ensures complete traceability from acceptance criteria to test cases,execution performance benchmarks, integration mapping, and extensive edge case coverage. |results.