Infojini Inc.

Workday Test Lead

Infojini Inc.Contract
Remote
10 - 30 YearsMar 3rd, 2026
29 ViewsBe an Early Applicant
Required Skillset:
Azure Dev OpsIntegration testingRegression testingAZURE DEVOPSWORKDAYWORKDAY TESTWORKDAY TESTINGLEADMANAGERWORKDAY TESTING LEADSALESFORCESystem testingAI GOVERNANCEAI COE

Job Description

Title: Workday Test Lead
Duration: 3 Months with possible extension/hire
Location: Olympia, WA 98504 (100% Remote)
 

There are four systems that will be remediated:
▪ Apptio: A mission critical cloud-based tool and public facing IT project dashboard that provides cost transparency so can meet the statutory requirement to monitor and track IT investments across the enterprise including cost modeling and IT expenditures by agency.
▪ Financial Invoicing System (FINS): The interface system for all fees for service billing. It will need internal Chart of Accounts (COA) alignment with the current financial invoicing account coding.
▪ Human Resource Management System (HRMS): HRMS will reach end-of-life on and the state will no longer be able to process payroll after that date. The project to remediate HRMS is an OFM ITD project and a dependency of the One Washington Program.
▪ Salesforce: The Washington State IT Dashboard platform integrates with Apptio – an inbound integration of projects and an outbound integration of financials. Remediation of Salesforce may be necessary based on changes made to Apptio.
 

Scope of the One Washington Project:
The project will implement the Workday solution by remediating systems, assessing future-state business processes, and addressing impacts on its stakeholders. The primary objective of this is to ensure the successful planning, execution, and management of all testing and cutover activities related to phase 1a core financials Workday implementation. This includes the testing and validation of all configured functionalities, integrations, and business processes to confirm that they meet agency needs.

The Test Lead is responsible for developing a comprehensive test strategy and plan, coordinating test activities, managing defects, and ensuring that all identified issues are resolved in a timely manner. The goal is to achieve a high-quality, defect-free deployment of the Workday system while ensuring that our systems, other enterprise systems (HRMS), and business processes work seamlessly together. This will minimize risks and ensure a smooth transition to the new platform.
 

To date, has completed substantial foundational testing and business process work in support of the project. This includes the development of a comprehensive Testing Strategy and high-level Testing Plan, along with the completion of key deliverables capturing current-state and target-state business processes, gap analysis outputs, and initial test cases.
 

These deliverables reflect extensive effort to document as-is and to-be processes, identify the impacts and gaps between our as-is and to-be business processes, and establish a baseline understanding of system functionality and business readiness. While this work provides a strong foundation for our project testing, the Program’s enterprise testing approach, sequencing, and phases continue to evolve. As a result, the Project-level Testing Strategy and Plan will require ongoing review, refinement, and updates to align with the most current Program guidance, timelines, environments, and dependencies.
 

The Test Lead is responsible for assessing the existing testing artifacts, identifying and addressing any gaps, updating the Testing Strategy and Plan as needed, and leading execution of testing activities through the remaining project phases.
 

The primary focus areas for this work are outlined below. The Project Manager will meet monthly with the Test Lead vendor to identify and prioritize the appropriate tasks, activities, and deliverables and the associated work package to be focused on in the coming month.
 

Test Strategy and Planning:
▪ Enhance the project testing strategy and plan to comprehensively include all aspects of business process transformation and technology modernization.
▪ Continuously update the Test Plan to reflect Program testing phases, timelines, milestones, and requirements.
▪ Utilize Azure DevOps (ADO) to develop a method for tracking, monitoring, documenting, and reporting testing status and results. Design a dashboard that clearly articulates testing status at any given point.
▪ Collaborate with the Program, particularly the testing director and leads, to pilot concepts on behalf of enterprise agencies impacted by Phase 1 core financials implementation.
▪ Define test metrics and key performance indicators (KPIs) to assess the effectiveness and efficiency of the testing effort.
▪ Establish a risk-based testing approach that considers business criticality, integration complexity, change impact, and frequency of use.
▪ Evaluate requirements, Program requirements, and Program user stories to determine appropriate test coverage based on risk assessments.
▪ Identify and prioritize high-risk use cases with significant impact on critical business processes, financial data integrity, and system integration points.
▪ Determine the necessary level of testing for each use case based on risk level, complexity, and business impact.
▪ Allocate testing resources and efforts to ensure adequate coverage of business-critical areas.
▪ Plan and coordinate all testing activities, including resource allocation, scheduling, dependencies, and risk management.
▪ Align the test strategy with the Program testing framework.
▪ Sequence and coordinate testing activities across and vendor teams.
▪ Communicate testing status, risks, and coverage information to support project readiness and decision-making discussions.
▪ Develop documentation for the testing approach that reflects agreed priorities, assumptions, constraints, and alignment with program-level testing.
▪ Continuously monitor and adjust test coverage and prioritization based on project progress, changes in requirements, and identified risks.
 

Expectation: Risk-Based, Enterprise-Aligned Strategy:
▪ The Test Lead is expected to implement a risk-based testing approach, prioritizing business-critical workflows, high-risk integrations, and mission critical financial processes.
▪ Testing scope and prioritization must be informed by:
o Business criticality
o Integration complexity
o Frequency of use
o Change impact
o Historical defect trends (where available)
▪ The project Test Strategy must align with and integrate into the Master Test Plan and schedule. The Test Leads priority is to prepare WaTech for the testing and implementation of phase 1a core financials.
 

Expectation: Stakeholder-Driven Planning:
▪ SMEs, technical leads, and stakeholders are expected to actively participate in defining scope, validating priorities, and approving testing artifacts. The Test Lead is responsible for coordinating and maintaining this engagement throughout the testing phases.
 

Test Case Development:
▪ Define test coverage objectives for end-to-end, integration, and specific scenarios
▪ Develop test cases for Workday functionality, legacy system remediation, new and changed business processes, and new and changed integrations with Workday.
▪ Develop test cases that support multiple test types, including functional, integration, end-to-end, performance, and user acceptance testing.
▪ Develop supplemental test cases to address specific gaps not covered by provided scenarios.
▪ Ensure test cases are comprehensive, accurate, and aligned with approved project requirements and business processes.
▪ Develop and maintain a centralized test case repository using Azure DevOps to support traceability, reuse, and consistency.
▪ Develop and maintain traceability between test cases, requirements, and identified risks.
 

Expectation: Test Case Standards:
▪ All test cases developed under this task must meet the following standards.
▪ Test cases must provide end-to-end and integration-focused coverage, including legacy system remediation, Workday business processes, system integrations (API, SFTP), Apptio, and full business workflows.
▪ Specific gaps not covered by provided scenarios must be identified and addressed through supplemental test cases.
▪ All test cases must be traceable to approved requirements and identified risks via a Requirements Traceability Matrix (RTM).
▪ Test cases and the RTM must undergo formal review and approval by Project Manager, and the Test Lead prior to execution
▪ The vendor must document and communicate test case coverage, traceability status, and closure throughout the testing lifecycle.
 

Test Execution and Defect Management:
▪ Plan and coordinate execution of all and applicable test cases across all testing phases
▪ Manage the execution of test cases and coordinate with agency subject matter experts (SMEs) to ensure participation, feedback, and validation of results
▪ Develop an approach for collecting, consolidating, and reporting test execution results across all testers
▪ Develop testing status, defect trend, and risk summaries to support escalation and readiness discussions
▪ Define defect severity, prioritization, and escalation approaches in alignment with practices
▪ Manage the defect lifecycle, including defect reporting, assignment, tracking, retesting, and closure
▪ Plan and conduct regular defect review and triage activities with stakeholders to support timely resolution and visibility into testing risks
▪ Coordinate with development and technical teams to support timely investigation and resolution of defects
 

The Test Lead is responsible for managing test execution across:
o System testing
o Integration testing
o Regression testing
o Business-led UAT
▪ Business users are expected to execute and validate real-world scenarios during UAT, not just technical success paths.
 

Expectation: Disciplined Defect Management:
▪ Azure DevOps (ADO) is the system of record for defects and test execution.
▪ Defects must be prioritized by severity with defined resolution targets Regular triage meetings are expected, with escalation of unresolved blockers to protect schedule and cutover.
 

Test Data and Environment Management:
▪ Define test data and environment requirements necessary to support planned testing activities.
▪ Plan for the availability, stability, and coordination of test environments and required system integrations.
▪ Develop an approach for preparing, validating, and maintaining test data that supports priority business scenarios and integration workflows.
▪ Ensure test data is representative of production data and covers agreed priority and high-risk scenarios.
▪ Manage, where applicable, test environments to ensure they are available, stable, and configured appropriately for testing.
▪ Coordinate with technical teams to set up and maintain test environments, including databases, servers, and network configurations.
▪ Plan for identifying, tracking, and communicating test readiness dependencies, constraints, and risks that may impact testing activities.
▪ Define data handling and protection approaches to ensure compliance with security and privacy standards.
 

Expectation: Readiness Before Testing:
▪ Environments are stable and configured and data integrity is maintained across regression testing iterations.
▪ Integrations are active.
▪ Test data is available, validated, and privacy-compliant.
▪ Environment or data readiness issues are program risks, not QA failures, and must be escalated accordingly.
 

Expectation: Representative and Secure Data:
▪ Test data must reflect production-like scenarios, including edge cases and integration data.
▪ Sensitive data must be masked or obfuscated in accordance with security standards.
 

Test Automation:
▪ Evaluate and recommend test automation tools and frameworks for the project.
▪ Develop and maintain test automation scripts for regression testing and continuous integration testing.
▪ Integrate test automation into the overall testing process to improve efficiency and coverage.
▪ Ensure project performance and track progress with timeline and goals.
▪ Plan, monitor, and track key metrics for progress and performance.
▪ Identify root causes of Projects’ performance shortfalls and opportunities to improve.
▪ Develop Project communication and status in the form of Word document, PowerPoint presentations, website content, and all other materials as required.
 

Expectation: Value-Driven Automation:
▪ Automation must focus on:
o High-risk
o Repetitive
o Regression-heavy scenarios
▪ Automation is intended to increase coverage and efficiency, not replace business validation.
 

Expectation: Governed AI Usage:
▪ AI-enabled tools (automation, synthetic data) may only be used following security and governance approvals.
▪ If AI tools are delayed or denied, manual testing must proceed without reducing quality expectations.
 

Cutover Planning and Execution:
▪ Define testing-related inputs required to support cutover planning and go-live readiness discussions.
▪ Plan testing activities and sequencing to align with Program cutover timelines, checkpoints, and dependencies.
▪ Develop testing status, defect summaries, and risk information to support cutover coordination and readiness discussions.
▪ Plan for and participate in cutover planning, cutover plan review, mock cutover, and cutover execution activities from a testing perspective.
▪ Lead and manage cutover planning and execution activities in coordination with the Program.
▪ Field and coordinate responses to questions and escalations related to agency cutover activities, working with the cutover team to close open items.
▪ Manage cutover tasks, risks, issues, and dependencies to ensure alignment with the approved cutover plan of record.
▪ Coordinate with system owners, business SMEs, technical SMEs, and workstreams to ensure cutover and system remediation activities, deliverables, and milestones are developed and completed. In coordination with the Project Manager, develop and provide cutover and remediation progress reporting to the OneWA Program and WaTech leadership.
 

Expectation: Testing Exit Criteria Gate Cutover:
▪ Cutover readiness is contingent upon:
o All defects resolved or dispositioned and resolved before go-live
o RTM complete and validated
o UAT summary reviewed and approved
o Formal test sign-off confirming readiness
▪ Testing is an explicit go/no-go input to cutover decisions
 

Expectation: Integrated Agency and Program Cutover:
▪ The Test Lead is expected to coordinate cutover activities with Program timelines, checklists, and mock cutovers
▪ Risks, issues, and readiness gaps must be actively tracked and escalated to leadership

Preferred education, experience, and competencies:
▪ A strong commitment to quality and protecting the customer experience is essential.
▪ Relevant professional experience in program or portfolio level coordination and technical and project management.
▪ Mastery of test case design techniques and tools.
▪ Minimum of ten years as software testing or development lead for large-scale SaaS implementations.

Experience with large, complicated enterprise resource planning (ERP) implementations is required:
▪ Minimum of ten years of experience of test automation frameworks, scripting languages, and tools for efficient test automation and coverage.
▪ Experience creating comprehensive test plans and a strong knowledge of software testing methodologies, tools, and frameworks.
▪ Skilled in defining test scenarios, electing appropriate test data, and ensuring comprehensive test coverage.
▪ Strong analytical skills to identify potential vulnerabilities and define appropriate test coverage for optimal defect identification and resolution.
▪ Proficient utilizing Azure Dev Ops (ADO).
▪ Experience and knowledge of cutover planning and execution principles and necessary activities.
▪ Experience with testing SaaS solutions and integration testing (e.g., API, data migration) using test automation tools and frameworks
▪ Strong leadership, communication, and collaboration skills with cross-functional teams.
▪ Ability to work independently and manage multiple priorities and deadlines under tight deadlines.

Best Regards,
Tarik Khanna
xxxxxxxxxxxxxxx

 

Similar Jobs

Workday Test

Washington

Mar 3rd, 2026

Workday Test Lead (ERP Implementation)

Remote

Mar 3rd, 2026

Workday Test Lead (ERP Implementation)

Remote

Mar 3rd, 2026

Workday Test Lead PM

Remote

Feb 13th, 2026

Workday Test Lead PM - Remote

Remote

Feb 11th, 2026